# How does Matlab calculate the feedforward of a neural network?

5 views (last 30 days)

Show older comments

Everton Estracanholli
on 17 Aug 2023

Commented: Everton Estracanholli
on 18 Aug 2023

I am working on an application that uses a neural network to process an optical absorption spectrum. I trained the neural network, and after training, the network was saved in the ".mat" file. In my application, I need to generate an executable ".exe" that should perform the calculations consecutively as quickly as possible. When I generate the executable by loading the ".m" file, the file becomes relatively large (~100MB), and the processing time is very long, as I imagine that in each cycle, the loading of all Matlab libraries occurs. To solve the problem, I am trying to extract the weight parameters, activation functions, and biases from the ".mat" file and manually recreate the feedforward step, because in previous tests, I found that the processing speed is much faster.

However, the problem is that when I use the parameters extracted from the ".m" file and try to recreate the feedforward process, the result I obtain is different from the result when the calculation is done by the ".m" file itself. I would like to know if the procedure of using the Matlab parameters when performing the calculation through the ".m" file is different from the approach I am using. Thank you very much.

cd('C:\Processamento');

espectro = load('espectro.txt'); %Data for analisys

%Trained Neural Network

arquivo_mat = 'C:\Processamento\redeNeuralTreinada_50N30N20.mat';

load(arquivo_mat, 'redeNeuralTreinada');

redeNeuralTreinada.numInputs

ans =

1

redeNeuralTreinada.numLayers

ans =

4

redeNeuralTreinada.outputConnect

ans =

1×4 logical array

0 0 0 1

redeNeuralTreinada.biasConnect

ans =

4×1 logical array

1

1

1

1

redeNeuralTreinada.inputConnect

ans =

4×1 logical array

1

0

0

0

redeNeuralTreinada.layerConnect

ans =

4×4 logical array

0 0 0 0

1 0 0 0

0 1 0 0

0 0 1 0

redeNeuralTreinada.inputWeights{1}.delays

ans =

0

redeNeuralTreinada.layerWeights{2,1}.delays

ans =

0

redeNeuralTreinada.layerWeights{3,2}.delays

ans =

0

redeNeuralTreinada.layerWeights{4,3}.delays

ans =

0

redeNeuralTreinada.layers{1}.netInputFcn

ans =

'netsum'

redeNeuralTreinada.layers{2}.netInputFcn

ans =

'netsum'

redeNeuralTreinada.layers{3}.netInputFcn

ans =

'netsum'

redeNeuralTreinada.layers{4}.netInputFcn

ans =

'netsum'

redeNeuralTreinada.layers{1}.transferFcn

ans =

'tansig'

redeNeuralTreinada.layers{2}.transferFcn

ans =

'tansig'

redeNeuralTreinada.layers{3}.transferFcn

ans =

'tansig'

redeNeuralTreinada.layers{4}.transferFcn

ans =

'purelin'

% Extração dos pesos sinápticos e limiares de ativação

layerWeights_1 = redeNeuralTreinada.IW{1};

layerWeights_1_to_layerWeights_2 = redeNeuralTreinada.LW{2, 1};

layerWeights_2_to_layerWeights_3 = redeNeuralTreinada.LW{3, 2};

layerWeights_3_to_output = redeNeuralTreinada.LW{4, 3};

bias_1 = redeNeuralTreinada.b{1};

bias_2 = redeNeuralTreinada.b{2};

bias_3 = redeNeuralTreinada.b{3};

bias_4 = redeNeuralTreinada.b{4};

% My FeedForward Neural Network

layer_1 = layerWeights_1 * espectro + bias_1;

layer_1_output = tansig(layer_1);

layer_2 = layerWeights_1_to_layerWeights_2 *layer_1_output + bias_2;

layer_2_output = tansig(layer_2);

layer_3 = layerWeights_2_to_layerWeights_3 * layer_2_output + bias_3;

layer_3_output = tansig(layer_3);

layer_output = layerWeights_3_to_output * layer_3_output + bias_4;

My_Output = layer_output

My_Output =

2.2572

% MATLAB Prediction from .mat

arquivo_mat = load('redeNeuralTreinada_50N30N20.mat');

MAT_Output = arquivo_mat.redeNeuralTreinada(espectro)

MAT_Output =

0.6188

##### 0 Comments

### Accepted Answer

Steven Lord
on 17 Aug 2023

### More Answers (0)

### See Also

### Categories

### Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!