Implementation of the feedforwardnet Neural Network
36 views (last 30 days)
Show older comments
I have implemented a very simple neural network to estimate a sine function. The following is the code for generating and training the network:
% Generate Data
dataSize = 1000;
x = linspace(0, 2*pi, dataSize);
y = sin(x);
hold off
plot(x,y)
hold on
% Add noise to Data
yInput = y+randn(1,dataSize)./5;
% No need to seperate training, test and validation data, that occurs automatically in train function.
% Generate Network. A Very simple two layer model, with two nodes in input layer.
net = feedforwardnet([2]);
% Train Network
net = train(net,x,yInput);
% Show result of trained network
yNN = net(x);
figure
plot(x,yNN, '*')
hold on
plot(x,y, '.')
Now my question, how is this network actually implemented. According to literature, I should be able to recreate the network, copying the weights and biases, with the following function:
function [y] = mynet(net, x_val)
%MYNET A manual implementation of the feedforward network, to demonstrate functionality.
W1 = net.IW{1};
b1 = net.b{1};
W2 = net.LW{2};
b2 = net.b{2};
y = purelin(W2*tansig(W1*x_val + b1)+b2);
end
However, the original net(x) function, and mynet(x) produce completely different results. Although the weights and biases are exactly the same, the functions are also directly copied over, you can extract them from the network with:
>> net.layers{1}.transferfcn
ans =
'tansig'
>> net.layers{2}.transferfcn
ans =
'purelin'
Can anyone suggest where my implementation of the neural network is wrong. I am really hoping that it is a simple mistake, but I just cant see it at the moment.
Many thanks in advance
2 Comments
Joshua Ogbebor
on 21 Feb 2021
Hi
Can you run the code net.inputWeights{1,1}.weightFcn and see if it is dot product?
Accepted Answer
Steven Lord
on 22 Feb 2021
How are you handling the preprocessing functions and postprocessing functions?
net = feedforwardnet([2]);
net.inputs{1}.processFcns
net.outputs{2}.processFcns
See the description of the inputs, outputs, etc. subobjects on this documentation page for more information.
More Answers (1)
Joshua Ogbebor
on 21 Feb 2021
In general the code net(x) is equivalent to sim(net, x), and this implementation depends on so many properties of the feedforward nnetwork that you may not be aware of. I copied this from the sim function page and thought t would be helpful. You could investigate and see if any one of these could be the issue.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
sim uses these properties to simulate a network net.
net.numInputs, net.numLayers
net.outputConnect, net.biasConnect
net.inputConnect, net.layerConnect
These properties determine the network’s weight and bias values and the number of delays associated with each weight:
net.IW{i,j}
net.LW{i,j}
net.b{i}
net.inputWeights{i,j}.delays
net.layerWeights{i,j}.delays
These function properties indicate how sim applies weight and bias values to inputs to get each layer’s output:
net.inputWeights{i,j}.weightFcn
net.layerWeights{i,j}.weightFcn
net.layers{i}.netInputFcn
net.layers{i}.transferFcn
See Also
Categories
Find more on Define Shallow Neural Network Architectures in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!