Why the results of my Elman network are different every time?
1 view (last 30 days)
Show older comments
Heather Zhang
on 31 Jul 2015
Commented: Greg Heath
on 13 Aug 2015
I created a elman network. But the results every time I run the code were different.I got different "errors","regression" and "avg_error" Could anyone tell me why? Appreciate SO MUCH!
Here is the code.
clear all
load('input4_train.mat');
load('output4_train.mat');
load('input4_test.mat');
load('output4_test.mat');
inputSeries = tonndata(input4_train,false,false);
targetSeries = tonndata(output4_train,false,false);
inputTest = tonndata(input4_test,false,false);
outputTest = tonndata(output4_test,false,false);
% Create a Network
hiddenLayerSize = 5;
net=newelm(inputSeries,targetSeries,[10,3,1], {'tansig','logsig','purelin'});
% Setup Division of Data for Training, Validation, Testing
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
net.trainParam.epochs = 2000;
% Initial net
net = init(net);
% Train the Network
net = adapt(net,inputSeries,targetSeries);
% Test the Network
outputs = sim(net,inputTest);
errors = gsubtract(outputTest,outputs);
error = cell2mat(errors);
for i = 1:10
error(i)=abs(error(i));
end
avg_error = sum(error)/10;
performance = perform(net,outputTest,outputs)
% View the Network
view(net)
% Plots
figure, plotregression(outputTest,outputs)
figure, plotresponse(outputTest,outputs)
figure, ploterrcorr(errors)
3 Comments
Greg Heath
on 1 Aug 2015
Edited: Walter Roberson
on 2 Aug 2015
% load('input4_train.mat');
% load('output4_train.mat');
% load('input4_test.mat');
% load('output4_test.mat');
%
% inputSeries = tonndata(input4_train,false,false);
% targetSeries = tonndata(output4_train,false,false);
% inputTest = tonndata(input4_test,false,false);
% outputTest = tonndata(output4_test,false,false);
whos
% % Create a Network
% hiddenLayerSize = 5;
Value never used
% net=newelm(inputSeries,targetSeries,[10,3,1], {'tansig','logsig','purelin'});
No justification for 3 hidden layers. One is sufficient.
% % Setup Division of Data for Training, Validation, Testing
% net.divideParam.trainRatio = 70/100;
% net.divideParam.valRatio = 15/100;
% net.divideParam.testRatio = 15/100;
Above 3 commands unnecessary for default values.
% net.trainParam.epochs = 2000;
%
% % Initial net
% net = init(net);
%
% % Train the Network
% net = adapt(net,inputSeries,targetSeries);
%
% % Test the Network
% outputs = sim(net,inputTest);
%
% errors = gsubtract(outputTest,outputs);
% error = cell2mat(errors);
% for i = 1:10
% error(i)=abs(error(i));
% end
% avg_error = sum(error)/10;
Above 5 commands unnecessary.
help mae
doc mae
% performance = perform(net,outputTest,outputs)
% % View the Network
% view(net)
%
% % Plots
% figure, plotregression(outputTest,outputs)
% figure, plotresponse(outputTest,outputs)
% figure, ploterrcorr(errors)
Defaults: Above 3 commands unnecessary;
Accepted Answer
Walter Roberson
on 31 Jul 2015
Neural Networks initialize their weights randomly usually. If you want repeatability you can initialize the weights yourself or you can set the random number generator seed.
2 Comments
More Answers (0)
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!