divideFcn & Validation Checks for Layer recurrent neural network
5 views (last 30 days)
Show older comments
So, I want to use layrecnet for my NN. Since the data is time-steped and can't be divided randomly I want the training to be done in continuous blocks and not random samples. But the code seems to be complaining later on in train() about me using rNNet.divideFcn = 'divideblock'; any ideas?
Also: Validation Checks in the nntraintool won't budge (frozen at zero). Isn't it involved in validating error after each epoch on validation data outside the training set? Or could it be so that my error always is getting smaller so it doesn't trigger anything and therefor stays constant? What have I misunderstood?
Here's some of the code:
if true
% code
[formatedInput, ~] = tonndata(allDataInput,true,false);
%%Setup NNetwork and training
layerDelays = 1:2; % def = 1:2, sets the "delay depth" for the NN
hiddenSizes = [8 8] ; %def = 10 ,sets amount of neurons for each hidden layer
trainFcn = 'trainrp'; % def = 'trainlm', others: 'trainbr','trainscg','trainrp' regressive stuff.
rNNet = layrecnet(layerDelays,hiddenSizes,trainFcn);
rNNet.trainParam.epochs = 10000; %goes pretty fast
[Xs,Xi,Ai,Ts] = preparets(rNNet,formatedInput,allDataOutput); % network, training data, target data
%rNNet.divideFcn = 'divideblock';
%rNNet.divideParam.trainRatio = 0.6;
%rNNet.divideParam.testRatio = 0.2;
%rNNet.divideParam.valRatio = 0.2;
%[trainInd,valInd,testInd] = divideblock(length(Ts),0.6,0.2,0.2);
[rNNet, trainingRecord] = train(rNNet,Xs,Ts,Xi,Ai);
%view(rNNet)
Y = rNNet(Xs,Xi,Ai);
perf = perform(rNNet,Y,Ts)
%%plot Performance
plotperf(trainingRecord)
%plotperformance(trainRecord)
%%Manualy Test NNetwork
testInput = table2array(nikoGrip(2001:end, 2:end))';
testOutput = table2cell(nikosGripData(2001:end, 1))';
[formTestInput,wasMatrix] = tonndata(testInput,true,false);
testRes = rNNet(formTestInput);
%%Plot Manual Test
plot(cell2mat(testRes))
hold on
plot(cell2mat(testOutput))
%end code
end
Many thanx, M!
0 Comments
Answers (1)
Greg Heath
on 28 Sep 2017
Why LAYRECNET instead of the BETTER and EASIER NARXNET ???
The best way to get a successful result is to begin with as small a program as possible with as many defaults as possible on a MATLAB example dataset. For example, MINOR modifications of the documentation example codes
help narxnet
doc narxnet
Next run as many MATLAB example datasets as you need to get comfortable.
help nndatasets
doc nndatasets
Typically, with regression and classification, all you need to vary is
1. Number of hidden nodes, H
2. 10 trials of random initial weights for each trial value of H
Using the smallest successful value of H should result in the best net to use on nontraining (i.e., validation, testing and unseen) data.
However, with timeseries you also have to vary the number of input and output delays.
I have found that looking at the large values on peaks of the ABSOLUTE VALUES of the target autocorrelation function and input-target cross-correlation function are very useful in choosing additional lags if the defaults 1:2 are not sufficient.
The beauty of starting this way is that
1. It is straightforward
2. We can help more easily.
Hope this helps.
Thank you for formally accepting my answer
Greg
PS: I have posted zillions of examples on both the NEWSREADER and ANSWERS
greg narxnet
greg narxnet Ntrials
0 Comments
See Also
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!