If I want to use dividerand and retraining for 10 times shoould the dividerand also be on the loop of 10 times or it should be out of the loop?
net.divideFcn = 'dividerand';
net.divideMode = 'sample';
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
for j=1:100
net = feedforwardnet(j);
for i =1:10
net = configure(net,inputs,targets);
[ net tr y e ] = train(net,inputs,targets);
save net
end
end
If the function "configure" can define different initial weights and rng, how about the different random dividing of data (which allocates 70/15/15 among train/val/test)? Should we use different division of dataset for each neuron or repetition ? or the division of datasets should be the same for each neuron? Thanks
0 Comments
Sign in to comment.