Final delay states for use in closed loop simulation – how do I get them if I have several external predictors?
1 view (last 30 days)
Show older comments
I have trained a NARXNET to the point where I’m satisfied with its performance, but almost every time I convert the net into closed loop form to predict ahead without targets the first few values that are being predicted are way off where they should be – if I’m for example simulating 60 steps ahead with external predictors, it is very common that the first 1:5 predictions are much less accurate than the remaining 6:60.
I suspect that this problem has to do with the layer states that I’m using in the closed loop simulation, I read a post made some time ago by Mark Hudson Beale giving an example as to how to acquire the correct delay states to use in closed loop simulation as:
%(This is just the last part of the example)
% Initial 2nd layer states for closed loop contination will be the
% processed second input's final states. Initial 1st layer states
% will be zeros, as they have no delays associated with them.
Ai2 = cell2mat(Xf(2,:));
for i=1:length(net.inputs{1}.processFcns)
fcn = net.inputs{i}.processFcns{i};
settings = net.inputs{i}.processSettings{i};
Ai2 = feval(fcn,'apply',Ai2,settings);
end
Ai2 = mat2cell([zeros(10,2); Ai2],[10 1],ones(1,2));
% Closed loop simulation on X2 continues from open loop state after X.
Y2 = sim(netc,X2,Xi2,Ai2);
When I run the code on his example it works fine, but since I in my own problem have several external predictors, when I try it there I’m getting an error saying:
Error using mat2cell (line 97)
Input arguments, D1 through D2, must sum to each dimension of the input matrix size, [37 2].'
Because what I’m getting out from the loop (Ai2) is a 27x2 matrix that has just been processed by the nets process function mapminmax.
Could someone advise me on how to get the correct delay conditions for closed loop simulation in a situation with multiple external predictors?
Thanks.
0 Comments
Accepted Answer
Greg Heath
on 2 Sep 2015
FROM: On Designing a Feedback Time-Series Neural Network for Operational Deployment http://www.mathworks.com/matlabcentral/newsreader/view_thread/332147#912806
'PLEASE READTHE REFERENCE FIRST !!!'
[ X,T ] = pollution_dataset;
x = cell2mat(X); t = cell2mat(T);
[ I N ] = size(x); % [ 8 508 ]
[ O N ] = size(t); % [ 3 508 ]
vart = mean( var( t',1) ) % 102.91
neto = narxnet;% (ID,FD,H) = (1:2,1:2,10)
neto.divideFcn = 'divideblock';
[ Xo Xoi Aoi To ] = preparets( neto, X, {}, T );
to = cell2mat( To ); varto = mean( var( to',1) )% 102.62
rng(4151941)
[ neto tro Yo Eo Xof Aof ] = train( neto, Xo, To, Xoi, Aoi );
% Yo = net(Xo,Xoi,Aoi); Eo = gsubtract(To,Yo)
view(net)
% Google: wikipedia/R-squared
Rsqo = 1 - mse(Eo)/varto % 0.70731
[ netc Xci Aci ]= closeloop( neto, Xoi, Aoi );
view( netc )
[ Xc Xci Aci Tc ] = preparets( netc, X, {}, T );
tc = cell2mat( Tc ); vartc = mean( var( tc',1) ) %102.62
[ Yc Xcf Acf ] = netc( Xc, Xci, Aci );
Ec = gsubtract( Tc, Yc);
Rsqc = 1 - mse( Ec ) / vartc % 0.39263
if Rsqc < 0.95*Rsqo
[ netc trc Yc2 Ec2 Xcf2 Acf2 ] = train( netc, Xc, Tc, Xci, Aci );
view(netc)
Rsqc2 = 1 - mse(Ec2) / vartc % 0.5087
end
% DESIGN IMPROVEMENTS ARE DISCUSSED IN THE REFERENCE !!!
0 Comments
More Answers (0)
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!