Clear Filters
Clear Filters

As used in closed loop network NarX reality

5 views (last 30 days)
Thanks Greg for your reply. I think the network is evaluated openloop and I think it gives good results, but for now I do not believe them. My questions are as follows:
1 - I have evaluated the accuracy openloop, how would an in closeloop?. For a first simulation closeloop as inputs I have therefore created a matrix with the last two values InputSeries and say 30 inputs outside the training sample which would have an array of 2 * 32. To create the matrix newsTargets introduce targetSeries the last value of the array and fill with 31 NaN thereby get another matrix 1 * 32. From here I simulate the network and gives me values that putting them together with real targets is a bit displaced. (I would like to send a screenshot of that graph, if you tell me in what way will send it). I think I perform the operation correctly, although I would like to evaluate the accuracy closeloop very strictly.
2 - My second question is because if I try the net for real, that is, suppose we are to March 4, 2013. I by a platform get data (RSI inputs and EMA), whereby I newInputs other matrix in which the last 2 introduce inputSeries values and RSI values and EMA today. (I think I'm doing well so far). To create another array newTarget do with the last value NaN targetSeries and 2 to have the same dimension as newInput (2 * 3) and newTarget (1 * 3) and performed the simulation to obtain the predicted value of the day March 5, 2013 . To continue iterating introduce the value of March 5 and get the value of the March 6, much like the network does NAR?, Is, in preparets command:
[inputs, inputStates, layerStates, targets] = preparets (net, inputSeries, targetSeries);?? and evaluate the network outside the sample.
Basically these are my doubts.
Many thanks Greg

Accepted Answer

Greg Heath
Greg Heath on 5 Mar 2013
Edited: Greg Heath on 5 Mar 2013
I think the network is evaluated openloop and I think it gives good results,
What are I, N, ID, FD, H, Ntrn, Nval, Ntst, R2trn, R2trna, R2val and R2tst ?
but for now I do not believe them.
Why?
What do you get from the same data using closeloop?
My questions are as follows:
1 - I have evaluated the accuracy openloop, how would an in closeloop?. For a first simulation closeloop as inputs I have therefore created a matrix with the last two values InputSeries and say 30 inputs outside the training sample which would have an array of 2 * 32. To create the matrix newsTargets introduce targetSeries the last value of the array and fill with 31 NaN thereby get another matrix 1 * 32.
I don't understand where this data is coming from.
From here I simulate the network and gives me values that putting them together with real targets is a bit displaced. (I would like to send a screenshot of that graph, if you tell me in what way will send it). I think I perform the operation correctly, although I would like to evaluate the accuracy closeloop very strictly.
There is a way to see plots on ANSWERS. Fnd out how. Right now I need to see your code.
2 - My second question is because if I try the net for real, that is, suppose we are to March 4, 2013. I by a platform get data (RSI inputs and EMA), whereby I newInputs other matrix in which the last 2 introduce inputSeries values and RSI values and EMA today. (I think I'm doing well so far). To create another array newTarget do with the last value NaN targetSeries and 2 to have the same dimension as newInput (2 * 3) and newTarget (1 * 3) and performed the simulation to obtain the predicted value of the day March 5, 2013 . To continue iterating introduce the value of March 5 and get the value of the March 6, much like the network does NAR?, Is, in preparets command:
[inputs, inputStates, layerStates, targets] = preparets (net, inputSeries, targetSeries);?? and evaluate the network outside the sample.
Basically these are my doubts.
I don't understand. Post the code.
Greg
  2 Comments
FRANCISCO
FRANCISCO on 6 Mar 2013
I have several scripts to evaluate autocorrelation, crosscorrelation, hidden layers .... I'll put the script creating the network, checking in closeloop, and attempted several iterations several steps ahead with the outputs predicted:
I have two files:
p (are the inputs), consisting of 8 columns and 2200 rows
t (are the targets) consisting of 1 column and 2200 rows
if true
% code
p1=p(1:8,1:2100);
p2=p(1:8,2101:end);
t1=t(1,1:2100);
t2=t(1,2101:end);
%normalizacion de los datos
inputSeries = tonndata(p1,true,false);
targetSeries = tonndata(t1,true,false);
%calculo del MSE00 y MSE00a
MSE00=mean(var(t1,1)); % MSE00=0.0095
MSE00a=mean(var(t1,0)); MSE00a=0.0095
%numero de capas y retrasos
inputDelays = 1:2;
feedbackDelays = 1:2;
hiddenLayerSize = 9;
%calculo de otros parámetros
[I N]=size(p); % I=8 ; N=2200
[O N]=size(t); % O=1 ; N=2200
Neq=N*O; % Neq=2200
Nw=(I+1)*hiddenLayerSize+(hiddenLayerSize+1)*O; % Nw=91
Ntrneq=0.7*Neq; % Ntrneq=1540
Ndof=Ntrneq-Nw; % Ndof=1449
%creacion de la red
net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize);
%preparacion de los datos
[inputs,inputStates,layerStates,targets] = preparets(net,inputSeries,{},targetSeries);
%division de los datos en entrenamiento, validacion y prueba
net.divideFcn='divideblock';
net.divideParam.trainRatio=0.70;
net.divideParam.valRatio=0.15;
net.divideParam.testRatio=0.15;
net.trainParam.goal=0.01*MSE00;
%entrenamiento de la red
[net,tr] = train(net,inputs,targets,inputStates,layerStates);
%salidas y errores
outputs = net(inputs,inputStates,layerStates);
errors = gsubtract(targets,outputs);
MSE = perform(net,targets,outputs); % MSE= 7.7982e-5
MSEa=Neq*MSE/(Neq-Nw); % MSEa= 8.1346e-5
R2=1-MSE/MSE00; % R2=0.9918
R2a=1-MSEa/MSE00a; % R2a=0.9915
MSEtrn=tr.perf(end); %MSEtrn=7.1240e-5
MSEval=tr.vperf(end); %MSEval=9.4383e-5
MSEtst=tr.tperf(end); %MSEtst=9.3139e-5
R2trn=1-MSEtrn/MSE00; %R2trn=0.9925
R2val=1-MSEval/MSE00; %R2val=0.9901
R2tst=1-MSEtst/MSE00; %R2tst=0.9902
precisiones=[MSE MSEa R2 R2a R2trn R2val R2tst];
% we close the loop and check with 50 values of p2
%cerramos bucle
netc = closeloop(net);
netc.name = [net.name ' - Closed Loop'];
view(netc)
%cogemos numero de predicciones
NumberOfPredictions = 50;
% create new inputs and new targets
s=cell2mat(inputSeries);
t4=cell2mat(targetSeries);
a=s(1:8,2098:2100);
b=p2(1:8,1:50);
newInputSeries=[a b];
c=t4(1,2099:2100);
d=nan(1,51);
newTargetSet=[c d];
%normalizamos los nuevos inputs y los nuevos targets
newInputSeries=tonndata(newInputSeries,true,false);
newTargetSet=tonndata(newTargetSet,true,false);
%preparamos los datos
[xc,xic,aic,tc] = preparets(netc,newInputSeries,{},newTargetSet);
%obtenemos las salidas predichas
yPredicted = sim(netc,xc,xic,aic);
w=cell2mat(yPredicted);
%errores
t3=t2(1,1:50);
MSE00C=mean(var(t3,1))
errores = gsubtract(t3,yPredicted);
t3=tonndata(t3,true,false);
MSEclosed = perform(netc,t3,yPredicted) % MSEclosed=2.456e-5
R2=1-MSEclosed/MSE00C
% I think this assessment is well closeloop errors
%representacion grafica de las salidas predichas
plot(cell2mat(yPredicted),'DisplayName','cell2mat(yPredicted)','YdataSource','cell2mat(yPredicted)');figure(gcf)
plot(t2,'r','DisplayName','targetsComprobacion')
hold on
plot(w,'b','DisplayName','salidasIteradas')
title({'ITERACCIONES'})
legend('show')
hold off
% Now intend to use this network to perform several steps ahead with today's data to predict the next 3-4 days. (This the part I do not understand). % first lame today's data, which is a file of one row by 8 columns and perform a 1 iteration as follows %p7 son los datos de hoy (1 fila por 8 columnas):
%cogemos numero de predicciones
NumberOfPredictions = 1;
s=cell2mat(inputSeries);
t4=cell2mat(targetSeries);
a1=s(1:8,2098:2100);
b1=p7;
newInputSeries1=[a1 b1];
c1=t4(1,2099:2100);
d1=nan(1,2);
newTargetSet1=[c1 d1];
%normalizamos los nuevos inputs y los nuevos targets
newInputSeries1=tonndata(newInputSeries1,true,false);
newTargetSet1=tonndata(newTargetSet1,true,false);
%preparamos los datos
[xc,xic,aic,tc] = preparets(netc,newInputSeries1,{},newTargetSet1);
%obtenemos las salidas predichas
yPredicted1 = sim(netc,xc,xic,aic);
w1=cell2mat(yPredicted1);
% Now this yPredicted1 should enter in the closed loop step back to get ahead, but do not know how. I tried as follows but I get
ttt=[c1 w1];
ttt=tonndata(ttt,true,false);
[xc2,xic2,aic2,tc2] = preparets(netc,a1,{},ttt);
yPredicted2 = sim(netc,xc2,xic2,aic2);
w2=cell2mat(yPredicted2);
%---------------------------2ºITERACION PREDICHA--------------------
ttt2=[c1 w2];
ttt2=tonndata(ttt2,true,false);
[xc3,xic3,aic3,tc3] = preparets(netc,a1,{},ttt2);
yPredicted3 = sim(netc,xc3,xic3,aic3);
w3=cell2mat(yPredicted3);
% This is the part I do not understand. How to get several steps ahead with the outputs predicted? . future values of inputs I have not thus forces me to use the outputs obtained prior to continue iterating % I hope you find me understand this last step because I stuck completely
end
Many thanks Greg
FRANCISCO
FRANCISCO on 7 Mar 2013
Good, should change the size number of inputs and network layers netc?? I keep thinking how to perform multistep ahead, but I think the solution will be much easier. Many thanks

Sign in to comment.

More Answers (1)

Greg Heath
Greg Heath on 8 Mar 2013
Please add a line of English translation beneath your Spanish cooments
% I have two files: % p (are the inputs), consisting of 8 columns and 2200 rows % t (are the targets) consisting of 1 column and 2200 rows
Which have to be transposed ...
% MSE00=mean(var(t1,1)); % MSE00=0.0095 % MSE00a=mean(var(t1,0)); MSE00a=0.0095
Not quite. Should remove the initial delays and divide into separate trn/trna/val/tst estimates
% %numero de capas y retrasos % inputDelays = 1:2; % feedbackDelays = 1:2; % hiddenLayerSize = 9; % %calculo de otros parámetros % [I N]=size(p); % I=8 ; N=2200 % [O N]=size(t); % O=1 ; N=2200 % Neq=N*O; % Neq=2200
No. Should exclude N2 = 100 holdout data and 2 initial delay data points
Neq = (N-100-2)*O = 2098
% Nw=(I+1)*hiddenLayerSize+(hiddenLayerSize+1)*O; % Nw=91
No. Should include the delay weights
Nw = (NID*I+NFD*O+1)*H+(H+1)*O = 181
After configure or train can check with
Nw = net.numWeightElements
% Ntrneq=0.7*Neq; % Ntrneq=1540
Ntrn = N -2*round(0.15*N) % 1468
% Ndof=Ntrneq-Nw; % Ndof=1449
Ndof = 1287
% net.divideFcn='divideblock';
'divideind' will also work if you specify the integers
% net.trainParam.goal=0.01*MSE00;
net.trainParam.goal = 0.01*Ndof*MSEtrn00a/Ntrneq
to obtain R2trna ~ 0.990
% %entrenamiento de la red % [net,tr] = train(net,inputs,targets,inputStates,layerStates);
[ net tr Ys Es Xf Af ] = ...
Ys,Es contain trn/val/test output and error time series
Xf,Af are the final delay data for use with your next N2 = 100 data points.
%salidas y errores % outputs = net(inputs,inputStates,layerStates); % errors = gsubtract(targets,outputs); % MSE = perform(net,targets,outputs); % MSE= 7.7982e-5 % MSEa=Neq*MSE/(Neq-Nw); % MSEa= 8.1346e-5 % R2=1-MSE/MSE00; % R2=0.9918 % R2a=1-MSEa/MSE00a; % R2a=0.9915 % MSEtrn=tr.perf(end); %MSEtrn=7.1240e-5 % MSEval=tr.vperf(end); %MSEval=9.4383e-5 % MSEtst=tr.tperf(end); %MSEtst=9.3139e-5 % R2trn=1-MSEtrn/MSE00; %R2trn=0.9925 % R2val=1-MSEval/MSE00; %R2val=0.9901 % R2tst=1-MSEtst/MSE00; %R2tst=0.9902 % precisiones=[MSE MSEa R2 R2a R2trn R2val R2tst];
Almost.
Need MSEtrna,R2trna and correct use of MSEtrn00, MSEtrn00a, MSEval00 and MSEtst00 .
Now close the loop and test on the original data as I explained in the recent answer to Nicholas.
% % we close the loop and check with 50 values of p2 % %cerramos bucle % netc = closeloop(net); % netc.name = [net.name ' - Closed Loop']; % view(netc) % %cogemos numero de predicciones % NumberOfPredictions = 50;
Why not use all 100 of p2(:,1:N2) ?
=============== MORE LATER
Greg
  3 Comments
Greg Heath
Greg Heath on 9 Mar 2013
Only the training error needs the DOF adjustment. The test error is unbiased. The val error is somewhat biased if it's minimum caused training to stop. However, most of the time it is near the test set error. Besides, I've never seen anyone propose a way to mitigate val set stopping bias.
FRANCISCO
FRANCISCO on 10 Mar 2013
Proposed to improve all these details?. If I can put better code example, to understand more easily and that sometimes do not quite understand the instructions and I would extend knowledge in NarX.
Many thanks.

Sign in to comment.

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!