Neural Network - Multi Step Ahead Prediction

Hi all, please I need your help !
I've read all the posts here about Time Series Forecasting but still can't figure it out ! I'm drained.. *:-(*
I've a NARX neural network with 10 hidden neurons and 2 delays. As input I have a 510x5 (called Inputx) and as output I have a 510x1 (called Target).
I want to forecast 10 days ahead but it's really not working...
I tried the following code but I'm stuck now. *:-(*
Would you mind to help me ? *Some code will be awesome. :-(*
***////////////////////////////////////////////******** ***/////////////////////////////////////////// ******
inputSeries = tonndata(Inputx,false,false);
targetSeries = tonndata(Target,false,false);
netc = closeloop(net);
netc.name = [net.name ' - Closed Loop'];
[xc,xic,aic,tc] = preparets(netc,inputSeries,{},targetSeries);
yc = netc(xc,xic,aic);
***////////////////////////////////////////////******** ***/////////////////////////////////////////// ******

2 commentaires

Oleg Komarov
Oleg Komarov le 2 Sep 2011
Two things,please change the title of your post in something useful and format the code: http://www.mathworks.com/matlabcentral/answers/13205-tutorial-how-to-format-your-question-with-markup#answer_18099
Constantine
Constantine le 21 Nov 2014
with respect to the accepted answer by Lucas Garcia, I find the predicted data only agrees with the actual data as well as his every once in a while.
1. It's important, before running the fit, to clear the variables, eg. 'clear all.' Re-running without clearing the variables leads to much worse fits.
2. much better fits result from using a bigger delay, like 5, instead of the delay of 2 in his example. Or by adding additional training data, such as the time derivative or 2nd time derivatives of the training data. Of course, doing this makes the fit considerably slower.

Connectez-vous pour commenter.

 Réponse acceptée

Lucas García
Lucas García le 7 Sep 2011
Modifié(e) : Lucas García le 3 Sep 2015
Hi Jack,
When using narxnet, the network performs only a one-step ahead prediction after it has been trained. Therefore, you need to use closeloop to perform a multi-step-ahead prediction and turn the network into parallel configuration.
Take a look at this example for a multi-step-ahead prediction, N steps. This uses the dataset magdata.mat which is available in the Neural Network Toolbox. Also, some of the inputs will be used for performing the multi-step-ahead prediction, and results validated with the original data. I hope the comments help to understand.
Edited in September 2015 to simplify step 5
%% 1. Importing data
S = load('magdata');
X = con2seq(S.u);
T = con2seq(S.y);
%% 2. Data preparation
N = 300; % Multi-step ahead prediction
% Input and target series are divided in two groups of data:
% 1st group: used to train the network
inputSeries = X(1:end-N);
targetSeries = T(1:end-N);
% 2nd group: this is the new data used for simulation. inputSeriesVal will
% be used for predicting new targets. targetSeriesVal will be used for
% network validation after prediction
inputSeriesVal = X(end-N+1:end);
targetSeriesVal = T(end-N+1:end); % This is generally not available
%% 3. Network Architecture
delay = 2;
neuronsHiddenLayer = 10;
% Network Creation
net = narxnet(1:delay,1:delay,neuronsHiddenLayer);
%% 4. Training the network
[Xs,Xi,Ai,Ts] = preparets(net,inputSeries,{},targetSeries);
net = train(net,Xs,Ts,Xi,Ai);
view(net)
Y = net(Xs,Xi,Ai);
% Performance for the series-parallel implementation, only
% one-step-ahead prediction
perf = perform(net,Ts,Y);
%% 5. Multi-step ahead prediction
[Xs1,Xio,Aio] = preparets(net,inputSeries(1:end-delay),{},targetSeries(1:end-delay));
[Y1,Xfo,Afo] = net(Xs1,Xio,Aio);
[netc,Xic,Aic] = closeloop(net,Xfo,Afo);
[yPred,Xfc,Afc] = netc(inputSeriesVal,Xic,Aic);
multiStepPerformance = perform(net,yPred,targetSeriesVal);
view(netc)
figure;
plot([cell2mat(targetSeries),nan(1,N);
nan(1,length(targetSeries)),cell2mat(yPred);
nan(1,length(targetSeries)),cell2mat(targetSeriesVal)]')
legend('Original Targets','Network Predictions','Expected Outputs')

24 commentaires

Jack
Jack le 7 Sep 2011
Hi Lucas,
Thank you very very much for your answer.
But my question is : I want to predict 10 steps ahead. Let's say I have a 1x510 target matrix and 5x510 input matrix, I want to predict 511, 512 ...to... 520. I don't have the "desired output" because I don't know it yet. I guess I won't use the inputs since I dont have them for 511 to 520. Am I wrong ?
Again, thank you because you help me and I'm pretty sure others as well.
Jack
Jack le 8 Sep 2011
It seems to me that the MAIN PROBLEM everyone is encoutering here is about predicting y(t+1) once y(t) is available, but before the actual y(t+1) occurs.
Again thanks for your help :-)
Lucas García
Lucas García le 8 Sep 2011
Hi Jack,
In my example, I created two sets of inputs and targets. The arrays "inputSeries" and "targetSeries" are used for training the network. These are your 5x510 and 1x510 arrays, respectively.
In order to do a multi-step-ahead prediction, you need to provide the network with NEW inputs. If you are doing a 10-step-ahead prediction, it will be a 5x10 array of inputs. Since I didn't have a new set of inputs in my dataset, I used part of the inputs ("inputSeriesVal") for doing the prediction process, and checked how good the network was performing by plotting the predictions with the real outputs ("targetSeriesVal").
Some rearranging of data has to be done though, since you also need to provide the network with the last d inputs and targets from the training dataset (d is the number of delays):
- in the case of the inputs, your array for prediction will be the last d values from the training dataset (5xd) and the new 10 inputs (with size 5x(d+10)). Named "inputSeriesPred" in the example.
- in the case of the targets, you will provide the last d values from the training dataset (1xd) and 10 values of NaNs (with size 1x(d+10)). Named "targetSeriesPred" in the example.
Jack
Jack le 12 Sep 2011
Wow Lucas, you are a very great teacher !!!
If you ever come to Montreal drop a line here ;-)
Thank you very very much !!
Morten
Morten le 31 Oct 2011
An addition to this issue: When I copy the script by Lucas an run it over again it keeps showing different results? I understand that if you set the seed it will produce the same result, but this is then not the same as the one seen in above figure?
I have some plots of the results without setting the seed:
http://dl.dropbox.com/u/2666399/Plot/test1.pdf
http://dl.dropbox.com/u/2666399/Plot/test2.pdf
http://dl.dropbox.com/u/2666399/Plot/test3.pdf
http://dl.dropbox.com/u/2666399/Plot/test4.pdf
http://dl.dropbox.com/u/2666399/Plot/test5.pdf
http://dl.dropbox.com/u/2666399/Plot/test6.pdf
Any suggestions of the strange behaviour? It is the same when I user other NN functions.
Nguyen Linh
Nguyen Linh le 9 Avr 2012
I also have the same problem with Morten, so could anyone points it out please!
Hi, i am using NARX to predict a daily stock market index data (Sensex 2003x1 matrix) as target and another daily stock market data(Nifty) as input. I have done it using the example you have shown in:
The code:
%%%newNARX code 24/4/2013
%%1. Importing data
% Matrix of 2003x1 each are
% daily stock market indices data
% of Nifty & Sensex
load Nifty.dat;
load Sensex.dat;
% %%S = load('magdata');
% %%X = con2seq(S.u);
% %%T = con2seq(S.y);
% To scale the data it is converted to its log value:
lognifty = log(Nifty);
logsensex = log(Sensex);
X = tonndata(lognifty,false,false);
T = tonndata(logsensex,false,false);
% X = con2seq(x);
% T = con2seq(t);
%%2. Data preparation
N = 300; % Multi-step ahead prediction
% Input and target series are divided in two groups of data:
% 1st group: used to train the network
inputSeries = X(1:end-N);
targetSeries = T(1:end-N);
% 2nd group: this is the new data used for simulation. inputSeriesVal will
% be used for predicting new targets. targetSeriesVal will be used for
% network validation after prediction
inputSeriesVal = X(end-N+1:end);
targetSeriesVal = T(end-N+1:end); % This is generally not available
%%3. Network Architecture
delay = 2;
neuronsHiddenLayer = 50;
% Network Creation
net = narxnet(1:delay,1:delay,neuronsHiddenLayer);
%%4. Training the network
[Xs,Xi,Ai,Ts] = preparets(net,inputSeries,{},targetSeries);
net = train(net,Xs,Ts,Xi,Ai);
view(net)
Y = net(Xs,Xi,Ai);
% Performance for the series-parallel implementation, only
% one-step-ahead prediction
perf = perform(net,Ts,Y);
%%5. Multi-step ahead prediction
inputSeriesPred = [inputSeries(end-delay+1:end),inputSeriesVal];
targetSeriesPred = [targetSeries(end-delay+1:end), con2seq(nan(1,N))];
netc = closeloop(net);
view(netc)
[Xs,Xi,Ai,Ts] = preparets(netc,inputSeriesPred,{},targetSeriesPred);
yPred = netc(Xs,Xi,Ai);
perf = perform(net,yPred,targetSeriesVal);
figure;
plot([cell2mat(targetSeries),nan(1,N);
nan(1,length(targetSeries)),cell2mat(yPred);
nan(1,length(targetSeries)),cell2mat(targetSeriesVal)]')
legend('Original Targets','Network Predictions','Expected Outputs')
Network predictions are coming very bad.. I guess there is some problem with the close loop's initial input states and initial layer states. please help.
phuong
phuong le 28 Août 2013
Please i don't understand some idea about your answer of Lucas García. Anyone know please show me. First, code of Lucas that use to predict N values in future with one step: predict value at t+1 by past values at t-1, t-2,...,t-d. Is it right? Second, why we don't use a loop as "for" with i from 1 to N. and each i, we will predict value at t+i by past values at t+i-1,...t+i-d. I think It's more accuracy than first way.
Greg Heath
Greg Heath le 30 Août 2013
No need for a loop. The input and targets are time series, not single points from time series. Each step is performed automatically via sim or net.
When the loop is called,there is only the input series. The target series is replaced by output feedback.
Greg
Muhammad Azeem
Muhammad Azeem le 17 Sep 2015
can anybody provides correct version(without loop) of this code, as the prediction results are very bad... thanks
Eric
Eric le 2 Mar 2016
Modifié(e) : Eric le 2 Mar 2016
Hello, Lucas!
Please, could you help adapting your code for NARnet?
I want predict N steps into the future of EURUSD close price.
My model has very good open loop performance: error MSE 5e-11 on unknown data.
Please, could you help me on closed loop prediction with NARnet?
Thanks! God bless you!
Eric
Search
narnet netc tutorial
in BOTH the NEWSGROUP and ANSWERS.
Hope this helps.
Greg
Eric
Eric le 8 Mar 2016
Modifié(e) : Eric le 8 Mar 2016
Hello, Greg!
Please, can you offer your freelancer service?
I'm not that kind of capitalist guy, but I'm so tired after spent various months trying to run multi-step ahead forecasting in Matlab.
What disapoints me is that my performance on open loop test is very good even on uknown data, but can't find a way to effectively run closed-loop multi-step ahead forecasting.
Please, can I pay you for this freelancer task? I need this very very much.
Already did all your recommended searchs on forums and google.
Also I paid 20 USD for Matlab Experts on Freelancer.com but not even they could solve the task. They refunded me. Another Matlab PhD Expert asked me 30 USD, but couldn't solve the task.
Thanks!
Eric
Diogo Goncalves
Diogo Goncalves le 15 Mar 2016
Modifié(e) : Diogo Goncalves le 15 Mar 2016
I can try... Contact me if you want me to try it:
Eric
Eric le 15 Mar 2016
Hello, Diogo!
I can't see your email address.
Mine: ericleonardocunha at hotmail.com
I WANT THAT MAGDATA PLZ HELP ME
Eric
Eric le 19 Mar 2016
Magdata is included in Matlab. Just write the command:
S = load('magdata');
Fabio Muratore
Fabio Muratore le 3 Avr 2016
Why do you include this line:
[Xs1,Xio,Aio] = preparets(net,inputSeries(1:end-delay),{},targetSeries(1:end-delay));
I dont see the "end-delay" in any other example like 1 or 2.
Thank you for your answer.
Fabio Retorta
Fabio Retorta le 22 Avr 2016
Im trying to use you cde Lucas Gacía. But it appears this:
Index exceeds matrix dimensions.
Error in preparets (line 293) xi = xx(:,FBS+((1-net.numInputDelays):0));
Error in Sript_Lucas_Garcia (line 43) [Xs,Xi,Ai,Ts] = preparets(net,inputSeries,{},targetSeries);
I dont know why. I have a total 168 sata. From this my ulti-Step ahead N = 50
Do this, before you put the variables in the preparets:
inputSeries = con2seq(inputSeries); targetSeries = con2seq(targetSeries);
rabi darji
rabi darji le 16 Mai 2017
after running the above code i am getting Error using closeloop Too many input arguments.
Error in magdata (line 35) [netc,Xic,Aic] = closeloop(net,Xfo,Afo);
rabi darji
rabi darji le 16 Mai 2017
after running the above code i am getting Error using closeloop Too many input arguments.
Error in magdata (line 35) [netc,Xic,Aic] = closeloop(net,Xfo,Afo);
Nils
Nils le 25 Mai 2020
"inputSeriesVal = X(end-N+1:end)"
What could be done if these values are not available?
Chris P
Chris P le 2 Août 2020
I'm wondering the same thing as Nils. What if this is implemented in real-time and we don't have these inputs planned in advance? For instance if we are trying to run an optimal control scheme to determine the best input values to achieve a desired trajectory. Or if some of the NN inputs are uncontrollable but necessary to get a good model prediction.

Connectez-vous pour commenter.

Plus de réponses (5)

Here is an example that may help. A NARX network is trained on series inputs X and targets T, then the simulation is picked up at the end of X using continuation input data X2 with a closed loop network. The final states after open loop simulation with X are used as the initial states for closed loop simulation with X2.
% DESIGN NETWORK
[x,t] = simplenarx_dataset;
net = narxnet;
[X,Xi,Ai,T] = preparets(net,x,{},t);
net = train(net,X,T,Xi,Ai);
view(net)
% SIMULATE NETWORK FOR ORIGINAL SERIES
[Y,Xf,Af] = sim(net,X,Xi,Ai);
% CONTINUE SIMULATION FROM FINAL STATES XF & AF WITH ADDITIONAL
% INPUT DATA USING CLOSED LOOP NETWORK.
% Closed Loop Network
netc = closeloop(net);
view(netc)
% 10 More Steps for the first (now only) input
X2 = num2cell(rand(1,10));
% Initial input states for closed loop continuation will be the
% first input's final states.
Xi2 = Xf(1,:);
% Initial 2nd layer states for closed loop contination will be the
% processed second input's final states. Initial 1st layer states
% will be zeros, as they have no delays associated with them.
Ai2 = cell2mat(Xf(2,:));
for i=1:length(net.inputs{1}.processFcns)
fcn = net.inputs{i}.processFcns{i};
settings = net.inputs{i}.processSettings{i};
Ai2 = feval(fcn,'apply',Ai2,settings);
end
Ai2 = mat2cell([zeros(10,2); Ai2],[10 1],ones(1,2));
% Closed loop simulation on X2 continues from open loop state after X.
Y2 = sim(netc,X2,Xi2,Ai2);

4 commentaires

Jack
Jack le 12 Sep 2011
Thank you very much Mark for your answer ! :-))
I have tried this code, and it is great, but when I try to apply it for my problem, I get really bad results. I tried with changing input and feedback delays, as well as number of hidden neurons, but the results are always bad (figure) (green line is multi step predistion)
The code is given below:
% DESIGN NETWORK
ID=1:2;
HL=6
FD=1:2;
net = narxnet(ID,FD,HL);
[X,Xi,Ai,T] = preparets(net,x,{},WS);
net.divideFcn = 'divideblock';
net = train(net,X,T,Xi,Ai);
% SIMULATE NETWORK FOR ORIGINAL SERIES
[Y,Xf,Af] = sim(net,X,Xi,Ai);
% CONTINUE SIMULATION FROM FINAL STATES XF & AF WITH ADDITIONAL
% INPUT DATA USING CLOSED LOOP NETWORK.
% Closed Loop Network
netc = closeloop(net);
Xi2 = Xf(1,:);
Ai2 = cell2mat(Xf(2,:));
for i=1:length(net.inputs{1}.processFcns)
fcn = net.inputs{i}.processFcns{i};
settings = net.inputs{i}.processSettings{i};
Ai2 = feval(fcn,'apply',Ai2,settings);
end
Ai2 = mat2cell([zeros(10,2); Ai2],[10 1],ones(1,2));
Y2 = sim(netc,X2,Xi2,Ai2);
plot(1:length(WS),cell2mat(WS))
hold on
plot(1:length(Y),cell2mat(Y),'r')
plot(length(WS):length(WS)+length(Y2)-1,cell2mat(Y2),'g')
legend('Input data - target series','One-step ahead prediction','Multi-step prediction beyond target series');
WT
WT le 1 Mar 2015
May I know what this "Ai2 = mat2cell([zeros(10,2); Ai2],[10 1],ones(1,2));" means?
Thank You
IOANNIS4
IOANNIS4 le 5 Août 2015
Please can someone exlpain little bit more this part
% Xi2 = Xf(1,:); Ai2 = cell2mat(Xf(2,:)); for i=1:length(net.inputs{1}.processFcns) fcn = net.inputs{i}.processFcns{i}; settings = net.inputs{i}.processSettings{i}; Ai2 = feval(fcn,'apply',Ai2,settings); end Ai2 = mat2cell([zeros(10,2); Ai2],[10 1],ones(1,2)); Y2 = sim(netc,X2,Xi2,Ai2);
Please you would really help us, Kind regards, Ioannis

Connectez-vous pour commenter.

Greg Heath
Greg Heath le 25 Mar 2014

1 vote

When the loop is closed, the net should be retrained with the original data and initial weights the same as the final weights of the openloop configuration.

1 commentaire

Mario Viola
Mario Viola le 26 Fév 2021
Just one question, How can i access the final weights from the open loop configuration ? ANd how to set them in the new closed configuration ?

Connectez-vous pour commenter.

mladen
mladen le 25 Oct 2013

0 votes

Be aware that predicting outputs this way (similar to cascade relaization of linear system) has great sensitivity to parametar estimation errors because they propagate in the process Mark Hudson Beale mentioned. This is highlighted in hard, multiple steps ahead problems.
Parallel realizations (simoltanoius output estimation...for instance 10 outputs of neural network for next 10 time steps) tend to be less sensitive to this errors. I have implemented this with my code which is alway prone to error :) So my subquestion is:
Is there some specific way to prepare my data for training with some matlab function?
Murat Akdag
Murat Akdag le 28 Mar 2015

0 votes

I'm trying to understand this narnet but still can't solve. Looking for help in matlab help section in this page: http://www.mathworks.com/help/nnet/ug/multistep-neural-network-prediction.html?searchHighlight=narnet%20multistep i try same codes but there is an error in >> [netc,xi,ai] = closeloop(net,xf,af); too many arguments. I just need one working sample about narnet which can predict 12 steps ahead prediction. I try to do with GUI in matlab NN section. But it has predict just 1 step ahead with removedelay command. I need 12 steps ahead. Thanks for help.

4 commentaires

Your error message is because you are using an obsolete version. Try
netc = closeloop(net,xf,af);
Then find xi and ai from preparets
Hope this helps.
Greg
Charles
Charles le 10 Juil 2017
Thank you for the NARX code. I am somewhat new to NARX and wish to leverage it to predict trend and next day value on a currency pair. I have used opening price and closing price as inputs X and T. i have reduced the delay to 5. For now I have kept rest of script the same. What is a good source on how to interpret the output? I believe the goal is to have the smallest MSE as possible. What are epochs?
Greg Heath
Greg Heath le 11 Juil 2017
epochs are the number of loops the training goes through to trying to minimize the objective function ...
but you knew that already because you have GOOGLE & WIKIPEDIA
RIGHT?
Did you find this
https://www.quora.com/What-is-epochs-in-machine-learning
?
Charles
Charles le 12 Juil 2017
Modifié(e) : Charles le 12 Juil 2017
yes, i was being lazy, and eventually found the answer. Still working on my currency trend prediction NARX.....looking to establish if next days price will in fact be higher or lower than last price....so not so much trend.

Connectez-vous pour commenter.

hugo kuribayashi
hugo kuribayashi le 15 Avr 2015

0 votes

Considering all this examples.. How can i calculate MAPE error instead MSE?

1 commentaire

Greg Heath
Greg Heath le 11 Juil 2017
Modifié(e) : Greg Heath le 12 Juil 2017
You mean "in addition to" ?
1. Learn with MSE or MSEREG
2. Report your findings with whatever floats your boat.
3. I prefer NMSE [0 1] for regression and time series
and
PCTERR [ 0 1 ]
for classification and pattern recognition
(;>)
Greg
P.S. Be aware of the shortcomings of MAPE and it's attempted modifications
https://en.wikipedia.org/wiki/Mean_absolute_percentage_error
Hope this helps.
Greg

Connectez-vous pour commenter.

Catégories

En savoir plus sur Deep Learning Toolbox dans Centre d'aide et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by