NARX re-training in closed loop

14 vues (au cours des 30 derniers jours)
Giuseppe Menga
Giuseppe Menga le 25 Nov 2017
Frequently, following the procedure suggested in the manual, the performance of a NARX neural network get worse when moving from open to closed loop. But this result doesn't represent the real potentiality of that net. I have resorted to retrain the closed loop network several times, always retaining the best result, using every time with starting parameters the random perturbations of the previously found best choice, obtaining much better performances. However, the process is slow. Doesn't a better technique exist to find the fitting closed loop net?

Réponses (4)

Greg Heath
Greg Heath le 1 Juin 2019
Using 100 feedback delays makes no sense.
Only use feedback delays that are within the correlation length of the function.
See my tutorials.
Greg
  2 commentaires
Greg Heath
Greg Heath le 2 Juin 2019
If you are having a problem in a long script, only post that part of the script that is involved in the problem.
Greg
Muhammad Adil Raja
Muhammad Adil Raja le 18 Mar 2020
Hi Greg,
Where are your tutorials? How can I find them??
Best,
MA

Connectez-vous pour commenter.


Greg Heath
Greg Heath le 25 Nov 2017
Modifié(e) : Greg Heath le 25 Nov 2017
I have not found one and the MATLAB documentation ...
IS OF VERY LITTLE HELP!!!
I have tried all of the example datasets in
help nndatasets
and
doc nndatasets
and listed my successes and frustrations.
I would like to use DIVIDEBLOCK with the basic assumption that the summary statistics are stationary. HOWEVER, when there are problems, that assumption really needs to be verified.
I have found no guidance w.r.t. this. Some obvious ideas:
1. Determine the summary stats of the DIVIDEBLOCK trn, val
and tst SUBSETS.
2. If stationarity cannot be assumed, try the 1st or 2nd
differences of the series.
3. Consider increasing the length of the lag time to reduce
the complexity.
Please keep us informed if you have any luck in your experimentation.
Hope this helps.
Greg

VICTOR CATALA
VICTOR CATALA le 30 Mai 2019
How do you get to retrain a NARX NN once tranformed to closed loop? I can train in open loop architecture, but I can't when I close the loop. See the code and error below, please.
Thanks
% Solve an Autoregression Problem with External Input with a NARX Neural Network
% Script generated by Neural Time Series app
% Created 24-Apr-2019 15:47:02
%
% This script assumes these variables are defined:
%
% ref - input time series.
% mic - feedback time series.
X = tonndata(ref,false,false);
T = tonndata(mic,false,false);
% Choose a Training Function
% For a list of all training functions type: help nntrain
% 'trainlm' is usually fastest.
% 'trainbr' takes longer but may be better for challenging problems.
% 'trainscg' uses less memory. Suitable in low memory situations.
trainFcn = 'trainscg'; % Scaled conjugate gradient backpropagation.
% Create a Nonlinear Autoregressive Network with External Input
inputDelays = 0:100;
feedbackDelays = 1:10;
hiddenLayerSize = 5;
net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize,'open',trainFcn);
% Choose Input and Feedback Pre/Post-Processing Functions
% Settings for feedback input are automatically applied to feedback output
% For a list of all processing functions type: help nnprocess
% Customize input parameters at: net.inputs{i}.processParam
% Customize output parameters at: net.outputs{i}.processParam
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
net.inputs{2}.processFcns = {'removeconstantrows','mapminmax'};
% Prepare the Data for Training and Simulation
% The function PREPARETS prepares timeseries data for a particular network,
% shifting time by the minimum amount to fill input states and layer
% states. Using PREPARETS allows you to keep your original time series data
% unchanged, while easily customizing it for networks with differing
% numbers of delays, with open loop or closed loop feedback modes.
[x,xi,ai,t] = preparets(net,X,{},T);
% Setup Division of Data for Training, Validation, Testing
% For a list of all data division functions type: help nndivision
net.divideFcn = 'divideblock'; % Divide data in contiguous blocks
net.divideMode = 'time'; % Divide up every sample
net.divideParam.trainRatio = 60/100;
net.divideParam.valRatio = 20/100;
net.divideParam.testRatio = 20/100;
% Choose a Performance Function
% For a list of all performance functions type: help nnperformance
net.performFcn = 'mse'; % Mean Squared Error
% Choose Plot Functions
% For a list of all plot functions type: help nnplot
net.plotFcns = {'plotperform','plottrainstate', 'ploterrhist', ...
'plotregression', 'plotresponse', 'ploterrcorr', 'plotinerrcorr'};
% Train the Network
[net,tr] = train(net,x,t,xi,ai);
% Test the Network
y = net(x,xi,ai);
e = gsubtract(t,y);
performance = perform(net,t,y)
% Recalculate Training, Validation and Test Performance
trainTargets = gmultiply(t,tr.trainMask);
valTargets = gmultiply(t,tr.valMask);
testTargets = gmultiply(t,tr.testMask);
trainPerformance = perform(net,trainTargets,y)
valPerformance = perform(net,valTargets,y)
testPerformance = perform(net,testTargets,y)
% View the Network
view(net)
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, ploterrhist(e)
figure, plotregression(t,y)
%figure, plotresponse(t,y)
%figure, ploterrcorr(e)
%figure, plotinerrcorr(x,e)
% Closed Loop Network
% Use this network to do multi-step prediction.
% The function CLOSELOOP replaces the feedback input with a direct
% connection from the outout layer.
netc = closeloop(net);
netc.name = [net.name ' - Closed Loop'];
view(netc)
[xc,xic,aic,tc] = preparets(netc,X,{},T);
yc = netc(xc,xic,aic);
closedLoopPerformance = perform(netc,tc,yc)
% Closed Loop Network
% Use this network to do multi-step prediction.
% The function CLOSELOOP replaces the feedback input with a direct
% connection from the outout layer.
[y,xf,af] = net(x,xi,ai);
[netc,xic,aic] = closeloop(net,xf,af); %close the loop
netc.name = [net.name ' - Closed Loop'];
view(netc)
[xc,xic,aic,tc] = preparets(netc,X,{},T);
yc = netc(xc,xic,aic);
closedLoopPerformance = perform(netc,tc,yc)
% Re-Train the Network
[netc,tr] = train(netc,xc,tc,xic,aic);
Error using zeros
Requested 1x7430768128 (55.4GB) array exceeds maximum array size preference. Creation of arrays greater
than this limit may take a long time and cause MATLAB to become unresponsive. See array size limit or
preference panel for more information.
Error in nnMex.perfsGrad (line 3)
TEMP = zeros(1,ceil(hints.tempSizeBG/8)*8);
Error in nnCalcLib/perfsGrad (line 294)
lib.calcMode.perfsGrad(calcNet,lib.calcData,lib.calcHints);
Error in trainscg>initializeTraining (line 151)
[worker.perf,worker.vperf,worker.tperf,worker.gWB,worker.gradient] = calcLib.perfsGrad(calcNet);
Error in nnet.train.trainNetwork>trainNetworkInMainThread (line 29)
worker = localFcns.initializeTraining(archNet,calcLib,calcNet,tr);
Error in nnet.train.trainNetwork (line 17)
[archNet,tr] = trainNetworkInMainThread(archNet,rawData,calcLib,calcNet,tr,feedback,localFcns);
Error in trainscg>train_network (line 145)
[archNet,tr] = nnet.train.trainNetwork(archNet,rawData,calcLib,calcNet,tr,localfunctions);
Error in trainscg (line 55)
[out1,out2] = train_network(varargin{2:end});
Error in network/train (line 373)
[net,tr] = feval(trainFcn,'apply',net,data,calcLib,calcNet,tr);

Juan Hynek
Juan Hynek le 25 Août 2019
Hi Victor,
Your method seems correct. I would suggest changing the closed loop training method (scg) as this solved the problem for me.

Catégories

En savoir plus sur Sequence and Numeric Feature Data Workflows dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by