Effacer les filtres
Effacer les filtres

Predict Output with New Data to Already Trained Narxnet

4 vues (au cours des 30 derniers jours)
ando
ando le 28 Jan 2021
Réponse apportée : ando le 26 Fév 2021
Currently, I simultaneously train and simulate an open loop narxnet on data (about 3000 samples) with X(t) inputs and Y(t) outputs. I omit the last X(t) sample from training/validation and only present X(t) to the net for final simulation/prediction. The relevant code is below. Network performance and Y(t) predications are acceptable for my purposes under this approach.
inputDelays = 1:5;
feedbackDelays = 1:5;
hiddenLayerSize = 60;
train_set=.7;
X = tonndata(input,true,false);
T = tonndata(output,true,false);
net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize,'open',trainFcn);
[x,xi,ai,t] = preparets(net,X_train,{},T_train);
totend=length(X)-1;
trainend=round(totend*train_set);
t_rand_index=randperm(totend);
net.divideFcn = 'divideind';
net.divideParam.trainInd = t_rand_index(1:trainend);
net.divideParam.valInd = t_rand_index(trainend+1:totend);
net.divideParam.testInd = totend+1; %save last data point for testing
[net,~] = train(net,x,t,xi,ai);
[y1,xf,af] = net(x,xi,ai);
Now, for computational resource reasons, I need to train/validate the network before I have the very last X(t) and then simulate the trained and saved network on the last X(t) at a later time. Keeping in mind that I don’t use the very last X(t) for training/validation in my original code above, it seemed trivial to be able to complete training/validation first and then wait for the final X(t) to simulate/predict. However, I have not been able to reach the same level of net performance of my original approach above with this revised two-step approach. I’ve looked at the various posts from Matlab and Greg Heath on closed loop and narxnet topics and tried some of the examples but net performance in the two step approach is worse than in my initial one step. I've tried various versions of the code below appended to the code above and results are poor:
%append to code above
netc = closeloop(net,xf,af);
[x_test,xi_test,ai_test,t_test] = preparets(netc,X,{},T);
[x,xi,ai,t] = preparets(netc,X_train,{},T_train);
y = netc(x_test,xi,ai);
I've also tried keeping the network open and trying to simulate the net with the new X(t):
%append to code above
X = tonndata(input,true,false);%%%%%updated to include most recent X(t)
T = tonndata(output,true,false);
[y1,xf,af] = net(x,xi,ai);%net is a saved training in earlier step above
%alternatively, I've tried: xf,af save from trained net above
[y1,xf,af] = net(x,xf,af);%net is a saved network from above
Just to be clear, I'm looking for Y(t) (not Y(t+1)) based on the last X(t). I would appreciate any suggestions.

Réponses (1)

ando
ando le 26 Fév 2021
Bump, anyone, thoughts?

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by