Effacer les filtres
Effacer les filtres

How to resolve the strange behavior of trained NARX neural Networks that generate outputs different than expected by theory.

2 vues (au cours des 30 derniers jours)
I am facing a strange behaviour when I am using NARX neural network in MATLAB neural network toolbox. I would appreciate any comments towards understanding/fixing this very much. The Neural Network was generated and trained using ntstool which is a Neural Network based Time Series tool in the neural network toolbox. The network is a NARX network with two inputs. I chose to have 3 neurons in the hidden layer and number of delays was chosen to be 3. I generated the network and trained it for some data set. Then I modified all the input weights except net.Iw{1,1}(1,1) to be equal to zero and put the value for this weight to a nonzero value. I also modified all the layer weights to become equal to one. I also set all the biases to zero. Then I tested the neural network with the following data: myInput = zeros(2,4); myInput(1,1) = 1; myTarget = ones(1,4); Further I prepared it for the neural network as follows: inputSeries = tonndata(myInput,true,false); targetSeries = tonndata(myTarget,true,false); [inputs,inputStates,layerStates,targets] = preparets(net,inputSeries,{},targetSeries); Finally, I tested the network and compared it with the equations I expect it to represent. outputs =cell2mat( net(inputs,inputStates,layerStates)) outputTheory= tansig(sum(net.iw{1,1}(:,1))+ net.b{1,1}'*[1;1;1])*net.lw{2,1}*[1;1;1]+ net.b{2,1}
The result is not the same. Even when I put all the input weights to zero and keep everything else as mentioned above, the output of the neural network is a positive value not zero as expected. Below you can see the output from Matlab.
looking forward for your comments. Payam
For first case: net.Iw{1,1}(1,1) = 0.1;
outputs =
0.2482
outputTheory =
0.2990
For later case: net.Iw{1,1}(1,1) = 0;
outputs =
0.2576
outputTheory =
0

Réponses (2)

Greg Heath
Greg Heath le 7 Sep 2012
The most common reason for disagreement between results like
y1 = sim( net, x )
and
y2 = b2 + LW * tansig( IW * x + b1 )
is that the latter does not take into account the effect of the default mapminmax normalization used to train net.
Hope this helps.
Thank you for accepting my answer.
Greg
  1 commentaire
Payam
Payam le 9 Sep 2012
The test inputs and targets are already in [-1,1} range. mapinmax is not the problem in this case.

Connectez-vous pour commenter.


renz
renz le 4 Oct 2012
x = [ 1, 2];
t = [3, 4];
net = feedforwardnet(2);
net = configure(net, x, t);
net.biasConnect = [0; 0];
net.IW{1} = [0;0];
net.LW{2} = [0,0];
sim(net,4)
ans = 3.5000
H = 2;
TF = {'tansig','purelin'};
BTF = 'trainlm';
BLF = 'learngdm';
PF = 'mse';
IPF = {};
OPF = {};
DDF = 'dividerand';
net = newff(x,t,H,TF,BTF,BLF,PF,IPF,OPF,DDF);
net.biasConnect = [0; 0];
net.IW{1} = [0;0];
net.LW{2} = [0,0];
sim(net,4)
ans = 0

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by