Effacer les filtres
Effacer les filtres

Assigning weight and bias values for a Network

1 vue (au cours des 30 derniers jours)
Julix
Julix le 10 Juil 2016
Hello all.. My network has 3 inputs, 5 neurons in a single hidden layer and 1 output. I observed that ANN automatically assigns weight and bias values for my network and thus training and simulation of the network 5 times will always generate 5 different output values. So after training/simulation I extracted the weight and bias values of the network using the code below
InputWeight = myFunc.IW{1};
LayerWeight = myFunc.LW{2};
b1 = myFunc.b{1,1};
b2 = myFunc.b{2,1};
I assigned these values to my network's IW, LW, b1 and b2 using the code below... (InputData is an (3xN) matrix and OutputData a (1xN) matrix)
NormInp = minmax(InputData);
myFunc = newff(NormInp, [5,1],{'tansig', 'purelin'});
myFunc = configure(myFunc,InputData, OutputData);
myFunc.IW{1,1} = IW;
myFunc.b{1,1} = b1;
myFunc.LW{2,1} = LW;
myFunc.b{2,1} = b2;
myFunc = train(myFunc, InputData, OutputData);
myOut = sim(myFunc,InputData);
Running this code about 5 times, generated exactly the same output. But when I change, the activation functions to;
myFunc = newff(NormInp, [5,1],{'purelin','tansig'});
normalize input/outputdata to [-1 to +1] and de-normalize the output, each element of the output matrix turns out to be the same i.e having a (1xN) matrix in which each of the elements is equal to 77. Please does anyone know what is responsible for the wrong output??

Réponses (0)

Catégories

En savoir plus sur Install Products dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by