Neural Network not updating weights after configuration
7 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Dear all,
I created a very simple feedforwardnet using the following code:
% set up
numknot = 14;
net = feedforwardnet([numknot]);
% reLU = poslin
net.layers{1}.transferFcn = 'poslin';
net = init(net);
net = configure(net, X, Y);
[rowW, colW] = size(net.iw{1,1});
I then wanted to set initialising input weights for the 14 knots. The biases are set to zero.
net.iw{1,1} = iwIN1;
%iwIN1 is a colX x numknot, e.g. 20 x 14, matrix with values between -1 and 1
%setting biases to 0
[rowB, ~] = size(net.b{1,1});
net.b{1,1} = zeros(rowB,1);
Unfortunately after I run:
[net, record] = train(net, X, Y,'useParallel','yes','useGPU','only');
The network is doing 3 iterations and not updating anything and it has little to no accuracy. iwIN1 is the same as iwIN2. The biases are still zero. If I run the code with zeros instead of my weight matrix iwIN1 it's updating the weights & biases...
I don't know how to proceed now. Thanks for any help in advance.
0 commentaires
Réponses (2)
Uday Pradhan
le 11 Sep 2020
Modifié(e) : Uday Pradhan
le 11 Sep 2020
Hi Lukas,
I have tried to implement your neural net with a sample data (bodyfat_dataset) and it seems to run fine for the size and shallowness of the network used. I have also attached the code in the asnwer, do have a look.
A good practice would be to place the line
net = init(net);
after using "configure" function as is suggested here. In addtion you can also try altering the default training parameters by using "net.trainParam". For example, you may set:
net.trainParam.max_fail = 10; %default is 6
and observe how the training and validation losses progress. Hope this helps!
5 commentaires
Uday Pradhan
le 13 Sep 2020
Modifié(e) : Uday Pradhan
le 13 Sep 2020
Hi Lukas,
Thanks for your detailed reply.
Yes, you are right, the network should be able to train (as loosely as it may) with the weights you have specified as well. As for the solution you have given, the network is not training with iwIN as the initial weights because using net = init(net) just before training the net will undo all the changes to the weights that you have made. You can verify this by checking the values of net.iw{1} just after using net = init(net). If you haven't specified in the initFcn, these will be different from iwIN.
Next, I was able to reproduce the situation that you faced i.e. the weights and biases are not updated after training. However, the following approach solved this issue:
net = feedforwardnet(14);
numNN = 10;
NN = cell(1, numNN);
perfs = zeros(1, numNN);
for i = 1:numNN
fprintf('Training %d/%d\n', i, numNN);
net = configure(net, xtrain, ttrain);
net.iw{1,1} = IWIN;
[rowB, ~] = size(net.b{1,1});
net.b{1,1} = zeros(rowB,1);
NN{i} = train(net, xtrain, ttrain,'useParallel','yes','useGPU','only');
y2 = NN{i}(xtest);
perfs(i) = mse(NN{i}, ttest, y2);
end
I trained the neural net for 10 consecutive times and stored the trained networks as well as their performance metrics. Remeber that even though we are initialising the first layer's weights and biases, the net still has the second layer where the weights and biases are initialised by default and each instance of train in this loop will start with a different set of initial weights and biases for the second layer, and with a different division of the first dataset into training, validation, and test sets.
After this training, you can investigate the network which has the lowest mse of all and check if its initial weights and biases have been updated. I hope this will help!
Voir également
Catégories
En savoir plus sur Sequence and Numeric Feature Data Workflows dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!