Customized Regression output layer
7 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Hello everyone, following the example of https://it.mathworks.com/help/deeplearning/ug/define-custom-regression-output-layer.html I tried to build a regression output layer using the mse error. Using the provided script, I did this for the loss function:
function loss = forwardLoss(layer, Y, T)
loss = mse(Y,T);
end
But trying with a data set in matlab net = trainNetwork(bodyfatInputs,bodyfatTargets,layers,options);
it gave me
Error using trainNetwork (line 170)
Error using 'forwardLoss' in Layer mseRegressionLayer. The function threw an error and could not be executed.
I buit layers
layers = [
sequenceInputLayer(13)
lstmLayer(100)
fullyConnectedLayer(1)
mseRegressionLayer('mse')];
What did I do wrong?
Thanks for your help
7 commentaires
Mohammad Sami
le 7 Sep 2020
Modifié(e) : Mohammad Sami
le 7 Sep 2020
It seems the layer should be valid. Maybe something else is wrong. Try using the built-in regression layer ( which also uses mse) to verify that there is nothing else wrong.
Réponses (1)
Uday Pradhan
le 10 Sep 2020
Hi,
I tried to implement your network on my end and found two problems. One, when using "mse" as the loss function, it is advisable to mention the 'DataFormat' argument as well, for example, see this page. So, modify the line (in your definition of 'mseRegressionLayer.m')
loss = mse(Y,T);
%change to
loss = mse(Y,T,'DataFormat','T'); %for sequences
Coming to the problem you are trying to solve:
The "bodyfat_dataset" consists of two important vectors X and T where X is of size 13 - by - 252 and targets T is 1 - by - 252. From my understanding, you would like to create a LSTM network which accepts a sequence of 13 features and predicts the body fat percentage. This is a sequence to one regression problem and as advised here, I redesigned your network as such:
layers = [
sequenceInputLayer(13)
lstmLayer(100,'OutputMode',"last") % Output the last time step of the sequence
fullyConnectedLayer(1)
mseRegressionLayer('mse')];
However, in this output mode the input must be in cell array format. To do this you may use the following:
N = 240; %number of sequences
cellArrTrain = cell(N,1);
for i = 1:N
seq = xtrain(:,i);
seq = num2cell(seq,1);
cellArrTrain(i) = seq;
end
% ------ FOR TRAINING PURPOSES -------%
net = trainNetwork(cellArrTrain,Ytrain,layers,options); %be cautious of the dimensions of
% cellArrTrain and Ytrain, they should match.
% Similarly convert the test data into a cell array too
Hope this helps!
5 commentaires
Uday Pradhan
le 4 Oct 2020
Looks like the regression loss is too high. Try normalizing the loss in the custom layer like this:
loss= mse(Y,T,'DataFormat','T')/size(Y,2); %for the mini - batch
Also, start with a smaller learning rate around 0.001. Example of a standard training Option:
options = trainingOptions('adam', ...
'MaxEpochs',500, ...
'GradientThreshold',1, ...
'InitialLearnRate',0.001, ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropPeriod',100, ...
'LearnRateDropFactor',0.1,...
'Verbose',1, ...
'Plots','training-progress');
You can play around with the number of layers and LSTM nodes, learning rates and number of epochs. Also, it is advisable to used validation sets because overfitting is quite common as we increase the number of layers.
Voir également
Catégories
En savoir plus sur Custom Training Loops dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!