Why test, train and validation performance are so different from global performance in my neural network?
6 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
ponseta07
le 23 Mai 2020
Commenté : ponseta07
le 1 Juin 2020
Hi,
I've trained a shallow neural network using generated code by matlab.
x = inputs(:,[1:13 42:42+12])';
t = outputs';
t(2,:) = ~t(1,:);
% Choose a Training Function
trainFcn = 'trainrp';
% Create a Pattern Recognition Network
hiddenLayerSize = 6;
net = patternnet(hiddenLayerSize, trainFcn);
net.layers{1}.transferFcn = 'tansig';
net.layers{2}.transferFcn = 'softmax';
% Choose Input and Output Pre/Post-Processing Functions
net.input.processFcns = {'removeconstantrows'};
net.output.processFcns = {};
% Setup Division of Data for Training, Validation, Testing
net.divideFcn = 'dividerand'; % Divide data randomly
net.divideMode = 'sample'; % Divide up every sample
net.divideParam.trainRatio = 65/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 20/100;
% Choose a Performance Function
net.performFcn = 'crossentropy'; % Cross-Entropy
% Train the Network
[net,tr] = train(net,x,t);
% Test the Network
y = net(x);
e = gsubtract(t,y);
performance = perform(net,t,y)
tind = vec2ind(t);
yind = vec2ind(y);
percentErrors = sum(tind ~= yind)/numel(tind);
% Recalculate Training, Validation and Test Performance
trainTargets = t .* tr.trainMask{1};
valTargets = t .* tr.valMask{1};
testTargets = t .* tr.testMask{1};
trainPerformance = perform(net,trainTargets,y)
valPerformance = perform(net,valTargets,y)
testPerformance = perform(net,testTargets,y)
The resulted values for performace are:
performance =
0.0102
trainPerformance =
1.7494
valPerformance =
4.2279
testPerformance =
3.9898
I know that perform(net,t,y,ew) returns network performance calculated according to the net.performFcn (mse), but I don't undestand why performance values using masks for select only a specific dataset are quite larger than global value.
In addition, if I try mse direcly:
trp = mse(net,trainTargets,y)
vp = mse(net,valTargets,y)
tsp = mse(net,testTargets,y)
trp =
0.0050
vp =
0.0048
tsp =
0.0056
values are completely different, when it is supposed to be interchangeable (https://es.mathworks.com/help/deeplearning/ref/mse.html).
Did I do anything wrong? Did I misunderstand results?
Thanks in advance!
0 commentaires
Réponse acceptée
Srivardhan Gadila
le 30 Mai 2020
As per my knowledge, the way you are calculating the trainPerformance, valPerformance & testPerformance may not be correct. Instead I would suggest that you can make use of tr.trainInd, tr.valInd & tr.testInd instead of tr.trainMask, tr.valMask & tr.testMask as follows:
trainPerformance = perform(net,t(tr.trainInd),y(tr.trainInd))
valPerformance = perform(net,t(tr.testInd),y(tr.testInd))
testPerformance = perform(net,t(tr.valInd),y(tr.valInd))
Then w.r.t performance function, according to your above code the performance function you are using is 'crossentropy' and hence perform(net,t,y,ew) returns network performance calculated according to the net.performFcn which is 'crossentropy' & not 'mse'.
Also for your additional part, you may have to use
trp = crossentropy(net,t(tr.trainInd),y(tr.trainInd),{1},'regularization',net.performParam.regularization,'normalization',net.performParam.normalization)
and not
trp = mse(net,trainTargets,y)
Refer to the following resources: net.performFcn, crossentropy, Train Neural Network Using mse Performance Function, Analyze Shallow Neural Network Performance After Training, training record & help(net.performFcn) in command window
Plus de réponses (0)
Voir également
Catégories
En savoir plus sur Sequence and Numeric Feature Data Workflows dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!