CNN Training progress plots - Validation accuracy Jumps at last iteration

3 vues (au cours des 30 derniers jours)
Mariam Ahmed
Mariam Ahmed le 30 Jan 2019
Réponse apportée : Kenta le 16 Juil 2020
Dear collegues,
I'm training a CNN on MATLAB and I noticed what you can see in the figure below. As shown in the training progress plots, the validation accruacy jumps at the very last iteration regardless of what's the number of Epoches used in the traning. It is confusing. What could be the reason for that?
Thank you.
#Epoches = 5
Untitled.png
#Epoches = 10
Untitled.png
Another trail with #Epoches = 10
Untitled.png
  2 commentaires
Don Mathis
Don Mathis le 19 Fév 2019
Can you post your network layers and training options?
Mariam Ahmed
Mariam Ahmed le 19 Fév 2019
Yes, there it is,
NumClasses = 2;
nor = batchNormalizationLayer('Name','BN'); %Bactch Normilization Layer
act1 = leakyReluLayer('Name','LeakyRELU'); %Non-linear Activiation Layer
p = averagePooling2dLayer([2 2],'Name','POOL');
outputLayer = classificationLayer('Name','Classify');
s1 = softmaxLayer('Name','Softmax');
%---Dropout layer
Dlayer = dropoutLayer(0.35,'Name','drop1');
%%--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
%---Define Layers Activation functions
%%--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
actCONV = act1;
actFC1 = act1;
%%-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
%---Start Creating the CNN structure
%%-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
%----Number of filters in Conv layer
n_f = 6;
%----Number of units in FC1 layer
f1_numO_CH = 7;
%---Input Layer
inputLayer = imageInputLayer([inputSize1 inputSize2 numCH],'Name','Input');
%---1st Layer "Conv1"
c1_numF = n_f;
c1 = convolution2dLayer([f ff],c1_numF,'WeightL2Factor',0.7,'Name','Conv1');
c1.Weights = randn([f ff numCH c1_numF]) * 0.01;
c1.Bias = zeros([1 1 c1_numF]);
%---FC1
f1_numO = f1_numO_CH;
numFinal = c1_numF*ConvOUT;%;*((inputSize + 2*(-f+1))-5);
f1 = fullyConnectedLayer(f1_numO,'Name','FC1');
f1.Weights = randn([f1_numO numFinal]) * 0.01;
f1.Bias = zeros([f1_numO 1]);
%---FC2
f2_numO = NumClasses;
f2 = fullyConnectedLayer(f2_numO,'Name','FC2');
f2.Weights = randn([f2_numO f1_numO]) * 0.01;
f2.Bias = zeros([f2_numO 1]);
%%-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
%---Define the Conv Net Structure
%%-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
convnet=[inputLayer;c1;nor;actCONV;Dlayer;f1;actFC1;f2;s1;outputLayer];
opts = trainingOptions('adam',...
'MaxEpochs',3,...
'MiniBatchSize',2^8,...
'Shuffle','every-epoch',...
'InitialLearnRate',0.05,...
'LearnRateSchedule','piecewise',...
'LearnRateDropPeriod',1,...
'LearnRateDropFactor',0.68,...
'ValidationData',{Xtest,Ytest},...
'ValidationPatience',Inf,...
'Verbose',false,...
'Plots','training-progress');
Thank you.

Connectez-vous pour commenter.

Réponse acceptée

Kenta
Kenta le 16 Juil 2020
If your network includes batch normalization layer, the final accuracy and the one during the training process sometimes differ. The reason why it happens is written in detail above. Hope it helps!

Plus de réponses (0)

Catégories

En savoir plus sur Sequence and Numeric Feature Data Workflows dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by