Why does my CNN model's validation accuracy vary so much everytime I run the model with the same data?

3 vues (au cours des 30 derniers jours)
Hello,
I am training a CNN model with a total of 80 images and 2 categories of classification. Every time, I run the model the validation accuracy significantly varies e.g. the first time it was 95.8% and the second time it was 61.5%. Can someone please help me in figuring out why there is such a huge difference in that. I would really appreciate any assistance. Thank you,
I am using the following code:
clear all
clc
rootfolder= fullfile ('CNN Data');
imds = imageDatastore(rootfolder,'IncludeSubfolders',true,'LabelSource','foldernames');
[imdsTrain, imdsValidation] = splitEachLabel(imds,0.70,'randomize');
numTrainImages = numel(imdsTrain.Labels);
idx = randperm(numTrainImages,16);
figure
for i = 1:16
subplot(4,4,i)
I = readimage(imdsTrain,idx(i));
imshow(I)
end
net = alexnet ();
rng ('default');
net.Layers(1);
net.Layers(end);
numel(net.Layers(end).ClassNames);
inputSize = net.Layers(1).InputSize;
augimdsTrain = augmentedImageDatastore(inputSize,imdsTrain, 'ColorPreprocessing','gray2rgb');
augimdsValidation = augmentedImageDatastore(inputSize,imdsValidation, 'ColorPreprocessing','gray2rgb');
w1 = net.Layers(2).Weights;
w1 = mat2gray(w1);
featurelayer = 'drop7';
trainingFeatures = activations(net,augimdsTrain,featurelayer,'MiniBatchSize', 32,'OutputAs', 'columns');
miniBatchSize = 32;
options = trainingOptions('sgdm', ...
'InitialLearnRate',0.001,...
'ValidationData',augimdsValidation, ...
'ValidationFrequency',3, ...
'MaxEpochs',5,...
'MiniBatchSize',32,...
'Plots','training-progress',...
'Shuffle','once');
numClasses = 2;
layersTransfer = net.Layers(1:end-3);
layers = [
layersTransfer
fullyConnectedLayer(numClasses,'WeightLearnRateFactor',20,'BiasLearnRateFactor',20)
softmaxLayer
classificationLayer];
trainedNet = trainNetwork(augimdsTrain,layers,options);
trainingLabels = imdsTrain.Labels;
classifier = fitcecoc(trainingFeatures, trainingLabels, 'Learner', 'Linear', 'Coding', 'onevsall','observationsIn','columns');
validationFeatures = activations(trainedNet, augimdsValidation, featurelayer,'MiniBatchSize', 32,'OutputAs', 'columns');
predictLabels = predict(classifier,validationFeatures,'ObservationsIn','columns');
validationLabels = imdsValidation.Labels;
confMatt = confusionmat(validationLabels, predictLabels);
confusionmean= mean(diag(confMatt));
plotconfusion(validationLabels, predictLabels)
figure, cm= confusionchart(validationLabels, predictLabels)
title ('Confusion Chart')
  2 commentaires
Mahesh Taparia
Mahesh Taparia le 15 Déc 2020
Hi
Can you try with more epochs, as it might be the case that loss function is not getting saturated in your 2nd case due to less number of epoch. Every time weights are initialized, the accuracy will be different for each initialization but there will not be much difference.
Vinay Chawla
Vinay Chawla le 15 Déc 2020
Thank you for getting back on this. I appreciate it
I actually tried a different initial learn rate ' 0.0001' with the same number of epochs and it turned out to be good because the difference is reduced to only a few percentages.

Connectez-vous pour commenter.

Réponses (0)

Catégories

En savoir plus sur Sequence and Numeric Feature Data Workflows dans Help Center et File Exchange

Produits


Version

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by