Same neural network training result each time at "dividerand" option

I wonder why my NN results shows same performance(error) every run even though I separated data set training, validation and test sets randomly by using 'deviderand'.
I understand the weight and bias are always same because I use rng('default') code, however, I think the results must be different every time due to its randomness of data selection.
Is it because I misunderstand the concept of NN?
The code is the following:
% Solve an Input-Output Fitting problem with a Neural Network
% Script generated by Neural Fitting app
% Created 28-Dec-2016 14:15:01
%
% This script assumes these variables are defined:
%
% InputT12 - input data.
% TargetT12 - target data.
rng('default')
x = InputT12;
t = TargetT12;
% Choose a Training Function
% For a list of all training functions type: help nntrain
% 'trainlm' is usually fastest.
% 'trainbr' takes longer but may be better for challenging problems.
% 'trainscg' uses less memory. Suitable in low memory situations.
trainFcn = 'trainlm'; % Levenberg-Marquardt backpropagation.
% Create a Fitting Network
hiddenLayerSize = 10;
net = fitnet(hiddenLayerSize,trainFcn);
% Setup Division of Data for Training, Validation, Testing
net.divideFcn = 'dividerand';
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% Train the Network
[net,tr] = train(net,x,t);
% Test the Network
y = net(x);
e = gsubtract(t,y);
performance = perform(net,t,y)
% View the Network
view(net)
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, ploterrhist(e)
%figure, plotregression(t,y)
%figure, plotfit(net,x,t)

 Réponse acceptée

Greg Heath
Greg Heath le 29 Déc 2016
When you reboot, the RNG goes into the default state.
Therefore begin with states different from default and 0.
Hope this helps
Thank you for formally accepting my answer
Greg

Plus de réponses (1)

Greg Heath
Greg Heath le 29 Déc 2016
You are mistaken.
When the rng is repeatedly set to a certain value, The resulting string of PSEUDO-RANDOM numbers is repeated.
HOPE THIS HELPS
THANK YOU FOR FORMALLY ACCEPTING MY ANSWER
GREG

1 commentaire

I understand your point. However, after I delete the rng('default') code, still same results are shown.
For example,
  • 1st result:67.6067
  • 2nd result:46.0002 -> different result due to deleated RNG code.
  • 3rd result:26.4202
after reboot the matlab, same results are shown
  • 1st result:67.6067
  • 2nd result:46.0002
  • 3rd result:26.4202
Something wrong in my code?
[Merged from duplicate question]
I wonder why my NN results shows same performance(error) every run even though I separated data set training, validation and test sets randomly by using 'deviderand'.
For example,
  • 1st run result: 67.6067
  • 2nd run result: 46.0002
  • 3rd run result: 26.4202
however, same results are shown after rebooting the matlab
  • 1st run result: 67.6067
  • 2nd run result: 46.0002
  • 3rd run result: 26.4202
I think the results must be different due to its randomness of data selection.
The code is the following:
% Solve an Input-Output Fitting problem with a Neural Network
% Script generated by Neural Fitting app
% Created 28-Dec-2016 14:15:01
%
% This script assumes these variables are defined:
%
% InputT12 - input data.
% TargetT12 - target data.
x = InputT12;
t = TargetT12;
% Choose a Training Function
% For a list of all training functions type: help nntrain
% 'trainlm' is usually fastest.
% 'trainbr' takes longer but may be better for challenging problems.
% 'trainscg' uses less memory. Suitable in low memory situations.
trainFcn = 'trainlm'; % Levenberg-Marquardt backpropagation.
% Create a Fitting Network
hiddenLayerSize = 10;
net = fitnet(hiddenLayerSize,trainFcn);
% Setup Division of Data for Training, Validation, Testing
net.divideFcn = 'dividerand';
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% Train the Network
[net,tr] = train(net,x,t);
% Test the Network
y = net(x);
e = gsubtract(t,y);
performance = perform(net,t,y)
% View the Network
view(net)
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, ploterrhist(e)
%figure, plotregression(t,y)
%figure, plotfit(net,x,t)

Connectez-vous pour commenter.

Catégories

En savoir plus sur Deep Learning Toolbox dans Centre d'aide et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by