Can we say that overfitting occur in this plot?

Réponses (1)

Greg Heath
Greg Heath le 23 Août 2018
This is a case of
*OVERTRAINING AN OVERFIT NET*
There are at least 3 ways to avoid this:
1. *DO NOT OVERFIT:*
Make sure that the number of unknown weights, Nw does not
exceed the number of training equations, Ntrneq.
2. *DO NOT OVERTRAIN*
In particular, the problem is not necessarily the
overfitting. Overfitting is easily mitigated by NOT
OVERTRAINING
a. Use a VALIDATION set to implement "EARLY STOPPING".
b. Use "REGULARIZATION" via TRAINBR to implement
"BAYESIAN RREGULARIZATION"
See
https://www.mathworks.com/matlabcentral/answers/280818-how-to-solve-overtrained-nn-with-validation-stop
Hope this helps.
Thank you for formally accepting my answer
Greg

4 commentaires

KAE
KAE le 24 Août 2018
What plot features indicate this overtraining is occurring, compared to the plot in the documentation, which is copied below?
Greg Heath
Greg Heath le 25 Août 2018
The increase in the val error rate (grn) for 6(default) continuous epochs (9 to 15) while the training error (blu) is still decreasing.
Hope this helps.
Greg
KAE
KAE le 5 Sep 2018
Modifié(e) : KAE le 5 Sep 2018
And in the first plot (original question), the val error rate only increases for 5 continuous epochs, during epoch 6-11, while the training error is decreasing. Since 5 is less than the default of 6, the first plot shows overtraining. Is this interpretation right?
Greg Heath
Greg Heath le 23 Sep 2018
5 - 11 not 6 - 11
Greg

Connectez-vous pour commenter.

Catégories

En savoir plus sur Deep Learning Toolbox dans Centre d'aide et File Exchange

Question posée :

le 22 Août 2018

Commenté :

le 23 Sep 2018

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by