How can we save neural network with best validation loss

7 vues (au cours des 30 derniers jours)
Arjun Desai
Arjun Desai le 9 Juil 2018
Commenté : Nethtrick le 22 Sep 2020
Currently I am using the trainNetwork command to train my network model. I want to save the model with the best running validation loss. For example, let us say at epoch 10, my validation loss is 0.2 and that is the lowest validation loss up to that point, then I would save that network model. Then, we reach epoch 11, where the validation loss reaches 0.1, we would also save this model (i.e. running best validation loss model).
My network contains batchNormalization layers, and as a result, I cannot use the models saved at checkpoints as the batchNormalization layers are not initialized.
Is there a work around for this? I know that tensorflow/Keras supports saving models with the best validation loss that do contain batchNormalization layers.

Réponses (1)

Pablo Rivas
Pablo Rivas le 2 Juin 2019
I don;t think this is possible yet.
** feature request **
It seems like, for now, you will have to save checkpoints of your network in every epoch, and at the end, in the training summary, you can see which epoch gave you the best validation accuracy/error and go back and find on what file corresponds to the chekpoint for that epoch.
However, this can be space-consuming and not ideal at all. It would be realy nice to have this feature. Right?
  1 commentaire
Nethtrick
Nethtrick le 22 Sep 2020
Unfortunately the checkpoint approach does not work with the batch normalization layers. I am running into the same issue. It's an oversight not to have this built in because "training" itself is defined as minimizing the loss function.
I posted this question also before I found your post:

Connectez-vous pour commenter.

Catégories

En savoir plus sur Image Data Workflows dans Help Center et File Exchange

Produits


Version

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by