BatchNormalization layer with DropOut layer issue
Afficher commentaires plus anciens
I'm having issues with BatchNormalization layer during training deep learning modules (UNET, SegNet) - both 2D/3D training models.
This layer is the reason all the time for having much lower validation accuracy/high jump in error values in the finish of the training - this causing me to not able to predict with this model. If I'm trying to load a certain checkpoint - I'm missing some values for using it (mean for example).
Is there a way to use in a certain model both DropOut + BatchNormalization layers without getting this issue? I'm using Matlab version 2020a, is there a fix in updater versions perhaps..?
Réponses (0)
Catégories
En savoir plus sur Deep Learning Toolbox dans Centre d'aide et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!