Forward function with frozen batch normalization layers
Afficher commentaires plus anciens
In my application i have both batch normalization and dropout, and i would like to perform MC dropout with the forward function, and ideally i would freeze the parameters TrainedMean and TrainedVariance for the batch normalization layers, but i cannot seem to understand is it possible. I have the bn layers after conv layers, and the dropout after the recurrent layer in my net. Thank you in advance
1 commentaire
Imola Fodor
le 28 Fév 2024
Réponse acceptée
Plus de réponses (0)
Catégories
En savoir plus sur Deep Learning Toolbox dans Centre d'aide et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
