how can one utilize a dropout layer in a neural network during prediction?
13 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
I was hoping to use dropout layers at prediction time with an LSTM network in order to get confidence intervals.
Apparently, dropout layers only randomly set connections to 0 during training time.
From the dropout reference:
"A dropout layer randomly sets input elements to zero with a given probability. At training time, the layer randomly sets input elements to zero given by the dropout mask rand(size(X))<Probability, where X is the layer input and then scales the remaining elements by 1/(1-Probability). This operation effectively changes the underlying network architecture between iterations and helps prevent the network from overfitting. A higher number results in more elements being dropped during training. At prediction time, the output of the layer is equal to its input."
This explains why repeated calls to predictions with the same input result in the same output.
Has anyone come up with a workaround?
Thank you for your help,
-Dino
1 commentaire
Michael Phillips
le 12 Mar 2021
Hi Dino - did you ever create a custom dropout layer that works during network testing? If so would you be willing to share it? Thanks!
Réponses (1)
Sourav Bairagya
le 10 Fév 2020
Usually dropout layers are used during training to avoid overfitting of the neural network. Currenly, 'dropoutLayer' of 'Deep learning toolbox' doesn't performs dropout during prediction. If you want to use dropout during prediction, you can write a custom dropout layer which does dropout in both 'forward' and 'prediction' method.
You can leverage this link to get idea about writing custom layers:
0 commentaires
Voir également
Catégories
En savoir plus sur Image Data Workflows dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!