Correct weight Initialization in CNN
8 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Andres Ramirez
le 29 Juil 2018
Modifié(e) : Maria Duarte Rosa
le 5 Juil 2019
When a very deep DAG network is built from scratch, the initialization of the weights made by matlab is not very good since it presents a vanishing gradient problem which causes the CNN not to learn.
What is the function with which Matlab does the initiation of CNN weights?
Why do you implement initialization functions in Matlab such as XAVIER or RELU AWARE SCALALED?
Thank you for your answers.
2 commentaires
Greg Heath
le 31 Juil 2018
I do not understand
"Why do you implement initialization functions in Matlab such as XAVIER or RELU AWARE SCALALED?"
Please explain.
Greg
Réponse acceptée
Maria Duarte Rosa
le 5 Juil 2019
Modifié(e) : Maria Duarte Rosa
le 5 Juil 2019
In R2019a, the following weight initializers are available (including a custom initializer via a function handle):
'glorot' (default) | 'he' | 'orthogonal' | 'narrow-normal' | 'zeros' | 'ones' | function handle
Glorot is also know as Xavier initializer.
Here is a page comparing 3 initializers when training LSTMs:
I hope this helps,
Maria
0 commentaires
Plus de réponses (2)
Andres Ramirez
le 31 Juil 2018
1 commentaire
Greg Heath
le 1 Août 2018
Modifié(e) : Greg Heath
le 1 Août 2018
Do you have a reference for
RELA AWARE SCALALED
I have no idea what this is.
Thanks
Greg
fareed jamaluddin
le 4 Août 2018
I think you can take a look at this example https://www.mathworks.com/help/images/single-image-super-resolution-using-deep-learning.html
I am also looking for a way on weight initialization options, you can see in the example it create the initialization with He method for every conv layer.
0 commentaires
Voir également
Catégories
En savoir plus sur Image Data Workflows dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!