- I agree that gradient descent is vector quantity & points in the direction of maximum change of the cost function.
- The ‘net.trainParam.min_grad’ is a scalar(numeric) quantity. The parameter ‘min_grad’ denotes the minimum magnitude (which is scalar) of gradient descent (which is vector), for which the training of neural network terminates.
- When the magnitude of gradient descent becomes less than ‘min_grad’, the neural network model is said to be optimized (and hence, further training stops).
What is the parameter minimum performance gradient (trainParam.min_grad) of traingd?
4 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
AntonyH
le 25 Sep 2020
Commenté : Mohamed Elsefy
le 12 Nov 2020
I use the training function "traingd" to train a shallow neural network:
trainedNet = train(net,X,T)
For the training function "traingd": How is the parameter minimum performance gradient (net.trainParam.min_grad) defined?
As the gradient for the gradient descent is usually a vector, but net.trainParam.min_grad is a scalar value, I am confused.
Is it the change in the performace (loss) between 2 iterations, and if yes: Does it refer to the training, validation or testing errror?
Thanks in advance!
I use MATLAB 2013 and 2015 with the neural network toolbox.
0 commentaires
Réponse acceptée
Rishabh Mishra
le 28 Sep 2020
Modifié(e) : Rishabh Mishra
le 28 Sep 2020
Hi,
Based on your description of the issue, I would state a few points:
For better understanding, refer the following links:
Hope this helps.
2 commentaires
Plus de réponses (0)
Voir également
Catégories
En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!