Effacer les filtres
Effacer les filtres

Adam Optimizer with feedforward nueral networks

6 vues (au cours des 30 derniers jours)
Manos Kav
Manos Kav le 30 Avr 2018
Commenté : Bob le 18 Nov 2022
Hello, is there any way to use Adam optimizer to train a neural network with the "train" fucntion? Or a way to use this implementation ( https://www.mathworks.com/matlabcentral/fileexchange/61616-adam-stochastic-gradient-descent-optimization ) to train the network?
Thanks in advance.
  2 commentaires
Abdelwahab Afifi
Abdelwahab Afifi le 14 Juin 2020
Have you get the answer ?
Bob
Bob le 18 Nov 2022
did anyone of you guys got the answer?

Connectez-vous pour commenter.

Réponses (1)

Hrishikesh Borate
Hrishikesh Borate le 19 Juin 2020
Hi,
It’s my understanding that you want to use Adam optimizer to train a neural network. This can be done using trainNetwork function, and setting the appropriate Training Options.
For eg.,
[XTrain,~,YTrain] = digitTrain4DArrayData;
layers = [ ...
imageInputLayer([28 28 1])
convolution2dLayer(12,25)
reluLayer
fullyConnectedLayer(1)
regressionLayer];
options = trainingOptions('adam', ...
'InitialLearnRate',0.001, ...
'GradientThreshold',1, ...
'Verbose',false, ...
'Plots','training-progress');
net = trainNetwork(XTrain,YTrain,layers,options);
[XTest,~,YTest] = digitTest4DArrayData;
YPred = predict(net,XTest);
rmse = sqrt(mean((YTest - YPred).^2))
For more information, refer to trainNetwork.
  1 commentaire
Abdelwahab Afifi
Abdelwahab Afifi le 19 Juin 2020
'trainNetwork' is used for Deep learning neural network. But I think he wanna use 'Adam optimizer' to train shallow neural network using 'train' function.

Connectez-vous pour commenter.

Catégories

En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by