Adam Optimizer with feedforward nueral networks

Hello, is there any way to use Adam optimizer to train a neural network with the "train" fucntion? Or a way to use this implementation ( https://www.mathworks.com/matlabcentral/fileexchange/61616-adam-stochastic-gradient-descent-optimization ) to train the network?
Thanks in advance.

2 commentaires

Have you get the answer ?
Bob
Bob le 18 Nov 2022
did anyone of you guys got the answer?

Connectez-vous pour commenter.

Réponses (1)

Hi,
It’s my understanding that you want to use Adam optimizer to train a neural network. This can be done using trainNetwork function, and setting the appropriate Training Options.
For eg.,
[XTrain,~,YTrain] = digitTrain4DArrayData;
layers = [ ...
imageInputLayer([28 28 1])
convolution2dLayer(12,25)
reluLayer
fullyConnectedLayer(1)
regressionLayer];
options = trainingOptions('adam', ...
'InitialLearnRate',0.001, ...
'GradientThreshold',1, ...
'Verbose',false, ...
'Plots','training-progress');
net = trainNetwork(XTrain,YTrain,layers,options);
[XTest,~,YTest] = digitTest4DArrayData;
YPred = predict(net,XTest);
rmse = sqrt(mean((YTest - YPred).^2))
For more information, refer to trainNetwork.

1 commentaire

'trainNetwork' is used for Deep learning neural network. But I think he wanna use 'Adam optimizer' to train shallow neural network using 'train' function.

Connectez-vous pour commenter.

Catégories

En savoir plus sur Deep Learning Toolbox dans Centre d'aide et File Exchange

Question posée :

le 30 Avr 2018

Commenté :

Bob
le 18 Nov 2022

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by