Adaptive moment estimation (Adam)

Adaptive moment estimation (Adam) Algorithm for deep learning optimization
148 téléchargements
Mise à jour 17 oct. 2023

Afficher la licence

Adaptive moment estimation (Adam) is an optimization algorithm used for gradient-based optimization of objective functions, particularly in deep learning. Adam combines the benefits of two other popular optimization algorithms, momentum and RMSprop, and adds some additional improvements.
Momentum is an optimization algorithm that adds a fraction of the previous gradient to the current gradient to smooth out the gradient descent path and accelerate convergence. RMSprop is an optimization algorithm that scales the learning rate for each parameter based on the magnitude of recent gradients for that parameter. Adam combines these two methods by using exponentially weighted moving averages of both the first and second moments of the gradients for each parameter. The first moment is the mean of the gradients, and the second moment is the variance of the gradients.
Adam also includes bias correction terms to correct for the fact that the first and second moments are initialized as zero vectors, which can lead to biased estimates early on in training. The bias correction terms are calculated using the decay rates beta1 and beta2 for the first and second moments, respectively.
One of the advantages of Adam over momentum and RMSprop is that it performs well on a wide range of problems and does not require extensive tuning of hyperparameters. Additionally, Adam is computationally efficient and can handle problems with large data sets and high-dimensional parameter spaces. Another advantage is that Adam can adaptively adjust its learning rate based on the magnitude of the gradients, making it more robust to noisy or sparse data.
Overall, Adam is a powerful optimization algorithm that combines the strengths of momentum and RMSprop while adding some additional improvements. It has become a popular choice for deep learning applications due to its efficiency, adaptability, and ease of use.

Citation pour cette source

Mehdi Ghasri (2024). Adaptive moment estimation (Adam) (https://www.mathworks.com/matlabcentral/fileexchange/136679-adaptive-moment-estimation-adam), MATLAB Central File Exchange. Extrait(e) le .

Hoang, NhatDuc, and VanDuc Tran. “Deep Neural Network Regression with Advanced Training Algorithms for Estimating the Compressive Strength of Manufactured-Sand Concrete.” Journal of Soft Computing in Civil Engineering, vol. 7, no. 1, Pouyan Press, Jan. 2023, doi:10.22115/scce.2022.349837.1485.

Afficher d’autres styles

Mai, Hau T., et al. “A Novel Deep Unsupervised Learning-Based Framework for Optimization of Truss Structures.” Engineering with Computers, vol. 39, no. 4, Springer Science and Business Media LLC, Apr. 2022, pp. 2585–608, doi:10.1007/s00366-022-01636-3.

Afficher d’autres styles
Compatibilité avec les versions de MATLAB
Créé avec R2023b
Compatible avec toutes les versions
Plateformes compatibles
Windows macOS Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
Version Publié le Notes de version
1.0.0