Gradient-Based Optimizer

GBO, inspired by the gradient-based Newton’s method, uses two main operators: gradient search rule (GSR) and local escaping operator (LEO).
343 téléchargements
Mise à jour 25 juin 2023

Afficher la licence

The GBO, inspired by the gradient-based Newton’s method, uses two main operators: gradient search rule (GSR) and local escaping operator (LEO) and a set of vectors to explore the search space. The GSR employs the gradient-based method to enhance the exploration tendency and accelerate the convergence rate to achieve better positions in the search space. The LEO enables the proposed GBO to escape from local optima. The performance of the new algorithm was evaluated in two phases. 28 mathematical test functions were first used to evaluate various characteristics of the GBO, and then six engineering problems were optimized by the GBO. In the first phase, the GBO was compared with five existing optimization algorithms, indicating that the GBO yielded very promising results due to its enhanced capabilities of exploration, exploitation, convergence, and effective avoidance of local optima. The second phase also demonstrated the superior performance of the GBO in solving complex real-world engineering problems.

Citation pour cette source

iman ahmadianfar (2026). Gradient-Based Optimizer (https://fr.mathworks.com/matlabcentral/fileexchange/131588-gradient-based-optimizer), MATLAB Central File Exchange. Extrait(e) le .

Ahmadianfar, Iman, et al. “Gradient-Based Optimizer: A New Metaheuristic Optimization Algorithm.” Information Sciences, vol. 540, Elsevier BV, Nov. 2020, pp. 131–59, doi:10.1016/j.ins.2020.06.037.

Afficher d’autres styles
Compatibilité avec les versions de MATLAB
Créé avec R2023a
Compatible avec toutes les versions
Plateformes compatibles
Windows macOS Linux
Tags Ajouter des tags
Version Publié le Notes de version
1.0.0