How to implement the augmented lagarangian method?

Hi, i have a function like this
x = ||g-u||.^2 + lambda.*R
where g,u and R are matrices of same size,and lambda is some constant now i want to optimize this equation using lagarangian method,any help?

Réponses (0)

Catégories

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by