This example was developed for use in teaching optimization in graduate engineering courses. This example demonstrates how the gradient descent method can be used to solve a simple unconstrained optimization problem. Taking large step sizes can lead to algorithm instability, but small step sizes result in low computational efficiency. A corresponding video can be found here:
Thank you for commenting Ibrahim. A little while ago my original YouTube channel was deleted unexpectedly. I did restore most of my videos using a new channel: https://www.youtube.com/channel/UC0iwMWMB5raqo2pl_XIaVrQ
I hope to update the links on the file exchange, but it may take some time. In the meantime you should be able to find videos for most of my FX submissions at the above channel.
this video is unavialable
this .m file was very helpful while preparing to my optimization methods class
thank you, great work!
Is it a batch or a stochastic gradient descent ?
Only changed video link in subscription. Code is unchanged.