Using fminunc() with very large scale problems
Afficher commentaires plus anciens
I have delved into this topic before http://www.mathworks.com/matlabcentral/newsreader/view_thread/300661#811585, but am returning to it again.
I'm working with another class of algorithms that are very large scale. The class of algorithms is known as multi-frame super-resolution. A good reference is "Fast and Robust Multiframe Super Resolution" by S. Farsiu, et al, in IEEE Transactions on Image Processing, Vol 13, No. 10, 1327-1344 (2004).
My question is in regards to Matlab's ability to perform optimization on truly large scale problems. The authors of the aforementioned paper used steepest descent. I've coded up the algorithm with conjugate gradient (Polak-Ribiere).
I would be curious to try the trust-region algorithm of fminunc() on this problem. However, the size of the low resolution images I am using for super-resolution reconstruction is 135x90 (very small for real-world applications, but fine for testing). With 4X super-resolution the resultant image I'm reconstructing is 540x360. The algorithm considers every pixel of the output image to be a variable to be optimized. Thus there are 194,400 variables.
If I blindly use fminunc (letting it calculate the Hessian), it throws a memory error at a line in fminunc that reads
Hstr = sparse(ones(n));
n is apparently the number of variables, as in my case it is 194400.
Based on the aforementioned link, I tried setting 'HessMult', to @(H,Y)Y. The same memory error occurs at the same location.
Again, conjugate gradient techniques work well. I'm just surprised that I can't get this problem to work using a built-in function. It still seems that the Mathworks provides a function that either uses no derivative information (fminsearch) or an algorithm that requires calculating second order derivatives (fminunc). This is the second class of algorithms I've worked with in the last couple years that really needs an algorithm that can use first order derivatives but does not require second order derivatives.
I did put in a service request in 2011 to add the conjugate gradient algorithm to the optimization toolbox (or another algorithm that might provide comparable functionality). I pointed out a quote by Roger Fletcher in his "Practical Methods of Optimization", 2nd Edition (page 85): “Thus conjugate gradient methods may be the only methods which are applicable to large problems , that is problems with hundreds or thousands of variables.” The emphasis on “large problems” is Fletcher’s, not mine. I'm guessing my request didn't go very far.
I still wonder if there is some built-in function which will solve these problems and I'm just overlooking it. So my question is: Is there a Matlab-provided function that allows for optimization problems of this scale?
Thanks, Eric
1 commentaire
Réponse acceptée
Plus de réponses (0)
Catégories
En savoir plus sur Solver Outputs and Iterative Display dans Centre d'aide et File Exchange
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!

