Using lsqnonlin on large-scale problems - Question about "JacobMult" option
12 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Shayan
le 3 Fév 2014
Réponse apportée : françois anquez
le 9 Avr 2020
Hi,
I am trying to use lsqnonlin() on a very large problem set, in which the Jacobian has a size of 31457280x393216 and is sparse. I am able to calculate the Jacobian myself and pass it to lsqnonlin() using
options = optimoptions('lsqnonlin','Jacobian','on')
The problem is that lsqnonlin() is taking a very long time. I stepped through the code to find where the bottleneck is happening and have tracked it down to the preconditioner step in the trust region algorithm. That is, within the function trdog(), it calls a preconditioner function aprecon() which performs a QR decomposition on the Jacobian. Specificially it is this section of aprecon() which takes an incredibly long time:
p = colamd(TM);
RPCMTX = qr(TM(:,p));
My question is, will using the JacobMult option of lsqnonlin() avoid this? According to the documentation:
For large-scale structured problems, this function computes the Jacobian matrix product J*Y, J'*Y, or J'*(J*Y) without actually forming J.
which to me sounds like it is only useful for large-scale problems in which the Jacobian is too large to even be stored, which is not the issue in my case. Additionally, I have read this: Jacobian Multiply Function with Linear Least Squares, and am confused on what the Jacobian Multiply function is actually doing, and therefore am unsure on how to even implement this.
To summarize, will the JacobMult option avoid the extremely time-consuming QR decomposition in lsqnonlin() for large-scale problems, and can anyone give a clear demonstration on what the JacobMult option actually is doing and how to implement it?
Thanks.
0 commentaires
Réponse acceptée
Matt J
le 3 Fév 2014
Modifié(e) : Matt J
le 3 Fév 2014
If you use JacobMult, I imagine it will try to solve for the Newton steps iteratively, instead of with QR decomposition, which could be faster, depending on how sparse your Jacobian really is.
Basically, the Newton step amounts to solving a linear equation
J*x=F
where J is your Jacobian. There are iterative algorithms which can do this using a series of multiplications with J and J.'. In your case, the JacobMult function could be implemented simply as
function W=jmfun(J,Y,flag)
if flag==0
W = J'*(J*Y);
elseif flag>0
W= J*Y;
else%flag<0
W=J.'*Y;
end
end
3 commentaires
Matt J
le 3 Fév 2014
Modifié(e) : Matt J
le 3 Fév 2014
That sounds weird, if it's true. I suppose you could trick aprecon by computing Jinfo as
[m,n]=size(Jacobian);
[i,j,s]=find(Jacobian);
Jinfo={i,j,s,m,n};
and then rebuilding the Jacobian inside jmfun
J=sparse(Jinfo{:});
W=...
You could also perhaps just reshape() the Jacobian into a different sized matrix Jinfo. Undoing that in jmfun might be faster.
In any case, bear in mind that not using a pre-conditioner could end up slowing convergence to a solution, with no clear net gains in speed.
Plus de réponses (1)
françois anquez
le 9 Avr 2020
Dear Shayan, Dear Matt,
I do not wether this will be usefull given the time past since you posted.
I was facing the same problem.
Something that helped for me was to set : 'SubproblemAlgorithm' to 'cg'.
This seems to considerably reduce the time spent in aprecon with similar results.
François.
0 commentaires
Voir également
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!