Single iteration with lsqnonlin (or fsolve), only compute new X0

2 vues (au cours des 30 derniers jours)
Sargondjani
Sargondjani le 29 Juin 2023
Commenté : Sargondjani le 30 Juin 2023
I want lsqnonlin (or fsolve) to only carry out one iteration, ie. compute the new X, and then stop. No further function evaluations.
So ideally I dont even want it to compute the new values of the objective function, but I definitely do not want extra function evaluations for the jacobian or first order optimality conditions at the new guess for X.
(My question is similar to an earlier question by me:
... but now function evaluations are even more expensive, and i want to use lsqnonlin, so i also dont know how to update X (which is easy for the Newton Raphson step if you know the Jacobian), so the suggestions made there dont help me for this case.
  5 commentaires
Torsten
Torsten le 29 Juin 2023
But you loose all information about the Jacobian in the iteration point and it will take much more effort in recomputing it in the next call to lsqnonlin than to continue the iterations.
Sargondjani
Sargondjani le 29 Juin 2023
@Torsten i use projection methods, and i want to update the grid before making a next iteration. so all information gathered at the new X with the old grid could be useless (especially if the step in X is relatively large). I first want to update the grid, and then do any further evaluations.
May I conclude it is not possible? Or at least not with a simple command?

Connectez-vous pour commenter.

Réponse acceptée

Matt J
Matt J le 30 Juin 2023
Modifié(e) : Matt J le 30 Juin 2023
This seems to be a feasible workaround. So, the important thing to realize is that even though the iterative display says the Func-count=2, the call to the objective function is doing no significant work after the first function call, because the externally scoped stopflag has been raised by that point.
doOptimization()
Norm of First-order Iteration Func-count Resnorm step optimality 0 1 4225 1.04e+03 1 2 0 4.0625 0 Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance.
x = 6.0625
res = 0
function doOptimization
clc
[stopflag,r0,J0]=deal(0);
opts=optimoptions('lsqnonlin','Display','iter',...
'SpecifyObjectiveGradient',true,'MaxIterations',0);
[x,res]=lsqnonlin(@resid, 2,[],[],opts)
function [r,J]=resid(x)
if ~stopflag
r=(x-10)^2+1; J=2*(x-10); %normal evaluation of residual function
r0=zeros(size(r)); %important that these be zero (but unclear why).
J0=zeros(size(J));
stopflag=1;
else %do no work
r=r0; J=r0;
end
end
end
  4 commentaires
Sargondjani
Sargondjani le 30 Juin 2023
Yeah, that is strange indeed!
Sargondjani
Sargondjani le 30 Juin 2023
Anyway, it works great! Super happy!

Connectez-vous pour commenter.

Plus de réponses (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by