Hi.
I have a function f(x,a) where a is a parameter that is exogenously provided. I want to minimize f(x,a) for different values of a (a1 a2 a3 ...) independently. Is there any way to do that without using a for loop ?
If I consider defining f(x,[a1 a2 a3 ...]) as a multidimensional:
function f = obj(x,a)
f(1) = f(x,a1);
f(2) = f(x,a2);
f(3) = f(x,a3);
.
.
.
end
and then minimize f by providing the vector a=[a1 a2 a3 ...], Matlab will not consider the components of the function as independent as I want. Do you have any solution ?
Thank you.

1 commentaire

Walter Roberson
Walter Roberson le 3 Déc 2019
You can hide the loop with arrayfun but otherwise you should still be using a loop of some kind as the objectives are independent.

Connectez-vous pour commenter.

 Réponse acceptée

Matt J
Matt J le 3 Déc 2019
Modifié(e) : Matt J le 3 Déc 2019

0 votes

You would need something like the following,
options=optimoptions(@fminunc,'SpecifyObjectiveGradient', true);
allX = fminunc(@(x)obj(x,a),x0,options);
function [ftotal,gradient] = obj(x,a)
delta=1e-6;
f0=getfs(x,a);
ftotal=sum(f0);
if nargout>1
gradient=(getfs(x+delta,a)-f0)./delta; %finite difference approx
end
end
function f=getfs(x,a)
f(1) = f(x(1),a(1));
f(2) = f(x(2),a(2));
f(3) = f(x(3),a(3));
.
.
.
end
You could also of course implement an analytic version of the gradient calculation, instead of using finite differences.
However, you should keep in mind that a for-loop may be advantageous if the a(i) are separated by small increments. That usually means that the optimal solution x(i) is a good initial guess of x(i+1), so you don't have to do as many iterations in the next pass of the for-loop..

7 commentaires

Idossou Marius Adom
Idossou Marius Adom le 3 Déc 2019
Thank you very much Matt J.
If I understand well you answer, the trick is to supply the gradient in the fminunc routine ?
Matt J
Matt J le 3 Déc 2019
Modifié(e) : Matt J le 3 Déc 2019
Well, you don't have to, but if you don't, fminunc will make N^2 calls to f(x,a) in order to do its own default finite difference approximation. In my proposal, only 2*N are made. Naturally, you might do even better with a proper analytical gradient calculation.
Idossou Marius Adom
Idossou Marius Adom le 3 Déc 2019
OK. Thank you.
But, then, is it sure the fminunc is minimizing the components of f independently ?
Because for example I saw the multi-objective routine gamultiobj that does not perform independently.
And, something else I would ask: could fminsearch do the same as fminunc is your proposal ?
Thank you very much.
Walter Roberson
Walter Roberson le 3 Déc 2019
If you have no constraints or only bound or linear equality constraints, then you might be able to use fminunc or fmincon with trust-region-reflective and calculate a sparse hessian .
fminsearch does not use gradient search as such, and so does not benefit from calculations to reduce the gradient calculation.
Idossou Marius Adom
Idossou Marius Adom le 3 Déc 2019
OK. I see. Thank you Waltler Roberson.
Matt J
Matt J le 4 Déc 2019
But, then, is it sure the fminunc is minimizing the components of f independently ?
As you can see, the objective function we have defined in the above scheme is additively separable: it is the sum of independent terms f(x(i),a(i)) and therefore the minimization of the sum is equivalent to minimizing each f((xi),a(i)) independently.
Idossou Marius Adom
Idossou Marius Adom le 4 Déc 2019
OK. You are right. Thank you Matt.

Connectez-vous pour commenter.

Plus de réponses (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by