Multi-Parameter function minimization with many many parameters
40 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Nathan Bblanc
le 22 Nov 2024 à 17:40
Commenté : Nathan Bblanc
le 28 Nov 2024 à 17:27
I am trying to minimize (find local minimum) a function of 100-1000 parameters. Generaly I realize this is nearly an impossible task, but I do have an advantage- I know that I am very close to the minimum. In other words, my initial condition is very close to the minumum. generally all parameters are 0.99-1.01 times the values of the minima. I thought this would give me a fighting chance.
However, so far fminsearch, fmincon and fminun and fminunc fail misserably. Are there any alternatives that might be more suitable for a so-called "multi-multi parameter" optimization?
Many thanks in advance
Nathan
4 commentaires
John D'Errico
le 23 Nov 2024 à 18:37
fminsearch is a waste of time there. It will NEVER work on a problem of that size. A reasonable upper limit for fminsearch is more like maybe 10 parameters, and I would try to keep it under 6 or 8.
As far as why it did not succeed in your case, even though the search space is relatively small, it is not as small as you think. A 1000 dimensional space is immense. Even 100 dimensions. Should fminunc work? Well, possibly. Even probably. 1000 dimensions is not that truly huge for a gradient based solver to converge. But we don't know what is happening in your objective.
Why did it fail? What was the termination code? If you want better anwers, it helps to provide as much information as possible. Otherwise people are left guessing. And those wild guesses always seem to be wrong.
Réponse acceptée
John D'Errico
le 23 Nov 2024 à 19:50
Let me try to help, a little at least.
Why is fminsearch so bad at high dimensions? It is just not designed for that class of problem. Not in the least. For example, even on a relatively small problem (I can't solve a much larger one and stay in the couple of minute long time slot they give us on Answers.)
N = 100;
H = randn(N); H = H'*H;
obj = @(x) x(:)'*H*x(:);
This is a VERY simple objective function. H is positive definiite. So the solution lies at all zeros. How much easier can it get?
x0 = 0.1*ones(N,1);
[xval,fval] = fminsearch(obj,x0);
max(abs(xval))
plot(x0,'rx')
hold on
plot(xval,'ob')
Did it reduce the objective at all?
disp([fval,obj(x0)])
So, yes, it did reduce the objective, but I doubt anyone would be happy with the result.
How well does a gradient based solver work?
[xval2,fval2] = fminunc(obj,x0);
norm(xval2)
fval2
And for only a few seconds of work, it did hugely better than fminsearch.
plot(xval2,'gs')
This is to be expected.
ub = 0.3*ones(N,1);
lb = -ub;
[xval3,fval3] = ga(obj,N,[],[],[],[],lb,ub)
plot(xval3,'kx')
legend('Start point','fminsearch','fminunc','ga')
All of these behaviors are completely expected. ga will not be terribly fast, but it did better than fminsearch. Remember however, that our genes were themselves determined by essentially a genetic algorithm. It is just that nature had many, many millions of years to refine them.
That your problem did not work out well, we cannot know why.
0 commentaires
Plus de réponses (0)
Voir également
Catégories
En savoir plus sur Solver Outputs and Iterative Display dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!