Computational Efficiency for Decoupled Optimisation
Afficher commentaires plus anciens
Hi all
Just looking for some best practice advice for setting up an optimisation study with a large number of variables. The cost function consists of multiple sub-models that don't interact with each other, but the outputs of all of them are summed to provide the cost value. Does anyone know if it's more computationally efficient to have one large optimisation process or to optimise each of the sub-models separately?
Thanks for your help,
Martin
Réponse acceptée
Plus de réponses (1)
Alan Weiss
le 25 Août 2020
0 votes
Because the problems can be solved separately, then it will certainly save memory to solve them separately. Whether it saves time to do so depends on a lot of details. You can try solving just a few at once and then separately, timing the solutions, then take a few more and try again, see if a pattern emerges. For a very large problem it will be necessary to solve sub-problems because of memory issues, but where the optimal cutoff is for problem size I cannot say.
Alan Weiss
MATLAB mathematical toolbox documentation
Catégories
En savoir plus sur Get Started with Optimization Toolbox dans Centre d'aide et File Exchange
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!