Genetic Algorithm - weights ensemble optimization
3 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Hi,
I have a created different ensembles, using Neural networks and decision trees. I now have 250 decision trees and 38 Neural networks. What I'd like to do is the following: Each decision tree and Neural Network has a set of probabilities, which predict the outcome. (0 or 1 are the possible outcomes). I now would like to assign random weights to create ensembles. For example: 250 random weights that add up to 1 for the decision trees. To create the weights I have used the code:
weights_nn= diff([0;sort(rand(249,1));1]);
Now I would like to optimize these weights to get a better ensemble which is a better predictive model. What's the best way of doing this?
Thanks for any help!
0 commentaires
Réponse acceptée
Greg Heath
le 9 Juin 2012
What is the dimensionality of your input vectors?
Can it be reduced without significantly degrading performance?
How many classes?
How many training, validation and testing vectors per class?
What is your error rate goal?
What range of error rates are you getting from your 288 classifiers?
You can probably get satisfactory performance by just using backslash to design a linear combiner of classifiers that are weakly correlated.
Furthermore, since the combination is linear, you can use STEPWISEFIT or STEPWISE to reduce the number of classifiers.
I doubt if you will need to combine more than a few tens of classifiers.
Hope this helps.
Greg
0 commentaires
Plus de réponses (1)
Seth DeLand
le 8 Juin 2012
Hi Michiel, If you aren't 100% set on using a Genetic Algorithm, I recommend starting with a gradient-based optimization solver such as FMINCON from the Optimization Toolbox. Gradient-based solvers are typically faster and 250 is on the big side of things for a Genetic Algorithm.
It looks like you want to find the 250 weights that result in an outcome closest to the probability that you already know. In that case your optimization problem will have 250 decision variables (1 for each weight). You will also need a linear equality constraint that keeps the sum of the 250 weights equal to 1. Bounds should be used to make sure that the individual weights are between 0 and 1, but you may want to tighten these bounds based on your knowledge of the problem (ie. the optimization routine may drive one variable to 1 and the other 249 to 0, you most likely don't want that).
In your objective function, you will need to use the present values of the weights and run your model. Running the model will result in a probability, which you can compare to the already known probability that you are targeting. That way the optimization algorithm will attempt to minimize the difference between the probability resulting form running the model, and the probability that you are targeting.
Hope that helps.
0 commentaires
Voir également
Catégories
En savoir plus sur Genetic Algorithm dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!