feature reduction via regression analysis
1 vue (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Suppose you have a very large feature vector X, used to predict a a vector of expected values y.
Is the sequential linear linear regression,
e.g.: coeff=regress(y, X);
followed by sequential feature reduction,
e.g. [coeff_subset] = sequentialfs(fun, X, y, 'direction', 'backward');
% where: fun = @(XT,yT,Xt,yt)(rmse(regress(yT, XT)'*Xt')', yt);
the easiest/best approach to get the a reasonable sized feature vecture when no other information is known?
It seems that, from my testing, this method rarely captures the features that matter the most, and I obtained better results by randomly selecting some of the features.
10 commentaires
Ilya
le 18 Juil 2012
I have trouble interpreting what you wrote in 1 because I still don't know what you mean by correlation. I thought you were saying that the correlation between each individual predictor and the observed response (measured y values) was small for all predictors but one. But setting one predictor to zero cannot have any effect on correlations between the other predictors and the response. And so "correlations went from near zero to back up again" is a mystery to me. Then perhaps by "correlation" you mean correlation between the predicted response and observed response? I don't get how that can be zero after you added the predictor with 94% correlation to the model either. If that happens, something must've gone bad with the fit.
Instead of re-running stepwisefit, I would recommend playing with 'penter' and 'premove' parameters.
Réponse acceptée
Ilya
le 17 Juil 2012
If you prefer linear regression, use function stepwisefit or its new incarnation LinearModel.stepwise. For example, for backward elimination with an intercept term you can do
load carsmall
X = [Acceleration Cylinders Displacement Horsepower];
y = MPG;
stepwisefit([ones(100,1) X],y,'inmodel',true(1,5))
In general, there is no "best" approach to feature selection. What you can do depends on what assumptions you are willing to make (such as linear model), how many features you have and how much effort you want to invest.
0 commentaires
Plus de réponses (0)
Voir également
Catégories
En savoir plus sur Model Building and Assessment dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!