Feature selection to perform classification using Multinomial Logistic Regression

7 vues (au cours des 30 derniers jours)
Hello, I need to do feature selection in a classification situation. I have two different situations normal versus pathological and the discrimination of five classes. I'm thinking to use Multinomial logistic regression implemented in stepwisefit Matlab function. After feature selection the classification is performed using SVM with RBF kernel. Do you think its a good aproach? Must data respect some constrainsts? I havent found any paper using this aproach!.
Thank you.
Maria

Réponse acceptée

Ilya
Ilya le 6 Déc 2016
You should not use a linear model for feature selection and a nonlinear model for classification on the selected features. If you have the latest MATLAB (16b), the fscnca function in the Statistics and Machine Learning Toolbox can perform simultaneous feature selection and classification with an RBF kernel. If you do not have 16b, try sequential feature selection from sequentialfs using SVM with an RBF kernel for feature selection. Sequential feature selection could prove too slow if you have many features.

Plus de réponses (3)

Veronica Marques
Veronica Marques le 22 Déc 2016
Hi IIYa,
Thanks for the answer for my feature selection doubt . In the case of applying the sequentialfs to perform the feature selection I am following the procedure:
1 - Divide the dataset in train and test (70% -30%) 2 - In train dataset perform feature selection: 2.1 Call sequentialfs funtion , inside a 10-fold cross-validation, for each pair (C and sigma), the hyperparameters used to achieve the best performance of the SVM classifier. 2.2 The set of features that allows the minimum error are choiced.
3 - Than, using all train dataset with the selected features the SVM model is trained again, to obtain the final model. 4 - The final model is applied in the test set, that is not used for seature selection, nor for the train of the SVM.
What do you think? Some authors, for example in "A wrapper method for feature selection using Support Vector Machines, Sebastián Maldonado, Richard Weber , Information Sciences 179 (2009) 2208–2217" perform the classifier model selection first, with all the features and using the best parameters, and after that do the features selection. No more classifier train is done.
What will be the best methodology in order to avoid overfitting?

Ilya
Ilya le 22 Déc 2016
You describe a procedure for selecting a set of features at fixed hyperparameter values. You do not say what you do, if anything, to optimize the hyperparameter values.
As long as you keep an independent test set not used for any sort of optimization, you are going to get an unbiased estimate of the classifier accuracy on that set. There is no overfitting in that sense.
You should keep in mind though that the 10-fold estimate of the model accuracy for the optimal feature subset is going to be biased high because the 10-fold estimate is what is being used to perform feature selection.
Some overfitting is inavoidable since sequential feature selection is greedy by nature. For example, if you start with an empty set and keep adding features as long as accuracy goes up, you are most likely going to end up with more features than necessary. The increase in accuracy may not be significant, but the feature is added to the optimal set anyway.

Veronica Marques
Veronica Marques le 22 Déc 2016
Hi, I use a grid search for find the best hyperparameter values. For each pair of hyperparameter values I perform feature selection. I expect to obtain the best set of features for the best classifier model.
  1 commentaire
Ilya
Ilya le 23 Déc 2016
Your procedure sounds sensible.
Hyperparameters such as the Gaussian kernel scale and order of the polynomial kernel are usually sensitive to the number of features. libsvm, for example, sets the default sigma in exp(-((x-z)/sigma)^2) proportional to sqrt of the number of features. Optimizing sigma first and then performing feature selection strikes me as naive for that reason.

Connectez-vous pour commenter.

Catégories

En savoir plus sur Statistics and Machine Learning Toolbox dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by