How does k-fold cross validation work in KNN?
4 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Samuel L. Polk
le 10 Déc 2018
Réponse apportée : Don Mathis
le 13 Déc 2018
My understanding is that KNN uses the classifications of the k data nearest to a query point in order to inform the classification of the query point. I was wondering how K-fold cross validation affects the ability of the KNN classifier in the classificationLearner toolbox in Matlab. My understanding of K-Fold CV is that it is used to make sure that out-of-sample data is predicted well. But also, one can use cross validation to optimize hyperparameters. Is something like this happening with KNN too? If so, what hyperparameters are being optimized through the use of KNN in classificationLearner?
0 commentaires
Réponse acceptée
Don Mathis
le 13 Déc 2018
There is currently no automatic hyperparameter optimization in the classificationLearner. It just uses the hyperparameters you have chosen and runs cross-validation to estimate the out-of-sample loss.
0 commentaires
Plus de réponses (0)
Voir également
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!