fitcsvm cross-validation

2 vues (au cours des 30 derniers jours)
João Mendes
João Mendes le 15 Avr 2021
Commenté : João Mendes le 16 Avr 2021
Hi, I am training a SVM classifier with the following code:
SVM_1=fitcsvm(X_train, y_train, 'OptimizeHyperparameters', 'all','HyperparameterOptimizationOptions',struct('Optimizer','bayesopt','AcquisitionFunctionName','expected-improvement-per-second-plus','Kfold',10,'ShowPlots',0));
I was wondering if there is any possibility to retrieve a performance metric of the classifier from the cross-validation - since I specify it as a 10-fold cross-validation (AUC, for example).
Thank you,
J

Réponse acceptée

Alan Weiss
Alan Weiss le 16 Avr 2021
As shown in this doc example, the cross-validation loss is reported at the command line and plotted by default (I see that you turned off the plot). Is there something else that you need, or did I misunderstand you?
Alan Weiss
MATLAB mathematical toolbox documentation
  3 commentaires
Alan Weiss
Alan Weiss le 16 Avr 2021
The "Objective" in the iterative display (the generated table of iterations) is the cross-validation loss. The "Best so far" is simply the minimum objective up to that iteration. There is a difference between the "best so far" estimated and observed; that is a function of the model that the solver is estimating, and that changes every iteration. The model is that the observations themselves are noisy, so simply observing a value doesn't mean that observing it again will give the same response.
In a nutshell, I think that the iterative display gives you the information you seek.
Alan Weiss
MATLAB mathematical toolbox documentation
João Mendes
João Mendes le 16 Avr 2021
Thank you very much.

Connectez-vous pour commenter.

Plus de réponses (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by