How to import 'kfoldPredict' for old version matlab (2023a)

2 vues (au cours des 30 derniers jours)
evelyn
evelyn le 27 Juil 2024
I want to use kfold with decision tree model. But kfoldpredict is for matlab 2024a.
Is there a possible way to import such function/ packages to matlab 2023?
I try save the class after open 'kfoldpredict' function, but it seems does not work.
Thank you for possible solutions.

Réponses (2)

Ayush Modi
Ayush Modi le 28 Juil 2024
Hi Evelyn,
"kfoldPredict" function was introduced in r2011a. You can use the function in r2023 as well.
However, "kfoldPredict" function is a part of "Statistics and Machine Learning Toolbox". So you would need the license to the r2023 version of the toolbox.
Refer to the following MathWorks documentation to learn more about the history of the function across various versions:
  4 commentaires
Steven Lord
Steven Lord le 29 Juil 2024
How exactly are you calling kfoldPredict? The specific input arguments with which you're calling it could determine which kfoldPredict method gets called (or whether MATLAB can call that method at all!)
evelyn
evelyn le 30 Juil 2024
thank for your reply!
Here is my code, the data are randomlt generated(not the real one that I used).
dataNum =200;
featureNum=4;
features = randn(dataNum,featureNum);
response = randi([0 1], dataNum, 1);
% 5-fold
t = templateTree('MinLeafSize', 10);
rfModel = fitcensemble(features, response,'Method', 'Bag', 'Learners', t,'OptimizeHyperparameters','auto');
|===================================================================================================================================| | Iter | Eval | Objective | Objective | BestSoFar | BestSoFar | Method | NumLearningC-| LearnRate | MinLeafSize | | | result | | runtime | (observed) | (estim.) | | ycles | | | |===================================================================================================================================| | 1 | Best | 0.49 | 6.4539 | 0.49 | 0.49 | AdaBoostM1 | 388 | 0.2419 | 1 | | 2 | Accept | 0.52 | 2.2347 | 0.49 | 0.49119 | GentleBoost | 131 | 0.059663 | 9 | | 3 | Accept | 0.59 | 2.4104 | 0.49 | 0.49463 | RUSBoost | 129 | 0.044567 | 55 | | 4 | Best | 0.485 | 0.55492 | 0.485 | 0.48514 | GentleBoost | 33 | 0.0010033 | 2 | | 5 | Accept | 0.515 | 0.6316 | 0.485 | 0.50832 | GentleBoost | 35 | 0.0012075 | 1 | | 6 | Best | 0.45 | 0.70127 | 0.45 | 0.46756 | AdaBoostM1 | 52 | 0.0010006 | 98 | | 7 | Accept | 0.45 | 0.67801 | 0.45 | 0.45 | AdaBoostM1 | 50 | 0.0011996 | 96 | | 8 | Accept | 0.54 | 0.51508 | 0.45 | 0.44997 | AdaBoostM1 | 36 | 0.028928 | 34 | | 9 | Accept | 0.45 | 0.33061 | 0.45 | 0.44997 | GentleBoost | 19 | 0.034428 | 98 | | 10 | Accept | 0.455 | 0.31386 | 0.45 | 0.44994 | GentleBoost | 18 | 0.019053 | 65 | | 11 | Accept | 0.46 | 0.24792 | 0.45 | 0.44991 | GentleBoost | 12 | 0.12908 | 33 | | 12 | Accept | 0.49 | 0.24321 | 0.45 | 0.44992 | AdaBoostM1 | 13 | 0.0010767 | 3 | | 13 | Accept | 0.45 | 0.22136 | 0.45 | 0.44989 | GentleBoost | 10 | 0.070155 | 83 | | 14 | Accept | 0.46 | 0.21113 | 0.45 | 0.44998 | AdaBoostM1 | 11 | 0.0017986 | 7 | | 15 | Accept | 0.505 | 0.192 | 0.45 | 0.44995 | AdaBoostM1 | 10 | 0.0010456 | 31 | | 16 | Accept | 0.45 | 0.20565 | 0.45 | 0.44993 | GentleBoost | 10 | 0.9156 | 99 | | 17 | Accept | 0.45 | 0.1802 | 0.45 | 0.44993 | AdaBoostM1 | 10 | 0.0091856 | 98 | | 18 | Accept | 0.56 | 0.2018 | 0.45 | 0.44996 | AdaBoostM1 | 10 | 0.43865 | 11 | | 19 | Best | 0.42 | 0.20737 | 0.42 | 0.42002 | GentleBoost | 10 | 0.98704 | 56 | | 20 | Accept | 0.445 | 0.25983 | 0.42 | 0.43315 | GentleBoost | 14 | 0.77577 | 50 | |===================================================================================================================================| | Iter | Eval | Objective | Objective | BestSoFar | BestSoFar | Method | NumLearningC-| LearnRate | MinLeafSize | | | result | | runtime | (observed) | (estim.) | | ycles | | | |===================================================================================================================================| | 21 | Accept | 0.43 | 0.22077 | 0.42 | 0.43172 | GentleBoost | 11 | 0.73441 | 64 | | 22 | Accept | 0.45 | 0.23336 | 0.42 | 0.43652 | GentleBoost | 12 | 0.96312 | 65 | | 23 | Best | 0.395 | 0.23522 | 0.395 | 0.42961 | GentleBoost | 12 | 0.87379 | 56 | | 24 | Accept | 0.53 | 0.20113 | 0.395 | 0.42991 | AdaBoostM1 | 10 | 0.83388 | 3 | | 25 | Accept | 0.5 | 0.21561 | 0.395 | 0.41094 | AdaBoostM1 | 10 | 0.0010431 | 1 | | 26 | Accept | 0.455 | 0.22147 | 0.395 | 0.43429 | GentleBoost | 11 | 0.63755 | 57 | | 27 | Accept | 0.505 | 0.21564 | 0.395 | 0.43441 | GentleBoost | 10 | 0.93185 | 2 | | 28 | Accept | 0.51 | 0.21469 | 0.395 | 0.43429 | GentleBoost | 10 | 0.0011029 | 21 | | 29 | Accept | 0.52 | 0.21221 | 0.395 | 0.43435 | GentleBoost | 10 | 0.94349 | 1 | | 30 | Accept | 0.45 | 0.19013 | 0.395 | 0.43428 | AdaBoostM1 | 10 | 0.94065 | 99 | __________________________________________________________ Optimization completed. MaxObjectiveEvaluations of 30 reached. Total function evaluations: 30 Total elapsed time: 28.5134 seconds Total objective function evaluation time: 19.155 Best observed feasible point: Method NumLearningCycles LearnRate MinLeafSize ___________ _________________ _________ ___________ GentleBoost 12 0.87379 56 Observed objective function value = 0.395 Estimated objective function value = 0.43428 Function evaluation time = 0.23522 Best estimated feasible point (according to models): Method NumLearningCycles LearnRate MinLeafSize ___________ _________________ _________ ___________ GentleBoost 12 0.87379 56 Estimated objective function value = 0.43428 Estimated function evaluation time = 0.23364
predictedSpecies = kfoldPredict(rfModel);
Incorrect number or types of inputs or outputs for function kfoldPredict.
confusionchart(response,predictedSpecies)

Connectez-vous pour commenter.


Walter Roberson
Walter Roberson le 30 Juil 2024
  2 commentaires
evelyn
evelyn le 30 Juil 2024
thank you for your reply!
Is there any way that I can do kfold with 'ClassificationBaggedEnsemble'?
Walter Roberson
Walter Roberson le 31 Juil 2024
Modifié(e) : Walter Roberson le 31 Juil 2024
I am not certain, but it appears to be that when you use Bag that the output model is not a "cross-validated partitioned classifier", so I suspect that the answer is No (at least at present.)

Connectez-vous pour commenter.

Produits


Version

R2023a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by