Effacer les filtres
Effacer les filtres

Matlab Leave-one-out Cross Validation for SVM

3 vues (au cours des 30 derniers jours)
Angga Lisdiyanto
Angga Lisdiyanto le 8 Juin 2016
Commenté : mizuki le 8 Sep 2016
Hello & Assalamu'alaikum, I was train a 2592 data with 512 for it's feature. I wonder, is my code are correct?
Because of the training is need a long time for training process (~9 hours).
I am using Matlab's Toolbox named Classification Learner. Then i generate the code and i got a KFold Cross Validation as a default in generated code. Then i modifying it for Leave-one-out as below :
cvp = cvpartition(response, 'LeaveOut')
% Initialize the predictions and scores to the proper sizes
validationPredictions = response;
numObservations = size(predictors, 1);
numClasses = uniqueClassTotal;
validationScores = NaN(numObservations, numClasses);
% Loop Cross Validation dengan jumlah fold berdasarkan jumlah data
for fold = 1:allDataTotal
trainingPredictors = predictors(cvp.training(fold), :);
trainingResponse = response(cvp.training(fold), :);
% Apply a PCA to the predictor matrix.
% Convert all-numeric predictor table into matrix, ready for use by pca function.
numericPredictors = table2array(varfun(@double, trainingPredictors));
% 'inf' values have to be treated as missing data for PCA.
numericPredictors(isinf(numericPredictors)) = NaN;
[pcaCoefficients, pcaScores, ~, ~, explained, pcaCenters] = pca(...
numericPredictors, ...
'Centered', true);
% Keep enough components to explain the desired amount of variance.
explainedVarianceToKeepAsFraction = 90/100;
numComponentsToKeep = find(cumsum(explained)/sum(explained) >= explainedVarianceToKeepAsFraction, 1);
pcaCoefficients = pcaCoefficients(:,1:numComponentsToKeep);
trainingPredictors = array2table(pcaScores(:,1:numComponentsToKeep));
% Train a classifier
% This code specifies all the classifier options and trains the classifier.
% ambil nilai popup menu FungsiKernel
nilaiFungsiKernel = get(handles.cbFungsiKernel,'String');
FungsiKernel = nilaiFungsiKernel{get(handles.cbFungsiKernel,'Value')};
template = templateSVM(...
'KernelFunction', FungsiKernel, ...
'Solver', 'SMO', ...
'CacheSize', 'maximal', ...
'Standardize', false);
classificationSVM = fitcecoc(...
trainingPredictors, ...
trainingResponse, ...
'Learners', template, ...
'Coding', 'onevsall', ...
'ClassNames', daftarKelas);
pcaTransformationFcn = @(x) array2table(bsxfun(@minus, table2array(varfun(@double, x)), pcaCenters) * pcaCoefficients);
svmPredictFcn = @(x) predict(classificationSVM, x);
validationPredictFcn = @(x) svmPredictFcn(pcaTransformationFcn(x));
% Compute validation predictions and scores
validationPredictors = predictors(cvp.test(fold), :);
[foldPredictions, foldScores] = validationPredictFcn(validationPredictors);
% Store predictions and scores in the original order
validationPredictions(cvp.test(fold), :) = foldPredictions;
validationScores(cvp.test(fold), :) = foldScores;
end
The total of fold loop is same as the total of data training. This is makes training process take a long time. Is my implementation of Leave-one-out for above code are correct?
Please help me, thanks in advance...
  1 commentaire
mizuki
mizuki le 8 Sep 2016
If you have Parallel Computing Toolbox, use "parfor" instead of "for" for parallelization.

Connectez-vous pour commenter.

Réponses (0)

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by