SVM Cross Validation Training
3 views (last 30 days)
Show older comments
I am using K-Fold cross validation. My K is 10.
I am supposed to do 10 crossfold and take the average of the SVM performance.
How should i perform such? Running the cross validarion ounce only generates 1 fold prediction or a complete 10-fold prediction?
1 Comment
Mohammad Sami
on 8 May 2020
According to the documentation it is average over all folds
https://www.mathworks.com/help/releases/R2020a/stats/select-data-and-validation-for-classification-problem.html
Answers (1)
Gayathri
on 3 Jan 2025
I understand that you need to perform K-fold cross-validation for a SVM model. For this purpose you can use the "crossval" function. And then, "kfoldLoss" function can be used to get the classification loss for cross-validated classification model. Please refer to the code below which implements the same.
load ionosphere
%Train a SVM classifier using the radial basis kernel
SVMModel = fitcsvm(X,Y,'Standardize',true,'KernelFunction','RBF','KernelScale','auto');
%Cross-validate the SVM classifier
CVSVMModel = crossval(SVMModel);
%Estimate the out-of-sample misclassification rate.
classLoss = kfoldLoss(CVSVMModel)
"crossval" by default uses 10-fold cross-validation.
Please refer to the "Train and Cross-Validate SVM Classifier" example in the documentation link mentioned below.
Hope you find this information helpful!
0 Comments
See Also
Categories
Find more on Statistics and Machine Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!