Main Content


Cross validate function


vals = kfoldfun(obj,fun)


vals = kfoldfun(obj,fun) cross validates the function fun by applying fun to the data stored in the cross-validated model obj. You must pass fun as a function handle.

Input Arguments


Object of class RegressionPartitionedModel or RegressionPartitionedEnsemble. Create obj with fitrtree or fitrensemble along with one of the cross-validation options: 'CrossVal', 'KFold', 'Holdout', 'Leaveout', or 'CVPartition'. Alternatively, create obj from a regression tree or regression ensemble with crossval.


A function handle for a cross-validation function. fun has the syntax

testvals = fun(CMP,Xtrain,Ytrain,Wtrain,Xtest,Ytest,Wtest)
  • CMP is a compact model stored in one element of the obj.Trained property.

  • Xtrain is the training matrix of predictor values.

  • Ytrain is the training array of response values.

  • Wtrain are the training weights for observations.

  • Xtest and Ytest are the test data, with associated weights Wtest.

  • The returned value testvals must have the same size across all folds.

Output Arguments


The arrays of testvals output, concatenated vertically over all folds. For example, if testvals from every fold is a numeric vector of length N, kfoldfun returns a KFold-by-N numeric matrix with one row per fold.


Cross validate a regression tree, and obtain the mean squared error (see kfoldLoss):

load imports-85
t = fitrtree(X(:,[4 5]),X(:,16),...
    'predictornames',{'length' 'width'},...
cv = crossval(t);
L = kfoldLoss(cv)

L =

Examine the result of simple averaging of responses instead of using predictions:

f = @(cmp,Xtrain,Ytrain,Wtrain,Xtest,Ytest,Wtest)...

ans =