how to use cross-validation in fitrgp
5 views (last 30 days)
Show older comments
I find that there are two places in fitrgp() that we can do cross-validation:
- cvgprMdl = fitrgp(x,y,'KernelFunction','squaredexponential','Holdout',0.25);
- gprMdl = fitrgp(x,y,'KernelFunction','squaredexponential',... 'OptimizeHyperparameters','auto','HyperparameterOptimizationOptions',struct('Holdout',0.25));
I don't clearly understand what is the different for the 'Holdout' used in two places?
Thank you.
0 Comments
Answers (1)
Don Mathis
on 23 Mar 2017
Briefly: The first command specifies a holdout proportion for fitting a single model. The second command specifies the holdout proportion used inside the objective function of a Bayesian Optimization.
In more detail:
Your first command trains a single model on 75% of the dataset and outputs a "RegressionPartitionedModel". This contains the trained model in cvgprMdl.Trained{1}. You can get its holdout Loss by doing:
loss = kfoldLoss(cvgprMdl)
Your second command runs a BayesianOptimization in which 30 models are fit, each to the same 75% of the dataset, using different hyperparameters. The optimization searches for the hyperparameters that minimize the holdout Loss on the remaining 25%. After the optimization completes, a final model is fit to 100% of the dataset using the optimal hyperparameters. The returned object is a "RegressionGP".
0 Comments
See Also
Categories
Find more on Gaussian Process Regression in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!