Which cross-validation method is used in Hyperparameters optimisation for fitrnet?

5 views (last 30 days)
In the example "Minimize Cross-Validation Error in Neural Network" (above link), the code is stated to reduce the cross-validation loss of over some problem hyperparameters by using Bayesian optimization. However, something is not clear, can somebody help with the followings?
  1. The data was splited into 80% train and 20% test. So the 20% test set is obviously not for cross-validation?
  2. If so, is it true that the cross-validation loss is calculated internally using train set? And which type of cross-validation method is it? (Kfold, holdout, leaveout,...?
  3. For the optimisation process, the software first starts with a set of parametes, train that network and calculate the value of ln(1+loss) and then modify the paramters to minimise it. If I am wrong or lacking something, can you help me explain the whole optimising procedure?
Sorry if these questions are too trivial as I am only a beginner in using these features.

Answers (1)

Daksh
Daksh on 21 Dec 2022
Edited: Daksh on 21 Dec 2022
It is my understanding that you wish to understand more about the Cross validation method used in hyperparameter optimization for Fitrnet. Refer to these points for answers to your queries:
  • The training set (80%) is used for cross validation and training the model, while the test set is only used to assess how the model performs when faced with new data.
c = cvpartition(height(cars),"Holdout",0.20);
  • The above link provided by you indicated that "Holdout" method for cross validation is used.
  • Refer the following link for Bayesian Hyperparameter Optimization, it should help you understand better the concept of optimization and model training with code, diagrams and visualizations.
Hope it helps

Products


Release

R2022b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!