Hyper-parameter Optimization

2 views (last 30 days)
Rhythm Shah
Rhythm Shah on 8 Nov 2017
Edited: Don Mathis on 9 Nov 2017
In the Matlab Documentation on 'Deep Learning Using Bayesian Optimization', under the section 'choose variables to optimize'; what are the different variables that one can choose to optimize? Also, Can we try to include a custom variable and optimize it along with the others?

Answers (1)

Don Mathis
Don Mathis on 9 Nov 2017
Edited: Don Mathis on 9 Nov 2017
Actually, all of the variables are custom variables. They can be whatever you want. The objective function, which you write, gets passed a table of values for the optimizableVariables you have defined, and you write the objective function to use them however you want.
The section 'Objective Function for Optimization' shows how an objective function can be defined. The variable 'optVars' inside the objective function will be a 1-row table with one column for each of the optimizableVariables you've defined. You can see the variables being used inside that function. Look for 'optVars.NetworkDepth', 'optVars.InitialLearnRate', 'optVars.Momentum', etc. You could create as many other variables as you want and access them inside your objective function in the same way.
This page explains a bit more about objective functions and optimizableVariables: https://www.mathworks.com/help/stats/bayesian-optimization-objective-functions.html

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!