How to use the genetic algorithm to optimize "trainingOptions" for trainNetwork?
2 views (last 30 days)
Show older comments
Hi all, I am trying to use the genetic algorithm optimizer to tune hyperparameters in a neural network made using the deep network designer. When setting up the objective function, I can't seem the make the InitialLearnRate one of the hyperparameters to optimise. As far as I understand, the trainingOptions function requries the learn rate to be a postive integer, but how would I set it up if I want the genetic algorithm to vary that? I have attached the section of code, any help will be greatly appreciated!
function f = SeqFunction(p)
% Hyperparams for the optimization
numHiddenUnits = round(p(1));
LearnRate = round(p(2));
%Define the network
numFeatures = 7;
numResponses = 1;
layers = [...
sequenceInputLayer(numFeatures)
lstmLayer(numHiddenUnits, 'OutputMode',"sequence")
dropoutLayer(0.2)
lstmLayer(numHiddenUnits, 'OutputMode',"sequence")
%dropoutLayer(0.2)
fullyConnectedLayer(numResponses)
regressionLayer];
options = trainingOptions('adam', ...
'MaxEpochs',100, ...
'GradientThreshold',1, ...
'InitialLearnRate',LearnRate, ... %This is the line I am struggling with
'Shuffle', 'never', ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropPeriod',125, ...
'LearnRateDropFactor',0.9, ...
'Verbose',0, ...
'Plots','training-progress');
% Train the network
net = trainNetwork(XTrain,YTrain,layers,options);
0 Comments
Answers (0)
See Also
Categories
Find more on Genetic Algorithm in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!