Adam Optimizer with feedforward nueral networks
9 views (last 30 days)
Show older comments
Hello, is there any way to use Adam optimizer to train a neural network with the "train" fucntion? Or a way to use this implementation ( https://www.mathworks.com/matlabcentral/fileexchange/61616-adam-stochastic-gradient-descent-optimization ) to train the network?
Thanks in advance.
2 Comments
Answers (1)
Hrishikesh Borate
on 19 Jun 2020
Hi,
It’s my understanding that you want to use Adam optimizer to train a neural network. This can be done using trainNetwork function, and setting the appropriate Training Options.
For eg.,
[XTrain,~,YTrain] = digitTrain4DArrayData;
layers = [ ...
imageInputLayer([28 28 1])
convolution2dLayer(12,25)
reluLayer
fullyConnectedLayer(1)
regressionLayer];
options = trainingOptions('adam', ...
'InitialLearnRate',0.001, ...
'GradientThreshold',1, ...
'Verbose',false, ...
'Plots','training-progress');
net = trainNetwork(XTrain,YTrain,layers,options);
[XTest,~,YTest] = digitTest4DArrayData;
YPred = predict(net,XTest);
rmse = sqrt(mean((YTest - YPred).^2))
1 Comment
Abdelwahab Afifi
on 19 Jun 2020
'trainNetwork' is used for Deep learning neural network. But I think he wanna use 'Adam optimizer' to train shallow neural network using 'train' function.
See Also
Categories
Find more on Image Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!