how can I apply Relu activation function in Levenberg_Marquardt algorithm for training neural network?
Show older comments
x = input';
t = target';
trainFcn = 'trainlm';
hiddenLayerSize = 10;
net = feedforwardnet(hiddenLayerSize,trainFcn);
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
[net] = train(net,x,t);
Answers (1)
Varun Sai Alaparthi
on 22 Nov 2022
Hello Ashish,
You can use ‘poslin’ as transfer function to the hidden layer which is same as applying ‘ReLU’ activation function.
You can use this code for using 'poslin'
net.layers{1}.transferFcn = 'poslin';
I hope this information helps and please reach out for any further issues.
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!