Problem 58882. Neural Nets: Activation functions
Return values of selected Activation function type for value,vector, and matrices.
y=Activation(x,id); where id is 1:4 for ReLU, sigmoid, hyperbolic_tan, Softmax
ReLU: Rectified Linear Unit, clips negatives max(0,x) Trains faster than sigmoid
Sigmoid: Exponential normalization [0:1]
HyperTan: Normalization[-1:1] tanh(x)
Softmax: Normalizes output sum to 1, individual values [0:1] Used on Output node
Working though a series of Neural Net challenges from Perceptron, Hidden Layers, Back Propogation, ..., to the Convolutional Neural Net/Training for Handwritten Digits from Mnist.
Might take a day or two to completely cover Neural Nets in a Matlab centric fashion.
Essentially Out=Softmax(ReLU(X*W)*WP)
Solution Stats
Problem Comments
-
1 Comment
Richard Zapor
on 21 Aug 2023
Multi-Case Softmax should be y=exp(x)./sum(exp(x),2)
Solution Comments
Show commentsProblem Recent Solvers10
Suggested Problems
-
Equidistant numbers containing certain value in an interval
87 Solvers
-
Project Euler: Problem 1, Multiples of 3 and 5
3172 Solvers
-
Create a function handle that reverses the input arguments of another function handle
150 Solvers
-
Calculate the area of a triangle between three points
2945 Solvers
-
423 Solvers
More from this Author308
Problem Tags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!