Return values of selected Activation function type for value,vector, and matrices.
y=Activation(x,id); where id is 1:4 for ReLU, sigmoid, hyperbolic_tan, Softmax
ReLU: Rectified Linear Unit, clips negatives max(0,x) Trains faster than sigmoid
Sigmoid: Exponential normalization [0:1] 
HyperTan: Normalization[-1:1] tanh(x)
Softmax: Normalizes output sum to 1, individual values [0:1]
Used on Output node
Working though a series of Neural Net challenges from Perceptron, Hidden Layers, Back Propogation, ..., to the Convolutional Neural Net/Training for Handwritten Digits from Mnist.
Might take a day or two to completely cover Neural Nets in a Matlab centric fashion.
Essentially Out=Softmax(ReLU(X*W)*WP)
Solution Stats
Problem Comments
1 Comment
Solution Comments
Show comments
Loading...
Problem Recent Solvers16
Suggested Problems
-
Which values occur exactly three times?
5244 Solvers
-
3405 Solvers
-
Given an unsigned integer x, find the largest y by rearranging the bits in x
2016 Solvers
-
Getting the indices from a matrix
725 Solvers
-
Basics: Divide integers to get integer outputs in all cases
137 Solvers
More from this Author305
Problem Tags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
Multi-Case Softmax should be y=exp(x)./sum(exp(x),2)