relu
Apply rectified linear unit activation
Syntax
Description
The rectified linear unit (ReLU) activation operation performs a nonlinear threshold operation, where any input value less than zero is set to zero.
This operation is equivalent to:
Examples
Input Arguments
Output Arguments
Extended Capabilities
Version History
Introduced in R2019b