Module juice::layers::activation::relu
source · Expand description
Applies the nonlinear Rectified Linear Unit.
Non-linearity activation function: y = max(0, x)
This is generally the preferred choice over Sigmod or TanH. The max function used in ReLU is usually faster to compute than the exponentiation needed in a Sigmoid layer.
Structs§
- ReLU Activation Layer