flax.linen.activation.relu6#
- flax.linen.activation.relu6 = <jax._src.custom_derivatives.custom_jvp object>[source]#
Rectified Linear Unit 6 activation function.
Computes the element-wise function
\[\mathrm{relu6}(x) = \min(\max(x, 0), 6)\]except under differentiation, we take:
\[\nabla \mathrm{relu}(0) = 0\]and
\[\nabla \mathrm{relu}(6) = 0\]- Parameters
x – input array