flax.linen.activation.relu#

flax.linen.activation.relu = <jax._src.custom_derivatives.custom_jvp object>[source]#

Rectified linear unit activation function.

Computes the element-wise function:

\[\mathrm{relu}(x) = \max(x, 0)\]

except under differentiation, we take:

\[\nabla \mathrm{relu}(0) = 0\]

For more information see Numerical influence of ReLU’(0) on backpropagation.

Parameters

x – input array