- Home
- /
- TIL
- /
- Deep Learning
ReLU: Rectified Linear Unit
Table of Contents
$ ReLU(x) = max(0, x)$
where $ x $ denotes the input to the neuron.
Variants:
- Leaky ReLU
- Parametric ReLU
- Constructed ReLU
- Gaussian-error Linear Unit (GeLU)
- etc.
References: