tsuji.tech
  • Blog
  • TIL
  • About
  • |
LIGHT DARK
  • Home
  • /
  • TIL
  • /
  • Deep Learning

ReLU: Rectified Linear Unit

Dec 4, 2024
Table of Contents

$ ReLU(x) = max(0, x)$

where $ x $ denotes the input to the neuron.

Variants:

  • Leaky ReLU
  • Parametric ReLU
  • Constructed ReLU
  • Gaussian-error Linear Unit (GeLU)
  • etc.

References:

  • Rectifier (neural networks) - wiki
 
About / Disclaimer / Privacy Policy / Sitemap / Contact /
© 2024-2025 K. Tsuji