# Rectified linear unit (ReLU)
Last edited: 2026-02-05
Rectified linear unit (ReLU)
The Rectified linear unit (ReLU) is an activation function that cuts the lower end off the identity
$$(x)^{+} = \max(0,x).$$Last edited: 2026-02-05
The Rectified linear unit (ReLU) is an activation function that cuts the lower end off the identity
$$(x)^{+} = \max(0,x).$$