# Rectified linear unit (ReLU)

Last edited: 2024-01-20

Rectified linear unit (ReLU)

The Rectified linear unit (ReLU) is an activation function that cuts the lower end off the identity

$$(x)^{+} = \max(0,x).$$