Rectified linear unit (ReLU)
Rectified linear unit (ReLU)
The Rectified linear unit (ReLU) is an activation function that cuts the lower end off the identity
$$(x)^{+} = \max(0,x).$$The Rectified linear unit (ReLU) is an activation function that cuts the lower end off the identity
$$(x)^{+} = \max(0,x).$$