# Activation function
Last edited: 2024-01-20
Activation function
An activation function $a: \mathbb{R} \rightarrow \mathbb{R}$ gets applied in a perceptron after the weighted sum of the inputs. It is the non-linear term. Classic activation functions are
- Identity function,
- Binary step ,
- Sigmoid function ,
- Rectified linear unit (ReLU) , or
- Hyperbolic tangent (tanh) .