Activation function
machine-learning
Activation function
An activation function $a: \mathbb{R} \rightarrow \mathbb{R}$ gets applied in a perceptron after the weighted sum of the inputs. It is the non-linear term. Classic activation functions are
- Identity function,
- Binary step,
- Sigmoid function,
- Rectified linear unit (ReLU), or
- Hyperbolic tangent (tanh).