# Backward propagation of errors (Back propagation)

Last edited: 2026-02-05

This is the specific implementation of gradient descent applied to neural networks . There are two stages of this calculation:

For this to work all the perceptrons need to have differentiable activation function .