AI Glossary

Backward Pass (Backpropagation)

The phase of neural network training where gradients are computed by propagating the error signal backward from the output layer to the input layer.

How It Works

After the forward pass computes predictions and loss, the backward pass applies the chain rule of calculus to compute the gradient of the loss with respect to every parameter. These gradients indicate how to adjust each weight to reduce the loss.

Automatic Differentiation

Modern frameworks (PyTorch, TensorFlow, JAX) implement backpropagation via automatic differentiation, building a computational graph during the forward pass and traversing it in reverse to compute gradients.

← Back to AI Glossary

Last updated: March 5, 2026