Loading

Quipoin Menu

Learn • Practice • Grow

math-for-ai / The Chain Rule
tutorial

The Chain Rule

Sometimes a function is built from other functions – like a nested machine. For example, in a neural network, the final output depends on layers of computations. The chain rule tells you how to compute the derivative of a composite function.

The chain rule: If y = f(u) and u = g(x), then dy/dx = (dy/du) × (du/dx).

Simple Analogy: Assembly Line

Imagine a factory where raw materials go through machine A, then machine B. The chain rule allows you to calculate how a change in raw materials affects the final product by multiplying the effect of each machine in sequence.

Why the Chain Rule Is Crucial for AI

  • Backpropagation: The algorithm that trains neural networks is an application of the chain rule. It propagates errors backwards through the layers.
  • Deep learning: A deep neural network is a composition of many functions (layers). The chain rule lets us compute the gradient of the loss with respect to early layers.
  • Automatic differentiation: Libraries like TensorFlow and PyTorch use the chain rule to compute gradients automatically.

Simple Example

Let y = (3x + 2)². You can think: u = 3x + 2, then y = u².
dy/du = 2u, du/dx = 3. So dy/dx = 2u × 3 = 6(3x+2) = 18x + 12.

Chain Rule in Neural Networks

A neural network with one hidden layer: output = activation(W₂ · activation(W₁·x + b₁) + b₂). To update W₁ (early layer), we apply the chain rule step by step, multiplying gradients from later layers back to earlier ones.

Intuition for Beginners

Think of the chain rule as: "The total effect = effect of first step × effect of second step × ..."


Two Minute Drill
  • The chain rule computes derivatives of nested (composite) functions.
  • Formula: dy/dx = (dy/du) × (du/dx).
  • Backpropagation is the chain rule applied to neural networks.
  • Essential for training deep learning models.

Need more clarification?

Drop us an email at career@quipoinfotech.com