Autograd Visualizer

See how automatic differentiation works by watching a computation graph evaluate forward (computing values) and backward (computing gradients). This is the same algorithm that powers backpropagation in neural networks.

Computation is a graph. Autodiff is just two traversals: forward for values, backward for gradients.

Operators: + - * / ** Functions: tanh() relu() Variables: a-z
a =
b =
c =

Computation Graph

Ready
Forward pass
Backward pass
?
Ready to start
Press play or step forward to begin the forward pass

Node Inspector

Click a node to inspect

How It Works

Forward pass: Evaluate each node in topological order (leaves first, then operations). Each node computes its value from its inputs.
Backward pass: Start from the output with gradient = 1. Traverse in reverse order, accumulating gradients using the chain rule: each node multiplies the incoming gradient by its local derivative.
The key insight: The gradient at any node tells you how much the output would change if you slightly changed that node's value. This is exactly what we need for gradient descent!