PyTorch autograd (loss.backward)

Software / App

The built-in automatic differentiation mechanism in PyTorch; the lecturer argues for replacing calls to loss.backward with a manual tensor-level backward pass for pedagogical and debugging reasons.

Mentioned in 1 video