PyTorch autograd (loss.backward)

Tool / ProductMentioned in 1 video

The built-in automatic differentiation mechanism in PyTorch; the lecturer argues for replacing calls to loss.backward with a manual tensor-level backward pass for pedagogical and debugging reasons.