Backpropagation
The fundamental algorithm for training multi-layer neural networks. LeCun, Hinton, and Rumelhart et al. independently developed or realized its importance around the same time.
Common Themes
Videos Mentioning Backpropagation

Jay McClelland: Neural Networks and the Emergence of Cognition | Lex Fridman Podcast #222
Lex Fridman
An optimization mechanism for training neural networks, co-authored by David Rumelhart and Jeff Hinton. It involves backpropagating error signals through layers to adjust connection weights, initially called the generalized delta rule.

Ilya Sutskever: Deep Learning | Lex Fridman Podcast #94
Lex Fridman
A fundamental algorithm used to train neural networks by calculating the gradient of the loss function with respect to the weights in the network.

MIT AGI: Building machines that see, learn, and think like people (Josh Tenenbaum)
Lex Fridman
An algorithm used in deep learning for training neural networks, with origins tracing back to psychological journals.

Deep Learning Basics: Introduction and Overview
Lex Fridman
The fundamental algorithm used for training artificial neural networks by adjusting weights based on errors, discussed as a key historical and practical aspect.

Deep Learning State of the Art (2019)
Lex Fridman
A core algorithm in deep learning for training neural networks, although one speaker suggests it may be fundamentally flawed and require revolution.

deeplearning.ai's Heroes of Deep Learning: Yann LeCun
DeepLearningAI
The fundamental algorithm for training multi-layer neural networks. LeCun, Hinton, and Rumelhart et al. independently developed or realized its importance around the same time.

MIT 6.S094: Introduction to Deep Learning and Self-Driving Cars
Lex Fridman
An algorithm innovation crucial for training neural networks, especially deep ones.

Torch Tutorial (Alex Wiltschko, Twitter)
Lex Fridman
A specific case of reverse-mode automatic differentiation widely used for training neural networks.