Adam optimizer
Software / App
An optimization algorithm for deep learning, mentioned as a clever way to solve issues like getting stuck in saddle points during gradient descent.
Mentioned in 2 videos
Videos Mentioning Adam optimizer

MIT 6.S094: Recurrent Neural Networks for Steering Through Time
Lex Fridman
An optimization algorithm for deep learning, mentioned as a clever way to solve issues like getting stuck in saddle points during gradient descent.

Stanford CS336 Language Modeling from Scratch | Spring 2026 | Lecture 1: Overview, Tokenization
Stanford Online
An optimization algorithm used in neural networks, mentioned as part of the development lineage of modern language models.