Adam
Software / App
A popular optimization algorithm for training deep learning models, mentioned in the context of hyperparameter tuning.
Mentioned in 2 videos
Videos Mentioning Adam

Jeremy Howard: fast.ai Deep Learning Courses and Research | Lex Fridman Podcast #35
Lex Fridman
A popular optimization algorithm for training deep learning models, mentioned in the context of hyperparameter tuning.

Stanford CS336 Language Modeling from Scratch | Spring 2026 | Lecture 2: PyTorch (einops)
Stanford Online
An optimization algorithm that combines momentum and adaptive learning rates, mentioned in contrast to Adagrad and relevant for Assignment 1.