Chinchilla
A research paper that presented a more correct version of scaling laws for language models, influencing people's approach to optimizing models for inference budgets and context windows.
Videos Mentioning Chinchilla

Training Llama 2, 3 & 4: The Path to Open Source AGI — with Thomas Scialom of Meta AI
Latent Space
A scaling law paper that emphasized the importance of training tokens over model size, influencing LLM training strategies.

Open Source AI is AI we can Trust — with Soumith Chintala of Meta AI
Latent Space
A scaling law model that balances training and inference costs, mentioned in the context of LLaMA's development.

Oriol Vinyals: Deep Learning and Artificial General Intelligence | Lex Fridman Podcast #306
Lex Fridman
A language-only model developed by DeepMind, also part of the animal-named model sequence, with 70 billion parameters, later reused in Flamingo.

Cursor Team: Future of Programming with AI | Lex Fridman Podcast #447
Lex Fridman
A research paper that presented a more correct version of scaling laws for language models, influencing people's approach to optimizing models for inference budgets and context windows.