Retrieval-Augmented Generation
A technique for improving LLM responses with external data, discussed as potentially suboptimal and soon to be obsolete for personalized memory.
Videos Mentioning Retrieval-Augmented Generation

State of the Art: Training 70B LLMs on 10,000 H100 clusters
Latent Space
Described as "the world's simplest agent," giving models the ability to retrieve data from external contexts or databases, bridging unstructured and structured data.

Open Source AI is AI we can Trust — with Soumith Chintala of Meta AI
Latent Space
An approach to fine-tuning LLMs by creating synthetic data from retrieved documents, overcoming a paradigm difference.

Why AI Agents Don't Work (yet) - with Kanjun Qiu of Imbue
Latent Space
A technique that enhances language models by retrieving information from external knowledge bases before generating a response. Considered helpful for user preferences but not sufficient for complex scientific memory.

The End of Finetuning — with Jeremy Howard of Fast.ai
Latent Space
Considered an inefficient hack compared to fine-tuning, used as the current equivalent of few-shot learning.

Bee AI: The Wearable Ambient Agent
Latent Space
A technique for improving LLM responses with external data, discussed as potentially suboptimal and soon to be obsolete for personalized memory.