GPT-3

GPT-3

OpenAIVerified via Wikidata

2020 transformer-based large language model

Mentioned in 88 videos
Founded
2020
Published
May 28, 2020
Developer
OpenAI
License
proprietary license

What podcasters actually say about GPT-3.

88 mentions, no marketing. Save them all to a pod and ask any question.

Get Started Free

Videos Mentioning GPT-3

Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization | Lex Fridman Podcast #368

Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization | Lex Fridman Podcast #368

Lex Fridman

A predecessor to GPT-4, mentioned in a thought experiment about removing consciousness discussions from training data. It was also noted for being well-calibrated with probabilities before reinforcement learning with human feedback (RLHF) degraded this ability.

The Digital Multiverse: A Conversation with David Auerbach (Episode #319)

The Digital Multiverse: A Conversation with David Auerbach (Episode #319)

Sam Harris

A previous version of a large language model mentioned in Auerbach's book.

Noam Brown: AI vs Humans in Poker and Games of Strategic Negotiation | Lex Fridman Podcast #344

Noam Brown: AI vs Humans in Poker and Games of Strategic Negotiation | Lex Fridman Podcast #344

Lex Fridman

A large language model that, along with GPT-2, underscored the rapid advancements in AI, providing context for the ambitious goals of the Diplomacy AI project.

Balaji Srinivasan: How to Fix Government, Twitter, Science, and the FDA | Lex Fridman Podcast #331

Balaji Srinivasan: How to Fix Government, Twitter, Science, and the FDA | Lex Fridman Podcast #331

Lex Fridman

Cited as an example of serious step-ups in AI capabilities.

Alien Debate: Sara Walker and Lee Cronin | Lex Fridman Podcast #279

Alien Debate: Sara Walker and Lee Cronin | Lex Fridman Podcast #279

Lex Fridman

A large language model, discussed as an example of AI that is improving at 'fooling' humans but is limited by its resource-constrained substrate and inability to generate true novelty or cross-domain connections.

Scott Aaronson: Computational Complexity and Consciousness | Lex Fridman Podcast #130

Scott Aaronson: Computational Complexity and Consciousness | Lex Fridman Podcast #130

Lex Fridman

A large language model developed by OpenAI, noted for its impressive capability to generate human-like text, poems, and essays, but still having limitations in logical reasoning and arithmetic.

AI and the Future of Law: The 10 Year "Overnight" Success Story

AI and the Future of Law: The 10 Year "Overnight" Success Story

Y Combinator

A large language model from OpenAI that CaseText utilized, enabling them to develop their core product, Co-Counsel.

How Scaling Laws Will Determine AI's Future | YC Decoded

How Scaling Laws Will Determine AI's Future | YC Decoded

Y Combinator

A successor to GPT-2, significantly larger and more capable, marking a pivotal moment in the era of scaling laws for LLMs.

AI Startup Founders Debate the Creation of Artificial General Intelligence

AI Startup Founders Debate the Creation of Artificial General Intelligence

Y Combinator

Mentioned as an earlier example where one speaker already thought it was AGI.

AI Expert Warns: “This Is The Last Mistake We’ll Ever Make” - Tristan Harris

AI Expert Warns: “This Is The Last Mistake We’ll Ever Make” - Tristan Harris

Chris Williamson

An earlier version of OpenAI's language model, capable of writing full essays.

Marc Andreessen introspects on Death of the Browser, Pi + OpenClaw, and Why "This Time Is Different"

Marc Andreessen introspects on Death of the Browser, Pi + OpenClaw, and Why "This Time Is Different"

Latent Space

Mentioned as the model accessible through AI Dungeon, previously deemed too dangerous for general use by OpenAI.

Stanford CS336 Language Modeling from Scratch | Spring 2026 | Lecture 1: Overview, Tokenization

Stanford CS336 Language Modeling from Scratch | Spring 2026 | Lecture 1: Overview, Tokenization

Stanford Online

A massive language model trained by OpenAI that demonstrated emergent behavior like in-context learning.

Stanford CS336 Language Modeling from Scratch | Spring 2026 | Lecture 3: Architectures

Stanford CS336 Language Modeling from Scratch | Spring 2026 | Lecture 3: Architectures

Stanford Online

Used as an example of a model trained with GeLU activation, and later as a benchmark for sequential transformer blocks.

Is AI About to Automate Every Office Job? (Not a Chance)

Is AI About to Automate Every Office Job? (Not a Chance)

Cal Newport

An early large language model that produced reasonable but inconsistent stories.

Stanford CS336 Language Modeling from Scratch | Spring 2026 | Lecture 9: Scaling Laws

Stanford CS336 Language Modeling from Scratch | Spring 2026 | Lecture 9: Scaling Laws

Stanford Online

Mentioned as a large model from the era that followed Kaplan's scaling laws, which trained very large models. It's also noted as being undertrained compared to later models like Chinchilla.

⚡️ Competing with ChatGPT and Sierra, building a $10M ARR company — Yasser Elsaid, Founder, Chatbase

⚡️ Competing with ChatGPT and Sierra, building a $10M ARR company — Yasser Elsaid, Founder, Chatbase

Latent Space

An early large language model from OpenAI that Yasser Elsaid experimented with, realizing the potential to add custom data.

PreviousPage 4 of 4