Colossus (Supercomputer)
Product
A supercomputer built by Elon Musk's XAI, containing 200,000 H100 GPUs, used for training Grok 3.
Mentioned in 2 videos
Save the 2 videos on Colossus (Supercomputer) to your own pod.
Sign up free to keep building your knowledge base on Colossus (Supercomputer) as more episodes are added.
Videos Mentioning Colossus (Supercomputer)

AI CEOs Keep Talking… But Should We Believe Them? | Cal Newport
Cal Newport
A supercomputer built by Elon Musk's XAI, containing 200,000 H100 GPUs, used for training Grok 3.

Jensen Huang: NVIDIA - The $4 Trillion Company & the AI Revolution | Lex Fridman Podcast #494
Lex Fridman
A supercomputer built by Tesla in Memphis with 200,000 GPUs, rapidly constructed in 4 months, cited as an example of Elon Musk's efficient systems engineering approach.