Colossus (Supercomputer)
Product
A supercomputer built by Elon Musk's XAI, containing 200,000 H100 GPUs, used for training Grok 3.
Mentioned in 2 videos
Videos Mentioning Colossus (Supercomputer)

AI CEOs Keep Talking… But Should We Believe Them? | Cal Newport
Cal Newport
A supercomputer built by Elon Musk's XAI, containing 200,000 H100 GPUs, used for training Grok 3.

Jensen Huang: NVIDIA - The $4 Trillion Company & the AI Revolution | Lex Fridman Podcast #494
Lex Fridman
A supercomputer built by Tesla in Memphis with 200,000 GPUs, rapidly constructed in 4 months, cited as an example of Elon Musk's efficient systems engineering approach.