The Alignment Problem

Book

A book by Brian Christian that looks at AI safety questions from an objective, non-AI researcher perspective.

Mentioned in 4 videos

Save the 4 videos on The Alignment Problem to your own pod.

Sign up free to keep building your knowledge base on The Alignment Problem as more episodes are added.

Get Started Free