The Alignment Problem

Book

A book by Brian Christian that looks at AI safety questions from an objective, non-AI researcher perspective.

Mentioned in 4 videos