Key Moments

Existential Risk: A Conversation with Toby Ord (Episode #208)

Sam HarrisSam Harris
Science & Technology3 min read65 min video
Jun 24, 2020|155,472 views|2,123|510
Save to Pod
TL;DR

Toby Ord discusses existential risks, from natural disasters to AI, and the urgent need for humanity to navigate the current 'precipice' to secure its long-term future.

Key Insights

1

Humanity faces significant existential risks, primarily from anthropogenic sources like advanced technology, which have dramatically outpaced our wisdom.

2

Effective altruism encourages focusing resources on the most impactful ways to do good, extending beyond immediate suffering to long-term future well-being.

3

Our innate moral biases often neglect risks that are distant in space and time, making it crucial to consciously counteract these intuitions.

4

Natural existential risks, such as asteroid impacts, are statistically far less likely to cause extinction in the next century compared to human-caused risks.

5

The current era, termed 'the precipice', is a critical window where humanity's increased power requires a commensurate increase in foresight and caution.

6

The potential loss of humanity's entire future, not just current lives, makes existential risks uniquely significant, demanding urgent attention and action.

THE PRECIPICE: DEFINING EXISTENTIAL RISK

Toby Ord introduces the concept of existential risk, defining it not just as threats to humanity but as risks that could cause irreversible collapse or extinction, thereby permanently closing off humanity's vast potential future. He likens the current era to standing on a precipice, a period of heightened risk due to humanity's rapidly increasing power, which has outpaced its wisdom. This precarious situation demands urgent attention, as failure could mean an eternal loss of all future good.

EFFECTIVE ALTRUISM AND MORAL BIASES

Ord, a proponent of effective altruism (EA), discusses its core principle: maximizing positive impact. EA challenges innate moral biases, such as the tendency to prioritize those near in space and time over distant future generations or strangers. By analyzing data, EA identifies interventions that are orders of magnitude more effective, urging a focus on 'doing good' over merely 'feeling good'. This is crucial for directing resources where they will have the greatest impact, whether in global poverty or existential risk mitigation.

NATURAL VERSUS ANTHROPOGENIC RISKS

The discussion distinguishes between natural and human-caused existential risks. While natural events like asteroid impacts or supernovas pose a threat, their likelihood per century is statistically very low, especially when compared to risks like supervolcanoes or pandemics, which still present lower probabilities than human-caused risks over the next century. Ord emphasizes that our understanding of these natural risks has advanced significantly, allowing for better estimation and mitigation efforts.

THE ELEVATED THREAT OF HUMAN-MADE RISKS

Anthropogenic risks, stemming from human progress and technology, are presented as far more significant and immediate. These include nuclear war, advanced artificial intelligence (AI), pandemics that are exacerbated by global interconnectedness, and engineered biological threats. Ord argues that our rapidly advancing technological capabilities, from AI to biotechnology, have created the potential for self-destruction that was unimaginable even a few decades ago, making this the central concern.

THE STRATEGIC IMPORTANCE OF THE 'PRECIPICE' ERA

Ord frames the current period, the 'precipice', as a historically unique and brief window—potentially a few centuries—where humanity possesses immense power but insufficient wisdom to manage it safely. This era is characterized by the paradox of progress: advancements that improve lives also create new, profound risks. The urgency is amplified by the fact that a single failure could permanently extinguish humanity's future potential, unlike failures in earlier periods.

THE COSMIC SIGNIFICANCE OF HUMAN SURVIVAL

The conversation highlights the potentially cosmic significance of humanity's survival. If humans are alone in the universe or represent the only locus of intelligent, moral reasoning, then ensuring our long-term future becomes paramount. This perspective suggests that preserving humanity is not just about preventing suffering for current and future generations, but about safeguarding the potential for untold future achievements, from art and discovery to the very amplification of goodness in the cosmos.

NAVIGATING THE FUTURE: ACTION AND PERSPECTIVE

To navigate the precipice, Ord advocates for a shift in perspective, focusing on the long-term future and the maximization of good. This involves conscious efforts to counteract our cognitive biases and to dedicate resources and attention to the most impactful interventions. The concept of 'earning to give' through high-paying careers, and career advice from organizations like 80,000 Hours, are presented as practical ways individuals can contribute. Ultimately, the goal is to ensure humanity's survival and flourishing for millennia to come.

Comparison of Existential Risk Sources

Data extracted from this episode

Risk SourceEstimated Probability (Next Century)Potential Impact
Asteroid Impact (1km+)1 in 120,000 (astronomical)Significant, but unlikely to be extinction-level
Asteroid Impact (Extinction Level)Approx. 1 in 1,000,000Human extinction
Supernova from nearby star1 in 1,000,000,000Existential catastrophe
Natural Risks (All combined)< 1 in 200 per century (likely << 1 in 2000)Extinction
Anthropogenic Risks1 in 6Existential catastrophe

Common Questions

Existential risk refers to threats that could cause human extinction or the permanent collapse of civilization, leading to the irretrievable loss of humanity's future potential.

Topics

Mentioned in this video

More from Sam Harris

View all 278 summaries

Found this useful? Build your knowledge library

Get AI-powered summaries of any YouTube video, podcast, or article in seconds. Save them to your personal pods and access them anytime.

Try Summify free