Key Moments
Existential Risk: A Conversation with Toby Ord (Episode #208)
Key Moments
Toby Ord discusses existential risks, from natural disasters to AI, and the urgent need for humanity to navigate the current 'precipice' to secure its long-term future.
Key Insights
Humanity faces significant existential risks, primarily from anthropogenic sources like advanced technology, which have dramatically outpaced our wisdom.
Effective altruism encourages focusing resources on the most impactful ways to do good, extending beyond immediate suffering to long-term future well-being.
Our innate moral biases often neglect risks that are distant in space and time, making it crucial to consciously counteract these intuitions.
Natural existential risks, such as asteroid impacts, are statistically far less likely to cause extinction in the next century compared to human-caused risks.
The current era, termed 'the precipice', is a critical window where humanity's increased power requires a commensurate increase in foresight and caution.
The potential loss of humanity's entire future, not just current lives, makes existential risks uniquely significant, demanding urgent attention and action.
THE PRECIPICE: DEFINING EXISTENTIAL RISK
Toby Ord introduces the concept of existential risk, defining it not just as threats to humanity but as risks that could cause irreversible collapse or extinction, thereby permanently closing off humanity's vast potential future. He likens the current era to standing on a precipice, a period of heightened risk due to humanity's rapidly increasing power, which has outpaced its wisdom. This precarious situation demands urgent attention, as failure could mean an eternal loss of all future good.
EFFECTIVE ALTRUISM AND MORAL BIASES
Ord, a proponent of effective altruism (EA), discusses its core principle: maximizing positive impact. EA challenges innate moral biases, such as the tendency to prioritize those near in space and time over distant future generations or strangers. By analyzing data, EA identifies interventions that are orders of magnitude more effective, urging a focus on 'doing good' over merely 'feeling good'. This is crucial for directing resources where they will have the greatest impact, whether in global poverty or existential risk mitigation.
NATURAL VERSUS ANTHROPOGENIC RISKS
The discussion distinguishes between natural and human-caused existential risks. While natural events like asteroid impacts or supernovas pose a threat, their likelihood per century is statistically very low, especially when compared to risks like supervolcanoes or pandemics, which still present lower probabilities than human-caused risks over the next century. Ord emphasizes that our understanding of these natural risks has advanced significantly, allowing for better estimation and mitigation efforts.
THE ELEVATED THREAT OF HUMAN-MADE RISKS
Anthropogenic risks, stemming from human progress and technology, are presented as far more significant and immediate. These include nuclear war, advanced artificial intelligence (AI), pandemics that are exacerbated by global interconnectedness, and engineered biological threats. Ord argues that our rapidly advancing technological capabilities, from AI to biotechnology, have created the potential for self-destruction that was unimaginable even a few decades ago, making this the central concern.
THE STRATEGIC IMPORTANCE OF THE 'PRECIPICE' ERA
Ord frames the current period, the 'precipice', as a historically unique and brief window—potentially a few centuries—where humanity possesses immense power but insufficient wisdom to manage it safely. This era is characterized by the paradox of progress: advancements that improve lives also create new, profound risks. The urgency is amplified by the fact that a single failure could permanently extinguish humanity's future potential, unlike failures in earlier periods.
THE COSMIC SIGNIFICANCE OF HUMAN SURVIVAL
The conversation highlights the potentially cosmic significance of humanity's survival. If humans are alone in the universe or represent the only locus of intelligent, moral reasoning, then ensuring our long-term future becomes paramount. This perspective suggests that preserving humanity is not just about preventing suffering for current and future generations, but about safeguarding the potential for untold future achievements, from art and discovery to the very amplification of goodness in the cosmos.
NAVIGATING THE FUTURE: ACTION AND PERSPECTIVE
To navigate the precipice, Ord advocates for a shift in perspective, focusing on the long-term future and the maximization of good. This involves conscious efforts to counteract our cognitive biases and to dedicate resources and attention to the most impactful interventions. The concept of 'earning to give' through high-paying careers, and career advice from organizations like 80,000 Hours, are presented as practical ways individuals can contribute. Ultimately, the goal is to ensure humanity's survival and flourishing for millennia to come.
Mentioned in This Episode
●Organizations
●Books
●Studies Cited
●Concepts
●People Referenced
Comparison of Existential Risk Sources
Data extracted from this episode
| Risk Source | Estimated Probability (Next Century) | Potential Impact |
|---|---|---|
| Asteroid Impact (1km+) | 1 in 120,000 (astronomical) | Significant, but unlikely to be extinction-level |
| Asteroid Impact (Extinction Level) | Approx. 1 in 1,000,000 | Human extinction |
| Supernova from nearby star | 1 in 1,000,000,000 | Existential catastrophe |
| Natural Risks (All combined) | < 1 in 200 per century (likely << 1 in 2000) | Extinction |
| Anthropogenic Risks | 1 in 6 | Existential catastrophe |
Common Questions
Existential risk refers to threats that could cause human extinction or the permanent collapse of civilization, leading to the irretrievable loss of humanity's future potential.
Topics
Mentioned in this video
Host of the Making Sense podcast, discussing existential risk and effective altruism.
Philosopher and Toby Ord's thesis advisor, influential in discussions on population ethics and long-term future.
Philosopher and influential figure in effective altruism, known for his work on global poverty and ethics.
Philosopher at Oxford University, author of 'The Precipice: Existential Risk and the Future of Humanity', and expert on existential risk.
Philosopher and influential thinker in existential risk and superintelligence, who has been on the podcast before.
Political theorist whose idea of intergenerational partnership is invoked to emphasize humanity's responsibility to the future.
A highly effective charity recommended by GiveWell, focused on distributing insecticide-treated bed nets.
Toby Ord's academic affiliation, where he works as a philosopher specializing in ethics.
An organization Toby Ord has advised, related to global issues.
An organization Toby Ord has advised, related to global health and poverty.
An organization Toby Ord has advised, suggesting his involvement in high-level policy discussions.
A non-profit charity evaluator that recommends effective, evidence-based giving opportunities.
An organization co-founded by Toby Ord that encourages members to pledge at least 10% of their income to effective charities.
An organization Toby Ord has advised, related to global poverty.
An organization that provides research and advice on how people can maximize their positive impact through their careers.
Historical period of intellectual and philosophical advancement, used as an analogy for the current 'precipice' period.
Ethical theories that judge the morality of an action based on its outcomes or consequences.
A movement focused on using evidence and reason to determine the most effective ways to benefit others.
A family of ethical theories that advocate for actions that maximize overall well-being or happiness.
Toby Ord's book, which explores existential risks and humanity's long-term future.
A seminal paper by Peter Singer that argues for the moral obligation to assist those in poverty.
A major work by Derek Parfit on personal identity, ethics, and population theory.
More from Sam Harris
View all 278 summaries
13 minThe Permission to Hate Jews Has Never Been This Open
24 minThe DEEP VZN Scandal: How Good Intentions Nearly Ended the World
10 minThe War Was Necessary. The Way Trump Did It Wasn’t.
1 minBen Shapiro Knows Better
Found this useful? Build your knowledge library
Get AI-powered summaries of any YouTube video, podcast, or article in seconds. Save them to your personal pods and access them anytime.
Try Summify free