How can we save the world? (with Toby Ord)
Guest Toby Ord discusses "The Precipice," humanity's current precarious era of escalating power and existential risks like nuclear war, engineered pandemics, and AI. He argues for urgent global prioritization to reduce these risks and achieve "existential security" for humanity's vast future.
Deep Dive Analysis
13 Topic Outline
Defining 'The Precipice' and Humanity's Precarious Moment
Natural Risks: Asteroids, Supervolcanoes, and Pandemics
Anthropic Reasoning and Observer Selection Effects
Current Technological Risks: Nuclear War and Environmental Destruction
Distinguishing Existential from Catastrophic Risks
Future Technological Risks: Engineered Pandemics
Future Technological Risks: Nanotechnology and Proliferation
Climate Change as a Potential Existential Risk
Achieving Existential Security and Global Coordination
Resource Allocation for Existential Risk Mitigation
Differential Technological Development for Protection
Understanding Long-termism and Future Generations
Balancing Near-Term Urgency with Long-Term Impact
8 Key Concepts
The Precipice
The current historical period where humanity's escalating power, particularly through technologies like nuclear weapons, has created the capacity for self-destruction, threatening our entire long-term future. It represents a precarious moment where we must end this risky period.
Existential Risk
A risk that permanently destroys humanity's long-term potential, which could include human extinction, unrecoverable collapse of civilization, or an unescapable totalitarian regime. These events are irreversible and destroy our entire future, not just the present.
Global Catastrophic Risks
Risks that cause immense disaster, often defined as killing 10% or more of the global population. Unlike existential risks, these do not necessarily involve the permanent destruction of humanity's long-term potential or complete unrecoverability.
Anthropic Selection Effects
Observer selection effects that mean we can only find ourselves in a time and place where we could exist, which can skew probabilities when considering how likely certain events are. This can be used to infer that certain highly destructive events must be rare if we've survived this long.
Anthropic Shadow
A phenomenon where a strange lull in recent times, such as an unusual absence of large asteroid collisions, could be evidence that such a lull is necessary for the existence of intelligent life. It's a way to use indirect evidence to understand selection effects.
Existential Risk Factor
Something that is not an existential risk itself but, if it occurs, increases the probability of an existential risk. An example is a great power war, which could heighten the chances of various global catastrophes or existential threats.
Differential Technological Development
The strategy of influencing the order in which technologies are developed, specifically by accelerating protective or defensive technologies while potentially slowing down or carefully managing risky ones. The goal is to gain an advantage in mitigating future threats.
Long-termism
A moral framework that emphasizes taking the long-term future of humanity seriously, recognizing that current actions have profound and lasting effects on future generations. It considers the vast potential of future lives and aims to safeguard humanity's potential over immense timescales.
12 Questions Answered
It is the current precarious period in human history where our escalating technological power, like nuclear weapons, has given us the ability to destroy ourselves and our entire future.
Supervolcanic eruptions are considered the largest known natural risk, as asteroid impacts and supernovae/gamma-ray bursts are incredibly unlikely, and natural pandemics are complex due to anthropogenic mediation.
We can only exist in a time and place conducive to life, meaning our observation of past events is biased. This can be used to infer that certain highly destructive events must be rare if we've survived this long.
Existential risks permanently destroy humanity's long-term potential (e.g., extinction, unrecoverable collapse), while catastrophic risks cause immense harm (e.g., killing 10% of the population) but don't necessarily end our long-term future.
While a runaway greenhouse effect like Venus's is currently thought to be physically impossible for Earth, extreme warming could lead to a collapse of civilization from which humanity might not recover, thus posing an existential threat.
Current risks include nuclear war and extreme climate change, while future risks on the horizon are primarily engineered pandemics and advanced artificial intelligence.
Nanotechnology could lead to self-replicating nanomachines that outcompete natural systems (gray goo scenario) or, more plausibly, vastly empower individuals or small groups to create weapons of mass destruction, leading to global instability.
It would be a world where humanity universally prioritizes its continued survival, with institutions and norms built to keep existential risks low, making it difficult for any group or nation to threaten global destruction. Space settlement alone is not a complete solution due to correlated risks.
Currently, humanity spends significantly less on existential risk reduction than on trivial things like ice cream. A substantial increase in funding, perhaps a few percent of world product, is needed, primarily for research and building up relevant communities.
It is the strategy of accelerating the development of protective technologies (e.g., metagenomic sequencing for rapid pathogen identification) while potentially slowing down or carefully managing risky technologies, to improve humanity's overall safety.
Long-termism is a moral perspective that takes the vast, potential future of humanity seriously, recognizing that our actions today have profound, long-lasting consequences for billions of future lives. We should care because our generation is uniquely positioned to safeguard or destroy this immense potential.
Existential risks that could strike in the near future (e.g., next 10-20 years) are urgent because if our generation doesn't address them, no one else can. This 'near-term long-termism' is highly impactful and feels like a duty, though influencing the world thousands of years from now is much harder to predict.
15 Actionable Insights
1. Embrace Long-Term Perspective
Adopt a long-termist perspective in moral and policy decisions, seriously considering the profound and lasting impacts of current actions on future generations of humanity.
2. Prioritize Wisdom Growth
Actively work to increase collective wisdom and governance capabilities at a pace that matches or exceeds technological advancement to prevent global catastrophe.
3. Reduce Existential Risk Probabilities
Actively work to reduce the per-unit-time probability of existential catastrophes, aiming for consistently declining or extremely low rates to ensure humanity’s long-term survival.
4. Rapidly Lower Existential Risk
Prioritize and implement strategies to reduce existential risk as quickly as possible, as current levels are unsustainable for long-term survival.
5. Achieve Existential Security
Work towards a state of “existential security” by making global risk reduction a top priority, establishing robust institutions, and developing norms to maintain low levels of existential risk permanently.
6. Address Urgent Existential Risks
Recognize the immediate urgency of addressing existential risks that could materialize within the next few decades, as failure to act now means no future generation will have the opportunity to do so.
7. Navigate Dangerous Era Safely
Dedicate a substantial portion of humanity’s collective effort and focus to navigating the current dangerous period, aiming to survive and overcome present existential threats.
8. Prioritize Existential Risk Reduction
For those adopting a long-termist ethical perspective, the most effective current approach is to focus on reducing existential risks until this area is thoroughly addressed and resourced.
9. Increase Existential Risk Funding
Increase global spending on existential risk reduction to at least match the amount spent on ice cream annually, as current allocations are disproportionately low given the stakes.
10. Invest in Risk Research
Direct funding towards building research communities and enhancing the academic credibility of existential risk studies, as this area is currently under-researched and under-resourced.
11. Prevent Great Power War
Prioritize efforts to avoid great power wars, as the risk of such conflicts significantly increases overall existential risk, potentially more than many individual specific threats.
12. Accelerate Protective Technologies
Prioritize and accelerate the development of defensive or protective technologies to mitigate risks and enhance global security before dangerous technologies proliferate.
13. Identify Defensive Technologies
Apply a “reading vs. writing” or “information gathering vs. action” lens across various technological fields to identify and prioritize the development of protective and defensive technologies over those that heighten aggressive powers.
14. Advance Metagenomic Sequencing
Invest in and accelerate the development and deployment of metagenomic sequencing technology, which can rapidly identify unknown pathogens and serve as a powerful defense against engineered pandemics and bioweapons.
15. Focus on Existential Risk
When addressing existential risk, narrow your focus specifically on threats that permanently destroy humanity’s long-term potential, rather than broadly addressing all forms of unmitigated disasters, to make effective progress.
5 Key Quotes
Our wisdom needs to grow at least as fast as our technology. Otherwise, we're in deep trouble.
Carl Sagan (as quoted by Spencer Greenberg)
Humanity spends more each year on ice cream than we do on existential risk reduction.
Toby Ord
If we don't deal with the risks of the next 30 years, say, no one else is going to be able to.
Toby Ord
An existential risk is a risk to humanity's long-term potential, that it would be permanently destroyed.
Toby Ord
If you've only got power that can wreak havoc on a local scale, then you only need to be able to kind of govern at this local level. If you have power that can wreak kind of global havoc, then the kind of interconnectedness of different countries and some ability to kind of manage on this world scale becomes essential.
Toby Ord