Given everything I know, I put the existential risk this century at around one in six: Russian roulette.
Toby Ord, The precipice
1. New series: Exaggerating the risks
Many effective altruists think that humanity faces high levels of existential risk, risks of existential catastrophes involving “the premature extinction of Earth-originating intelligent life or the permanent and drastic destruction of its potential for desirable future development”.
My series “Existential risk pessimism and the time of perils” looks at what would follow if that were true.
But is it true? Just why are we so convinced that humanity is on the brink of doom?
In this series, Exaggerating the risks, I look at some places where claimed levels of existential risk may be exaggerated.
2. From catastrophic risk to existential risk
Effective altruists care deeply about catastrophic risks, risks “with the potential to wreak death and destruction on a global scale”. So do I. Catastrophes are not hard to find. The world is emerging from a global pandemic. There are ongoing genocides throughout the world. And nuclear saber-rattling is on its way to becoming a new international sport. Identifying and stopping potential catastrophes is an effort worth our while.
But effective altruists are also deeply concerned about existential risks, risks of existential catastrophes involving “the premature extinction of Earth-originating intelligent life or the permanent and drastic destruction of its potential for desirable for desirable future development”. Most catastrophic risks are not existential risks. A hundred million deaths would not pose an existential risk. Nor, in many scenarios, would a billion deaths. Existential risks are literal risks of human extinction, or the permanent destruction of our ability to develop as a species.
Many authors give alarmingly high estimates of the level of current existential risk. Toby Ord puts the risk of existential catastrophe by 2100 at “one in six; Russian roulette’’. The Royal Astronomer Martin Rees gives a 50% chance of civilizational collapse by 2100. And participants at the Oxford Global Catastrophic Risk Conference in 2008 estimated a median 19% chance of human extinction by 2100.
These are striking numbers, and you might expect them to be backed by solid research. But the deeper we dig into existing discussions, the harder it becomes to support these numbers.
I will argue in a different series that many of the most familiar threats, such as asteroid impacts or nuclear war, are unlikely to lead to existential catastrophe. Effective altruists know this. As a result, they rest their case on a number of less familiar risks. However, I argue, these risks are often exaggerated. That is the project of this series.
In Part 2 of this series, I’ll look at a discussion of climate risk by Toby Ord.
In the meantime, let me know what you think about current levels of existential risk. What are you worried about? Are there any risks that you would like me to discuss?
Leave a Reply