Science
Fact-checked
At AllTheScience, we're committed to delivering accurate, trustworthy information. Our expert-authored content is rigorously fact-checked and sourced from credible authorities. Discover how we uphold the highest standards in providing you with reliable knowledge.

What is Existential Risk?

Existential risk refers to threats that could annihilate intelligent life or drastically curtail its potential. These dangers, ranging from nuclear annihilation to rogue AI, pose profound questions about our future. Understanding them is crucial for safeguarding humanity's legacy. How might we mitigate such risks to ensure a thriving tomorrow? Join us as we examine the paths to a safer world.
Michael Anissimov
Michael Anissimov
Michael Anissimov
Michael Anissimov

An existential risk is a disaster so great that it either wipes out all of humanity or permanently cripples us. These may be natural disasters, or man-made disasters of an intentional or accidental nature. An existential risk may have been around for a long time, or only a few decades, or perhaps it lies in our future. Examples of existential risk include large asteroid strikes, nuclear war, and rogue Artificial Intelligence.

The concept of existential risk was first articulated in its current form by Dr. Nick Bostrom, an Oxford philosopher. He uses a risk chart similar to the following to explain existential risks:

Scope of risk
GlobalEl Niñodeforestationexistential risk
Localthunderstormeconomic downturnhurricane
Personalpapercutsprained ankleyou are shot
Intensity of riskNegligibleManageableTerminal

Existential risks are global and terminal, or perhaps near-terminal. An extremely contagious virus with a 99.9% lethality rate that no one is immune to is an example of an existential risk.

A contagious virus is an example of an existential risk.
A contagious virus is an example of an existential risk.

Bostrom points out that our minds and institutions are ill-equipped to deal with thinking about existential risk, because we have never experienced one before – if we had, we wouldn’t be here to think about them. Like a child that doesn’t know that a stove is hot until he touches it, we have little experience with catastrophes on this level. The Bubonic plague of medieval Europe and the Spanish flu of WWI offer us a taste of what an existential disaster would be like. Dozens of millions of healthy people were struck dead in mere hours by both diseases.

In his canonical paper on the topic, Bostrom lists about a dozen existential risks and categorizes them based on their severity and recoverability. Some of the most plausible ones are listed here:

  • genetically engineered viruses
  • nanotechnological arms races
  • catastrophic nuclear war
  • out-of-control self-replicating robotics
  • superintelligent AI indifferent to humans
  • physics disaster in a particle accelerator
  • supervolcano explosion blocks out the sun

Because of the extreme severity and irreversibility of existential risk, possible countermeasures are worth brainstorming and implementing. Even if the chance of a given existential threat becoming a reality is small, the immense stakes involved demand a serious avoidance program. For human-originating threats, countermeasures include sophisticated observation-and-alert systems and regulation of certain technologies to ensure that they are not used for mass destruction. Countries suspected of possessing weapons of mass destruction are sometimes invaded by other countries worried about the long-term consequences, as the War in Iraq vividly demonstrates.

Michael Anissimov
Michael Anissimov

Michael is a longtime AllTheScience contributor who specializes in topics relating to paleontology, physics, biology, astronomy, chemistry, and futurism. In addition to being an avid blogger, Michael is particularly passionate about stem cell research, regenerative medicine, and life extension therapies. He has also worked for the Methuselah Foundation, the Singularity Institute for Artificial Intelligence, and the Lifeboat Foundation.

Learn more...
Michael Anissimov
Michael Anissimov

Michael is a longtime AllTheScience contributor who specializes in topics relating to paleontology, physics, biology, astronomy, chemistry, and futurism. In addition to being an avid blogger, Michael is particularly passionate about stem cell research, regenerative medicine, and life extension therapies. He has also worked for the Methuselah Foundation, the Singularity Institute for Artificial Intelligence, and the Lifeboat Foundation.

Learn more...

You might also Like

Discuss this Article

Post your comments
Login:
Forgot password?
Register:
    • A contagious virus is an example of an existential risk.
      By: Farina3000
      A contagious virus is an example of an existential risk.