Existential risks are risks that threaten humanity's existence.

Classification according to probability and danger

Very low-probability

  1. killer natural virus.
  2. alien invasion
  3. asteroid impact
  4. simulation shuts down
  5. gamma ray burst
  6. supervolcano eruption
  7. black hole impact

Wouldn’t kill everyone, but still worth preventing

  1. nuclear holocaust
  2. runaway climate change
  3. repressive global dictatorship

The really important ones

  1. superintelligence - not just AI - but superhumans too
  2. deliberate misuse of nanotech (arms race, nanoweapons)
  3. accidental misuse of nanotech
  4. killer artificial virus
  5. antimatter holocaust?
  6. particle accelerator disaster

Ways to counteract

  1. friendly superintelligence
  2. nanofactory restrictions
  3. universal sousveillance
  4. ocean habitat
  5. subterranean habitat
  6. antarctic habitat
  7. space habitat

"Grid of risk"

Risk grid
Superweapons Biotechnology Nanotechnology Cognitive technology
Nuclear weapons Hybrid/designed diseases Arms races Neural implants
Solar weapons Full-fledged hyperdisease ’Grey goo’ Wireheading
Kinetic weapons Artificial life (nanobiotech) Thermal limit Superintelligence

Source: Comprehensive List of Existential Risks, Part 2 by Michael Anissimov

Ad blocker interference detected!

Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.