Existential risks are risks that threaten humanity's existence.
Classification according to probability and danger
Very low-probability
- killer natural virus.
- alien invasion
- asteroid impact
- simulation shuts down
- gamma ray burst
- supervolcano eruption
- black hole impact
Wouldn’t kill everyone, but still worth preventing
- nuclear holocaust
- runaway climate change
- repressive global dictatorship
The really important ones
- superintelligence - not just AI - but superhumans too
- deliberate misuse of nanotech (arms race, nanoweapons)
- accidental misuse of nanotech
- killer artificial virus
- antimatter holocaust?
- particle accelerator disaster
Ways to counteract
- friendly superintelligence
- nanofactory restrictions
- universal sousveillance
- ocean habitat
- subterranean habitat
- antarctic habitat
- space habitat
"Grid of risk"
Superweapons | Biotechnology | Nanotechnology | Cognitive technology |
---|---|---|---|
Nuclear weapons | Hybrid/designed diseases | Arms races | Neural implants |
Solar weapons | Full-fledged hyperdisease | ’Grey goo’ | Wireheading |
Kinetic weapons | Artificial life (nanobiotech) | Thermal limit | Superintelligence |
Source: Comprehensive List of Existential Risks, Part 2 by Michael Anissimov