1. The Vulnerable World Hypothesis (2019)

By Nick Bostrom




Abstract:
Scientific and technological progress might change people’s capabilities or incentives in ways that would destabilize civilization. For example, advances in DIY biohacking tools might make it easy for anybody with basic training in biology to kill millions; novel military technologies could trigger arms races in which whoever strikes first has a decisive advantage; or some economically advantageous process may be invented that produces disastrous negative global externalities that are hard to regulate. This paper introduces the concept of a vulnerable world: roughly, one in which there is some level of technological development at which civilization almost certainly gets devastated by default, i.e. unless it has exited the ‘semi-anarchic default condition’. Several counterfactual historical and speculative future vulnerabilities are analyzed and arranged into a typology. A general ability to stabilize a vulnerable world would require greatly amplified capacities for preventive policing and global governance. The vulnerable world hypothesis thus offers a new perspective from which to evaluate the risk-benefit balance of developments towards ubiquitous surveillance or a unipolar world order.

Read the full paper:
https://nickbostrom.com/papers/vulnerable.pdf

More episodes at:
https://radiobostrom.com/

---

Outline:

(00:20) Abstract

(01:45) Policy Implications

(03:48) Is there a black ball in the urn of possible inventions?

(09:05) A thought experiment: easy nukes

(23:21) The vulnerable world hypothesis

(35:44) Typology of vulnerabilities

(35:54) Type-1 (‘easy nukes’)

(41:26) Type-2a (‘safe first strike’)

(52:27) Type-2b (‘worse global warming’)

(59:21) Type-0 (‘surprising strangelets’)

(01:10:11) Achieving stabilization

(01:11:28) Technological relinquishment

(01:20:10) Preference modification

(01:30:26) Some specific countermeasures and their limitations

(01:39:18) Governance gaps

(01:42:09) Preventive policing

(01:55:53) Global governance

(02:01:32) Discussion

(02:24:08) Conclusions

(02:29:16) References

(02:29:25) Author Information

Subscribe to Radio Bostrom

New to Bostrom? Subscribe to the Introduction as well, and start there.

An Introduction to Nick Bostrom

Start here for a deep dive into his ideas, including: existential risk, the ethics of AI, transhumanism, and wise philanthropy.

Listen and subscribe →