With very advanced technology, a very large population of people living happy lives could be sustained in the accessible region of the universe. For every year that development of such technologies and colonization of the universe is delayed, there is therefore an opportunity cost: a potential good, lives worth living, is not being realized. Given some plausible assumptions, this cost is extremely large. However, the lesson for utilitarians is not that we ought to maximize the pace of technological development, but rather that we ought to maximize its safety, i.e. the probability that colonization will eventually occur.
Read the full paper:
More episodes at:
An Introduction to Nick Bostrom
Start here for a deep dive into his ideas, including: existential risk, the ethics of AI, transhumanism, and wise philanthropy.Listen and subscribe →