By Tristan Greene
Oxford philosopher and founding director of the Future of Humanity Institute Nick Bostroms latest research paper seems to indicate our species could be on a collision course with a technology-fueled super villain.
Will a deranged lunatic soon have the capabilities to take the entire world hostage? Can our nations leaders do anything to stop this inevitable tragedy? Will the caped crusader rescue his sidekick before the Jokers sinister trap springs?
In the paper, titled The Vulnerable World Hypothesis, Bostrom posits the whole of human technological achievement can be viewed as a giant urn filled with balls that we pull out each time we invent something. Some of the balls, says Bostrom, are white (good), most are gray (neutral), but so far none have been black (apparently eradicates civilizations re: Pandoras Box). Bostrom says:
What if there is a black ball in the urn? If scientific and technological research continues, we will eventually reach it and pull it out. Our civilization has a considerable ability to pick up balls, but no ability to put them back into the urn. We can invent but we cannot un-invent. Our strategy is to hope that there is no black ball.
Thats a terrible strategy. And thats probably why Bostroms put his considerable mental faculties to work on the new paper, a work in progress that explores some concepts that can help us think about the possibility of a technological black ball, and the different forms that such a phenomenon could take.
Put succinctly, Bostroms ultimate reckoning for the Vulnerable World Hypothesis (VWH) is:
If technological development continues then a set of capabilities will at some point be attained that make the devastation of civilization extremely likely, unless civilization sufficiently exits the semi-anarchic default condition.
Anyone else catch the episode of the 1998 science fiction anthology TV show The Outer Limits called Final Exam? In it, a college student wreaks havoc on those whove wronged him when he demonstrates hes discovered cold fusion, and can make nukes with it. The plot centers around the inevitability of technology, showing that even if we stop one evil genius from discovering and using something horrible someone else will figure it out.
Bostrom points this out in his paper he says if we assume theres at least one black ball in the urn, then we must also assume someones going to pull it out one day. He predicts this could play out in a number of ways easy nukes, even worse climate change, and a War Games style paradigm where the worlds super powers realize that whoever strikes first will be the sole survivor, are among those hypothesized in the paper.
But the scariest part isnt how well all be destroyed; its how well have to prevent it from happening. Bostrom outlines four potential possibilities for achieving stabilization, or ensuring we dont make ourselves obsolete with our own technology. Theyre terrifying:
- Restrict technological development.
- Ensure that there does not exist a large population of actors representing a wide and recognizably human distribution of motives.
- Establish extremely effective preventive policing.
- Establish effective global governance.
In other words, all we need to do is stop Google, get everyone in agreement on our collective morals, create an ubiquitous surveillance state, and establish a one-world government.
Its worth pointing out that Bostrom isnt endorsing the view as correct hes a philosopher, they come up possibilities and probabilities, but theres nothing proving his hypothesis is right. Though, as he puts it, it would seem to me unreasonable, given the available evidence, to be at all confident that VWH is false.
And, as for the scary list above, Bostrom advises weighing the pros against the cons:
A threshold short of human extinction or existential catastrophe would appear sufficient. For instance, even those who are highly suspicious of government surveillance would presumably favour a large increase in such surveillance if it were truly necessary to prevent occasional region-wide destruction. Similarly, individuals who value living in a sovereign state may reasonably prefer to live under a world government given the assumption that the alternative would entail something as terrible as a nuclear holocaust.
If youre in the mood to face our species mortality head on, you can read the entire paper here on Bostroms website. Its a work in progress, but its a fascinating portrayal of our imminent doom. And well worth the horrifying read.
This article was previously published on The Next Web.