Stephen Hawking’s final writings predict that a race of superhumans would take control, having surpassed their fellow creatures through genetic engineering.
Hawking makes no apologies in Brief Answers to the Big Questions, which will be published on Oct. 16 and excerpted in the UK’s Sunday Times (paywall).
Hawking issues a dire warning about the importance of regulating AI, noting that “in the future, AI may develop its own will, one that is at odds with ours.” A possible arms race over autonomous weapons should be halted before it begins, he writes, posing the question of what would happen if a weapons-related crash occurred on a par with the 2010 stock market Flash Crash. He goes on:
In summary, the development of superintelligent artificial intelligence would either be the finest or worst thing that has ever happened to humanity. The true danger posed by AI is not malice, but competence. A superintelligent AI will excel at achieving its objectives, and if those objectives conflict with ours, we’re in big trouble.
You’re probably not a nasty ant-hater who deliberately walks on ants, but if you’re in charge of a hydroelectric green-energy project and an anthill is in the area to be flooded, the ants are out of luck. Let us not put humans in the ant’s shoes.
The grim future of the planet, gene editing, and superhumans
The bad news is that nuclear war or environmental catastrophe will “cripple Earth” at some time in the next 1,000 years. However, at that time, “our inventive people will have discovered a method to escape Earth’s surly ties and therefore survive the tragedy.” However, the Earth’s other species are unlikely to survive.
The individuals who do manage to flee Earth will very certainly be new “superhumans” who have mastered gene editing technologies such as CRISPR. They will do this by violating anti-genetic engineering laws, enhancing their memory, illness resistance, and life expectancy, he claims.
Hawking expresses an unusual amount of enthusiasm for this final argument, stating, “There is no time to wait for Darwinian evolution to improve our intelligence and character.”
Once such superhumans exist, huge political issues will arise with unimproved humans who will be unable to compete. They will almost certainly fade out or become irrelevant. Rather than that, there will be a race of self-designing beings that will always improve. If the human species succeeds in redesigning itself, it is likely that it will expand and colonize other worlds and stars.
Space-based intelligent life
Hawking recognizes that there are a variety of possible theories for why intelligent life has not been discovered or visited Earth. His forecasts here are less audacious, but he prefers to believe that humans have “missed” other types of sentient life.
Is there a God? Hawking asserts that this is not the case.
The debate is whether God decided the way the universe began for reasons we cannot comprehend or whether it was established by a scientific rule. The second, I believe. You may name the rules of science “God” if you like, but it would not be a personal God that you would meet and interrogate.
The Earth’s greatest dangers
The first threat is an asteroid impact, similar to the one that wiped off the dinosaurs. However, Hawking says, “we have no protection” against it. Climate change is a more imminent concern. “An increase in ocean temperature would result in the melting of the ice caps and the release of significant amounts of carbon dioxide,” Hawking says. “Both of these impacts have the potential to transform our climate into that of Venus, which has a surface temperature of 250 degrees Celsius.”