The renowned British scientist Stephen Hawking died in 2018, but he’s still talking to us. His parting words to humanity have just been released, in the form of recorded excerpts from his last book, Brief Answers to Big Questions. Here’s what he has to say.
First, we have to stop ignoring climate change or we will be doomed, said Hawking. “A rise in ocean temperature would melt the ice caps and cause the release of large amounts of carbon dioxide,” Hawking said. “Both effects could make our climate like that of Venus with a temperature of 250C.”
Hawking was not optimistic that humanity will have the collective will to organize the meaningful solution required to stop the devastation a warming planet would bring. He saw the election of and subsequent actions taken by Donald Trump as pushing environmental destruction even more rapidly in the direction of no return. He suggested that our only hope for survival— or at least, for a few of us to survive — is to colonize a different planet. Elon Musk’s aerospace company SpaceX expects to begin that very process, with the first shipment of human colonists arriving on Mars by 2024. We shall see how that goes.
As for the rest of the species that live on Earth, Hawking said the odds are not in their favor, and we could ultimately follow them to extinction. “We are in danger of destroying ourselves by our greed and stupidity. We cannot remain looking inwards at ourselves on a small and increasingly polluted and overcrowded planet,” he said in an interview with Larry King in 2010. In 2016, King talked with him again, and Hawking said, “We certainly have not become less greedy or less stupid,” Hawking said. “Six years ago, I was worrying about pollution and overcrowding. They have gotten worse since then.”
OK, doomed. So what else did he want us to know about? Killer robots. Hawking believed that artificial intelligence is probably going to turn out badly. Machines with superhuman intelligence will be able to destroy humans with weapons that “we cannot even understand,” he wrote. “It’s tempting to dismiss the notion of highly intelligence machines as mere science fiction, but this would be a mistake, and potentially our worst mistake ever.”
At the Web Summit technology conference in Lisbon, Portugal, in November 2017, Hawking said, “Unless we learn how to prepare for, and avoid, the potential risks, AI could be the worst event in the history of our civilization. It brings dangers, like powerful autonomous weapons, or new ways for the few to oppress the many. It could bring great disruption to our economy.”
Hawking believed that a mercenary AI force is the inevitable outcome of increasing machine learning capabilities, combined with human fallibility and inability to create responsible boundaries for our most dangerous creations. He was one of the thousands of scientists who signed a letter asking the international community to ban autonomous weapons in 2015.