Tech Billionaires Need to Stop Trying to Make the Science Fiction They Grew Up on Real
Today’s Silicon Valley billionaires grew up reading classic American science fiction. Now they’re trying to make it come true, embodying a dangerous political outlook
This was written by Charles Stross, a sci-fi author whose work I tend to like. I kinda understand why he has reached the conclusion that he has… a *lot* of his work is heavily Lovecraftian, with the universe laden with horrible, horrible things. If you believe that the universe truly is filled with cosmic horrors just waiting at the edges for some fool to go poking at them, then of course you’re want to prevent people from pushing forward. You will, instead, live by this quote from Lovecraft himself (from “The Call of Cthulhu”):
“The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far. The sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the light into the peace and safety of a new dark age.”
But the thing is… those horrible things are out there anyway. You can’t hide from them. They’ll come and getcha. If we are to avoid colonizing space because “imperialism” is wrong, then we’ll be simply stepped on by the next imperialist species over. If you are afraid of the consequences of AI – and there are valid concerns – taking that tech away from Our Guys and leaving it in the hands of the likes of the Chinese Communists *guarantees* that some form of AI is going to come along and take a giant dump on us. If you want to stop “eugenics” because there’s been some bad history and because it could maybe lead to bad things, you’re stranding us in the reality we’re now facing of ongoing dysgenics which is *already* screwing society.
In particular the argument against space colonization is just vacuous and insane. The benefits are damn near infinite. The risk are comparatively minimal. If over the next millennium we lose a million habitats to a hard learning curve, taking with them a trillion lives… it will be a small price to pay to bring life to a trillion worldlets just within this single solar system.
Yeah, sci-fi provides warnings of potential bad futures. But it also provides innumerable examples of futures we *want* to bring about. Focusing solely on the dystopias of sci-fi is black-pilled doomerism at its worst. For every “1984” or “Brave New World” or “Star Trek Discovery” that show horrible worlds filled with horrible people living in horrible societies, there are “2001” and “Star Trek” and “Star Trek the Next Generation” and “Stargate SG-1” and “The Orville” and even “The Expanse.” The thing is… “bad” always sells better than “good,” because “bad” tends to have more interesting drama. Imagine any sort of plotline. What’ll be more interesting, or at least easier to write interesting: the story without some sort of villain or disaster, or the one where there aren’t such antagonists? A movie about, say, an architect designing and building his dream building, whatever it might happen to be, will almost certainly have competitors trying to sabotage it, or bureaucrats grinding it down, or local activists trying to stop it, or earthquakes, storms, floods, fires, asteroid impacts or alien invasions trying to trash it. So the fact that sci-fi – like *every* literary genre – includes Very Bad Things from time to time is no reason to avoid trying to see the best of sci-fi brought to life, anymore than heartbreak and rivals in romance stories are reasons to avoid trying to find love.