Support First Things by turning your adblocker off or by making a  donation. Thanks!

We all know the story. Whether we’ve seen it at the movies, heard it on the radio, or read it in a book, we’ve all been exposed at one point or another to a variation on the same dystopian nightmare: the attack of the machines.

One version plays out far away, in some cold, dark corner of the galaxy, when a space station’s mainframe computer eventually decides that its human cargo has become a nuisance and something it needs getting rid of. An epic battle ensues¯man versus machine¯and the humans realize they’ve put themselves at a huge disadvantage. Not only are the computers faster and smarter, they have also taken control of every aspect of their masters’ lives. Eating, breathing, sleeping¯even thinking¯were gradually handed over to the all-powerful, ever-present computers for the sake of convenience. Now that the space travelers have become completely dependent on their own creations, however, they’ve also become redundant, unnecessary, obsolete.

This scenario¯set in a remote corner of the universe and filmed on a soundstage in Hollywood¯may sound like a joke, but Richard Dooling, in his latest nonfiction book Rapture for the Geeks: When AI Outsmarts IQ , isn’t laughing (too hard). Instead, Dooling, a novelist and screenwriter best known for his National Book Award—finalist novel White Man’s Grave , has written an engaging romp that takes a serious yet satirical look at the prospect of a world run and ruled by computers.

This matters more than you might think. Google and NASA announced this past week that they are teaming up with futurist inventor Ray Kurzweil to found Singularity University on NASA’s Silicon Valley Campus. The institute takes its name from a concept known as “the Singularity”¯the point in time at which computer intelligence surpasses human intelligence. In his controversial 2005 book The Singularity is Near , Kurzweil argues that by 2040 “our civilization will be billions of times more intelligent . . . [and that] by the end of this century, the nonbiological portion of our intelligence will be trillions of trillions times more powerful than unaided human intelligence.” Kurzweil’s supreme confidence in the progress of technology has led him to believe that “we will reverse engineer the human brain and upload our consciousness to machines, all before 2040.” As Dooling puts it, Kurzweil “is probably the smartest man alive who truly believes in his own immortality.” And this man was just given millions of dollars by some of the most reputable institutions in the country so that his vision of the future might be advanced.

Still, for most of us, the idea that computers could take over the world is easy to shake. There is, after all, a reason these stories play out in remote corners of the universe and distant futures. It’s hard to imagine a computer¯one that can’t even figure out that an email with the subject line “X@nax 4 Free” is spam¯is capable of taking over anything.

Dooling, however, is quick to point out how a technological takeover wouldn’t necessarily involve one big, malevolent machine with a creepy voice and a quick temper. Instead, computers could do the equivalent of ruling the world, if humans became completely dependent on them for their own survival:

Eventually, a stage may be reached at which the decisions necessary to keep [society] running will be so complex that human beings will be incapable of making them intelligently. At that stage the machines will be in effective control. People won’t be able to just turn the machines off, because they will be so dependent on them that turning them off would amount to suicide.

Dooling doesn’t reveal, until later, that this scary prediction was made by Theodore Kaczynski, who, “in addition to being the barking-mad Unabomber in an aluminum-foil hat, was also a bloodhound when it came to scenting all of the future terrors of technology.” Kaczynski certainly was a madman, but as G.K. Chesterton once pointed out, “the madman is not the man who has lost his reason. The madman is the man who has lost everything except his reason.”

We already have a world swayed by the ebb and flow of the stock market, bound by the limits of our electric grids, and tethered to the nearest network connection. And Kaczynski, in his own twisted but logical way, realized that such a world will be increasingly dependent on¯and therefore vulnerable to¯the complicated technologies that tie the whole system together. This is the attack of the machines that we don’t need a science-fiction movie to imagine, because, in a sense, it’s already begun.

We catch the first glimpses of the onslaught in our children, who can’t imagine a world without cell phones, microwaves, PCs, and Internet connections. For them, these modern-day marvels are no longer mere conveniences¯things they appreciate because they know how life was without them. Instead, these technologies have become the default, the way life is .

And that’s just cell phones and microwave ovens¯things we could, in theory, go without. What about those technologies we have actually come to depend on for our survival? If the sophisticated tractors that can work hundreds of thousands of acres, or the trains and trucks that transport all of our food, were to stop working, we’d realize rather quickly how machines have, quite literally, taken over our lives.

But there is also another version of the attack of the machines narrative, and it’s one that requires a greater stretch of the imagination. It goes something like this: Computer technology has been progressing at an alarming rate ever since someone had a bright idea and strung a few vacuum tubes together. Computations that once took days and a whole room of equipment can now be performed in fractions of a second on a chip smaller than a postage stamp.

In fact, computer technology has progressed so quickly that something known as Moore’s Law¯every two years you can get twice as much circuitry running at twice the speed for the same price¯has yet to break down, more than forty years after Moore first made his famous prediction.

This constant advance has led some scientists, filled with an overwhelming faith in technology and progress, to believe that, at some point in time, computer technology will surpass human intelligence. That point is known as the Singularity, and it will completely reverse the human—computer, master—slave relationship. Once computers obtain super-human intelligence, they’ll be able to progress exponentially on their own, without the need for human guidance or control. Basically, computers will take over the world, because they’re just plain smarter than we are.

This is the version of the attack of the machines that might raise eyebrows, but not too many alarms. We all know that computers are getting faster every time we turn around. We don’t need a mathematical model to see that our cell phones, laptops, and televisions date themselves quicker than women’s fashion. And it’s also not surprising that computers can store and recall more information in their circuits than we can in our brains¯we who consistently forget to take the out trash or pay the cable bill.

In short, we already know that computers are doing incredible things, performing calculations that a human being couldn’t dream of doing on his own. But comparing human intelligence to computer intelligence is a matter, not of degree, but of kind. Even if computers could interact with us in a way that made us think we were talking to someone , and not something ¯a milestone sometimes known as the Turing Test¯we would still be talking to a computer, just one with incredibly sophisticated hardware and software.

In other words, consciousness and free will can’t be programmed into a machine, because¯just as the word program implies¯when you program something, you tell it exactly what to do. There are limits to science and technology, and¯however reluctant an overly confident scientific community is to admit it¯creating consciousness and free will is one of those limits.

So what are we to make of these scenarios? Do we pack our bags, turn off the lights, head for the hills, and wait for our dependence on technology to finally catch up with us? Or should we all start taking self-defense classes, preparing for the day when machines finally have had enough of us and decide to go on a rampage?

These miss the point. Humans have depended on technology for survival ever since we planted our first seeds and picked up our first club. It’s a fact of life. Technology can be used to do a lot of good or a lot of evil¯but it ultimately depends on human agents.

In the end, Dooling’s Rapture for the Geeks is an engaging and funny look at some of our most eccentric theories about the future. But, more important, Dooling also gives these theories the historical context they deserve. Using examples from Faust to Freud and from Ovid to Oppenheimer, Dooling shows that our modern-day futurists are really part of a long human tradition¯“the instinct to always and everywhere overreach.” As Oppenheimer wrote after the first atomic bomb was detonated:

We thought of the legend of Prometheus, of that deep sense of guilt in man’s new powers, that reflects his recognition of evil, and his long knowledge of it. We knew that it was a new world, but even more we knew that novelty itself was a very old thing in human life, that all our ways are rooted in it.

So instead of worrying about the day when computers try to take over the world, we ought to worry about the day when humans, using the latest technology, attempt to do the same. As our lives become more and more integrated with and dependent on technology, and as computers gain potential for advancing both good and evil, that’s the attack of the machines that won’t have to be filmed on a backlot in Hollywood.

It will unfold in our own backyards.

Ryan Sayre Patrico is a junior fellow at First Things .

References

Rapture for the Geeks: When AI Outsmarts IQ by Richard Dooling

The Singularity Is Near: When Humans Transcend Biology by Ray Kurzweil


Comments are visible to subscribers only. Log in or subscribe to join the conversation.

Tags

Loading...

Filter Web Exclusive Articles

Related Articles