News & Politics » Essay

Reading: The Missing Link

Computers may be the next evolutionary step. But do we want them to direct our future?

by

comment

After decades of undergraduate "physics for poets" classes, the gulf between literature and science is as wide as ever. The division of literary intellectuals and scientists into two hostile, mutually uncomprehending camps has been a topic for debate ever since C.P. Snow first called attention to the rift in his 1959 book The Two Cultures. Perhaps nothing more is at work than a mundane principle well-known to physicists and poets alike, but never discussed on Nova specials: science is boring and hard. To master it beyond the level of pastiche you have to memorize thousands of chemical formulas, immerse yourself in abstruse mathematics, take apart a fruit fly cell by cell. It's no accident that ever since the Manhattan Project, Big Science has been a creature of the military, the one institution perfectly suited to joining the most tedious of means to the most elephantine of ends. Literature, on the other hand--reading novels, picking out the Christ figures, limning the homoerotic subtexts--now that's fun.

Yet many see the divide as a world-historical paradigm clash. Snow blamed the literati; their willful ignorance of science was just the resentment of ancien regime aestheticism toward the confident industrial technocracy of tomorrow. But to the literati, the scientist's reductionist mindset can seem unsophisticated, even juvenile. Consider Richard Feynman, the Nobel Prize-winning physicist and author of the best-selling pop-science manifesto Surely You're Joking, Mr. Feynman. Feynman's memoir captivated the public with its portrait of the Nobel laureate as bongo-pounding beatnik--smoking grass, sitting in with a samba band in Rio, telling off the stuffed shirts at the Royal Academy. But beneath the hipster stylings you find a shallow thinker, whose philosophy of life boiled down to the happy aphorism "You don't have to be responsible for the world that you're in" (this from a man who helped build the atom bomb). If Feynman had the brilliance to discern simple laws beneath the muddle of physical reality, he also had the immaturity to oversimplify the social and psychological realms. All his life he retained the 14-year-old's conviction that mastering adulthood is a matter of learning a secret code, of deducing a few hidden, arbitrary mechanisms. Hence his strange preoccupations--with cracking safes, with uncovering the tricks of psychics and other "fakers," and always, always with picking up chicks (his secret: don't buy them drinks until they promise to sleep with you)--all puzzles to be unraveled just as he had deciphered the mysteries of quantum electrodynamics. For Feynman, life was an eternal adolescence, an eternal flight from responsibility, an eternal quest for an equation to illuminate what his lack of intuition and empathy hid from him.

Many contemporary debates--evolutionists versus creationists, techno-utopians versus neo-Luddites--follow the fault lines separating the "two cultures," and Snow's thesis still has its enthusiasts. One such is John Brockman, a self-employed literary agent and columnist for Wired magazine. In his new collection of interviews, The Third Culture, Brockman declares that the war is over and the scientists have won. The reading public, he insists, has tired of the facile verities of art and literature, and hungers for deeper insights into the human condition--insights that only science can provide. He celebrates an emerging breed of scientists who fill the void by writing their very own books, addressing a mass readership without the mediation of the disdainful literary clique. These "third culture thinkers"--researchers in evolutionary biology, cosmology, cognitive science, and artificial intelligence--are the true "public intellectuals" of our day, the creators of "a new set of metaphors to describe ourselves, our minds, the universe, and all the things we know in it." According to Brockman, today's cutting-edge science is relevant, popular, and, most importantly, profitable.

Brockman has a lot riding on his claim. He's a major purveyor of these texts, having made a career out of matching up scientists with publishers eager to cash in on the pop-science boomlet. It's a demographically tempting strategy, aimed at a middlebrow audience of intellectually insecure but financially upscale professionals. But while his schemes have netted Brockman many fat commissions, they have paid off less handsomely for the publishers who bankroll them. According to Eric Konigsberg's recent article in the New Republic, many of Brockman's clients are notorious for handing in unreadable manuscripts that rarely manage to recoup their hefty advances. Despite their intellectual pretensions, pop-science books must bow to the vagaries of the mass market, where celebrity worship goes a lot farther than good writing or potent metaphors. This lesson is dramatically underscored by the genre's monster hit, physicist Stephen Hawking's A Brief History of Time. Like Feynman, Hawking scaled the best-seller lists thanks to an elaborately contrived personality cult, selling a million hardcover copies of a dull catalog of space-time oddities mostly on the strength of his own peculiar anticharisma. Confined to a wheelchair, communing with the primal forces of the universe yet unable to contact the outside world except by computer, Hawking's performance as a wizened Internet oracle mesmerized the public.

Such are the imponderables of fame in a popular culture that thinks of DNA only as the nemesis that pursued O.J. through the tabloids. Most scientists, of course, never achieve the kind of crossover success that Hawking and Feynman enjoyed. To judge by the tone of The Third Culture, their frustration has curdled into a paranoid vision of themselves as a dissident samizdat. Instead of blaming their marginality on public apathy or their own lack of animal magnetism, they scapegoat another equally marooned sector of the intelligentsia, conjuring up what paleontologist Stephen Jay Gould calls "a conspiracy among literary intellectuals to think they own the intellectual landscape."

We never learn exactly who the conspirators are. Brockman doesn't name names, or go beyond vague imprecations against "traditional intellectuals" who "dismiss science" and won't admit that "a 1950s education in Freud, Marx, and modernism" doesn't get much mileage on the information superhighway. Unlike Snow, who criticized some actual modernists like Yeats, Eliot, and Lawrence, Brockman steers clear of specifics. He can hardly do otherwise. There is no cohesive literary movement nowadays of the sort that Snow castigated; and just as modernism was at least as infatuated with technology as repelled by it, today's elite "intellectual landscape" as a whole--academics, pundits, politicians, and, yes, many novelists--obsess over science and technology, in tones ranging from the resigned to the messianic. Granted, your average English-lit professor may not have much to say about mass extinctions or plate tectonics. But to find an authentically antiscience voice today one must turn to the Unabomber's pastoralist ramblings, as far from modernism as they are from literature.

Brockman's critique is so tangential to real scientific and literary discourse because its dragged along by a larger agenda. As rootless info-capitalism triumphs over ideology, it requires a new paradigm to justify its rule, and the "third culture" has one made to order. It's actually a rather old idea--Darwinian natural selection--dressed up with an attractive cybername--"complex adaptive systems." Darwin's great idea--to eliminate the divine central planner from creation--seems fresher than ever in the deregulatory 90s; simply update an old trope like "survival of the fittest" with an information-age vocabulary, and you have Gingrichism in a nutshell. That's why Brockman's "scientific" metaphors must do battle with generic signifiers of the second wave, as if words like "modernism," "Freud," and "Marx" might suppress insurgent third wave words like "complexity," "Darwin," and "Bill Gates."

In its new guise, evolutionary theory has colonized regions of science far beyond its original territory of snail collections and dinosaur bones. The structure of atoms, the workings of the mind, the arrangements of galaxies, the intricacies of human society--all are examples of complex adaptive systems, shaped through random mutation and competition. It works like a downsizing virtual corporation, breaking up the world's most complicated and grandiose aspects and outsourcing them to ephemeral coalitions. The cosmic organizing principle is the invisible hand.

Those seeking a careful exposition of contemporary science should look elsewhere. The book is structured as a series of short summaries of research by individual scientists, followed by even briefer comments and rejoinders by colleagues. Some (notably the psychologist Steven Pinker) present an engaging account of their work in a few pages. Many others bog down in vaporous musings on the splendor of nature, empty logrolling ("He's one of my favorite physicists"), or personal recrimination ("That quote captures his arrogance"). With all the hand waving, the airy conjectures, and the flurries of backslapping and backstabbing, reading this book is like being trapped in a faculty lounge.

The section on standard evolutionary biology offers the only sophisticated discussion of evolution's philosophical underpinnings. Ultra-Darwinists like Richard Dawkins and George C. Williams evoke a Platonic distinction between the ideal and the material, between genetic blueprints that persist through the aeons and their fleeting instantiation in plants and animals, soon to wither into ghostly fossils. Evolution is just the gene's struggle to make more copies of itself by manipulating the material world around it. Theirs is a cannily cyber-friendly view. Dawkins, in his new book River Out of Eden, likes to compare DNA to elegantly engineered computer code: "Life is just bytes and bytes and bytes of digital information" flowing from parents to offspring just as a computer file flits back and forth between Web site, disk drive, and hard copy. Fittingly, the hottest field in evolution nowadays is "artificial life"--computer programs that mimic natural selection and "evolve" new digital life forms. Dawkins describes his epiphany with an early artificial-life program in his 1986 book The Blind Watchmaker: "With a wild surmise, I began to breed, generation after generation, from whichever child looked most like an insect....I still cannot conceal from you my feeling of exultation as I first watched these exquisite creatures emerging before my eyes. I distinctly heard the triumphal opening chords of 'Also Sprach Zarathustra' (the 2001 theme) in my mind. I couldn't eat, and that night 'my' insects swarmed behind my eyelids as I tried to sleep."

Here is evolution unshackled from the plodding feet of carbon-based creatures, racing ahead on wings of silicon.

As the evolutionary metaphor roams further afield it begins to untether itself from reality. The section on psychology includes Steven Pinker's convincing account of the genetic basis of human language, but also the philosopher Daniel Dennett's bizarre theory of language as a Darwinian competition between hundreds of mental "demons" who propose utterances at random, with the most appropriate sentence winning out. The discussion finally floats into outer space when natural selection is proposed as an explanation for the origin of the universe. You may have thought that there was only one universe to select from, but these theoreticians believe there are really multitudes of universes, which give birth to yet more universes by way of bangs big and small. Lee Smolin argues that one world might spawn another through a black hole, thereby causing the laws of physics to mutate unpredictably; universes that form the most black holes thus leave the most offspring. Alan Guth even speculates that physicists might in principle create a universe "in their own backyard," a notion that moved Wired magazine's Rudy Rucker to ponder the Big Question: "Might this lead to a Silicon Valley industry?"

Sterile conjectures like these hardly add up to a Star Trek episode, let alone a zeitgeist. The Third Culture merely reconfigures Darwin's hoary conceptual apparatus, elevating the law of the jungle to a bloodless abstraction. What's new is the rhetoric. Those harsh apposites, survival and extinction, give way to neutral, holistic conceits like complexity and information theory. Natural selection, once the scourge of the unfit, becomes a harmonizer that transforms chaos into intelligible order. This flaccid terminology appeals to a New Age, antiscientistic sensibility--after all, what could be further from "reductionism" than "complexity"? But it also signals a crucial shift in methodology, an abandonment of analysis in favor of exhaustive description by way of computer simulation, the preferred tool of postmodern science. Here the medium truly is the message--in Brockman's words, "As we create tools, we recreate ourselves in their image." The important thing isn't the ideas, it's the simulations themselves, modeling everything from the primordial soup to the stock market, ever bigger, ever faster, insinuating ever more hypnotically that deep truths lie buried in the entrails of the computer. Biologist Christopher Langton even suggests that "the universe as we know it is an artifact in a computer in a more 'real' universe."

A metaphor must mobilize a social base if it is to launch a cultural revolution. In this sense, the world-as-information-processor version of evolution really does have revolutionary potential, since it expresses the authentic consciousness of a small but potent vanguard of cyber-entrepreneurs, entertainment executives, and media-lab professors. Let's call it the Wired Paradigm. Back in 1959, Snow characterized scientific culture as leftish, proletarian, and pragmatically focused on the welfare of the common man; sadly, the Wired Paradigm has reversed these correlations. Physicist Lee Smolin denounces artists and writers for being "caught in the trap of Nietzsche, playing with death and violence and negativity," but we need only turn a few pages to find a vision of the future far bleaker than any Nietzsche ever imagined. Consider the musings of physicist-turned-stock-market-speculator J. Doyne Farmer, who explains that "humans aren't the endpoint" of evolution: "By the year 2025...we're likely to have computers whose raw processing power exceeds that of the human brain. Also, we're likely to have more computers than people. It's difficult to realistically imagine a world of cyberintelligences and superintelligent humanlike beings. It's like a dog trying to imagine general relativity. But I think such a world is the natural consequence of adaptive complex systems....As for myself, I'm just going along, trying to stay sane, raise my children, and make a living."

In other words, humanity will survive, if it does survive, only as an appendage of machine intelligence. Maybe Smolin views this prospect as a totally positive one. If, as Brockman claims, history shows that "only a small number of people have done the serious thinking for everybody else," he and his friends can be forgiven for identifying with the "superintelligent humanlike beings" who will inherit the earth. The scenario is an intoxicating distillation of the elitism and will to power that pervade the discourse of techno-futurism; it's made all the more compelling because the shape of things to come is supposed to be "unimaginable," a complete break with past human experience, hence beyond the reach of historical explanation or political critique.

Farmer's sunny nihilism, his Strangelovian prophecies, and his knee-jerk disclaimer of responsibility ("I'm just going along...") are all typical of the public intellectuals showcased here. So is the premise that the forced march of microprocessors into every corner of life is somehow natural, spontaneous, and self-organizing, as if there were some deep urge on the part of silicate deposits to dope themselves with gallium and assemble into circuit boards. This stance ignores the institutional provenance of computers, the midwifery of huge military and corporate bureaucracies, the long-articulated desire of managers to replace recalcitrant human beings with fully programmable workers and consumers. The Third Culture instead urges an attitude of befuddled passivity in the face of technological change; it ends up pleading with us to try not to understand science.

Evolution has long been invoked to buttress the most retrograde essentialisms of race and sex, to prove that every social pathology from date rape to the cold war is written into our genes. As The Third Culture shows, Darwin's legacy also includes a diverging tendency to embrace change for the sake of change, to reject the possibility of permanence and stability in human affairs, to celebrate--as Richard Dawkins puts it--"the evolution of evolvability." But Darwinism's conflicting impulses are linked by their common denial of human agency; neither biological determinism nor nihilism leave room for people to consciously conform their actions to a larger conception of the world. What makes natural selection such a good account of biological forms--the insight that conscious design plays no role in evolution--is precisely what makes it so inadequate an account of society. Nature doesn't work from a master plan, but people do. We always seek some vantage point--a religion, a politics, an aesthetic--from which to discover a pattern to shape our seemingly haphazard lives. The ambiguous task of human consciousness is to hold a course between order and chaos, one eye drawn to heaven, the other fixed on our stumbling feet. Charting that course--through the most complex of all terrains, human civilization--has been the business of poets, prophets, and revolutionaries throughout history. Someday the scientists may catch up.

The Third Culture by John Brockman, Simon and Schuster, $27.50.

Art accompanying story in printed newspaper (not available in this archive): illustration/Kevin Kurtz.

Add a comment