"Brain Chips and Other Dreams of the Cyber-Evangelists"

Chronicle of Higher Education, June 3, 2005

At times, I confess, I yearn for a brain chip. Dissatisfied with the sluggish, aging, three-pound lump of neurons that nature bequeathed me, I fantasize that a surgeon has drilled a hole in my cranium and installed a Neuromorphic Adaptive Quantum Nanoprocessor in my cortex. Its features would include a WIFI Internet linkup and an artificial-pundit program customized to reflect my rhetorical and intellectual idiosyncracies.

Instead of agonizing over this essay, I'd let my brain chip do the work. I'd mentally specify the essay's topic, target audience, word count, and tone (settings: mildly skeptical to viciously snarky). My brain chip would scour cyberspace for relevant readings, distill the mass of data and opinion into a nifty 2,000-word essay, and beam it to my editor--all in less time than it takes my "real" self to type this period. "I" could finally make a decent living as a freelancer.

Brain chips are only one of many technologies that could allow us to transcend our natural limits, but they appeal to those who consider genetic or pharmaceutical enhancement too subtle and slow. Think of the difference between the films Gattaca, whose genetically souped-up characters resembled supermodels with high IQ's, and The Matrix, in which everyone sported brain jacks. Brain chips could, in principle, allow us to download digitized knowledge of kung fu or helicopter navigation directly into our memory banks, like The Matrix characters. We could also control our computers and toaster ovens with our thoughts; communicate with other chip-equipped people, not in our current tedious, one-word-at-a-time fashion but broadband; and exchange virtual fluids with ultratalented "sexbots."

Such sci-fi scenarios are imminent, if we are to believe recent books like Digital People, Citizen Cyborg, I, Cyborg, and Flesh and Machines. The tone of the books varies from sober to silly, but their perspectives overlap enough to form a distinct genre, which we might call cyber-evangelism. The basic theme is that science is on the verge of bringing about an astounding merger of machine and man. I say "man" advisedly: All the authors are men, and their infatuation with technology has a male cast. The major disagreement among the authors concerns how far we will go in embracing what Sidney Perkowitz, a physicist at Emory University and author of Digital People, calls "neurobionics." Some cyber-evangelists believe that we will eventually abandon our flesh-and-blood selves and become entirely artificial--like Hollywood starlets, but even more so.

Not everyone is thrilled by the prospect of cyborgs. Those fuddy-duddies on President Bush's Council on Bioethics have fretted that the capacity to download textbooks directly into the human brain could undermine students' work ethic. (Oh, the horror.) What if someone hacks into your brain chip to read your thoughts, or to control you, as in the recent remake of The Manchurian Candidate? And won't neurobionics deepen the gap between haves and have-nots?

The bioethicist James Hughes, of Trinity College, nonetheless contends that the benefits of neurobionics far outweigh the risks. We could minimize potential problems, he argues in Citizen Cyborg, by establishing a benign, global government that made brain chips available to everyone and regulated their use. To ensure that cyborgs behaved, for example, the government would test them for moral decency; those who failed would have "morality chips" installed.

Hughes is executive director of the World Transhumanist Association, whose members favor transcendence of our biological limits. Transhumanists enjoy debating issues like cryonic preservation: After you die, should you have your whole body frozen for revival after science has solved the problem of death--or will your head alone suffice?

Hughes also proposes equipping dolphins and monkeys with brain chips so that we can communicate with them. You would think someone who entertains such notions would be a fun guy, and perhaps Hughes is in person. But Citizen Cyborg has the deadly earnestness of an Al Gore white paper on toxic waste. Hughes wants us to take this cyborg stuff very, very seriously.

Those who find Hughes too dry may prefer the flamboyant--albeit relentlessly self-aggrandizing--authorial persona of Kevin Warwick, a professor of cybernetics at the University of Reading who has transformed himself into a kind of neurobionic performance artist. In I, Cyborg, a masterpiece of naïve, unwittingly comic narration, Warwick recounts how in 2002 he persuaded a surgeon to implant a chip in his forearm and another chip in the forearm of his hapless wife, Irena. As the surgeon pushed the chip into Irena's incision, "she remained brave," Warwick recalls, "shrieking on a couple of occasions when it was particularly painful."

After the implantations, when Warwick made a fist, his chip picked up the minute electrical surge in his arm and sent a signal to his wife's chip, which buzzed her. She then flexed her hand, and he felt "a beautiful, sweet, deliciously sexy charge."

Of course the Warwicks could have achieved an equivalent intimacy with vibrating cellphones; the fact that the chips were embedded in their bodies made no functional difference. Warwick nonetheless calls his stunt "the most incredible scientific project imaginable, one that is sure to change, incalculably, humankind and the future." We must begin asking ourselves, Warwick says, how to "deal with the possibility of superhumans." The real question, another British scientist remarked in Discover magazine, is whether Warwick is a buffoon, who actually believes his own hype, or merely a charlatan.

Unlike Warwick, Ray Kurzweil is an accomplished authority in the fields of computer science and artificial intelligence; his many inventions include the first computer-based reading machine for the blind. But his worldview is if anything even wackier than Warwick's. In his manifesto The Age of Spiritual Machines, Kurzweil predicts that within a couple of decades, computers will become fully conscious and autonomous, and will begin rapidly evolving in unpredictable directions. Borrowing a term that refers to black holes and other phenomena that strain physics theories to the breaking point, Kurzweil calls that event "the singularity."

Rather than passively allowing machines to leave us in the cognitive dust, we will have the option of digitizing our personalities and "uploading" them into computers, where we can live forever as software programs. That vision has been spelled out previously by others--notably the roboticist Hans Moravec, in Mind Children: The Future of Robot and Human Intelligence (Harvard University Press, 1988) and Robot: Mere Machine to Transcendent Mind (Oxford University Press, 1999)--but Kurzweil's faith is especially fervent.

In his recent Fantastic Voyage: Live Long Enough to Live Forever, written with a physician, Kurzweil advises us how to stay alive until uploading becomes possible. His regimen calls for exercising and meditating; eating organic vegetables and meat; drinking alkaline water (to keep the blood from being acidic); taking nutritional supplements (Kurzweil swallows 250 pills a day); and, of course, having injections of pineal cells culled from hydroponically grown fetuses (just kidding).

Halfway into Flesh and Machines, by Rodney A. Brooks, director of the Computer Science and Artificial Intelligence Laboratory at the Massachusetts Institute of Technology, I thought finally common sense might prevail. Brooks points out that--notwithstanding the precipitous increase in computer power over the past few decades--AI has in many respects been a failure. No existing computer remotely resembles HAL, the smooth-talking, homicidal machine that was by far the most interesting, complex chararacter in the film 2001. Computers cannot even recognize faces in natural settings, which we can do effortlessly. Our fastest, most sophisticated machines still lack some mysterious, fundamental quality--which Brooks calls "juice"--that biological systems possess.

So where does Brooks's candid acknowledgement of his field's shortcomings leave him, vis-à-vis uploading and other neurobionic scenarios? He thinks uploading will eventually be possible, just not soon enough for Kurzweil or anyone else alive today. "I think we are all going to die eventually," Brooks boldly ventures.

But like Kevin Warwick, he believes that within a decade or two we will transform ourselves with brain-machine interfaces. "We will be superhuman in many respects. And through our thought-mediated connections to cyberspace, we will have access to physical control of our universe, just with our thoughts."

What neither Brooks nor any other cyber-evangelist considers in any depth is the implicit assumption of all their scenarios, that the brain is a digital computer. According to that view, the minute "action potentials" emitted by individual nerve cells are analogous to the electrical pulses that represent information in computers, and just as computers operate according to a machine code, so action potentials are arranged according to a "neural code." Given the right interface and knowledge of the neural code, brains and computers should be able to communicate as easily as iMacs and PC’s.

If a neural code exists, however, neuroscientists have no idea what it is. They cannot explain how the brain achieves even rudimentary feats of cognition, like my ability to recall Neo's final battle with Agent Smith in The Matrix. Such cognition may depend not only on action potentials but also on other processes at larger or smaller scales. No one knows.

Moreover, my brain almost certainly represents Neo with a pattern of activity quite unlike yours. Not only is each person's code probably idiosyncratic, the product of his or her unique biology, but our individual codes may also constantly evolve in response to new experiences. For all those reasons, some neuroscientists suspect that uploading, downloading, telepathic conversations, and other scenarios that involve precise reading and manipulation of thoughts may never be possible--no matter how far brain-chip technology advances.

That view is corroborated by the slow progress of research on so-called neural prostheses, which replace or supplement capacities lost because of damage to the nervous system. Artificial retinas, light-sensitive chips that mimic the eye's signal-processing ability and stimulate the optical nerve or visual cortex, have been tested in a handful of blind subjects, but most have been able to see nothing more than phosphenes, or bright spots. A few paralyzed patients have learned to control a computer cursor"merely by thinking," as the media invariably put it, [though the control is not telekinetic but] via implanted electrodes that pick up the patients' neural signals--but communicating that way remains slow and unreliable.

The only truly successful neural prosthesis is the artificial cochlea. More than 50,000 hearing-impaired people have been equipped with those devices, which restore hearing by feeding signals from an external microphone to the auditory nerve. But as Michael Chorost makes clear in his memoir, Rebuilt: How Becoming Part Computer Made Me More Human, artificial cochleas are far from perfect. Hard of hearing since childhood, Chorost was getting by with conventional hearing aids when he suddenly went totally deaf in 2001. In Rebuilt—which is by far the most original, honest, and authoritative book I've read on human-machine interfaces--he recounts how he was equipped and learned to live with an artificial cochlea.

Although Chorost was grateful for the device, which restored some semblance of normality to his social life, he notes that it is a crude simulacrum of our innate auditory system. Artificial cochleas generally require a breaking-in period, during which technicians tweak the device's settings to optimize its performance. With that assistance, the brain learns how to make the most of the peculiar, artificial signals. Even then, the sound quality is often poor, especially in noisy settings. Chorost still occasionally relies on lip reading and contextual guessing to decipher what someone is saying to him. Some people are never able to use artificial cochleas, for reasons that are not clearly understood.

Chorost's experience leaves him both impressed with the ingenuity of scientists and cognizant of how little they really know about how the brain works. He thus deplores the "over-weening techno-optimism" underlying the predictions of Warwick and others that neurobionics will eventually give us supernatural capacities. "We are a long way from understanding our own brains well enough to implant devices in them to enhance our mental functioning." Chorost suspects that the prophesies of Warwick et al. have less to do with science than with the perennial human desire to transcend the loneliness and pain of the human condition.

Indeed, now and for the forseeable future, cyber-evangelism is best understood as an escapist, quasi-religious fantasy, which reflects an oddly dated, Jetsons-esque faith in scientific progress and its potential to cure all that ails us. Even those cyber-evangelical books published well after September 11, 2001, and the end of the dot-com boom echo the hysterical techno-optimism of the late 1990s. At their best, they raise some diverting questions: Would you rather live in a pleasant virtual world, or in an unpleasant real one? Would cyber-sex satisfy you? Would we still be recognizably human if we were immortal, or had IQ's over 1,000, or were immune to pain?

But I felt my cognitive-dissonance alarm clanging whenever I reminded myself of the issues that preoccupy most mature adults these days: terrorism, overpopulation, poverty, environmental degradation, AIDS and other diseases, and all the pitfalls of ordinary life.

I try to forget this vale of tears myself now and then by reading books like William Gibson’s Neuromancer or watching movies like The Matrix. But I also try not to confuse science fiction with science.

Selected Works

Books
McSweeney's Books, 2012.
With Reverend Frank Geer. Edited and with an Introduction by Robert Hutchinson. Brown Trout, 2002. Royalties go to Help the Afghan Children Inc.
Misc. Writings
Review of The Beginning of Infinity, by David Deutsch, Wall Street Journal, July 20, 2011
Review of The Information by James Gleick, Wall Street Journal, March 1, 2011
Review of "The Moral Landscape" by Sam Harris, Globe and Mail, Oct. 8, 2010.
Review of "What Darwin Got Wrong," by Jerry Fodor and Massimo Piattelli-Palmarini, Philadelphia Inquirer, June 27, 2010
Review of "The Shallows," by Nicholas Carr, Wall Street Journal, June 4, 2010.
Article in Slate, Aug. 4, 2009
Review of "Year Million," Wall Street Journal, June 13, 2008
Neuroscientific critique of the Singularity, IEEE Spectrum Magazine, June 2008.
Article in Discover Magazine, April 2008
Review of biographies of Einstein by Walter Isaacson and Jurgen Neffe, Chronicle of Higher Education, May 4, 2007.
A report on "mystical technologies" for inducing religious experiences, Slate, April 26, 2007.
Q&A with Francis Collins, Director of the Human Genome Project, National Geographic, February 2007.
Article on scientific explanations of religious experiences, Discover, December 2006.
Tenth-anniversay update of The End of Science for Discover, October 2006.
Review of The Jasons: The Secret History of Science's Postwar Elite, by Ann Finkbeiner, New York Times Book Review, April 16, 2006.
Essay in the Chronicle of Higher Education, April 7, 2006.
Essay in the New York Times Book Review, January 1, 2006.
Review of The Republican War on Science by Chris Mooney. New York Times Book Review, December 18, 2005
Profile of Jose Delgado, a pioneer of brain implants, Scientific American, October 2005.
An essay inspired by the Centennial of Einstein's revolutionary papers on relativity and quantum mechanics. New York Times, August 12, 2005
Researchers have found evidence for the controversial "grandmother-cell" theory. Discover, June 2005.
An essay in the Chronicle of Higher Education, June 3, 2005
An essay published in the Chronicle of Higher Education, November 26, 2004.
An essay published in the New York Times, December 12, 2004
Cover story for Discover Magazine, October 2004.
A critique of Buddhism, published online by Slate (slate.msn.com) February 12, 2003.
Published in Discover Magazine, February 2003. A profile of the Harvard psychiatrist John Halpern and his five-year study of peyote use by members of the Native American Church.
An essay published in the New York Times, December 31, 2002.
An essay published on the oped page of the New York Times Christmas Day, 2002.
A list of articles written for Scientific American and other publications.
Outtakes from Rational Mysticism (published here only)
An account of Horgan's efforts to achieve satori in a Zen class.
A profile of the German anthropologist and authority on shamanism Christian Ratsch.
A profile of Diana Alstad and Joel Kramer, authors of The Guru Papers.
A profile of the Benedictine monk Brother David Steindl-Rast.
A profile of the British Buddhist Stephen Batchelor.
A profile of the guru Andrew Cohen, founder of What Is Enlightenment?, with digressions on Yogi Bhajan and Amrit Desai.