You are on page 1of 9

HE HUGE CULTURAL authority sci- ence has acquired over the past century imposes' large duties on every scientist. Scientists have acquired the power to im- press and intimidate every time they open their mouths, and it is their responsibil- ity to keep this power in mind no matter

what they say or do. Too many have forgotten their obligation to approach with due respect the scholarly, artistic, religious, humanistic work that has always been mankind's main spiritual support. Scientists are (on average) no more likely to understand this work than the man in the street is to understand quantum physics. But science used to know enough to approach

DAVID GELERNTER is a professorof'computerscience at Yale. His book Subjectivism: The Mind from Inside will be published by Norton later this year.

Commentar y

1 7

* ^ p

cautiously and admire from outside, and to build its own work on a deep belief in human dignity. No longer.

BELITTLING HUMANITY. Today science and the "philosophy of mind"—its thoughtful assistant, which is sometimes smarter than the boss—are threat- ening Western culture with the exact opposite of hu-

manism. Call it roboticism. Man is the measure of all things, Protagoras said. Today we add, and computers are the measure of all men.

Many scientists are proud of having booted man off his throne at the center of the universe and reduced him to just one more creature—an especially annoying one—in the great intergalactic zoo. That is their right. But when scientists use this locker-room braggadocio to belittle the human viewpoint, to belittle human life and values and virtues and civilization and moral, spir- itual, and religious discoveries, which is all we human beings possess or ever will, they have outrun their own empiricism. They are abusing their cultural standing. Science has become an international bully.

Nowhere is its bullying more outrageous than in its assault on the phenomenon known as subjectivity. Your subjective, conscious experience is just as real as the tree outside your window or the photons striking your retina—even though you alone feel it. Many philosophers and scientists today tend to dis- miss the subjective and focus wholly on an objective, third-person reality—a reality that would be just the same if men had no minds. They treat subjective real- ity as a footnote, or they ignore it, or they announce that, actually, it doesn't even exist.

If scientists were rat-catchers, it wouldn't mat- ter. But right now, their views are threatening all sorts of intellectual and spiritual fields. The present problem originated at the intersection of artificial intelligence and philosophy of mind—in the question of what con- sciousness and mental states are all about, how they work, and what it would mean for a robot to have them. It has roots that stretch back to the behaviorism of the early 20th century, but the advent of computing lit the fuse of an intellectual crisis that blasted off in the 1960s and has been gaining altitude ever since.

BULLYING NAG EL. The modern "mind fields" encompass artificial intelligence, cognitive psycholo- gy, and philosophy of mind. Researchers in these fields are profoundly split, and the chaos was on display in the ugliness occasioned by the publication of Thomas Nagel's Mind & Cosmos in 2012. Nagel is an eminent philosopher and professor at NYU. In Mind & Cosmos, he shows with terse, meticulous thoroughness why mainstream thought on the workings of the mind is

intellectually bankrupt. He explains why Darwinian evolution is insufficient to explain the emergence of consciousness—the capacity to feel or experience the world. He then offers his own ideas on consciousness, which are speculative, incomplete, tentative, and pro- vocative—in the tradition of science and philosophy.

Nagel was immediately set on and (symbolically) beaten to death by all the leading punks, bullies, and hangers-on of the philosophical underworld. Attack- ing Darwin is the sin against the Holy Ghost that pious scientists are taught never to forgive. Even worse, Na- gel is an atheist unwilling to express sufficient hatred of religion to satisfy other atheists. There is nothing religious about Nagel's speculations; he believes that science has not come far enough to explain conscious- ness and that it must press on. He believes that Darwin is not sufficient.

The intelligentsia was so furious that it formed a lynch mob. In May 2013, the Chronicle of Higher Edu- cation ran a piece called "Where Thomas Nagel Went Wrong." One paragraph was notable:

Whatever the validity of [Nagel's] stance, its

timing was certainly bad. The war between New Atheists and believers has become savage, with Richard DawMns writing sentences like, "I have described atonement, the central doctrine of Christianity, as vicious, sadomasochistic, and repellent We should also dismiss it as barking

mad ...."

In that climate, saying anything nice at

all about religion is a tactical error.

It's the cowardice of the Chronicle's statement that is alarming—as if the only conceivable response to a mass attack by killer hyenas were to run away. Nagel was assailed; almost everyone else ran.

THE KURZWEIL CULT. The voice most strong- ly associated with what I've termed roboticism is that of Ray Kurzweil, a leading technologist and inventor. The Kurzweil Cult teaches that, given the strong and ever-increasing pace of technological progress and change, a fateful crossover point is approaching. He calls this point the "singularity." After the year 2045 (mark your calendars!), machine intelligence will dominate human intelligence to the extent that men will no longer understand machines any more than potato chips understand mathematical topology. Men will already have begun an orgy of machinification— implanting chips in their bodies and brains, and fine- tuning their own and their children's genetic material. Kurzweil believes in "transhumanism," the merging of men and machines. He believes human immortality is

just around the corner. He works for Google. Whether he knows it or not, Kurzweil believes in and longs for the death of mankind. Because if things work out as he predicts, there will still be life on Earth, but no human life. To predict that a man who lives forever and is built mainly of semiconductors is still a man is like predicting that a man with stainless steel skin, a small nuclear reactor for a stomach, and an IQ. of 10,000 would still be a man. In fact we have no idea what he would be.

Each change in him might be defended as an improvement, but man as we know him is the top growth on a tall tree in a large for- est: His kinship with his parents

and ancestors and mankind at large, the experience of seeing his own reflection in human history and his fellow man—those things are the crucial part of who he is. If you make him grossly different, he is lost, with no reflection anywhere he looks. If you make lots of people grossly different, they are all lost together—cut adrift from their forebears, from human history and human experience. Of course we do know that whatever these creatures are, untransformed men will be unable to keep up with them. Their superhuman intelli- gence and strength will extinguish

each case, sane persons are apt to intervene before the plan reaches completion.

SUBJECTIVITY.

Subjectivity is your private expe- rience of the world: your sensa- tions; your mental life and inner landscape; your experiences of sweet and bitter, blue and gold, soft and hard; your beliefs, plans, pains, hopes, fears, theories, imag- ined vacation trips and gardens and girlfriends and Ferraris, your sense of right and wrong, good and evil. This is your subjective world. It is just as real as the objective physical world.

This is why the idea of objec-

tive reality

is a masterpiece of West-

ern thought—an idea we associate with Galileo and Descartes and

other scientific revolutionaries of the 17th century. The only view of the world we can ever have is sub- jective, from inside our own heads. That we can agree nonetheless on the observable, exactly measur- able, and predictable characteris- tics of objective reality is a remark- able fact. I can't know that the color I call blue looks to me the same way it looks to you. And yet we both use the word blue to describe this color, and common sense suggests that

Whether he knows it or not, Ray Kurzweil believes in and longs for the death of mankind. To predict that a ma n w h o lives forever and

is built mainly of

semiconductors is still

  • mm is like predicting

that

a ma n

wit h

stainless steel skin, a nuclear reactorfor a stomach, and an IQ

of 10,00 0 would still be a man. W e have no idea

wha t he woul d be .

slaves or

your experience of blue is probably a lot like mine. Our ability to tran- scend the subjective and accept the existence of objective reality is the cornerstone of every- thing modern science has accomplished.

mankind as we know it, or reduce men to

dogs. To wish for such a development is to play dice

with the universe.

Luckily for mankind, there is (of course) no rea- son to believe that brilliant progress in any field will continue, much less accelerate; imagine predicting the state of space exploration today based on the events of 1960-1972. But the real flaw in the Kurzweil Cult's sickening predictions is that machines do just what we tell them to. They act as they are built to act. We might in principle, in the future, build an armor-plated robot with a stratospheric IQthat refuses on principle to pay attention to human beings. Or an average dog lover might buy a German shepherd and patiently train it to rip him to shreds. Both deeds are conceivable, but in

But that is not enough for the philosophers of mind. Many wish to banish subjectivity altogether. "The history of philosophy of mind over the past one hundred years," the eminent philosopher John Searle has written, "has been in large part an attempt to get rid of the mental"—i.e., the subjective—"by showing that no mental phenomena exist over and above physi- cal phenomena."

Why bother? Because to present-day philoso- phers, Searle writes, "the subjectivist ontology of the mental seems intolerable." That is, your states of mind (your desire for adventure, your fear of icebergs, the ship you imagine, the girl you recall) exist only

Commentar y

1 9

subjectively, within your mind, and they can be examined and evalu- ated by you alone. They do not exist objectively. They are strictly inter- nal to your own mind. And yet they do exist. This is intolerable! How in this modern, scientific world can we be forced to accept the existence of things that can't be weighed or measured, tracked or photo- graphed—that are strictly private, that can be observed by exactly one person each? Ridiculous! Or at least, damned annoying.

And yet your mind is, was, and will always be a room with a view. Your mental states exist in- side this room you can never leave and no one else can ever enter. The world you perceive through the window of mind (where you can never go—where no one can ever go) is the objective world. Both worlds, inside and outside, are real.

The ever astonishing Rainer Maria Rilke captured this truth vividly in the opening lines of his eighth Duino Elegy, as translated by Stephen Mitchell: "With all its eyes the natural world looks out/into the Open. Only our eyes are turned

backward

....

We

know what is re-

ally out there only from/the ani- mal's gaze." We can never forget or disregard the room we are locked into forever.

brain that "embodies" it. Yet the brain's structure is different from the mind's. The brain is a dense tangle of neurons and other cells in which neurons send electrical sig- nals to other neurons downstream via a wash of neurotransmitter chemicals, like beach bums splash- ing each other with bucketfuls of water.

Two wholly different struc- tures, one embodied by the other— this is also a precise description of computer software as it relates to computer hardware. Software has its own structure and laws (soft- ware being what any "program" or "application" is made of—any email program, web search engine, photo album, iPhone app, video game, anything at all). Software consists of lists of instructions that are given to the hardware—to a digital computer. Each instruction specifies one picayune operation on the numbers stored inside the computer. For example: Add two numbers. Move a number from one place to another. Look at some number and do this if it's 0.

Large lists of tiny instruc- tions become complex mathemati- cal operations, and large bunches of those become even more so- phisticated operations. And pretty

Your mind is, was, and will always

be a room wit h a

view.

Your mental states exist inside this room

you can never leave and no one else can ever enter. The world you perceive through the

window of mind

(wher e you

ca n

never go—where no one can ever go) is the objective world. Both worlds, inside and outside, are real.

soon your application is sending spacemen hurtling across your screen firing lasers at your avatar as you pelt the aliens with tennis balls and chat with your friends in Idaho or Algiers while sending notes to your girlfriend and keeping an eye on the comic-book news. You are swimming happily within the rich coral reef of your software "environment," and tfce tiny in- structions out of which the whole thing is built don't matter to you at all. You don't know them, can't see them, are wholly unaware of them.

The gorgeously varied reefs called software are a topic of their own—just as the mind is. Software and computers are two different topics, just as the psy- chological or phenomenal study of mind is different from brain physiology^ Even so, software cannot-ex- ist without digital computers, just as minds cannot exist without brains.

THE BRAIN AS COMPUTER. The dominant, main- stream view of mind nowadays among philosophers and many scientists is computationalism, also known as cognitivism. This view is inspired by the idea that minds are to brains as software is to computers. "Think of the brain," writes Daniel Dennett of Tufts University in his influential 1991 Consciousness Explained, "as a computer." In some ways this is an apt analogy. In oth- ers, it is crazy. At any rate, it is one of the intellectual milestones of modern times.

How did this "master analogy" become so influ- ential? Consider the mind. The mind has its own struc- ture and laws: It has desires, emotions, imagination; it is conscious. But no mind can exist apart from the

That is why today's mainstream view of mind is based on exactly this analogy: Mind is to brain as soft- ware is to computer. The mind is the brain's software— this is the core idea of computationalism. Of course computationalists don't all think alike. But they all believe in some version of this guiding analogy. Drew McDermott, my colleague in the com- puter science department at Yale University, is one of the most brilliant (and in some ways, the most hetero- dox) of computationalists. "The biological variety of computers differs in many ways from the kinds of com- puters engineers build," he writes, "but the differences are superficial." Note here that by biological computer, McDermott means brain.

McDermott believes that "computers can have minds"—minds built out of software, if 'the software is correctly conceived. In fact, McDermott writes, "as far

as science is concerned, people are just a strange kind

of animal that arrived fairly late on the scene

purpose ...

[My]

.... is to increase the plausibility of the hypoth-

esis that we are machines and to elaborate some of its

consequences."

John Heil of Washington University describes cognitivism this way: "Think about states of mind as something like strings of symbols, sentences." In other words: a state of mind is like a list of numbers in a com- puter. And so, he writes, "mental operations are taken to be 'computations over symbols.'" Thus, in the cogni- tivist view, when you decide, plan, or believe, you are computing, in the sense that software computes.

BESMIRCHING CONSCIOUSNESS. Butwhat about consciousness? If the brain is merely a mecha- nism for thinking or problem-solving, hbw does it cre- ate consciousness? Most computationalists default to the Origins of Gravy theory set forth by Walter Matthau in the film of Neil Simon's The Odd Couple. Challenged to account for the emergence of gravy, Matthau explains that, when you cook a roast, "it comes." That is basically how consciousness arises too, according to computational- ists. It just comes. In Consciousness Explained, Dennett lays out the essence of consciousness as follows: "The concepts of computer science provide the crutches of imagina- tion we need to stumble across the terra incognita between our phenomenology as we know it by 'intro- spection' and our brains as science reveals them to us." (Note the chuckle-quotes around introspection; for Dennett, introspection is an illusion.) Specifically:

"Human consciousness can best be understood as the operation of a Von Neumannesque' virtual machine." Meaning, it is a software application (a virtual ma-

chine) designed to run on any ordinary computer. (Hence von Neumannesque: the great mathematician John von Neumann is associated with the invention of the digital computer as we know it.) Thus consciousness is the result of running the right sort of program on an organic computer also called the human brain. If you were able to download the right app on your phone or laptop, it would be con- scious, too. It wouldn't merely talk or behave as if it were conscious. It would be conscious, with the same sort of rich mental landscape inside its head (or its pro- cessor or maybe hard drive) as you have inside yours:

the anxious plans, the fragile fragrant memories, the ability to imagine a baseball game or the crunch of dry leaves underfoot. All that just by virtue of running the right program. That program would be complex and sophisticated, far more clever than anything we have today. But no different fundamentally, say the compu- tationalists, from the latest video game.

I HE FLAWS. But the master analogy—between mind and software, brain and computer—is fatally flawed. It falls apart once you mull these simple facts:

  • 1. You can transfer a program easily from one

computer to another, but you can't transfer a mind, ever, from one brain to another.

  • 2. You can run an endless series of different pro-

grams on any one computer, but only one "program" runs, or ever can run, on any one human brain.

  • 3. Software js transparent. I can read off .the

precise state of the entire program at anytime. Minds are opaque—there is no way I can know what you are thinking unless you tell me.

  • 4. Computers can be erased; minds cannot.

  • 5. Computers can be made to operate precisely as

we choose; minds cannot. There are more. Come up with them yourself. It's easy. There is a still deeper problem with computa-

tionalism. Mainstream computationalists treat the mind as if its purpose were merely to act and not to be. But the mind is for doing and being. Computers are machines, and idle machines are wasted. That is not true of your mind. Your mind might be wholly quiet, doing ("computing") nothing; yet you might be feel- ing miserable or exalted, or awestruck by the beauty of the object in front of you, or inspired or resolute—and such moments might be the center of your mental life. Or you might merely be conscious. "I cannot see what flowers are at my feet,/Nor what soft incense hangs

upon the boughs

Darkling I listen

...."

That was

.... drafted by the computer known as John Keats.

Emotions in particular are not actions but states

of being. And emotions are central to your mental life and can shape your behavior by allowing you to com- pare alternatives to determine which feels best. Jane Austen, Persuasion: "He walked to the window to rec- ollect himself, and feel how he ought to behave." Henry James, The Ambassadors: The heroine tells the hero, "no one feels so much as you. No—not any one." She means that no one is so precise, penetrating, and sym- pathetic an observer. Computationalists cannot account for emotion. It fits as badly as consciousness into the mind-as-soft- ware scheme.

THE BODY AND THE MIND. And there is (at least) one more area of special vulnerability in the com- putationalist worldview. Computationalists believe that the mind is embodied by the brain, and the brain is simply an organic computer. But in fact, the mind is em- bodied not by the brain but by the brain and the body, intimately interleaved. Emotions are mental states one feels physically; thus they are states of mind and body simultaneously. (Angry, happy, awestruck, relieved— these are physical as well as mental states.) Sensations are simultaneously mental and physical phenomena. Wordsworth writes about his memories of the River Wye: "I have owed to them/In hours of weariness, sensations sweet,/Felt in the blood, and felt along the heart / And passing even into my purer mind ..."

Where does the physical end and the mental be- gin? The resonance between mental and bodily states is a subtle but important aspect of mind. Bodily sensations bring about mental states that cause those sensations to change and, in turn, the mental states to develop further. You are embarrassed, and blush; feeling yourself blush,

your embarrassment increases. Your blush deepens. "A smile of pleasure lit his face. Conscious of that smile, pie] shook his head disapprovingly at his own state." (Tol- stoy.) As Dmitry Merezhkovsky writes brilliantly in his classic Tolstoy study, "Certain feelings impel us to cor- responding movements, and, on the other hand, certain habitual movements impel to the corresponding mental

states

....

Tolstoy,

with inimitable art, uses this convert-

ible connection between the internal and the external."

All such mental phenomena depend on some- thing like a brain and something like a body, or an ac- curate reproduction or simulation of certain aspects of the body. However hard or easy you rate the problem of building such a reproduction, computing has no wisdom to offer regarding the construction of human- like bodies—even supposing that it knows something about human-like minds.

I cite Keats or Rilke, Wordsworth, Tolstoy, Jane Austen because these "subjective humanists" can tell

us, far more accurately than any scientist, what things are like inside the sealed room of the mind. When sub- jective humanism is recognized (under some name or other) as a school of thought in its own right, one of its characteristics will be looking to great authors for information about what the inside of the mind is like. To say the same thing differently: Computers are information machines. They transform one batch of information into another. Computationalists often describe the mind as an "information processor." But feelings are not information! Feelings are states of be- ing. A feeling (mild wistfulness, say, on a warm sum- mer morning) has, ordinarily, no information content at all. Wistful is simply a way to be. Let's be more precise: We are conscious, and consciousness has two aspects. To be conscious of a thing is to be aware of it (know about it, have informa- tion about it) and to experience it. Taste sweetness; see turquoise; hear an unresolved dissonance—each feels a certain way. To experience is to be some way, not to do some thing. The whole subjective field of emotions, feelings, and consciousness fits poorly with the ideology of com- putationalism, and with the project of increasing "the plausibility of the hypothesis that we are machines." Thomas Nagel: "All these theories seem insuffi- cient as analyses of the mental because they leave out something essential!' (My italics.) Namely? "The first-per- son, inner point of view of the conscious subject: for ex- ample, the way sugar tastes to you or the way red looks or anger feels." All other mental states (not just sensations) are left out, too: beliefs and desires, pleasures and pains, whims, suspicions, longings, vague anxieties; the mental sights, sounds, and emotions that accompany your read- ing a novel or listening to music or daydreaming.

FUNGTIONALISM. How could such important things be left out? Because functionalism is today's dominant view among theorists of the mind, and func- tionalism leaves them out. It leaves these dirty boots on science's back porch. Functionalism asks, "What does it mean to be, for example, thirsty?" The answer:

Certain events (heat, hard work, not drinking) cause the state of mind called thirst. This state of mind, together with others, makes you want to do certain things Qike take a drink). Now you understand what "I am thirsty" means. The mental (the state of thirst) has not been written out of the script, but it has been reduced to the merely physical and observable: to the weather, and what you've been doing, and what ac- tions (take a drink) you plan to do. But this explanation is no good, because "thirst" means, above all, that you feel thirsty. It is a way of

being. You have a particular sen- sation. (That feeling, in turn, ex- plains such expressions as "I am thirsty for knowledge," although this "thirst" has nothing to do with the heat outside.)

And yet you can see the seduc- tive quality of functionalism, and why it grew in prominence along with computers. No one knows how a computer can be made to feel anything, or whether such a thing is even possible. But once feeling and consciousness are eliminated, creating a computer mind becomes much easier. Nagel calls this view "a heroic triumph of ideological theory over common sense."

Some thinkers insist other- wise. Experiencing sweetness or the fragrance of lavender or the burn of anger is merely a bio- chemical matter, they say. Certain neurons fire, certain neurotrans- mitters squirt forth into the inter- neuron gaps, other neurons fire and the problem is solved: There is your anger, lavender, sweetness.

There are two versions of this idea: Maybe brain activity causes the sensation of anger or sweet- ness or a belief or desire; maybe, on the other hand, it just is the sensation of anger or sweetness- sweetness is certain brain events in the sense that water is H 3 0.

But

how

do

those

brain

preferences, habits, and character- istic moods. Is it possible to sup- pose (just suppose) that he is in fact a zombie? By zombie, philosophers mean a creature who looks and behaves just like a human being, but happens to be unconscious. He does everything an ordinary per- son does: walks and talks, eats and sleeps, argues, shouts, drives his car, lies on the beach. But there's no one home: He (meaning it) is actually a robot with a computer for a brain. On the outside he looks like any human being: This robot's behavior and appearance are won- derfully sophisticated.

No evidence makes you doubt that your best friend is human, but suppose you did ask him: Are you human? Are you conscious? The ro- bot could be programmed to answer no. But it's designed to seem human, so more likely its software makes an answer such as, "Of course I'm hu- man, of course I'm conscious!—talk about stupid questions. Are you con- scious? Are you human, and not half- monkey? Jerk."

So that's a robot zombie. Now imagine a "human" zombie, an organic zombie, a freak of na- ture: It behaves just like you, just like the robot zombie; it's made

Where does the physical end and the mental begin? Bodily sensations bring about mental states that cause those sensations

to change and, in turn, the mental states develop further. 'A smile of pleasure

lit his face. Conscious of that smile,

[he] shook his head disapprovingly at his

o w n state.'(Tolstoy.)

of flesh and blood, but it's uncon- scious. Can you imagine such a creature? Its brain would in fact be just like a computer: a complex control system that makes this creature speak and act exactly like a man. But it feels nothing and is conscious of nothing.

events bring about, or translate into, subjective mental states? How is this amazing trick done? What does it even mean, precisely, to cross from the physical to the mental realm?

THE ZOMBIE ARGUMENT. Understanding subjective mental states ultimately comes down to un-

derstanding consciousness. And consciousness is even trickier than it seems at first, because there is a serious, thought-provoking argument that purports to show us

that consciousness is not just mysterious but superflu-

ous. It's called the Zombie Argument. It's a thought ex-

periment that goes like this:

Imagine your best friend. You've know him for years, have had a million discussions, arguments, and deep conversations with him; you know his opinions,

Many philosophers (on both sides of the argument about software minds) can indeed imagine such a crea- ture. Which leads them to the next question: What is con- sciousness/or? What does it accomplish? Put a real hu- man and the organic zombie side by side. Ask them any questions you like. Follow them over the course of a day or a year. Nothing reveals which one is conscious. (They both claim to be.) Both seem like ordinary humans.

So why should we humans be equipped with consciousness? Darwinian theory explains that nature selects the best creatures on wholly practical grounds, based on survivable design and behavior. If zombies

Commentar y

23

and humans behave the same way all the time, one group would be just as able to survive as the other. So why would nature have taken the trouble to invent an elaborate thing like consciousness, when it could have got off without it just as well?

Such questions have led the Australian philosopher of mind David Chalmers to argue that con- sciousness doesn't "follow logical- ly" from the design of the universe as we know it scientifically. Noth- ing stops us from imagining a universe exactly like ours in every respect except that consciousness does not exist.

Nagel believes that "our mental lives, including our sub- jective experiences" are "strong- ly connected with and proba- bly strictly dependent on physical events in our brains." But—and this is the key to understanding why his book posed such a dan- ger to the conventional wisdom in his field—Nagel also believes that explaining subjectivity and our conscious mental lives will take nothing less than a new scientific revolution. Ultimately, "conscious subjects and their mental, lives" are "not describable by the physi- cal sciences." He awaits "major

A world that is intimidated by science

and bored sick

with cynical, empty 'postmodernism'

desperately needs anewsubjectivist,

humanist, individualist

worldview. W e need

science and scholarship and art and spiritual life

to be fully human. The last three are withering, and almost no one understands the first.

THE IRON ROD. Innerbook,4&- of Mind, the novelist and es-

sence

sayist Marilynne Robinson writes that the basic assumption in every variant of "modern thought" is that "the experience and testimony of the individual mind is to be ex- plained away, excluded from con- sideration." She tells an anecdote about an anecdote. Several neuro- biologists have written about an American railway worker named Phineas Gage. In 1848, when he was 25, an explosion drove an iron rod right through his brain and out the other side. His jaw was shat- tered and he lost an eye; but he recovered and returned to work, behaving just as he always had— except that now he had occasional rude outbursts of swearing and

blaspheming, which (evidently) he had never had before.

Neurobiologists want to show that particular personal- ity traits (such as good manners) emerge from particular regions of the brain. If a region is destroyed, the corresponding piece of per- sonality is destroyed. Your mind is thus the mere product of your genes and your brain. You have nothing to do with it, because there is no subjective, individual you. "You" are what you say and

scientific advances," "the creation of new concepts" before we can understand how consciousness works. Physics and biology as we understand them to- day don't seem to have the answers.

On consciousness and subjectivity, science still has elementary work to do. That work will be done cor- rectly only if researchers understand what subjectivity is, and why it shares the cosmos with objective reality. Of course the deep and difficult problem of why consciousness exists doesn't hold for Jews and Chris- tians. Just as God anchors morality, God's is the view- point that knows you are conscious. Knows and cares:

Good and evil, sanctity and sin, right and wrong pre- suppose consciousness. When free will is understood, at last, as an aspect of emotion and not behavior—we are free just insofar as we feel free—it will also be seen to depend on consciousness.

do. Your inner mental world either doesn't exist or doesn't matter. In fact you might be a zombie; that wouldn't matter either.

Robinson asks: But what about the actual man Gage? The neurobiologists say nothing about the fact that "Gage was suddenly disfigured and half blind, that he suffered prolonged infections of the brain," that his most serious injuries were permanent. He was 25 years old and had no hope of recovery. Isn't it possible, she asks, that his outbursts of angry swearing meant just what they usually mean—that the man was enraged and suffering? When the brain scientists tell this story, writes Robinson, "there is no sense at all that [Gage] was a human being who thought and felt, a man with a singular and terrible fate."

Man is only a computer if you ignore everything that distinguishes him from a computer.

24

The Closing of the Scientific Mind: January 2014

THE CLOSING OFTHE SCIENTIFIC MIND.

That science should face crises in the early 21st century is inevitable. Power corrupts, and science today is the Catholic Church around the start of the 16th century:

used to having its own way and dealing with heretics by excommunication, not argument.

Science is caught up, also, in the same educa- tional breakdown that has brought so many other proud fields low. Science needs reasoned argument and constant skepticism and open-mindedness. But our leading universities have dedicated themselves to stamping them out—at least in all political areas. We routinely provide superb technical educations in science, mathematics, and technology to brilliant un- dergraduates and doctoral students. But if those same students have been taught since kindergarten that you are not permitted to question the doctrine of man- made global warming, or the line that men and women are interchangeable, or the multiculturalist idea that all cultures and nations are equally good (except for Western nations and cultures, which are worse), how will they ever become reasonable, skeptical scientists? They've been reared on the idea that questioning of- ficial doctrine is wrong, gauche, just unacceptable in polite society. (And if you are president of Harvard, it can get you fired.)

Beset by all this mold and fungus and corrup- tion, science has continued to produce deep and bril- liant work. Most scientists are skeptical about their own fields and hold their colleagues to rigorous stan- dards. Recent years have seen remarkable advances in experimental and applied physics, planetary explora- tion and astronomy, genetics, physiology, synthetic materials, computing, and all sorts of other areas.

But we do have problems, and the struggle of subjective humanism against roboticism is one of the most important. The moral claims urged on man by Judeo-Chris- tian principles and his other religious and philosophical traditions have nothing to do with Earth's being the cen- ter of the solar system or having been created in six days, or with the real or imagined absence of rational life else- where in the universe. The best and deepest moral laws we know tell us to revere human life and, above all, to be human: to treat all creatures, our fellow humans and the world at large, humanely. To behave like a human being (Yiddish: mensch) is to realize our best selves.

No other creature has a best self. This is the real danger of anti-subjectivism, in an age where the collapse of religious education among Western elites has already made a whole generation morally wobbly. When scientists casually toss our human-centered worldview in the trash with the used

coffee cups, they are re-smashing the sacred tablets, not in blind rage as Moses did, but in casual, ignorant indifference to the fate of mankind. A world that is intimidated by science and bored sick with cynical, empty "postmodernism" desper- ately needs a new subjectivist, humanist, individualist worldview. We need science and scholarship and art and spiritual life to be fully human. The last three are withering, and almost no one understands the first. The Kurzweil Cult is attractive enough to require opposition in a positive sense; alternative futures must be clear. The cults that oppose Kurzweilism are called Judaism and Christianity. But they must and will evolve to meet new dangers in new worlds. The central text of Judeo-Christian religions in the tech- threatened, Googleplectic West of the 21st century might well be Deuteronomy 30:19: "I summon today as your witnesses the heavens and the earth: I have laid life and death before you, the blessing and the curse; choose life and livel—yon are your children." The sanctity of life is what we must affirm against Kurzweilism and the nightmare of roboticism. Judaism has always preferred the celebration and sanctification of this life in this world to eschatologi- cal promises. My guess is that 21st-century Christian thought will move back toward its father and become increasingly Judaized, less focused on death and the afterlife and more on life here today (although my Christian friends will dislike my saying so). Both reli- gions will teach, as they always have, the love of man for man—and that, over his lifetime (as Wordsworth writes at the very end of his masterpiece, The Prelude), "the mind of man becomes/A thousand times more beautiful than the earth / On which he dwells." At first, roboticism was just an intellectual school. Today it is a social disease. Some young people want to be robots (I'm serious); they eagerly await elec- tronic chips to be implanted in their brains so they will be smarter and better informed than anyone else (ex- cept for all their friends who have had the same chips implanted). Or they want to see the world through computer glasses that superimpose messages on poor naked nature. They are terrorist hostages in love with the terrorists. All our striving for what is good and just and beautiful and sacred, for what gives meaning to hu- man life and makes us (as Scripture says) "just a little lower than the angels," and a little better than rats and cats, is invisible to the roboticist worldview. In the ro- boticist future, we will become what we believe our- selves to be: dogs with iPhones. The world needs a new subjectivist humanism now—not just scattered pro- tests but a growing movement, a cry from the heart. S*