What Makes Us Moral

If the entire human species were a single individual, that person would long ago have been declared mad. The insanity would not lie in the anger and darkness of the human mind—though it can be a black and raging place indeed. And it certainly wouldn’t lie in the transcendent goodness of that mind—one so sublime, we fold it into a larger “soul.” The madness would lie instead in the fact that both of those qualities, the savage and the splendid, can exist in one creature, one person, often in one instant.

We’re a species that is capable of almost dumbfounding kindness. We nurse one another, romance one another, weep for one another. Ever since science taught us how, we willingly tear the very organs from our bodies and give them to one another. And at the same time, we slaughter one another. The past 15 years of human history are the temporal equivalent of those subatomic particles that are created in accelerators and vanish in a trillionth of a second, but in that fleeting instant, we’ve visited untold horrors on ourselves—in Mogadishu, Rwanda, Chechnya, Darfur, Beslan, Baghdad, Pakistan, London, Madrid, Lebanon, Israel, New York City, Abu Ghraib, Oklahoma City, an Amish schoolhouse in Pennsylvania—all of the crimes committed by the highest, wisest, most principled species the planet has produced. That we’re also the lowest, cruelest, most blood-drenched species is our shame—and our paradox.

The deeper that science drills into the substrata of behavior, the harder it becomes to preserve the vanity that we are unique among Earth’s creatures. We’re the only species with language, we told ourselves—until gorillas and chimps mastered sign language. We’re the only one that uses tools then—but that’s if you don’t count otters smashing mollusks with rocks or apes stripping leaves from twigs and using them to fish for termites.

What does, or ought to, separate us then is our highly developed sense of morality, a primal understanding of good and bad, of right and wrong, of what it means to suffer not only our own pain—something anything with a rudimentary nervous system can do—but also the pain of others. That quality is the distilled essence of what it means to be human. Why it’s an essence that so often spoils, no one can say.

Morality may be a hard concept to grasp, but we acquire it fast. A preschooler will learn that it’s not all right to eat in the classroom, because the teacher says it’s not. If the rule is lifted and eating is approved, the child will happily comply. But if the same teacher says it’s also O.K. to push another student off a chair, the child hesitates. “He’ll respond, ‘No, the teacher shouldn’t say that,'” says psychologist Michael Schulman, co-author of Bringing Up a Moral Child. In both cases, somebody taught the child a rule, but the rule against pushing has a stickiness about it, one that resists coming unstuck even if someone in authority countenances it. That’s the difference between a matter of morality and one of mere social convention, and Schulman and others believe kids feel it innately.

Of course, the fact is, that child will sometimes hit and won’t feel particularly bad about it either—unless he’s caught. The same is true for people who steal or despots who slaughter. “Moral judgment is pretty consistent from person to person,” says Marc Hauser, professor of psychology at Harvard University and author of Moral Minds. “Moral behavior, however, is scattered all over the chart.” The rules we know, even the ones we intuitively feel, are by no means the rules we always follow.

Where do those intuitions come from? And why are we so inconsistent about following where they lead us? Scientists can’t yet answer those questions, but that hasn’t stopped them from looking. Brain scans are providing clues. Animal studies are providing more. Investigations of tribal behavior are providing still more. None of this research may make us behave better, not right away at least. But all of it can help us understand ourselves—a small step up from savagery perhaps, but an important one.

The Moral Ape

The deepest foundation on which morality is built is the phenomenon of empathy, the understanding that what hurts me would feel the same way to you. And human ego notwithstanding, it’s a quality other species share.

It’s not surprising that animals far less complex than we are would display a trait that’s as generous of spirit as empathy, particularly if you decide there’s no spirit involved in it at all. Behaviorists often reduce what we call empathy to a mercantile business known as reciprocal altruism. A favor done today—food offered, shelter given—brings a return favor tomorrow. If a colony of animals practices that give-and-take well, the group thrives.

But even in animals, there’s something richer going on. One of the first and most poignant observations of empathy in nonhumans was made by Russian primatologist Nadia Kohts, who studied nonhuman cognition in the first half of the 20th century and raised a young chimpanzee in her home. When the chimp would make his way to the roof of the house, ordinary strategies for bringing him down—calling, scolding, offers of food—would rarely work. But if Kohts sat down and pretended to cry, the chimp would go to her immediately. “He runs around me as if looking for the offender,” she wrote. “He tenderly takes my chin in his palm … as if trying to understand what is happening.”

You hardly have to go back to the early part of the past century to find such accounts. Even cynics went soft at the story of Binta Jua, the gorilla who in 1996 rescued a 3-year-old boy who had tumbled into her zoo enclosure, rocking him gently in her arms and carrying him to a door where trainers could enter and collect him. “The capacity of empathy is multilayered,” says primatologist Frans de Waal of Emory University, author of Our Inner Ape. “We share a core with lots of animals.”

While it’s impossible to directly measure empathy in animals, in humans it’s another matter. Hauser cites a study in which spouses or unmarried couples underwent functional magnetic resonance imaging (fMRI) as they were subjected to mild pain. They were warned before each time the painful stimulus was administered, and their brains lit up in a characteristic way signaling mild dread. They were then told that they were not going to feel the discomfort but that their partner was. Even when they couldn’t see their partner, the brains of the subjects lit up precisely as if they were about to experience the pain themselves. “This is very much an ‘I feel your pain’ experience,” says Hauser.

The brain works harder when the threat gets more complicated. A favorite scenario that morality researchers study is the trolley dilemma. You’re standing near a track as an out-of-control train hurtles toward five unsuspecting people. There’s a switch nearby that would let you divert the train onto a siding. Would you do it? Of course. You save five lives at no cost. Suppose a single unsuspecting man was on the siding? Now the mortality score is 5 to 1. Could you kill him to save the others? What if the innocent man was on a bridge over the trolley and you had to push him onto the track to stop the train?

Pose these dilemmas to people while they’re in an fMRI, and the brain scans get messy. Using a switch to divert the train toward one person instead of five increases activity in the dorsolateral prefrontal cortex—the place where cool, utilitarian choices are made. Complicate things with the idea of pushing the innocent victim, and the medial frontal cortex—an area associated with emotion—lights up. As these two regions do battle, we may make irrational decisions. In a recent survey, 85% of subjects who were asked about the trolley scenarios said they would not push the innocent man onto the tracks—even though they knew they had just sent five people to their hypothetical death. “What’s going on in our heads?” asks Joshua Greene, an assistant professor of psychology at Harvard University. “Why do we say it’s O.K. to trade one life for five in one case and not others?”

How We Stay Good

Merely being equipped with moral programming does not mean we practice moral behavior. Something still has to boot up that software and configure it properly, and that something is the community. Hauser believes that all of us carry what he calls a sense of moral grammar—the ethical equivalent of the basic grasp of speech that most linguists believe is with us from birth. But just as syntax is nothing until words are built upon it, so too is a sense of right and wrong useless until someone teaches you how to apply it.

It’s the people around us who do that teaching—often quite well. Once again, however, humans aren’t the ones who dreamed up such a mentoring system. At the Arnhem Zoo in the Netherlands, de Waal was struck by how vigorously apes enforced group norms one evening when the zookeepers were calling their chimpanzees in for dinner. The keepers’ rule at Arnhem was that no chimps would eat until the entire community was present, but two adolescents grew willful, staying outside the building. The hours it took to coax them inside caused the mood in the hungry colony to turn surly. That night the keepers put the delinquents to bed in a separate area—a sort of protective custody to shield them from reprisals. But the next day the adolescents were on their own, and the troop made its feelings plain, administering a sound beating. The chastened chimps were the first to come in that evening. Animals have what de Waal calls “oughts”—rules that the group must follow—and the community enforces them.

Human communities impose their own oughts, but they can vary radically from culture to culture. Take the phenomenon of Good Samaritan laws that require passersby to assist someone in peril. Our species has a very conflicted sense of when we ought to help someone else and when we ought not, and the general rule is, Help those close to home and ignore those far away. That’s in part because the plight of a person you can see will always feel more real than the problems of someone whose suffering is merely described to you. But part of it is also rooted in you from a time when the welfare of your tribe was essential for your survival but the welfare of an opposing tribe was not—and might even be a threat.

In the 21st century, we retain a powerful remnant of that primal dichotomy, which is what impels us to step in and help a mugging victim—or, in the astonishing case of Wesley Autrey, New York City’s so-called Subway Samaritan, jump onto the tracks in front of an oncoming train to rescue a sick stranger—but allows us to decline to send a small contribution to help the people of Darfur. “The idea that you can save the life of a stranger on the other side of the world by making a modest material sacrifice is not the kind of situation our social brains are prepared for,” says Greene.

Throughout most of the world, you’re still not required to aid a stranger, but in France and elsewhere, laws now make it a crime for passersby not to provide at least the up-close-and-personal aid we’re good at giving. In most of the U.S., we make a distinction between an action and an omission to act. Says Hauser: “In France they’ve done away with that difference.”

But you don’t need a state to create a moral code. The group does it too. One of the most powerful tools for enforcing group morals is the practice of shunning. If membership in a tribe is the way you ensure yourself food, family and protection from predators, being blackballed can be a terrifying thing. Religious believers as diverse as Roman Catholics, Mennonites and Jehovah’s Witnesses have practiced their own forms of shunning—though the banishments may go by names like excommunication or disfellowshipping. Clubs, social groups and fraternities expel undesirable members, and the U.S. military retains the threat of discharge as a disciplinary tool, even grading the punishment as “other than honorable” or “dishonorable,” darkening the mark a former service person must carry for life.

Sometimes shunning emerges spontaneously when a society of millions recoils at a single member’s acts. O.J. Simpson’s 1995 acquittal may have outraged people, but it did make the morality tale surrounding him much richer, as the culture as a whole turned its back on him, denying him work, expelling him from his country club, refusing him service in a restaurant. In November his erstwhile publisher, who was fired in the wake of her and Simpson’s disastrous attempt to publish a book about the killings, sued her ex-employer, alleging that she had been “shunned” and “humiliated.” That, her former bosses might well respond, was precisely the point.

“Human beings were small, defenseless and vulnerable to predators,” says Barbara J. King, biological anthropologist at the College of William and Mary and author of Evolving God. “Avoiding banishment would be important to us.”

Why We Turn Bad

With so many redundant moral systems to keep us in line, why do we so often fall out of ranks? Sometimes we can’t help it, as when we’re suffering from clinical insanity and behavior slips the grip of reason. Criminal courts are stingy about finding such exculpatory madness, requiring a disability so severe, the defendant didn’t even know the crime was wrong. That’s a very high bar that prevents all but a few from proving the necessary moral numbness.

Things are different in the case of the cool and deliberate serial killer, who knows the criminality of his deeds yet continues to commit them. For neuroscientists, the iciness of the acts calls to mind the case of Phineas Gage, the Vermont railway worker who in 1848 was injured when an explosion caused a tamping iron to be driven through his prefrontal cortex. Improbably, he survived, but he exhibited stark behavioral changes—becoming detached and irreverent, though never criminal. Ever since, scientists have looked for the roots of serial murder in the brain’s physical state.

A study published last year in the journal NeuroImage may have helped provide some answers. Researchers working through the National Institute of Mental Health scanned the brains of 20 healthy volunteers, watching their reactions as they were presented with various legal and illegal scenarios. The brain activity that most closely tracked the hypothetical crimes—rising and falling with the severity of the scenarios—occurred in the amygdala, a deep structure that helps us make the connection between bad acts and punishments. As in the trolley studies, there was also activity in the frontal cortex. The fact that the subjects themselves had no sociopathic tendencies limits the value of the findings. But knowing how the brain functions when things work well is one good way of knowing where to look when things break down.

Fortunately, the overwhelming majority of us never run off the moral rails in remotely as awful a way as serial killers do, but we do come untracked in smaller ways. We face our biggest challenges not when we’re called on to behave ourselves within our family, community or workplace but when we have to apply the same moral care to people outside our tribe.

The notion of the “other” is a tough one for Homo sapiens. Sociobiology has been criticized as one of the most reductive of sciences, ascribing the behavior of all living things—humans included—as nothing more than an effort to get as many genes as possible into the next generation. The idea makes sense, and all creatures can be forgiven for favoring their troop over others. But such bias turns dark fast.

Schulman, the psychologist and author, works with delinquent adolescents at a residential treatment center in Yonkers, New York, and was struck one day by the outrage that swept through the place when the residents learned that three of the boys had mugged an elderly woman. “I wouldn’t mug an old lady. That could be my grandmother,” one said. Schulman asked whom it would be O.K. to mug. The boy answered, “A Chinese delivery guy.” Explains Schulman: “The old lady is someone they could empathize with. The Chinese delivery guy is alien, literally and figuratively, to them.”

This kind of brutal line between insiders and outsiders is evident everywhere—mobsters, say, who kill promiscuously yet go on rhapsodically about “family.” But it has its most terrible expression in wars, in which the dehumanization of the outsider is essential for wholesale slaughter to occur. Volumes have been written about what goes on in the collective mind of a place like Nazi Germany or the collapsing Yugoslavia. While killers like Adolf Hitler or Slobodan Milosevic can never be put on the couch, it’s possible to understand the xenophobic strings they play in their people.

“Yugoslavia is the great modern example of manipulating tribal sentiments to create mass murder,” says Jonathan Haidt, associate professor of psychology at the University of Virginia. “You saw it in Rwanda and Nazi Germany too. In most cases of genocide, you have a moral entrepreneur who exploits tribalism for evil purposes.”

That, of course, does not take the stain of responsibility off the people who follow those leaders—a case that war-crimes prosecutors famously argued at the Nuremberg trials and a point courageous people have made throughout history as they sheltered Jews during World War II or refuse to murder their Sunni neighbor even if a militia leader tells them to.

For grossly imperfect creatures like us, morality may be the steepest of all developmental mountains. Our opposable thumbs and big brains gave us the tools to dominate the planet, but wisdom comes more slowly than physical hardware. We surely have a lot of killing and savagery ahead of us before we fully civilize ourselves. The hope—a realistic one, perhaps—is that the struggles still to come are fewer than those left behind. Autor: Jeffrey Kluger
Fuente: tim

No Comments

Post a Comment