Documenting the Coming Singularity

Wednesday, May 30, 2007

Evolutionary Morality

Doing good is its own reward. So said a wise man. We've long known that being good feels good, not from brain scans, but from subjective experience. Science can now add credence to what we sensed all along, according to some interesting experimental evidence reported in the Washington Post:
The results were showing that when the volunteers placed the interests of others before their own, the generosity activated a primitive part of the brain that usually lights up in response to food or sex. Altruism, the experiment suggested, was not a superior moral faculty that suppresses basic selfish urges but rather was basic to the brain, hard-wired and pleasurable.

Their 2006 finding that unselfishness can feel good lends scientific support to the admonitions of spiritual leaders such as Saint Francis of Assisi, who said, "For it is in giving that we receive." But it is also a dramatic example of the way neuroscience has begun to elbow its way into discussions about morality and has opened up a new window on what it means to be good.
It's not too difficult to image how this hardwiring came to be. For genes to be passed on, an individual has to survive to breeding age, be relatively healthy, and somewhat attractive to the opposite sex. A violent, selfish troublemaker, would be more likely to be ostracized and perhaps killed by others in his tribe.

The fact is, societies are more successful when their constituents are nice to each other. To be sure, socialization of children by adults will either reinforce and guide or suppress our natural tendency to treat others well. Equally certain is the fact that the tendency to do good can be overpowered and subsumed by another hardwired imperative: survival.

One of the most interesting concepts to arise from this research, however, is that this wiring didn't begin with humans. It seems to be much older than our species.
Grafman and others are using brain imaging and psychological experiments to study whether the brain has a built-in moral compass. The results -- many of them published just in recent months -- are showing, unexpectedly, that many aspects of morality appear to be hard-wired in the brain, most likely the result of evolutionary processes that began in other species.

No one can say whether giraffes and lions experience moral qualms in the same way people do because no one has been inside a giraffe's head, but it is known that animals can sacrifice their own interests: One experiment found that if each time a rat is given food, its neighbor receives an electric shock, the first rat will eventually forgo eating.

What the new research is showing is that morality has biological roots -- such as the reward center in the brain that lit up in Grafman's experiment -- that have been around for a very long time.
The saying, "Do unto others as you would have them do unto you," presupposes the existence of empathy. We can imagine what another individual will experience as a result of our actions, whether suffering or comfort. Research along these lines leads science into realms usually reserved for religion.
The more researchers learn, the more it appears that the foundation of morality is empathy. Being able to recognize -- even experience vicariously -- what another creature is going through was an important leap in the evolution of social behavior. And it is only a short step from this awareness to many human notions of right and wrong, says Jean Decety, a neuroscientist at the University of Chicago.
Discussion of these kinds of topics will certainly lead to controversy.
The research enterprise has been viewed with interest by philosophers and theologians, but already some worry that it raises troubling questions. Reducing morality and immorality to brain chemistry -- rather than free will -- might diminish the importance of personal responsibility. Even more important, some wonder whether the very idea of morality is somehow degraded if it turns out to be just another evolutionary tool that nature uses to help species survive and propagate.

Moral decisions can often feel like abstract intellectual challenges, but a number of experiments such as the one by Grafman have shown that emotions are central to moral thinking. In another experiment published in March, University of Southern California neuroscientist Antonio R. Damasio and his colleagues showed that patients with damage to an area of the brain known as the ventromedial prefrontal cortex lack the ability to feel their way to moral answers.

When confronted with moral dilemmas, the brain-damaged patients coldly came up with "end-justifies-the-means" answers. Damasio said the point was not that they reached immoral conclusions, but that when confronted by a difficult issue -- such as whether to shoot down a passenger plane hijacked by terrorists before it hits a major city -- these patients appear to reach decisions without the anguish that afflicts those with normally functioning brains.

Such experiments have two important implications. One is that morality is not merely about the decisions people reach but also about the process by which they get there. Another implication, said Adrian Raine, a clinical neuroscientist at the University of Southern California, is that society may have to rethink how it judges immoral people.

Psychopaths often feel no empathy or remorse. Without that awareness, people relying exclusively on reasoning seem to find it harder to sort their way through moral thickets. Does that mean they should be held to different standards of accountability?

"Eventually, you are bound to get into areas that for thousands of years we have preferred to keep mystical," said Grafman, the chief cognitive neuroscientist at the National Institute of Neurological Disorders and Stroke. "Some of the questions that are important are not just of intellectual interest, but challenging and frightening to the ways we ground our lives. We need to step very carefully."

Joshua D. Greene, a Harvard neuroscientist and philosopher, said multiple experiments suggest that morality arises from basic brain activities. Morality, he said, is not a brain function elevated above our baser impulses. Greene said it is not "handed down" by philosophers and clergy, but "handed up," an outgrowth of the brain's basic propensities.
Once again, it seems that science is pushing up against the doctrines and creeds of religion.

Singularity & The Price of Rice is updated daily; the easiest way to get your daily dose is by subscribing to our news feed. Stay on top of all our updates by subscribing now via RSS or Email.


Spaceman Spiff said...

I think this is fascinating stuff. It is worth noting that it was the Greeks who separated the body and soul, rather than the Jews, for whom the two were intimately bound. I think to a Judeo-Christian mind, the idea of moral maladies being intimately tied to the physical body would not be surprising in the least.

Also within Judeo-Christian tradition, we have the idea of the fall. Even the most Arminian Christian has to admit that within a Christian framework, our free will only goes so far, and the fallenness of our upbringing, of our spirits, and of our very bodies all ultimately mean a very limited final conception of free will.

On the other hand, no Christian or Jew should be surprised that concern for others is hardwired into our brains, since we also believe we are created in God's image.

So I don't believe this should be controversial. To "reduce moral decisions to brain chemistry" would be begging the question. We can surely say brain chemistry is intimately bound with moral decision making, and yet that doesn't show anything about where chemistry or brains or morals or anything comes from in the first place. There may be nothing deeper than chemistry, and morals just flow from that. On the other hand, there may be nothing deeper than morals, and perhaps chemistry simply flows from that.

StephUF said...

spaceman spiff is a smart dude