Spartanburg Herald Journal, SC

advertisement
Spartanburg Herald Journal, SC
02.07/06
When Death Is on the Docket, the Moral
Compass Wavers
By BENEDICT CAREY
New York Times
Burl Cain is a religious man who believes it is only for God to say when a
person's number is up. But in his job as warden and chief executioner at the
Louisiana State Penitentiary in Angola, Mr. Cain is the one who gives the order to
start a lethal injection, and he has held condemned inmates' hands as they died.
He does it, he said in an interview, because capital punishment "is the law of the
land."
"It's something we do whether we're for it or against it, and we try to make the
process as humane as possible," he said, referring to himself and others on the
execution team.
But he concedes, "The issue is coping, how we cope with it."
Common wisdom holds that people have a set standard of morality that never
wavers. Yet studies of people who do unpalatable things, whether by choice, or
for reasons of duty or economic necessity, find that people's moral codes are
more flexible than generally understood. To buffer themselves from their own
consciences, people often adjust their moral judgments in a process some
psychologists call moral disengagement, or moral distancing.
In recent years, researchers have determined the psychological techniques most
often used to disengage, and for the first time they have tested them in people
working in perhaps the most morally challenging job short of soldiering, staffing a
prison execution team.
The results of this and other studies suggest that a person's moral judgment can
shift quickly, in anticipation of an unpalatable act, or slowly and unconsciously.
Moral disengagement "is where all the action is," said Albert Bandura, a
professor of psychology at Stanford and an expert on the psychology of moral
behavior. "It's in our ability to selectively engage and disengage our moral
standards, and it helps explain how people can be barbarically cruel in one
moment and compassionate the next."
The crude codes of behavior that evolved to hold early human societies together
taboos against killing, against stealing would have been psychologically
suffocating if people did not have some way to let themselves off the hook in
extreme situations, some experts argue. Survival sometimes required brutal acts;
human sacrifice was commonplace, as were executions.
The innate human ability to disconnect morally has made it hard for researchers
to find an association between people's stated convictions and their behavior:
preachers can commit sexual crimes; prostitutes may live otherwise exemplary
lives; well-trained soldiers can commit atrocities.
Investigators can identify the precise kinds of thoughts that allow people to do
things that defy their personal codes of ethics.
Now, psychologists at Stanford have shown that prison staff members who work
on execution teams exhibit high levels of moral disengagement and the closer
they are to the killing, the higher their level of disengagement goes. The
trailblazing research grew out of a high school project.
In the late 1990's, Michael Osofsky, then a teenage student in New Orleans,
began interviewing prison guards at the penitentiary in nearby Angola. His father,
a psychiatrist who consulted with the prison, collaborated, as did the warden, Mr.
Cain.
By the time Mr. Osofsky graduated from Stanford in 2003, he had conducted indepth interviews with 246 prison workers from penitentiaries, including Angola, in
three states. They included guards who administer the lethal shots, counselors
who provide support during the execution, members of the strap-down team, and
guards not involved in executions. The people on the execution teams "come
together, do the execution, then go back to their regular jobs" in the prison, Mr.
Osofsky, now on a fellowship in Asia, said in a telephone interview. "They never
really talked about this part of their job, even with their families; even with each
other."
Working with Mr. Cain, Dr. Bandura and Philip Zimbardo, another Stanford
psychologist, Mr. Osofsky administered a moral disengagement scale to the
execution team members and the guards not on the execution team.
This questionnaire asked workers to rate how much they approved or
disapproved of 19 statements, including: "The Bible teaches that murders must
be avenged: life for a life, eye for an eye"; "Nowadays the death penalty is done
in ways that minimize the suffering"; and "Because of the nature of their crimes,
murderers have lost the right to live."
In an analysis of the answers published late last year in the journal Law and
Human Behavior, the psychologists reported that members of the execution team
were far more likely than guards not on the team to agree that the inmates had
lost important human qualities; to cite the danger that "they can escape and kill
again;" and to consider the cost to society of caring for violent criminals.
The team members were also more likely than other guards to favor religious
support for the sentence: an eye for an eye.
"You have to sanctify lethal means: this is the most powerful technique" of
disengagement from a shared human moral code, said Dr. Bandura, who has
expressed serious moral reservations about capital punishment. "If you can't
convince people of the sanctity of the greater cause, they are not going to carry
the job out as effectively."
Execution teams are organized so as to divide the grisly tasks, enhancing what
researchers call a diffusion of responsibility. A medical technician provides the
lethal drugs; a team of guards straps the inmate down, with each guard securing
only one part of the body; another guard administers the drugs. "No one person
can say he is entirely responsible for the death," Mr. Osofsky said.
Firing squads draw on this same idea. Everyone in the squad fires but no one
can be sure whose shot was deadly.
The level of disengagement, as measured by the scale, was about as high in
prison workers who participated in one execution as in those who had been party
to more than 15, the study found. This suggests that, while the job may get
easier over time, "moral disengagement is an enabler, rather than merely the
result of performing repeated executions," the authors conclude.
The pattern was strikingly different in members of the execution support staff,
particularly the counselors working with the families of inmates and victims.
These staff members were highly morally engaged when they first joined the
execution staff, deeply sympathetic to everyone involved, including the
condemned. "I'm in a helping profession, but there isn't a damn thing I can do for
these guys," one of them said to Mr. Osofsky. "I hate it, but I do it. I am required
to do it."
That ambivalence seemed to affect the counselors' moral judgment over time,
the study found. After they had been involved in 10 executions, the counselors'
scores on the disengagement scale almost matched the executioners'.
The finding stands as a caution to the millions of people who work in the service
of organizations whose motives they mistrust, psychologists say: shifts in moral
judgment are often unconscious, and can poison the best instincts and
intentions.
"This really gets at the idea of people working in corporate structures that are
involved in selling, say, weapons or tobacco, and saying, 'Well, I just keep the
books,' " when they disapprove of the business, said Susan Ravenscroft, a
professor of accounting at Iowa State University in Ames who has studied
business ethics.
Moral distancing can also be seen in the language of war, politics and corporate
scandal. Pilots euphemistically "service a target" rather than bomb it; enemies
are dehumanized as "gooks," "hajis" or infidels. Politicians and chief executives
facing indictments deflect questions about ethical lapses by acknowledging that
"mistakes were made," or that they were "out of the loop."
These remarks reflect internal methods of self-protection, as well as public
evasions, research suggests.
Yet it is in the mundane corner-cutting of everyday life that moral disengagement
may be most common and insidious, and least conscious.
In a 2004 study, professors at Iowa State University and the University of
Arkansas tested the moral judgment of 47 college students who had cheated on
a take-home exam, a complex accounting problem.
Many of the students found a solution to the problem online posted by another
professor who was unaware it was part of an exam and reproduced the solution
as their own, though it used techniques they had not yet learned. Others had
clearly collaborated, which their professor had explicitly forbidden. Another 17
students had not cheated, as far as their teacher could determine.
The professor threw out the test scores and got permission from the students to
ask about their behavior. The cheaters' scores on a standard test of moral
judgment did not correlate at all with their level of plagiarism or collaboration. On
the contrary, it was the most dishonest male students who scored highest on the
morals test.
"Clearly, this is not what you want to find in a test of moral judgment," said Dr.
Ravenscroft, a co-author of the study, with Charles Shrader of Iowa State and
Tim West of the University of Arkansas.
Only by conducting in-depth interviews with students about their behavior did the
researchers begin to see clear, familiar patterns. One was displacing the blame:
"I think it's hard for people not to look at the answer manual if it's available," said
one student. "Maybe you should have taken the problem off so people wouldn't
be tempted."
Another was justifying the behavior by comparison: "I really don't consider
working with another person that unethical," one student commented. "Taking
and copying answers from the key was highly unethical." Many students
"rationalized cheating behavior as a necessary defense to the cheating of
others," the researchers concluded in their analysis, to appear this year in the
Business and Professional Ethics Journal. "Yet in an extreme example of moral
exclusion, none of the students discussed this impact on others."
Recognizing these kinds of selfish evasions in oneself is hardly proof of moral
collapse, psychologists say. Rather, they say, moral disengagement is evidence
that a sound moral sensibility is trying to assert itself, warning against a situation
it finds suspect. As a rule people don't like to cheat or lie, studies find, and they
are extremely reluctant to inflict pain on others, no matter the circumstances.
And moral engagement is dynamic. Once people stop doing what is consciously
or unconsciously upsetting them, the research suggests, they engage their
conscience more fully.
That is, if they have the luxury to walk away.
"I remember the one execution I attended, there was this strange heaviness in
the air all day," Mr. Osofsky said. "These guards you knew were somber and
detached, keeping to themselves. This wasn't something they gloried in or looked
forward to at all. They didn't really seem like themselves."
Download