Site update

Since I have been really terrible at updating the blog (but pretty good at keeping up with the facebook blog posts) I've added the widget below so that facebook cross posts to the blog.

You shouldn't need to join facebook but can just click on the links in the widget to access the articles. If you have any problems or comments please mail me at arandjel 'AT'

Wednesday, February 23, 2011

Questioning morality, right and wrong

hmmm.....I feel like this article sort of works on the premise that the morality we start off with is "correct" (ie: references to "behaving badly") and then we become increasingly bankrupt over time...but isn't it instead that the morality imposed on us is so full of shit that living/experiences allows us to recognize it as such, and the modifications we undertake grant us the ability to live a life not so out of tune with our wants, desires AND logic....?-MA

From the Daily Beast
The Science of Why We Cheat

A lack of morality can lead to bad behavior—but can behaving badly make us lose our morals? Casey Schwartz on how lying, cheating and stealing warps our sense of right and wrong.

Before you hack into your boyfriend’s email account, or sleep with the married guy, or overstate your billable hours, take note: Telling yourself it is "just this once” is an unlikelier story than ever before.

Or at least that is the conclusion of intriguing new research that examines the way our actions influence our beliefs, reversing the traditional direction of cause and effect. In their study, published in the current issue of Personality and Social Psychology Bulletin, Lisa Shu and her colleagues at Harvard University found that behaving badly actually altered their subjects’ sense of right and wrong.

Humans are invested in seeing themselves as ethical creatures. We want to believe in the rightness of our own conduct, to see our lives as a series of mostly well-intentioned decisions. And it appears that we'll go to great lengths to feel that way, even if it means warping our own sense of morality to suit our needs.

The famous psychologist Albert Bandura coined the term “moral disengagement” to capture the process by which people pervert their own sense of right and wrong in order to give into a questionable temptation.

Yes I know he’s married, but it’s OK to sleep with him, the logic of moral disengagement goes, because, insert excuse here: I can’t stand his wife. If not with me, it would be with somebody else. This is his moral dilemma, not mine. The institution of marriage is a meaningless concept.

The options are many.

Moral disengagement essentially allows people to behave in ways that, at another moment, in a different mood, that same person would never consider. For years, research has shown again and again that moral disengagement influences how people will behave in a given situation. But now, in a chicken-and-egg twist, Shu and her team have shown that it works both ways: How people behave influences the moral beliefs they have about their behavior. Moral disengagement is the result of unethical behavior, they have now shown, not just the cause.

Shu’s research is based on a string of four related studies, each using a different group of undergraduates as subjects. In one, 138 subjects were asked to read an academic honor code that reinforced in their minds the idea that cheating is wrong. Then they were given a set of math problems to solve, and an envelope of cash that they would be rewarded from, according to how many problems they answered correctly. The subjects were divided into two conditions: one where it was possible for them to cheat by misreporting their own scores, and a control condition where their scores were tallied by a proctor in the room. Perhaps not surprisingly, some of the subjects in the first group, who were allowed to report their own scores, inflated those scores in order to get more cash.

Afterward, they were given a questionnaire to fill out that they’d also been given at the beginning of the study, consisting of questions designed to measure moral engagement with a focus on cheating. Shu and her colleagues developed this measure themselves, and tested its validity in other circumstances before using it for the current research. The results? Those subjects who had cheated on the math problems demonstrated a greater degree of moral disengagement in their responses the second time they filled out the questionnaire.

What's more, Shu found that the students who had cheated also had a harder time remembering the academic honor code that they’d been given to read before the task, compared to those subjects who hadn’t cheated. Shu calls this phenomenon “motivated forgetting,” citing it as yet another strategy we deploy to avoid the disquieting recognition that we’ve done something wrong.

In fact, what initially led Shu to this research was her sense that beliefs and values are not fixed, stable traits that we tote with us like a wheelie bag everywhere we go. On the contrary, she believes we bend or break them according to circumstance.

“It didn’t seem intuitive to me that our beliefs never change,” Shu said. “But what really led me to the question was the debate, both in academia and in the business world, about how much of peoples’ dishonest behaviors and bad actions is due to the situation, versus who that person is and how their upbringing was.”

Shu notes that given the “permissive environment” that she created in the lab by allowing one group of subjects the opportunity to cheat, she produced a greater likelihood of cheating—which in turn produced a shift in the way the cheaters thought about cheating.

On the bright side, Shu found that if participants did something as simple as sign their names to the honor code, rather than just passively read it, they were less likely to cheat on the math problems they were given to solve.

As a whole, Shu and her colleagues’ study is further reason to doubt that people have an unbudging, ingrained ethical compass guiding their every action. Indeed, ignoring that compass seems to make us forget we have it at all, at least temporarily.

The implications of Shu’s findings align with the existing research and paint a troubling picture of how morality can easily spiral out of our grip without us even noticing. If both things are true—that attitude influences action and action influences attitude—it becomes easier to understand scenarios of runaway transgressions. You do something you know isn't good, you talk yourself out of feeling bad about it, you become more likely to do it again—and, having done it again, you’re back to telling yourself it doesn’t matter, it’s no big deal, it was just this once…

And just like that, you’ve done nothing wrong.

No comments: