Monday, July 27, 2009

• We like to keep our beliefs consistent with our experience and actions.

• We like to keep our beliefs consistent with our experience and actions.


The painter Philip Ernst, father of Max Ernst, when painting a picture of his garden omitted a tree which spoiled the composition and then, overcome with remorse at this offense against realism, cut down the tree (Tuchman 353).


# Inconsistency is stressful, and we work hard to reduce it. After studying end-of-the-world movements, and building on Fritz Heider’s work, Festinger and colleagues proposed a condition they call “cognitive dissonance”: the tension we feel when one belief contradicts another, our actions, or our personal experience. We try to reduce that tension in any of several ways. Other writers put it this way:

For example, cigarette smoking is dissonant with the awareness that 'smoking causes illness.' The smoker might change the behavior and try to quit. Or she might reject the cognition that 'smoking is bad.' She could persuade herself that she will quit later on ('after these exams'). She could emphasize the benefits of smoking ('A cigarette helps me relax'). Or she could decide she doesn't want a long life, anyhow ('It will be shorter, but sweeter'). In all of these cases, the smoker is motivated to reduce dissonance because the behavior, smoking, is out of kilter with the smoker's knowledge of the dangers of that behavior.
In an actual study of people who had gone to a clinic to quit smoking, those who later relapsed . . . . most of them lowered their perceptions of the health risk of smoking (W,T 349-50).

Where I live, some smokers defend their habit with the fervor of martyrs and the language of the right-wing backlash against our multicultural society: “Now it’s PC to pick on smokers. The government has no right to interfere. This is supposed to be the land of the free.”

Festinger’s insight has been explored by many psychologists since he and his colleagues wrote When Prophecy Fails in 1956. A lot of research supports the idea, but some critics think it’s too general. How can an outsider tell if someone’s ideas are truly dissonant? As Atkinson et al point out , “his or her opinions may simply be inconsistent with the ideological framework of the investigator. Inconsistency may be in the eye of the beholder” (710). For example, polls show that the same people either support welfare or hate it, depending on how the question is asked (709).

A third fallacy we need to avoid is assuming that people normally work out their beliefs, attitudes, and desires in an ideologically consistent fashion that reflects an internally coherent and nonparadoxical world-view. They generally do not. Most people, it appears, carry on in life with outlooks and belief systems containing significant complexity, paradox, multivocality, ambivalence, inconsistency, and sometimes confusion. By assuming people's views are internally consistent and well-ordered, observers think they can use known information about some views of evangelicals to make reliable inferences about other views about which they do not have data. But how people think about one issue may be logically inconsistent with how they think about another. It is impossible to determine which is their 'real' position (C. Smith 11).

So conflicting ideas don't always cause the stress of cognitive dissonance. In fact, we are fairly accomplished compartmentalizers (see below).

Likewise, it’s hard to judge if a particular behavior stems from the feeling of cognitive dissonance. When I saw the frantic celebrations at the militarily insignificant capture of Saddam Hussein, coupled with spiteful, foam-flecked attacks on the anti-war unbelievers, was I wrong to conclude that these were expressions of temporary relief from the vast Persian gulf between what was promised for Iraq (cheering crowds, flourishing businesses, constitutional conventions, truckloads of captured bioweapons and nuclear warheads) and the reality (continued grinding poverty, widespread, murderous, and effective resistance, and, in the basements of secret government labs, several tubes of Preparation H)-- ?

From an on-line poll by my local newspaper, the Knoxville News-Sentinel, Dec. 18, 2003:
• “Isn’t it fun watching the radical left hiding their sadness with the capture of Saddam?”
• “Boy, you lefties hate it when anything good happens.”
• “ . . . tremendous victory for the Iraqi people and our troops. Persistence-- and not heeding the cowards who want us to cut and run.”

Maybe they were just being exuberant.


# Small steps can lead to lifetime commitments. Many times we enter into a course of action through a series of small steps; but once we’re committed, we find it very hard to back out. Our justifications for the first steps make the next ones easier; sometimes we need to take the next steps to justify the first. Once you start a war, expanding it may be that the only way to “prove” it’s worth the cost.

In the 1960s, Stanley Milgram conducted a classic series of experiments about obedience. Volunteers were asked to teach series of words to “learners” (really, people in cahoots with Milgram), and if they didn’t remember, the volunteers were told to give the learners electric shocks. The shock machine was clearly labelled from 0 to 450 volts. Although many volunteers were reluctant to hurt the learners, the experimenter insisted. Little by little, the volunteers turned up the voltage until the learners were screaming in pretended pain. “In order to break off, [the volunteers] must suffer the guilt and embarrasment of acknowledging that they were wrong to begin at all. And the longer they put off quitting, the harder it is to admit their misjudgment in going as far as they have. It is easier to continue. Imagine how much less obedience there would be if subjects had to begin by giving the strongest shocks first” (Atkinson 739).

Kelman writes of “sanctioned massacres,”

Authorization processes create a situation in which people become involved in an action without considering its implications and without really making a decision. Once they have taken the initial step, they are in a new psychological and social situation in which the pressures to continue are powerful. As Lewin (1947) has pointed out, many forces that might originally have kept people out of a situation reverse direction once they have made a commitment (once they have gone through the ‘gate region’) and now serve to keep them in the situation. For example, concern about the criminal nature of an action, which might originally have inhibited a person from becoming involved, may now lead to deeper involvement in efforts to justify the action and to avoid negative consequences (17-18).

Any good bureaucracy understands these principles and uses them to train its staff. Clerks start by covering up the boss' little "accounting errors"; cops start taking "Christmas gifts" from shopowners on the beat; soldiers look the other way while platoonmates beat the hell out of civilians. Later on these small collaborations grow into big crimes; and through long practice in petty crime, the rapes and murders come easier. This is also part of the theory behind the Giulani crackdown on petty crime in New York, and there applied to observers as well as offenders. The idea was that if people saw they or others could get away with minor offenses, if they thought no one cared, it would encourage them to dare bigger crimes.

This is not to claim that dipping our toes in a behavior requires us to dive in completely a moment later; on our own, we can perfectly well choose our level of involvement. Rather, it’s a lot easier to push us when we’re hovering on the brink of desperation.

We like to think we can’t be caught by such shabby tricks, that we’re immune to outside influence. That very confidence, however, can make us easy pickin’s. Tavris and Aronson offer the example of drug companies wooing doctors:

‘If a pharmaceutical company wants to give us pens, notepads, calendars, lunches, honoraria or small consulting fees, why not? We can’t be bought by trinkets and pizzas.’ According to surveys, physicians regard small gifts as being ethically more acceptable than larger gifts. . . . The evidence shows, however, that most physicians are influenced even more by small gifts than by big ones. . . . The reason Big Pharma spends so much on small gifts is well known to marketers, lobbyists, and social psychologists: Being given a gift evokes an implicit desire to reciprocate (52).

But once reciprocity kicks in, self-justification will follow . . . . Once you take the gift, no matter how small, the process starts. You will feel the urge to give something back, even if it’s only, at first, your attention, your willingness to listen, your sympathy for the giver. Eventually, you will become more willing to give your prescription, your ruling, your vote. Your behavior changes, but, thanks to blind spots and self-justification, your view of your intellectual and professional integrity remains the same (53).

This process helps explain why some officeholders sell themselves so cheaply, handing out billions to lobbyists for little more than the cost of tickets to Las Vegas or a few hours with prostitutes.

Incremental commitment can go in a liberating direction, too; once we start helping people publically, we’re more inclined to continue (Wade, Tavris 1993, 664-5). That's why education through action is such a central part of the democratic project. Citizens who attend a public hearing may speak at the next one, and may help turn out neighbors for the third. We feel our way into standing for justice.


# We try to bring our beliefs in line with our actions, when we take responsibility for those actions. What happens when it’s our own actions that contradict our beliefs? During the Korean war, Americans were disturbed to learn that thousands of captured U.S. soldiers had collaborated with their Chinese and North Korean jailers, even to the point of condemning the U.S. In word and deed they turned their backs on their patriotic upbringing. This was different than the World War II experience and the authorities imputed to the communists a sinister new weapon: “brainwashing”, the ancestor of the “cult programming” model of the 1970s. The Frankenheimer film The Manchurian Candidate (1962) depicts a commie plot to assassinate U.S. leaders by the hand of a brainwashed ex-POW. But changing the loyalty of U.S. soldiers turned out to be less mysterious than originally thought. Cummings is worth quoting here at length:

Psychologists were particularly interested in the unsettling success the Chinese achieved in getting them to change their beliefs about their own country’s role in the war. They came to believe, for example, that the U.S. had engaged in germ warfare and had been the initial aggressors in starting the war. There was also an extremely high rate of collaboration with the enemy on the part of the POWs, ranging from benign activities such as running errands voluntarily to serious behaviors such as turning in fellow prisoners who tried to escape.
The surprising thing about these belief changes and collaborations is that they occurred despite the absence of severe coercion. . . .
One tactic used by the Chinese was to hold political essay contests. The prizes were kept exceedingly small --a few cigarettes, a bit of fruit-- but the contests and prizes were sufficient to evoke interest from prisoners living in such desolate circumstances. In order to win the prize, the essay had to contain elements of a pro-Communist stand, even a token nod in that direction. In this way, the winning POW would obtain the prize only by providing support --however small-- for the enemy’s cause. While salting an essay with a few token statements in favor of communism might have seemed harmless to the POW under the circumstances, the important thing was that it was there in black-and-white and in his own handwriting. He could hardly deny having written it later. So how does he justify his behavior when the essay is trotted out late and shown to his fellow prisoners, his family, or the American press? By appealing to the cigarette or piece of fruit he was offered in return? Does such a paltry inducement justify ‘aiding’ the enemy in their propaganda campaign? . . .
As [Robert] Cialdini goes on to say, we accept inner responsibility for our behavior when we think we have chosen to perform it in the absence of strong outside pressures. A strong reward or threat of punishment constitute such strong outside pressures. This implies that in order to change people’s beliefs, you must accomplish two things. First, you must get them to behave in a way that is inconsistent with their beliefs. Second, you must make them take responsibility for their behavior. Faced with an inconsistency between their beliefs and their actions, most people will change their beliefs to bring them in line with their actions.
As unintuitive as this theoretical framework seems, it has bee successfully tested repeatedly (19-20).

For example, Festinger and Carlsmith paid subjects either $1 or $20 to tell others that a boring experiment was fun. “Afterward, the experimenters asked the subjects how they felt about the experiment. They found that the subjects who had been paid $20 to lie about the experiment continued to believe that it had been boring. Those who had been paid $1, however, came to believe that the experiment had actually been fairly enjoyable. These people had less reason to tell the lie, so they experienced more dissonance when they did so. To justify their lie, they had to believe that they had actually enjoyed the experiment” (Kasschau 467).

In another experiment, the researcher forbade a bunch of little boys from playing with a particular toy robot, but threatened only half the kids with punishment. Then he left the playroom and watched the kids through a one-way mirror. Most kids did not play with the forbidden toy. Weeks later the experiment was repeated, with the same kids but a different adult on hand, and no threats or prohibitions. This time, “Of the boys who had received a threat of punishment six weeks previously, 77 percent played with the robot. Of those who had not received the threat, only 33 percent played with the robot. Why this dramatic difference?

“The results are readily interpretable using cognitive dissonance theory. Boys in the first group didn’t play with the toy during the first session because they had been threatened with punishment. They didn’t own their own behavior, the person who threatened them did. . . . The boys in the second group, however, . . . were not threatened with punishment. They willingly chose not to play with the robot. They owned their actions in a way that the other boys did not, and these actions contradicted their belief that the robot was the preferred toy. During the second session, their attitude toward the robot emerged and was found to have undergone significant change: The robot was not the toy most preferred to play with” (Cummins 23).

The principle that taking responsibility leads to commitment is one reason we ask people to take loyalty oaths in public, confess their sins, or witness to the heathen. Frykholm notes that sharing religious convictions may have no effect on the listener, but powerfully confirms the faith of the testifier (159). When the Israeli Army drafts people it sends a booklet listing the penalties for shirkers, and a letter stating: “Draftees are called to the flag according to the law but come as volunteers” (Ezrahi 157)-- you don’t have a choice, but you are expected to commit heart and hands as if you do.

I wonder if some left intellectuals in the 1930s, understanding that belief follows action, initiated the process for themselves. Guilt-wracked for their privileges in a suffering world but only too aware of the fearful costs of resistance, perhaps those very fears led them to take the great leap of faith into Soviet communism. Maybe they weren’t sure they could keep up the good fight unless they burned their bridges behind them. It would be interesting to know if people who came to the CP by this route were any more or less willing than other members to accept and excuse Stalin’s later purges and invasions. And maybe we’re seeing a similar process among some religious activists today-- thirsting for salvation, they settle for terrorism.

Politicians aim to bind us even more strongly when they organize scapegoating rituals and mass atrocities, from the biblical stoning of women who slept with someone who didn’t own them, to lynch mobs in the gracious Old South, to Mao’s Cultural Revolution and the mass rapes in Bosnia. If you can even chant a slogan while your neighbor’s being tortured, you are complicit in the crime and have all the more reason to defend it as sacred policy. As Grossman writes,

the killer must violently suppress any dissonant thought that he has done anything wrong. Further, he must violently attack anyone or anything that would threaten his beliefs. . . . By ensuring that their men participate in atrocities, totalitarian leaders can also ensure that for these minions there is no possibility of reconciliation with the enemy. Trapped in their logic and their guilt, those who commit atrocities see no alternatives other than total victory or total defeat in a great Götterdämmerung (210).

It’s a good way to demonize the victims, too: “‘Many subjects harshly devalue the victim as a consequence of acting against him,’ wrote Milgram (1974). ‘Such comments as, “He was so stupid and stubborn he deserved to get shocked,” were common’” (Wade, Tavris 1993 p. 649).

Of course there are all sorts of less dramatic ways we can be Good Germans. A stupid joke here, and handful of gold fillings there-- what’s the harm? I guess we shouldn’t worry about kids having to say the Pledge of Allegiance --they have little choice-- but what about the adults? Too often, some thoughtless remark --whatever is most convenient to say at the time-- rears up later on as a major commitment we feel we have to back up. Sheds a different light on patriotic essay contests, doesn’t it?

Sometimes --many times-- the victims themselves assume the blame the powerful cast their way. After all, people being starved, raped and murdered face perhaps the most intense cognitive dissonance of all, and the need to reconcile self-respect with brutal treatment by the powerful. This “internalized oppression” may be the ugliest outcome of authoritarian society short of (but leading to) mass murder. What Stalin did with hundreds of communists, forcing them to confess to imaginary crimes before he murdered them, capitalist and religious systems of thought do to millions of women, children, low income people, and people of color. I was struck by Mike Davis’ description of the child witches of Kinshasa, though it’s really no different in character from what a lot of people experience every day. People are so desperately sick, hungry and afraid in that African city that they have accused thousands of children of witchcraft as an excuse to kick them out of their overburdened families.

Witch children, like possessed maidens in seventeenth-century Salem, seem to hallucinate the accusations against them, accepting their role as sacrificial receptacles for family immiseration and urban anomie. One boy told photographer Vincen Beeckman:

I’ve eaten 800 men. I make them have accidents, in planes or cars. I even went to Belgium thanks to a mermaid who took me all the way to the port of Antwerp. Sometimes I travel by broomstick, other times on an avocado skin. At night, I’m 30 and I have 100 children. My father lost his job as an engineer because of me-- then I killed him with the mermaid. I also killed my brother and sister. I buried them alive. I also killed all of my mother’s unborn children (197-8).

I'm stupid, I'm a sinner, I can't be trusted, I don't deserve your love . . . . Tell me what to do!


# Bloody hands: we can't admit we were wrong. The more time, money and blood we spend to achieve a goal, the more reluctant we’ll be to admit we made a mistake. Psychologists call it “justification of effort”--we must be right, because the alternative would damage our pride. “This explains why hazing, whether in social clubs or the military, turns new recruits into loyal members" (Wade & Tavris 1993 p. 351).

If we buy an expensive bottle of wine, or choose a more expensive school, or contribute to church or a political candidate, we are thereby cementing our choices and making it less likely we’ll change our minds. It’s not just pride, however. If we destroy the entire food supply to make ready for the Ancestors, we might understandably be reluctant to admit a mistake that’s likely to kill us. And if we send a lot of teens to die in Asia, instead of saying, Oops, sorry, we better stop the slaughter, we’re more likely to prolong the killing so we don't have to admit we killed all those kids for nothing.

In some cases, our fervor results in burning our bridges behind us. When true believers condemn their neighbors, when captured U.S. POWs condemn their country, when President Halliburton punishes one-time allies, when some devotees physically attack unbelievers, they’ve made it much harder to change course. In effect, they’re rolling double-or-nothing-- they must win everything or lose everything. There’s almost no way to back down.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
To me truth is precious . . . . I should rather be right and stand alone than to run with the multitude and be wrong. . . . The holding of the views herein set forth has already won for me the scorn and contempt and ridicule of some of my fellowmen. I am looked upon as being odd, strange, peculiar. . . . But truth is truth and though all the world rejects it and turns against me, I will cling to the truth still.
-- Charles Silvester de Ford, of Fairfield, WA, in his pamphlet explaining that the world is flat (Engle 144).
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

No comments:

Post a Comment