2C. THAT’S MY STORY AND I’M STICKING TO IT: Reconciling behavior and belief.
Cherished lies.
“Hope is not a delusion.”
Investing in our stories.
• Our model makes us feel successful.
• We adopt the bosses’ Story.
• We like to keep our beliefs consistent with our experience and actions.
# Inconsistency is stressful, and we work hard to reduce it.
# Small steps can lead to lifetime commitments.
# We try to bring our beliefs in line with our actions, when we take responsibility for those actions.
# Bloody hands: we can't admit we were wrong.
# Stand and be counted: defending our public positions.
# Explanations become commitments.
# Inventing the self.
• The Self is a kind of explanation.
• The Ambassador.
# The Stockholm Syndrome: victims come to identify with their abusers.
• Sweeping back the sea: the endless task of justifying injustice.
• The Green Zone: thinking inside the boxes.
That’s not my story, but I won’t tell you what is: silent resistance.
Action and belief in context of the group.
* * * * * * * * * * * *
2C. THAT’S MY STORY AND I’M STICKING TO IT: Reconciling behavior and belief.
Cherished lies.
The root cause of the occupation's paralysis may have been the cloud of cognitive dissonance that seems to have fogged in Rumsfeld and other senior officials at this time [mid-2003]. They were not finding what they expected: namely, strong evidence of intensive efforts to develop and stockpile nuclear bombs. Meanwhile, they were finding what they had not expected: violent and widespread opposition to the U.S. military presence (Ricks 168).
The defense secretary's vulnerability wasn't that he made errors, it was that he seemed unable to recognize them and make adjustments. . . . As Baghdad was looted, the defense secretary seemed to freeze (169).
There’s no great mystery as to why Halliburton might tell the world Iraq bought plutonium from Niger, having evidence in hand that the claim was false. The mystery is rather, why it came to believe its own lies; and even more bizarre, why so many of us should take its belief in its own lies as a demonstration of sincerity. Between processing errors and calculated lies there’s a domain of dangerous self-delusion.
All stories are self-serving-- that’s why we tell them. But why do we lie to ourselves when it makes solving problems so much more difficult? “He’ll change,”says the battered wife. “My son died defending freedom,” says the grieving dad of Pfc. Jones, another teen subcontractor blown to smithereens while stealing oil for Halliburton. “I wish Stalin were back,” laments the hungry pensioner. Gay priests-- what could they possibly be thinking? Our capacity to believe in the face of contrary experience is awesome, and well-documented.
Experiments with college students have shown, for example, that once students develop opinions based on information provided by the experimenters, and are later told the information is false, they persist in their opinions even while acknowledging the falseness of the underlying data. In one experiment, students continued to believe in the psychic powers of an amateur magician even as he himself denied them, and even though they agreed that most psychics were fakes.
“The most frequent justification cited was prior belief in psychic phenomena, and in these cases subjects typically made mention only of prior belief. That is, they seldom mentioned anything at all about the particular case they were supposed to be judging. For instance, the complete justification of one subject who etimated that [the performer]was psychic was: ‘My justification for the answer is I am a Christian and I feel strongly that ESP or anything dealing with that is of Satan. Yes, I believe it could happen, but I, being a Christian, will have no part of it’” (Singer & Benassi 62). This student’s comment hints at the value to him of the magician’s performance: it validates the student’s beliefs in magic and in present evil, and gives him a chance to promote his religion.
No group can claim a monopoly on self-delusion. Scientists, sometimes happy to be celebrated for a Mr. Spock-like air of detachment and objectivity, have had their own little episodes-- Blondlot and his N-Rays, visible only to the select; Lowell and the canals of Mars; the “cold fusion” of Pons and Fleischmann; Margaret Mead’s invention of Free Love in the South Seas. Psychology and biology seem to attract more than their share of carelessness and fraud-- Spencer’s Social Darwinism, Nazi racism, Soviet biology, U.S. creationism, psychic research, intelligence studies in many lands.
For the most part, however, errors persist mainly for political reasons. Blondlot was sustained for a time by the nationalism of French colleagues, eager to steal a march on their German and British rivals. Pons and Fleischmann were pushed into publishing by the officials at the University of Utah, who saw the prospect of raking in millions of dollars in grants and royalties. But even in these cases, the errors were discredited by other scientists within a matter of months.
Much more dangerous and long-lasting has been pseudo-science sponsored by powerful authoritarians-- tobacco companies’ cancer research, the campaign to replace science with Christianity in the schools, concentration camp “medical experiments,” and so on. Burt’s fraudulent intelligence studies were accepted for decades, at least in part because he used them explicitly to support the dominant myth that “the wide inequality in personal income is largely, though not entirely, an indirect effect of the wide inequality in innate intelligence” (Gould 1981 MoM). Racists transformed Yerkes’ flawed intelligence testing into immigration laws that ultimately barred closed the U.S. to Jews fleeing Hitler (Gould 1981 MoM 231-233). Scientists who disagreed with Stalin’s pet agronomist Lysenko lost their jobs, went to jail, or were simply shot.
But these examples do not really pertain to my point. I know already that it’s easy to believe what’s in our self-interest. Again, what I’m interested in here is how we end up with beliefs that do not help us. How do we invest so much in an idea that we hold onto it whatever the cost?
For even where we would seem to have little to gain, sometimes we persist in our beliefs despite contradictory evidence (“disconfirmation”, in researcher parlance), evidence which we accept. For example, researchers at the University of Virginia gave 132 undergraduates a fake study showing that firefighters attitudes toward risk related to their success on the job (Smith 140). The topic was chosen specifically as one the students were not likely to care much about, or have personal beliefs about. After reading the article the students were asked to state their opinions, and write an explanation. Afterward, half the students were told the information was made up, and asked again to give their opinions. Even though they now understood that the underlying information was wrong, all the students who staked out a strong position in favor of the discredited information stuck to their initial opinions, “arguing that the fabricated data are (and always were) irrelevant . . . . ‘I hold my position because I believe in it. Even if the “article” is fictitious, the reasons I stated are the reasons I believe . . . . ‘My reasons are the same as before. I used logical reasoning from the start’” (119). Thus in one casual leap of faith these folks have broken free from the oppressive confines of reality. “This research . . . . demonstrated that people often cling tenaciously to their beliefs, even in the face of total evidentiary discrediting” (M.J. Smith 123).
Aren’t we silly? Adorable, really, with all our little quirks and eccentricities. Or, just as I knew all along, all the people around me are idiots. There’s a tendency to treat these errors ironically, with a roll of the eyes and a whatcha-gonna-do? shrug. Talk show hosts, columnists like The News of the Weird guy, who’s contemptuous scribblings I see in every issue of the liberalish local weekly in my town, they all make their living by making fun of people. The trouble is, we are not stupid. So how do we keep hanging on to stupidities?
Misreading the world can be costly to all of us -- the folks who burn their eyes looking into the sun for miracles at Mother Cabrini Shrine in 1992 (Nickell 199); the sick kids whose parents pray instead of getting them to a doctor; the chemical workers who sabotage community environmental initiatives, so they won’t have to admit poisoning themselves and their families; generations of women and men, abused and murdered by institutional sexism; and (here you can add your own observations).
Perhaps the most dramatic examples of faith in the face of reality are the end-of-the-world folks. Norman Cohn’s Pursuit of the Millenium describes many episodes in medieval Europe where people suffered extreme hardship to bring about the Second Coming. In the German town of Münster, for example, thousands of Anabaptists suffered starvation and slaughter in an effort to establish a communist theocracy.
In the course of the British conquest of South Africa the Xhosa people, having been defeated in war, were swept by a prophecy that the ancestors would return and a great wind would blow the British into the sea. But to prepare the way, the people would have to destroy their grain stocks and slaughter all their cattle. Many did so, but some resisted. The date first prophesied passed without the resurrection:
But when the ‘moon of wonders and dangers’ failed to bring forth the prophecy’s promise, anticipation and joy turned into disillusion and anguish. [King] Kreli issued orders to suspend the Cattle Killing and sent urgent messages to Mhlakaza.
If the delusion had ended at this point, it would not have been the tragedy it became, for the slaughter was not yet wholesale and large numbers of animals remained. It was a curiosity of the phenomenon that this disappointment actually fueled the prophecy and led to its much greater impact in the months to come.
Mhlakaza and Nongquase had an explanation for what came to be known as the First Disappointment of August 16, 1856. The prophecy failed to materialize, they said, because some of the people had sold their cattle rather than slaughtering them (Bulgatz 168).
Kreli resumed the slaughter. Believers started attacking unbelievers and destroyed the remaining cattle. Two more resurrection dates came and went, and 20,000 to 40,000 died of murder and starvation, before the movement fell apart.
(The Lysenko episode in Russia was a more recent, top-down parallel to the Mhlakaza movement. Having helped cause murderous famine with their policies of collectivizing peasant land, the Communists were desperate to reverse the agricultural disaster. So they elevated Lysenko and other hucksters to the top of Soviet science, on the basis of their claims that they could double or triple farm production. That millions of peasants starved did not shake Stalin's faith in these miracle workers. Gratzer 182ff.)
Festinger, Riecken and Schachter describe a less gruesome event, the Millerite movement of the 1840s. In 1818 William Miller, a New England farmer, prophesied the end of the world in 1843. Despite some opposition by mainstream churches, a lot of people took him seriously, and by 1840 the movement had thousands of adherents, mobilized through camp meetings and religious newspapers and tracts. As 1843 approached, many sold or gave away their belongings in preparation for the Second Coming. The usual succession of dates and disappointments took place. Festinger notes, however, that the disappointments, far from alienating the faithful, actually intensified their enthusiasm.
The two partial disconfirmations (April 23, 1843, and the end of the calendar year 1843) and one complete and unequivocal disconfirmation (March 21, 1844) served simply to strengthen conviction that the Coming was near at hand and to increase the time and energy that Miller’s adherents spent trying to convince others:
‘Perhaps not so much from the preaching and writing of [Rev. Samuel] Snow, as from a deep conviction that the end of all things could not be far away, some of the believers in Northern New Hampshire, even before the summer began, failed to plow their fields because the Lord would surely come “before another winter.”’ (Festinger 19).
Devotees demanded another date: “It is interesting that it was the insistence of the ordinary members of the Millerite movement that the October date be accepted. The leaders of the movement resisted and counseled against it for a long time but to no avail” (20). It was the failure of the world to end on October 22, 1844 that finally broke the movement, although some of these believers went on to become Seventh Day Adventists, a highly successful church.
*~*~*~*~*~*~*~*~*~*~*~*~*~*~* Hope is not a delusion. *~*~*~*~*~*~*~*~*~*~*~*~*
Highlander co-founder Myles Horton used to say that political movements rise out of hope, not from desperation. We don’t need guarantees to be able to take risks, but we do need a sense of possibility. Sometimes we need a fallback story, too, to rationalize failure.
Hope doesn’t always reflect a realistic assessment of the situation. Albert Hirschman figures that we’d rarely take on hard tasks if we knew all the risks involved. But, he writes, our underestimation of the difficulties balances our underestimation of our own capability, so it’s just as well we can’t foresee all the obstacles. This is true of course on a personal level, as any parent can attest.
After reviewing years of research, Shelley E. Taylor and Jonathan D. Brown (1988) concluded that well-being virtually depends on illusions of ‘overly positive self-evaluations, exaggerated perceptions of control or mastery, and unrealistic optimism.’ These ‘positive illusions,’ Taylor (1989) argues, are not only characteristic of human thought, but also necessary for the usual criteria of mental health: the ability to care about others, the ability to be contented, the the ability to work productively . . . . Positive illusions are especially useful when people are threatened with illness, crisis, or attackes to their self-esteem. This strategy is adaptive and healthy, because it allows people to respond with hope for the future (Wade, Tavris 1993, p. 552).
Groups and movements need hope too. Scott talks of the key role of rumors in peasant and slave insurrections: as absurd as they were, in many cases tales that the king had abolished feudal taxes sparked widespread resistance to the lords’ demands. “The people have been told that the king wishes every man to be equal, that he wants neither bishops nor lords; no more rank; no more tithes or seigneurial rights. And so these poor misguided people believe they are exercising their rights and obeying their king” (1990 p. 146). And when Haitians heard that the king had abolished slavery, which was not true, they carried out history’s only successful slave revolt.
Scott doesn’t tell us what happened when slaves and peasants figured out whose side the king was really on. Perhaps they simply denied contrary evidence, even as they were being slaughtered and crucified by the king’s vassals. Maybe the belief in the king’s authorization was important less as a primary motivator than as a way to move potential recruits from unrest to uprising, and to initiate action in some kind of coordinated or at least coincident fashion. “It is not a simple matter to determine the proportions of wish fulfillment and willful misunderstanding that went into these utopian readings. What is certain however, is that like Russian peasants interpreting the czar’s wishes, their interpretations were very much in line with their interests” (146).
It seems some ideologies did evolve to the point of shifting loyalties from the real king to a leader of the uprising in the guise of “the true emperor”. It would be tempting to blame the ultimate failure of almost all of these revolts on their unrealistic assessments and expectations --this is the central thesis of Hobsbawm’s Primitive Rebels-- but it might be just as true that folk mythology, false rumors and the historically absurd “We Shall Overcome” were among the best organizing tools available.
Which raises the question, can mass movements do without illusions? Can we mobilize tens of thousands of people, get them to take incredible risks in confronting a very powerful criminal class, on the basis of the scary truth that we can have no guarantee of the outcome? Later I’ll touch on the roles of self-interest and expectation in shaping our decisions. I think there are two main dimensions of the problem. First, folks are certainly capable of assessing the risks, and sometimes they do indeed take the dangerous course. Second, those moments of being willing to stick our necks out happen at different times with different folks; to succeed, people have to make that decision together. Armies spend years training soldiers to be ready for unified combat; slave culture and peasant culture and working class culture offer some of that training, but more urgently teach folks how to survive from day to day under unspeakable tyranny.
My guess is that truer information makes better tools than less true. Rebels need to be able to trust each other, and it’s too easy to see through lies (especially those that contradict the king’s story). To fight effectively we need to know the enemy and ourselves with utmost clarity. We need to be able to absorb defeat without being demoralized from shock. And when we win, we have to be ready to govern without lies.
*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*
Subscribe to:
Post Comments (Atom)
glad you wrote this
ReplyDeleteexactly!
ReplyDeletelemonade for everyone.
ReplyDeleteturtle crossing.
ReplyDeleteblue bird flying.
ReplyDeletesugar beets.
ReplyDeleteorganize!
ReplyDelete