Monday, July 27, 2009

• Like selective perception, social roles simplify our lives by limiting information and options.

Like selective perception, social roles simplify our lives by limiting information and options. In 1973 psychologist David Rosenhan placed several volunteers, including himself, in twelve mental hospitals in different communities. The volunteers checked in complaining of hearing voices.

Once in the hospital, the pseudopatients immediately stopped faking any symptoms. All behaved normally and ‘sanely.’ Nevertheless, they were kept in the hospital from 7 to 52 days, an average of 19 days. When they were released, it was with the same diagnosis, now in ‘remission.’
Although this study is usually used to illustrate problems of psychiatric diagnosis, it also shows the impact on behavior of being in the role of patient or attendant. Once the pseudopatients were diagnosed as schizophrenic and assigned the role of patent, the hospital staff regarded everything they did as further signs of emotional disorder. For example, all the pseudopatients took frequent notes about their experiences. Several nurses noted this act in the records without asking them what they were writing about. One nurse wrote ‘patient engages in writing behavior,’ as if writing were an odd thing to do.
. . . . In all 12 hospitals, the staff avoided eye contact and conversation with the patients as much as possible. . . . Of 1,283 attempts to talk to nurses and attendants, only half of one percent succeeded. The usual response to a pseudopatient’s effort to communicate was the staff member’s equal effort to avoid discussion. The result was often a bizarred exchange like this one:
PSEUDOPATIENT: Pardon me, Dr X. Could you tell me when I am eligible for grounds privileges?
PSYCHIATRIST: Good morning, Dave. How are you today? [moves off without waiting for a response]”
. . . . Rosenham himself observed actual patients who were beaten by staff members for trying to talk to them (Wade, Tavris 1993 p. 645).

Well-known school-based prejudice role-plays and prisoner / jailer simulations (W,T 1993 644) confirm how powerfully our adopted roles influence what we see and how we behave.


• How many angels can dance on the head of a pin?-- drawing conclusions from poor or incomplete information.

One of the reasons why intelligent adults persist in their follies, [psychologists Daniel Kahneman and Amos Tversky] believe, is that when confronted with a puzzle to solve, they are immoderately inclined to make sense of ‘worthless’ information. They seem unable to discard rubbishy data, which means that they fail to learn that other, higher-quality data are the key to an intelligent solution of the puzzle (Jeremy Campbell 234).

Have you ever caught the sight of the back of someone’s head, just a glimpse, and hurried up to greet them, only to find it’s not your friend at all? Or maybe you’ve been waiting for a friend to call, and the phone rings, and you pick it up and start talking, only to find out that the person on the other end was not the person you expected. We see the tag end of something, and it matches a pattern we know and we think we see the pattern itself; a bit like the tail of a morphine molecule fits an endorphin receptor. A lot of times we’re right, but sometimes we’re way off base. “In Center Harbor, Maine, local legend has it that veteran newscaster Walter Cronkite was sailing into port one day when he heard a small crowd on shore shouting ‘Hello, Walter . . . Hello, Walter.’ Pleased, he waved and took a bow. Only when he ran aground did he realize what they had really been shouting: ‘Low water . . . Low water’” (W,T 186).

We often have to make decisions without having enough information. Being born story-tellers, almost literally, we can fashion explanations and decisions from the flimsiest materials. As mentioned above, our schemas and expectations help us fill in the gaps. But sometimes we end up seeing what’s not really there: monsters in the closet, Jesus in a ketchup stain. "What a person holds to be true about the world can affect the interpretation of ambiguous sensory signals” (W,T 1993 185). We may have all sorts of personal reasons to delude ourselves, but a lot of times we make mistakes simply because we don't sort good information from bad; or we try to draw big conclusions from very limited data.

Usually we can check our first impressions right away. That person you see on the street that looks like your friend, you walk up next to him and look at his face to see if it's really who you thought.

But imagine this: you see someone walking ahead of you, you think it's your friend, you go up for a closer look, and it doesn't look like your friend at all. Then you realize, he's had plastic surgery. Fred! you call out, How ya? Fred looks at you, startled. He doesn't recognize you. You give him a big sloppy smack on the lips. Now he looks kind of annoyed. Then you realize, Fred has been kidnapped by the Illuminati and brainwashed to assassinate the Archduke. So you clonk him over the head with the butt of your machine pistol, push him into the limo and bundle him off to the safe house.

That's how Dick Cheney operates. You can learn how to do it, too. Glom onto the tiniest fragment of information. I recommend those sticky strips that pick up lint. All you need is a smidge; don't bother looking for more. Instead, starting with that nanobyte of data, concoct a grand fantasy and start to act on it.

Sometimes the information is just bad. Maybe it's self-serving, like the testimony about Saddam's nuclear program. Hey, I got a great source. Utterly reliable. I know, because I pay him for every word.

A lot of trashy data is no better than a little. You could hire ten liars. That wouldn't make their lies any truer.

Sometimes we have no source, and cannot judge the quality of the information, or the difference between the original version and the one we hear. A guy in line at the convenience store said his chiropractor heard her son's boyfriend got it off the internet . . . .

Sometimes we treat speculation as documented fact. "It could have happened this way" comes to mean, "It must have happened this way." There is a whole literature built on just this device. Can the shrinking ice caps be sinking into the Hollow Earth? Could streetlights be secret transmitters for the New World Order? Was that really Hillary Clinton in Jakarta, or an alien replica? Could Moses be Sesom spelled backwards? Probably more often we see speculation as news; much cheaper for the talking heads to tell us what they think (based on what other talking heads have said) than to dig up and verify actual firsthand information.

So those are some of the ways we get poor quality information. An overlapping category might be called incomplete information-- so fragmentary and removed from context that it shouldn't be used, but we do anyway. We can misuse anecdotes and numbers with equal ease.

Most of us prefer to tell and listen to anecdotes --short stories about a situation or event-- because our worldly mind thrives on detail that reminds us of our own personal experience. We can solve the relationship problems of Carey, Sam and Mei much more easily than a logic puzzle with partipicants A, B and C. No wonder politicians and reporters alike prefer “human interest” stories to facts and numbers that might actually help us make good decisions; anecdotes are easy to understand, even when they’re false, and cheap to collect. Testifyin' is as popular in community education circles as in religious gatherings, and is often scripted, though it's not supposed to be. At the same time, because we need details to compare situations and understand cause and effect, a more formal version, the case study, is the central teaching method of many classes in business and law.

But anecdotes can help us only if they are traceable, accurate, and understood in context. If I ask my mom who she thinks is going to win the election, how the mortgage meltdown has affected her, or when Global Warming will arrive in Tennessee, her answers won't help me much unless I understand her expertise, her assumptions and how representative she is. (See the discussion of statistics below, under "We've got to trust . . . .") Think about what a task it would be to gauge Iraqi attitudes toward the U.S. invasion. Who are you going to ask? How obligated do they feel to tell you the truth? How can you compare what they say now to what they might have thought 2 years ago? We get plenty of news stories based on a couple interviews, but they give us very little to act on. It would be like picking up rocks by the railroad tracks, and trying to guess from them the local geology. It doesn't work. They are taken out of context.

The usual way to put anecdotes in context is to back them up with statistics, and that works pretty well. Mom belongs to this set of people, and 90% of them agree with her. Anecdotes plus numbers can give us depth and breadth of understanding. But just adding up anecdotes, with no way to judge how representative they are, does not help us.

Numbers tossed about in isolation suffer the same drawbacks as anecdotes alone: they can easily be taken out of context. Under former Ford exec Robert MacNamara, the U.S. conducted war in Viet Nam by the numbers: mostly inputs, like how many thousands of tons of bombs dropped, and also outputs, such as General Westmoreland's famous "body counts." You had to be on the ground to see that massive bombing did not stop the flow of supplies to the Viet Cong, and that "enemies killed" very often meant whole villages obliterated, children incinerated.

So the main thing is to understand how one bit of information relates to all the rest. What we can do with any single datum is ask a bunch of questions. We'd be foolish to act on it in isolation.

(This is a good place for me to explain what I think I'm doing with all the anecdotes and examples in these pages. I am not trying to prove my points but to illustrate them, and to raise questions. This is nothing like a scientific investigation. Neither the ideas nor the data are new. Rather than break new ground I hope to be clear enough so that readers can recognize these patterns in their own experience-- and fashion them into tools we can use.)

Acting on bad information can get to be a habit. Our mistakes cascade into crisis, and push us even harder to make uninformed, ill-considered decisions. Some people have made so many mistakes that they give up on all but the simplest decisions, and make impulsive choices a way of life. They come to rely on psychics, astrologers, and superstitious rules to guide them-- a fundamentally authoritarian way to see the world.


It’s hard to see or anticipate massive “threshold effects”. The world mostly changes bit by bit-- we get hungry, we get a little older, we make a little more or less money, the kid wins a scholarship, the strip mall engulfs a couple miles more of countryside, computer corps nibble away at our privacy, politicians kill a few more people. Or something as simple and profound as the partner waking up in A Mood. You can tell by the tone of his voice; and by now you know how to respond. So we are used to continuous feedback from our surroundings, and adjusting our course as we get new information. If it feels like we're going in the wrong direction, we count on being able to change course.

We are the ultimately adaptable species; we adjust incrementally to incremental changes. We may not like the changes, but we tighten our belts or depend more on the family or write a letter to Congress. The classic business cycles of the textbooks are examples of self-adjusting social systems on a large scale: businesses make big profits, they hire more workers, the labor pool gets smaller, businesses have to raise wages to attract additional workers, their profits fall, they lay off workers, their profits increase, they hire more workers, and so on. The Deciders make big mistakes when they don't take into account our ability to respond to changes, as when Halliburton grabbed the oil fields of Iraq, and discovered that the Iraqi people are not the bunch of meek, dispirited, manipulable, resourceless sheep it had supposed.

Not being particularly logical people --thank goodness!-- rarely do we set a course and prepare for every contingency and hold to it no matter what. Instead, we set out in a general direction with high hopes there will be something on the other side. We can’t know exactly what we’ll encounter, so we are always taking in new information and trimming our sails to fit the changing winds of this stormy world. We can usually compensate for contrary weather.

But there are some changes that are too sudden and massive to adjust to, and so strongly one-directional they can’t be rolled back in any meaningful time span. Moreover, because these are changes in quality, not just quantity, we get little warning, or can’t recognize the warnings we do see. Our minds operate well on the assumption that tomorrow will be slightly but not vastly different from today, which is usually the case.

One name for these big changes are “threshold effects”; more recently I've seen the TV heads talk of "tipping points". One prof I had gave the example of water in a teakettle: you turn on the burner, and the water gets hotter and hotter and hotter and then, pretty quickly, it starts becoming something quite different, steam. Now, just add the frogs of the gruesome urban legend, and that’s us: floating complacently in the pot, feeling a little warmer, maybe, sort of cozy, and we don’t realize what’s happening until it’s too late.

Scientists have begun to understand some of the thresholds we may be approaching (or have crossed, unwitting), such as global climate change and the sixth mass extinction in the planet’s history. If these changes are under way, there’s no going back. But it’s hard to grasp what the computer models tell us. Changes this massive run counter to our experience and possibly the very structure of our minds. The world’s temperature has increased only a couple degrees in the last century, that’s not so much, is it? It’s a shame Halliburton turned its back on the Kyoto treaty, but surely we have a couple more centuries to figure out what to do. It’s a shame they’re cutting down the last forests, but I know! let’s send away for seedlings and have a tree-planting campaign.

What’s missing here, of course, is the understanding that, after a certain point, some trends go only one way for a very long time: that once the oceans are saturated with carbon, all the additional carbon goes into the atmosphere, so the greenhouse effect will accelerate even if we reduce our use of fossil fuels; once the ice-caps have melted, it will be an awfully big job to refreeze them; once we wipe out a supremely complex forest eco-system, a hundred or a thousand newly planted trees cannot resurrect it.

Social systems have thresholds, too, and we understand even less about them. How do business booms turn into speculative bubbles, and then depressions? What triggers the self-perpetuating spiral of accusation that brings a society to burn tens of thousands of women as witches? When do generations of ethnic rivalry cross over into seemingly unstoppable genocide, as in Rwanda, and India in the late 1940s? At what point do hunger, corruption and brutality begin to spawn mass movements of mass murder, as with the Khmer Rouge in Cambodia and Shining Path in Peru?

Sometimes we miscalculate the tipping points. Years ago the Club of Rome (a disco club, I think it was, you had to wear a toga and olive leaves to get in) predicted that, based on known oil reserves and increasing rates of consumption, we’d run out of oil in a few decades. That hasn’t happened; the oil companies have discovered new reserves and new ways to reach them; we just have to pay more. There have been similar short-term improvements in global food supply; despite dire predictions, the world is now feeding twice as many people as when I was born.

The fact that the crisis hasn't arrived doesn't guarantee us it's not lurking around the corner. The actual total of oil hasn’t changed; it takes millions of years to make the stuff; we will run out (sometime before or after Florida slips beneath the Atlantic). Likewise, fertile land and clean water are in finite supply, and we are spoiling them at rates that threaten the global food supply. In this regard we are fortunate that complex systems, evolved over long periods of time, have numerous redundancies and negative feedback loops that dampen change. For instance, global warming may end up making Europe colder-- something to do with Greenland glaciers breaking up and cutting off the Gulf Stream. In this case one catastrophic threshold effect would weaken another.

At some point, however, no matter how complex and resilient a system might be, many incremental changes can add up to major phase change. You turn up the heat, and the water keeps being water-- until it’s not. It's just hard for us to notice the accumulations until it's too late. As Diamond points out, “Perhaps the commonest circumstance under which societies fail to perceive a problem is when it takes the form of a slow trend concealed by wide up-and-down fluctuations. The primary example in modern times is global warming” (2005 p. 425). Apparently major climate and environmental changes have taken place relatively quickly in the past; just not in our personal pasts, and we are ill-equipped to imagine them, or act before it’s too late.

With the examples I mention here, there are clear paths from small changes at the beginning to large ones later on, even if we expect the changes to take place more slowly than they really do. There's a contrary mistake we make sometimes, assuming a false or automatic continuum from one situation to another. We killed a lot of Asians in the 1960s, supposedly to prevent the nations of Asia from falling "like dominos" to the Soviet empire. Saddam Hussein had a nuclear weapons program in the 1980s, so he was about to drop atom bombs on the U.S. in 2003. Some anti-abortion activists see abortion as the first step leading to infanticide, and then to euthanizing old people. The "slippery slope" is a common term in this kind of thinking. Sometimes there is a single continuous path from the first step to a particular destination. Sometimes we can only justify the costs of our commitments by pointing to an inflated danger way down a mythical slippery slope.

Sometimes we may underestimate risks because we assume a threshold that's not really there. The EPA sets standards for radiation doses or chemicals in the water supply based on research about the hazards. So many parts per million of this chemical can be expected to cause so many cancers or kidney failures. While most of these studies can only give us gross probabilities, and have barely explored what happens when many chemicals interact, they usually show that the risks of some toxin are the proportional to the amounts-- increase the amount by 100%, presumably you get about 100% more cancers. Taking into consideration the health benefits and economic costs of raising or lowering the standards, the EPA then sets limits of how much poison it will allow in the air and water. We commonly take EPA standards to mean that the presence of poisons below allowed limits are safe. They are not. They are less dangerous than larger amounts, that's all.

No comments:

Post a Comment