Groupthink

From Wikipedia, the free encyclopedia

Groupthink is a type of thought exhibited by group members who try to minimize conflict and reach consensus without critically testing, analyzing, and evaluating ideas. Groupthink may cause groups to make hasty, irrational decisions, where individual doubts are set aside, for fear of upsetting the group’s balance. The term is usually used as a derogatory term after the results of a bad decision.


Contents

[edit] Origin

The term was coined in 1952 by William H. Whyte in Fortune:

Groupthink being a coinage — and, admittedly, a loaded one — a working definition is in order. We are not talking about mere instinctive conformity — it is, after all, a perennial failing of mankind. What we are talking about is a rationalized conformity — an open, articulate philosophy which holds that group values are not only expedient but right and good as well.

Irving Janis, who did extensive work on the subject:

A mode of thinking that people engage in when they are deeply involved in a cohesive in-group, when the members' strivings for unanimity override their motivation to realistically appraise alternative courses of action. [1]

The word groupthink was intended to be reminiscent of Newspeak words such as "doublethink" and "duckspeak", from George Orwell's Nineteen Eighty-Four.

[edit] Causes of groupthink

  • Highly cohesive groups are much more likely to engage in groupthink. The closer they are, the less likely they are to raise questions to break the cohesion.
  • The group isolates itself from outside experts. In order to make a well informed decision, the group needs to invite qualified experts to help weigh the possible risks.
  • Strong leadership leads to groupthink, because the leader is more likely to promote his/her own solution.

Social psychologist Clark McCauley's three conditions under which groupthink occurs:

  • Directive leadership.
  • Homogeneity of members' social background and ideology.
  • Isolation of the group from outside sources of information and analysis.

[edit] Symptoms of groupthink

In order to make groupthink testable, Irving Janis devised eight symptoms that are indicative of groupthink (1977).

  1. A feeling of invulnerability creates excessive optimism and encourages risk taking.
  2. Discounting warnings that might challenge assumptions.
  3. An unquestioned belief in the group’s morality, causing members to ignore the consequences of their actions.
  4. Stereotyped views of enemy leaders.
  5. Pressure to conform against members of the group who disagree.
  6. Shutting down of ideas that deviate from the apparent group consensus.
  7. An illusion of unanimity with regards to going along with the group.
  8. Mindguards — self-appointed members who shield the group from dissenting opinions.

[edit] Classic cases of groupthink

Most classical cases of groupthink come from the government. The presidential cabinet and NASA have been most closely studied. They are under extremely high stress, with direct leadership. It is very easy for them to slip into groupthink. NASA actually used sociologists in the aftermath of the Space Shuttle Challenger disaster to examine how the groups failed in preventing the disaster (Giddens 114-15). The sociologists concluded that the individual fears were suppressed, in order to make the launch deadline.

[edit] Space Shuttle Challenger disaster (1986)

The Space Shuttle Challenger disaster is a classic case of groupthink. The Challenger exploded shortly after liftoff on January 28, 1986 (Vaughan 33). The launch had been originally scheduled for January 22, but a series of problems pushed back the launch date. Scientists and engineers throughout NASA were eager to get the mission underway. The day before the launch an engineer brought up a concern about the o-rings in the booster rockets. Several conference calls were held to discuss the problem and the decision to go ahead with the launch was agreed upon. The group involved in making the Challenger decision met several of the symptoms of groupthink. They ignored warnings that contradicted the group’s goal. The goal was to get the launch off as soon as possible, and it ended up being a fatal mistake. They also suffered from a feeling of invulnerability, up until that point NASA had an almost spotless safety record. They also failed to completely examine the risks of their decision; they played it off as if it was nothing important. Another factor that had suppressed the few engineers who were "going against the grain" and "sounding the alarm" was that all eyes were on NASA not to delay the launch and that Congress was seeking to earmark large funding to NASA given the large amount of publicity on the Teacher in Space program. These misjudgments led to the tragic loss of several astronauts, and a huge black mark of NASA’s near perfect safety record.

[edit] Bay of Pigs invasion (1959-1962)

Another closely-studied case of groupthink is the 1961 Bay of Pigs invasion (Giddens 109). The main idea of the Bay of Pigs invasion was to train a group of Cuban exiles to invade Cuba and spark a revolution against Fidel Castro’s communist regime.

The plan was fatally flawed from the beginning, but none of President Kennedy’s top advisers spoke out against the plan. Kennedy’s advisers also had the main characteristics of groupthink: They had all been educated in the country's top universities, causing them to become a very cohesive group. They were also all afraid of speaking out against the plan, because they did not want to upset the president. The President's brother, Robert Kennedy, took on the role of a "mind guard", telling dissenters that it was a waste of their time, because the President had already made up his mind.[2]

Ultimately, the failure of the Bay of Pigs invasion was the influence of groupthink, which nearly paralyzed the decision making process from criticism.


[edit] Preventing groupthink

According to Irving Janis, decision making groups are not necessarily doomed to groupthink. He also claims that there are several ways to prevent it. Janis devised seven ways of preventing groupthink (209-15):

  1. Leaders should assign each member the role of “critical evaluator”. This allows each member to freely air objections and doubts.
  2. Higher-ups should not express an opinion when assigning a task to a group.
  3. The organization should set up several independent groups, working on the same problem.
  4. All effective alternatives should be examined.
  5. Each member should discuss the group's ideas with trusted people outside of the group.
  6. The group should invite outside experts into meetings. Group members should be allowed to discuss with and question the outside experts.
  7. At least one group member should be assigned the role of devil’s advocate. This should be a different person for each meeting.

By following these guidelines, groupthink can be avoided. After the Bay of Pigs fiasco, John F. Kennedy sought to avoid groupthink during the Cuban Missile Crisis.[3] During meetings, he invited outside experts to share their viewpoints, and allowed group members to question them carefully. He also encouraged group members to discuss possible solutions with trusted members within their separate departments, and he even divided the group up into various sub-groups, in order to partially break the group cohesion. JFK was deliberately absent from the meetings, so as to avoid pressing his own opinion. Ultimately, the Cuban missile crisis was resolved peacefully, thanks in part to these measures.

[edit] Criticism

There has been much criticism about the groupthink theory. Robert S. Baron contends that recent investigation and testing has not been able to defend the connection between certain antecedents with groupthink (219-253). This may simply be due to the fact that the groupthink theory is very difficult to test in a lab situation using a scientific method. Alfinger and Esser also came to the same conclusion (40). After concluding their study, they stated that better methods of testing Janis' symptoms were needed. It is impossible to create in labs the same conditions under which important government groups work. It is impossible to create the same levels of stress and pressure experienced by high level government officials, with the future of the entire nation hanging in the balance. Baron also contends that the groupthink model applies to a far wider range of groups than Janis originally concluded. This remains to be tested. However, it can be speculated that many people who have worked in a group setting can identify some of the symptoms of groupthink[citation needed].

[edit] References

  1. ^ Janis, Irving L. Victims of Groupthink. Boston. Houghton Mifflin Company, 1972, page 9.
  2. ^ Janis, Irving L. Ibid., page 41.
  3. ^ Janis, Irving L. Ibid., page 148-149.

Baron, R. S. (2005). So Right It's Wrong: Groupthink and the Ubiquitous Nature of Polarized Group Decision Making. In Zanna, Mark P (Ed.) Advances in experimental social psychology, Vol. 37. (219-253). San Diego. Elsevier Academic Press.

Giddens, Anthony, Mitchell Duneier, and Richard P. Appelbaum. Essentials of Sociology. New York. W.W. Norton & Company, 2006.

McCauley, Clark. "The Nature of Social Influence in Groupthink: Compliance and Internalization." Journal of Personality and Social Psychology. Vol. 57 (1987). 250-260.

Richardson Ahlfinger, Noni, and James K. Esser. "Testing the Groupthink Model: Effects of Promotional Leadership and Conformity Predisposition." Social Behavior and Personality (2001). 31-42.

Vaughan, Diane. The Challenger Launch Decison: Risky Technology, Culture, and Deviance at NASA. Chicago. University of Chicago Press, 1996.

[edit] Further reading

The Wisdom of Crowds, James Surowiecki

[edit] See also

[edit] External links