Is 'Groupthink' Part of the Problem with NASA Again?

LEAD STORY-DATELINE: The New York Times, 23 March 2003.

In a news piece, John Schwartz and Matthew Wald note that a body of research that is getting more and more attention points to the ways smart people working collectively can be dumber than the sum of their brains. The issue came into sharp focus in Houston in February 2003 at the first public hearing of the board investigating the Feb. 1 Columbia disaster. Henry McDonald, a former director of the NASA Ames Research Center, testifying before the board, said that officials at the space agency want to do the right thing, but cannot always get the facts they need. Investigators are questioning the quick analysis by Boeing engineers that NASA used to decide early in the Columbia mission that falling foam did not endanger the shuttle, though it is now considered one of the leading candidates for the craft's breakup.

Because the engineers directly connected to the process were satisfied that the foam was not a risk, they did not pass the results of their discussions up the line, even though they suggested the material could cause catastrophic damage. But other engineers who had been consulted became increasingly concerned and frustrated. The shuttle investigation may conclude that NASA did nothing wrong. But if part of the problem turns out to be the culture of decision making at NASA, it could lead to more group dynamics and words such as groupthink, an ungainly term coined in 1972 by Irving Janis, a Yale psychologist and a pioneer in the study of social dynamics. He called groupthink "a mode of thinking that people engage in when they are deeply involved in a cohesive in-group, when the members' strivings for unanimity override their motivation to realistically appraise alternative courses of action." It is the triumph of concurrence over good sense, and authority over expertise.

It would not be the first time the term has been applied to NASA. Janis, who died in 1990, cited the phenomenon after the loss of Challenger and its crew in 1986. The official inquiry into that disaster found "a serious flaw in the decision-making process leading up to the launch." Worries about the O-rings circulated within the agency for months before the accident, but "NASA appeared to be requiring a contractor to prove that it was not safe to launch, rather than proving it was safe." As you go up the chain, you're generally asked harder and harder questions by people who have more and more control over your future," said David Lochbaum, a nuclear engineer at the Union of Concerned Scientists. The group answering the questions then tends to agree upon a single answer, and to be reluctant to admit it when they don't have a complete answer.





Continue here for full article:

Is 'groupthink' part of problem with NASA again? ; The process, also blamed for the events that led to the 1986 Challenger disaster, puts members' desire to agree ahead of possible alternatives. 
Schwartz, John and Matthew Wald; The Grand Rapids Press; Grand Rapids, Mich.; Mar 23, 2003    

HOUSTON -- At NASA, it really is rocket science, and the decision-makers really are rocket scientists. 

But a body of research that is getting more and more attention points to the ways smart people working collectively can be dumber than the sum of their brains. 

The issue came into sharp focus in Houston last month at the first public hearing of the board investigating the Feb. 1 Columbia disaster. Henry McDonald, a former director of the NASA Ames Research Center, testifying before the board, said that officials at the space agency want to do the right thing, but cannot always get the facts they need. 

In fact, NASA's databases are out of date. For example, the agency cannot easily collect its data on damage to the shuttle on previous flights, and then search for trends and warning signs. 

NASA officials are working to return the shuttle to orbit as early as fall, with plans to quickly correct any flaws in the system uncovered by the board investigating the Columbia accident. 

Investigators also are questioning the quick analysis by Boeing engineers that NASA used to decide early in the Columbia mission that falling foam did not endanger the shuttle, though it is now considered one of the leading candidates for the craft's breakup. 

Because the engineers directly connected to the process were satisfied that the foam was not a risk, they did not pass the results of their discussions up the line, even though they suggested the material could cause catastrophic damage. But other engineers who had been consulted became increasingly concerned and frustrated. 

"Any more activity today on the tile damage, or are people just relegated to crossing their fingers and hoping for the best?" asked a landing gear specialist, Robert H. Daugherty, in a Jan. 28 e-mail message to an engineer at the Johnson Space Center, just days before the shuttle disintegrated on Feb. 1. 

The shuttle investigation may conclude that NASA did nothing wrong. But if part of the problem turns out to be the culture of decision making at NASA, it could lead to more group dynamics and words such as groupthink, an ungainly term coined in 1972 by Irving Janis, a Yale psychologist and a pioneer in the study of social dynamics. 

He called groupthink "a mode of thinking that people engage in when they are deeply involved in a cohesive in-group, when the members' strivings for unanimity override their motivation to realistically appraise alternative courses of action." It is the triumph of concurrence over good sense, and authority over expertise. 

It would not be the first time the term has been applied to NASA. Janis, who died in 1990, cited the phenomenon after the loss of Challenger and its crew in 1986. 

The official inquiry into the Challenger disaster found that the direct cause was the malfunction of an O-ring seal on the right solid-rocket booster that caused the shuttle to explode 73 seconds after launching. 

But the commission also found "a serious flaw in the decision-making process leading up to the launch." Worries about the O-rings circulated within the agency for months before the accident, but "NASA appeared to be requiring a contractor to prove that it was not safe to launch, rather than proving it was safe." 

Groupthink, Janis said, was not limited to NASA. He found it in the bungled Bay of Pigs invasion of Cuba and the escalation of the Vietnam War. It can be found, he said, whenever institutions make difficult decisions. 

David Lochbaum, a nuclear engineer at the Union of Concerned Scientists, has studied nuclear plants where problems have gone uncorrected because of internal communications failures and poor oversight. His list includes the Davis-Besse plant near Toledo, Ohio, where in March 2002 technicians discovered that rust had eaten a hole the size of a football nearly all the way through the vessel head. Only luck prevented what might have become an American Chernobyl. 

"As you go up the chain, you're generally asked harder and harder questions by people who have more and more control over your future," Lochbaum said. The group answering the questions then tends to agree upon a single answer, and to be reluctant to admit it when they don't have a complete answer. 

It is only common sense that large institutions should try to make sound decisions, said John Seely Brown, a former researcher at Xerox and a co-author of "The Social Life of Information." But it can be bewilderingly hard to do in practice. 

"Often it takes tremendous skill in running a brainstorming session," Brown said. "Every once in a while, the random way-out idea needs to have more of a voice." 

Full Text: 
Copyright Grand Rapids Press Mar 23, 2003

TALKING IT OVER AND THINKING IT THROUGH!

  1. Decisions that affected Columbia may certainly have been the result of groupthink among technical experts and managers at NASA. However, external forces may also have affected their decisions. What were some of those external factors?

  2. In this case, groupthink has a negative connotation, something you wouldn't want your company to do. But why does it occur so frequently? Why is it easier for members of a group to quietly agree to a course of action that may be imperfect than it is to reach a "perfect" decision that may only appeal to a minority of the group?

  3. According to the article, how does NASA's structure make it easier for groupthink to occur?



  4. Given what is now known about how Columbia disintegrated, does it now seem as if groupthink played a role in the outcome?


SOURCES:

Schwartz, John, and Matthew Wald. "Is 'Groupthink' Part of the Problem with NASA Again?" The New York Times, 23 March 2003.


- Pedagogy provided by Roland J. Kushner, Kushner Management Advisory Services