May 2008
Interview : Diane Vaughan
Sociologist, Columbia University

Diane Vaughan is an American sociologist who devoted most of her time on topics as different as "Tension in private life" and "Deviance in organizations". She has attracted a large interest towards her work when she published in 1996 The Challenger Launch Decision which book unveiled a "Normalization of Deviance" within NASA that led to the space shuttle Challenger disaster in 1986.

After the Columbia crash of 2003 Diane Vaughan was invited to join the CAIB, the Columbia Accident Investigation Board for which she demonstrated that the space administration had not profited from the first accident and replicated its risk acceptance and sliding toward normalization of hazardous operations. Known for outstanding works ranging from turning points in intimate relations (Uncoupling, 1986, a research work started as a doctoral dissertation) to deviance in organizations (Controlling Unlawful Organizational Behaviour, 1983, a book written as a quasi-thriller where her investigations on high prescription of Revco tranquilizer led to the unveiling of an organized false billing of the Welfare Department) she, after the recent years devoted to the Space Shuttle accident analysis, is now focusing on matters related to the Air Trafic Control. Formerly a Professor at Boston College she nowadays is Professor of Sociology and International and Public Affairs at Columbia University New York where she welcomes us for this interview.

Diane Vaughan, thank you for this interview at Columbia University during the AMCF meeting. In the title of one of your papers you indicate that you became « Public Sociologist by Accident »... Some kind of a joke?

Diane Vaughan : It happens that I became a sociologist a long time before the blowing up of the space shuttles. But what I did was not what I expected to do.  I « accidentally » took a course in sociology at the college and then read a book by William Foote Whyte entitled « Street Corner Society », [devoted to the life of Italian immigrants in a district of Boston]  and at the end there was a long appendix on his research methods. I found it fascinating. So I « embarked » on sociology and what I did in Boston [for Revco, NASA, etc] was nothing else than what he did with the Italian neighborhood. I could not believe it was work because it looked like so much fun. The title of the piece you refer to is « Public Sociologist by Accident » and describes how I began to take my social research outside the academy, making public my results to other audiences.  Any situation where you talk of your research outside academics is « Public Sociology ». So « Public Sociologist by Accident » refers to 2 of my works that generated a lot of public attention and brought me – unexpectedly - into public socioloogy. One is on Challenger, the other Uncoupling.

To take the research outside the academy. We understand that topics have to come from the outside and the conclusion of the research work to be returned to the public. What are the main concepts you discovered and  try today to explain to a broad public ?  

Diane Vaughan : I’m interested in the dark side of organizations : how things go wrong - mistakes, misconduct, disaster. Researchindicates that troubles came not only from individual failures but also from organizational failures. The end of a relationship is an example of this, because a relationship is the smallest organization we can set-up. From that research, I traced, using interviews, how relationships came apart. It was a gradual transition, not a sudden break, where one person begins leaving the relationshipsocially and psychologically before the other. By the time the person being left behind realizes something is seriousy wrong, the first person is already gone in so many ways that the relationship is difficult to save.  From the public response to that project, I realized that sociology had an important message to give because it could explain how people live their lives as well as how organizations do their duties by showing similar patterns. For example, common concepts explain misconduct by organizations, the deterioration of intimate relationships, and accidents and disasters. In each of my three books, there was a common pattern : a long incubation period filled with early warning signs that were either missed or misinterpreted or ignored. Concepts common to all are structural secrecy, the normalization of deviance, signals – missed signals, weak signals, routine signals.All of these are common in failures of all sorts. Primarily, the work has introduced the idea of how deviance becomes normalized in different kinds of organizations.

So we get to the core of your research work and your main contribution to the field : Deviance and acceptance of deviance. Might you develop this concept for our readers, most involved in management consulting ?

Diane Vaughan : The Challenger case is a good example. Initially, it appeared to bea case of individuals – NASA managers - undercompetitive pressure who violated rules about launching the shuttle in order to meet the launch schedule.It was the violation of the rules in pursuit of organization goals that made it seem like miconduct to me. But after getting deeper into the data, it turned out the managers had not violated rules at all, but had actually conformed to all NASA requirements.  After analysis I realized that people conformed to « other rules » than the regular procedures. They were conforming to the agency’s need to meet schedules, engineering rules about how to make decisions about risk. These rules were about what were aceptable risks for the technologies of space flight. We discovered that they could set-up the rules that conformed to the basic engineering principles that allowed them to acept more and more risk. So they established a social normalization of the deviance, meaning once they accepted the first technical anomaly, they continued to accept more and more with each launch. It was not deviant to them. In their view, they were conforming to engineering and organizational principles. That was the big [discovery].  I concluded it was mistake, not misconduct.

A discovery… a Challenger, plus a Columbia… May we compare the observed behaviours to the one of drivers running on empty, for instance... "so far so good" ?

Diane Vaughan : Yes, you succeed by risking a little, you don’t fail, so each time you risk a little more. What is different from your driving example is the role of the organization in urging you along in this risky business.

Deviance, normalization of deviance. What was exactly that normalization of deviance in the case of Nasa ?

Diane Vaughan : Social normalization of deviance means that people within the organization become so much accustomed to a deviant behaviour that they dont consider it as deviant, despite the fact that they far exeeed their own rules for the elementary safety. But it is a complex process with some kind of organizational acceptance. The people outside see the situation as deviant whereas the people inside get accustomed to it and do not. The more they do it, the more they get accustomed. For instance in the Challenger case there were design flaws in the famous « O’rings », although they considered that by design the O-rings would not be damaged. In fact it happened that they suffered some recurrent damage. The first time the O-rings were damaged the engineers found a solution and decided the space transportation system to be flying with « acceptable risk ». The second time damage occurred, they thought the trouble came from something else.Because in their mind they believed they fixed the newest trouble, they again defined it as an acceptable risk and just kept monitoring the problem. And as they recurently observed the problem with no consequence they got to the point that flying with the flaw was normal and acceptable. Of course, after the accident, they were shocked and horrified as they saw what they had done.

Diane Vaughan
Diane Vaughan in her office at Columbia University New York

You indicate the O’Ring problem for Challenger (1986). But in the CAIB report in 2003 you write that it was the same situation for Columbia. However the technological reason was different that time.

Diane Vaughan :
The causes of the two accidents were identical. In one case they flew with O’ring problems that they considered were not a risk to flight safety, and the more they flew the more they demonstrated that the problem had no consequences. For Columbia they flew with foam debris that hit the heat shield of the orbiter wing [during lift off] and removed some tiles from it and the more they observed such debris hits the more they considered that they had no safety conséquences.Tile damage was defined as a maintenance problem only. Each time they replaced the heat tiles and it was ready to go again. So the outcome was that one day at Columbia reentry into the earth’s atmosphere the orbiter broke into parts... and the flight data indicated that foam debris hits on the wing at the moment of launch had created a large hole in the wing.  The heat of reentry started a fire and the shuttle disintegrated. So, as I indicate in papers and in the CAIB report we have 2 technical situations, very different in term of technology but identical in term of social normalization of deviance. Exactly the same : in both cases, there was a history of early warning signs that were misinterpreted or ignored until it was too late.

Well, space transportation is a dangerous task and statistics are known to be poorly  reliable for « small numbers », and somehow the number of launches is low compared to the number of commercial flights...  So aren’t we passing judgement on tragic events that somehow must happen from time to time... the history of mankind gives some evidences to this dramatic reality ?

Diane Vaughan : Yes, you make a very good point. One of NASA’s failings was to treat space flight as routine and operational, when in fact it was and will continue to be experimental.  NASA failures, though rare, are dramatic and have enormous political consequences, as well as costing billions of dollars.  So it becomes important, for many reasons to find out what happened. Blame must be placed so the program can continue.  But in these two accidents are data indicating that other organizations fail for the same reasons. So something is to be gained from research investigating causes of accidents. In particular, the idea of early warning signs that are missed, but also that élite décisions about resources and goals keep employéesat the lower level moving forward, normalizing déviance time after time..

"Early signs"... this seems to be a concept you are fond of... not that it appears only in the « Challenger Launch Decision » but also as early as in the publication of «Uncoupling. Turning Points in Intimate Relationship». Any comments?

Diane Vaughan : If you think about it, the idea of early warning signs and missed, weak, and routine signals that are ignored contribute to may kinds of harmful outcomes. In 1978, sociology Barry Turner wrote a book called « Man-Made Disasters, » in which he examined 85 different accidents and found that in common that had a long incubation period with warning signs that were not taken seriously. Now we read that in the dramatic school shootings in the United States, that preceding the moment when the students went into the schools with guns, were many signals that they were angry, alienated, and planning a dangerous act, When the US was attacked on September 11, 2001, the entire country was shocked and surprised. But the 9/11 investigating commission discovered that the terrorist attack too, was preceded by a history of early warning signs that were misinterpreted or ignored. Whether talking about failed relationships or terrorist attacks,  the ignorance of what is going on is organizational and prevents any attempt to stop the unfolding harm. An important distinction between relationships failure and shuttles exploding, corporate misconduct, and terrorist attacks is politics and the décisions of éliteswho set problems in organizations in motion.

Challenger blew up in 1986 and you published in 1996, I mean 9 to 10 years after.

Diane Vaughan : It took about 9 years to come to the final version. When I started, it was from the point of view that it was msiconduct.  I kept looking  for « rule violations », but I didn’t find any. In fact engineers conformed to rules and doing so  normalized the deviance. After about a year I had to throw everything out and start over. It has been a long process to understand why and how they normalized deviance. I relied on thousands of documents in the US National Archives that were placed there by the government investigation. Many of these were engineering documents and memos. Doing the research involved learning engineering and NASA language. In the Challenger case there is this notion of « levels» of « acceptable risk » , for example. These were called « Criticality Levels » Each part on the shuttle had to be classified at a Criticality level, meaning the probability of failure. Same for Columbia. NASA’s risk assessment procedures were very complicated. The reality was that every attempt they made to quantify and clarify risk gave them no help because there were thousands og technical components on the shuttle. Instead of clarifying, it was just overwhelming – for NASA engineers and managers doing the risk assessments and for me, doing the research !

Was your workvery welcome at the time of publishing ?

Diane Vaughan : Yes, beyond anything I had imagined. The « Challenger Launch Decision » was published for the 10th anniversary of the Challenger accident. The timing brought it much more attention than if it had been published earlier. On the anniversary day, it was reviewed by more than 40 newspapers over the nation. I was credited for a lot of inventions. It was a thick, dense and very technical book and I never thought it would be read by so many people. I thought it would not sell at all. When I published « Uncoupling », earlier,  I tried to write in clear language and I gave a lot of lectures. And at that time I had never imagined that the book might be read by so many people, but it is still being read. With the "Challenger Launch Decision" I also tried to write in clear language, but it involved a lot of technical talk. So I was even more surprisedat its reception because of its length. The media wanted to interview me, I was asked to consult with a number of companies and agencies whose work has some risk. It was very exciting.

Was this the start of a new career ? With industrial clients for instance ?

Diane Vaughan : After the publication of the « Challenger Launch Decision » consulting  kept me busy for a year.  The first month after publishing I was asked by organizations perceiving that they had risks [unmanaged] such as the US Force Services that manage the big fires. They were interested by reviewing their organization, instead of blaming the fire fighters for the mistakes. The nuclear industry was also interested as well as IBM, the submarine service,  hospitals with the nurses and doctors. I felt a responsibility to respond to their requests, but was eager to get back to my research. I then began a project on air traffic control – this is an organizations where people are trained to recognize early warning signs and fix small mistakes so they don’t turn into big catastrophes.

Diane Vaughan
<>Diane Vaughan in her office at Columbia University New York.

The big black cupboard full of ATC files that is just behind me seems to testify for the complexity of your new research work on Air Trafic Control... some few years of work I guess, as for Challenger. Does it mean that sociology is slower than « diagnostic as  usual » as it is done by management consultants ?

Diane Vaughan : As I said Challenger took me 9 years. This is long. So certainly this kind of investigation is far from the common standards, even for an academic researcher. But the data collected by the government was so extensive and technical that it took an exceptionally long time. It is not typical of sociological research. Many sociologists do consult with businesses and agencies on a variety fo topics. Many sociologists who specialize in organizational behavior are in business schools, for example.

It seems that it takes time to investigate in a sociological maner (TWA 800, Y2K, 9/11, Eron - Andersen case, or the Concorde crash). Would this prevent management consultants to integrate a sociological approach to their diagnostics of corporations ?

Diane Vaughan : No. The basic lesson that sociologists bring is that the organization matters. If there are problems, the tendency of corporate or public agency administrators is to blame individuals. However, organization characteristics – cultures, structures, politics, economic resources, their presence or absence, their allocation, put pressure on individuals to behave in deviant ways to achieve organization goals. If you want to fix a problem, you can’t just fire the responsible person. You have to fix the organization, or else the next person to take the job will just experience the same pressures. Like Columbia after Challenger, the harmful behavior persists.

The media after the Columbia crash upon reentry were somehow very pleased that you had been « At Last » integrated to the official investigation board...  What can be said on that point ?

Diane Vaughan : After the publishing of the « Challenger  Launch Decision » many organizations doing risky work contacted me, but NASA never called.  When the Columbia accident happened, the publication of the Challenger book and the media coverage at the time of publication made me the expert on NASA shuttle accidents. So now I was doing public sociology again. I was meeting, by phone and television interview, many of media representatives for the second time.  Many of them had read the book, so were looking for the organizational explanation. When the Columbia Accident Investigation Board was created to investigate the accident, the CAIB public hearings were televised and attended by the media. I was one of the many people who were called, then afterwards the CAIB asked me to join them in the investigation and writing of the report. My consulting work with the CAIB then limited what I could say publicaly, but I continued to be a source for them.

Finally if we had to select just one concept out of your research work, what would it be ?

Diane Vaughan : This is not the outcome of the work I did that is important but the sociology as a whole, its principles and methods. We see events or outcomes as the result of organizational behaviour and not as individual success or failure. Sociologists look at things having in mind the groups, organizations, communities, social class, and/or states that can explain individual behavior. The people in the US as well as the courts of justice see the world as the result of individual failures. They think that if you find the individual responsible, you would solve the problems. But the problem is that if you take the responsible people out, new ones will take the job and social factors will just reproduce the behaviours and  replicate the problems. America is a culture where individual achievement is everything. We consider that if you don’t achieve, you are responsible for your own failure. So looking beyond individual responsibility, to seeing what’s going on in the social context, how it works, what are the beliefs, the common culture, the political economy, etc.. is something we sociologists believe explains human behavior. So it is important to target the real root causes when things go wrong, whether we are talking about relationships, shuttle accidents, or terrorist attacks.We want to know why people make the décisions that they do. When it comes to organization mistake, misconduct, or disaster, the blame usually goes to low level workers and middle managers. Blaming them works for the organization. It deflects attention from top administrators who make major décisions about goals and resources that affect organization cultures, and falls upon workers, affected their actions. I call this « the trickle down effect ». To show the connection between élite actions, as they affect organizations and harmful outcomes shifts our understanding of what to do about the dark side of organization. And this at the core of public sociology. [Even if] it happens to be demonstrated « by accidents ».

Words collected by Bertrand Villeret
Editor in chief

French version :

Short extract :

The social organisation of a mistake

The Challenger disaster was an accident, the result of a mistake. What is important to remember from this case is not that individuals in organizations make mistakes, but that mistakes themselves are socialy organized and systematically produced. Contradicting the rational choice theory behind the hypothesis of managers as amoral calculations, the tragedy had systemic origins that transcended individuals, organization, time and geography. Its sources were neither extraordinary nor necessary peculiar to NASA, as the amoral calculator hypothesis would lead us to believe ? Instead, its origin were in routine and taken for granted aspects of organizational life that created a way of seeing that was simultaneously a way of not seeing.  The normalization of deviant joint performance is the answer to both questions  raised at the beginning of this book :  why did NASA continue to launch shuttles prior to 1986  with a design that was not performing as predicted ? Why was Challenger launched over the objection of engineers ?

From : The Challenger Launch Decision.  Risky Technology, Culture, and Deviance at NASA
Diane Vaughan, University of Chicago Press, 1996, Chap X, P 394

The trickle Down Effet, Policy Decisions, Risky Work and the Challenger TragedyThe Trickle-Down Effect

California Management Review, Vol 39, N°2, Winter 1997
Published with special agreement from the author. All rights reserved.
Copy not allowed.

To know more :

Books :

Uncoupling. Turning Points in intimate relationship
Diane Vaughan, Oxford University Press, 1986

Controlling Unlawful Organizational Behaviour
Diane Vaughan, University of Chicago Press, 1983

The Challenger Launch Decision.  Risky Technology, Culture, and Deviance at NASA
Diane Vaughan, University of Chicago Press, 1996.

Papers (limited list) :

History as a Cause : Columbia and Challenger
CAIB Report Volume 1, Chapter 8, 195-201,  2003

Organizational rituals of risk and errors
Diane Vaughan, In Organizational Encounters with Risks,
Bridget Hutter & Michael Power Eds, New York and Cambridge,
Cambridge  University Press , 2004

Internet link to list of papers by Diane Vaughan :

Whoswoo :
Diane Vaughan

Sociological vocabulary :

A bit of Diane Vaughan’s concepts

Turning Points
The Dark Side of Organizations
Organizational Deviance
Criminology of Organizations
Structural Secrecy
Culture of Risk
Ritual of Risks and Errors
Clean-up Work
Normalization of Deviance
Regulation Failure

Photos :
Official academic picture : Courtesy Diane Vaughan
Exctract from moving pictures taken at Columbia University
New York :  Copyright B. Villeret, Quantorg 2006

Copyright Quantorg  2008
pour ConsultingNewsLine
All rights reserved
Reproduction interdite

Diane Vaughan