The Challenger disaster: A tragic lesson in data analysis …

Well-intended engineers correctly interpreted the wrong data.

Excerpted from Everydata: The Misinformation Hidden in the Little Data You Consume Every Day


I’m sure all baby-boomers have a vivid recollection, but for younger readers, here’s some background …

“On the morning of 28 January 1986, the Space Shuttle Challenger, mission 51– L, rose into the cold blue sky over the Cape. To exuberant spectators and breathless flight controllers, the launch appeared normal. Within 73 seconds after liftoff, however, the external tank ruptured, its liquid fuel exploded, and Challenger broke apart.”



What happened?

“The specific failure,” noted the Report of the Presidential Commission on the Space Shuttle Challenger Accident, “was the destruction of the seals that are intended to prevent hot gases from leaking.…”

Investigators quickly focused their attention on a key part of the seals— the rubber O-rings that went in between two sections of the solid rocket motor— the “tang” and the “clevis.”

The O-rings on the Challenger needed to be flexible enough to compress and expand, sometimes within milliseconds.

But O-ring resiliency “is directly related to its temperature… a warm O-ring will follow the opening of the tang-to-clevis gap. A cold O-ring may not.”

In fact, investigators found that a compressed O-ring is five times more responsive at 75 degrees Fahrenheit than at 30 degrees Fahrenheit.

The air temperature at launch was 36 degrees Fahrenheit.

The commission’s report found “it is probable” that the O-rings were not compressing and expanding as needed.

The resulting gap allowed the gases to escape, destroying the Challenger.


So why didn’t engineers stop the launch, given the cold temperatures?


They tried.

“We were concerned the temperature was going to be lower than the 50 or the 53 that had flown the previous January, and we had experienced some… erosion on the O-rings… it wasn’t a major concern, but we said, gee, you know, we just don’t know how much further we can go below the 51 or 53 degrees or whatever it was.  So we were concerned with the unknown.”

In other words— they didn’t have enough data.

Nobody knew what would happen to the O-rings on a day where the temperature was 15 degrees colder than that of any previous launch.

But not having data below 53 degrees was just one of the issues.


The team recognized that they didn’t have data below 53 degrees, and decided to look at all cases where there had been signs of O-ring distress, regardless of temperature.

In this case, the data was limited only to incidents of O-ring thermal distress, defined as O-ring erosion, blow-by, or excessive heating) —  exactly the question of interest.



The night before the disaster— as engineers tried to convince their managers at Thiokol and NASA not to launch— someone pointed out that there had been signs of O-ring distress on a shuttle that was launched at 75 degrees.

It’s true— there had been issues at 75 degrees. And at 70 degrees. And at 63 degrees. In fact, on seven separate missions, there was evidence of O-ring thermal distress.

And if you look at the temperature for these launches, you’ll see that there is no easily recognizable pattern.

Observing this data, you could easily be convinced that temperature does not affect O-ring performance.

The conclusion the scientists and engineers drew based on this data was correct.

As the Rogers Commission Report stated, “In such a comparison, there is nothing irregular in the distribution of O-ring ‘distress’ over the spectrum of joint temperatures at launch between 53 degrees Fahrenheit and 75 degrees Fahrenheit.”


But, when you conduct a statistical analysis on a sample of the available data, you can induce what in statistics is known as a sample selection problem.

Running an analysis on less than the entire data set is not always a problem, but it can lead to mistaken conclusions depending on the question you are trying to answer.

The above comparison only looks at data from 7 out of the 24 space shuttle launches up to that point.

In retrospect, the engineers should have looked at all of the data on O-ring performance — not just cases where there were signs of distress.



When you look at all of the data— including flights with zero incidents— you can see the difference for yourself.

Above 65 degrees, only 3 out of 20 flights had incidents. Below 65 degrees, all 4 flights had incidents.

This is a classic example of how reliance on a sample of data — even with the best intentions — can contribute to disastrous results.


By focusing only on the flights with O-ring incidents, people were truncating the data set— a fancy way of saying that they weren’t looking at all of the data.

And that error in how the data was analyzed would have significant repercussions.

Because the engineers only looked at launches with evidence of O-ring distress — they missed a vital connection that became obvious when investigators looked at the temperatures for all 24 launches.


Lessons learned:

(1) Don’t project “out-of-sample” … no data for flights below 53 degrees.

(2) Don’t sample when all data points are available … all launches, not just ones with O-ring distress.



Follow on Twitter @KenHoma            >> Latest Posts

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: