Normalizing Deviance

From NASA’s space shuttle to crosswinds in your Cessna, if we think we got away with something, we’ll also think we can get away with it again.

19

Humans are really good at rationalizing. We do it all the time, every time when we cut corners, break rules or ignore evidence in pursuit of a successful outcome. We continue because it often has no consequences. The thing about rationalizing, though, is it can change our behavior. What once seemed wrong starts to feel normal, and outcomes that may be mostly due to good luck start to feel more like skill and deep understanding. In aviation, rationalizing can result in disaster.

In a book that examined the shuttle Challenger disaster and its root causes, sociologist and professor Dr. Diane Vaughan coined a term, the “normalization of deviance.” She defined it as, “the gradual process through which unacceptable practice or standards become acceptable. As the deviant behavior is repeated without catastrophic results, it becomes the social norm for the organization.”

You may recall that NASA officials chose to repeatedly fly the space shuttle despite having knowledge of a design flaw involving the booster rockets’ O-rings and their behavior in cold weather. The “group think” that lead NASA to accept the risk was the defining case for normalized deviance.

Seven years after Vaughan’s book was published, the same mode of failure struck NASA again with the shuttle Columbia. After 22 years, it was well-known that when foam blocks broke off the space shuttle’s external tanks during launch, they would strike and often damage the thermal shielding on the main shuttle. There was never any consequence, so it came to be viewed as a maintenance issue rather than a safety issue. This second normalization of deviance also ended in catastrophe.

Normalizing

Academic safety experts primarily write about the normalization of deviance as an issue that affects corporate safety cultures and organizations, but it is just as applicable to individuals.

Humans are great at rationalizing our actions and normalizing our own deviations. One of our behavioral norms is to drive safely and follow traffic laws, but sometimes we make exceptions. On a busy interstate, I might decide to drive over the speed limit by rationalizing that it is safer to stay with the flow of traffic (and maybe it is). But when the other lanes become empty, am I still driving above the posted limits? If I am, my SOP of following the speed limit has been subverted. My rationalized excuse is gone, but the behavior remains. I am headed down the slippery slope of normalization of deviance. Worse, I haven’t even asked myself whether I was justified in breaking the law in busy traffic or just in a hurry and looking for an excuse.

In a post-accident safety presentation, “The Cost of Silence: Normalization of Deviance and Groupthink,” NASA’s Chief of Safety and Mission Assurance stated, “There is a natural tendency to rationalize shortcuts under pressure.” And pilots are almost always under some form of pressure. Staying ahead of weather, managing fuel, optimizing headwinds, arriving prior to nightfall are all pressures that might lead us to cut corners in the name of safety. And there’s nothing like a good, juicy rationalization.

Rationalizing

Checklist discipline is an area where individual pilots can develop bad habits. We may be creatures of efficiency and discipline, but we don’t like unnecessary tasks. Checking pitot heat on a CAVU day seems redundant. Checking aircraft lights for a day flight seems unnecessary. It becomes pretty easy to disengage, stop reading the excessive details and subsequently miss critical items. It doesn’t help that checklists can be overburdened with details plugged in by lawyers to protect the manufacturer from liability rather than improve pilot safety.

Checklist discipline is completely lost when pilots start normalizing deviance by blowing past items that appear unnecessary. In a 2014 fatal runway overrun in Bedford, Mass., the crew of a Gulfstream IV must have been so comfortable pencil-whipping their checklist, they missed four (!) separate opportunities to remove the gust lock that remained in place on takeoff. That the gust lock is on the checklist four times is both a testament to how important it is to disengage it, but also an explanation about why it may have been ignored: If you do it right the first time on most flights, the other three times seem annoyingly redundant.

Desensitizing ourselves to accept greater risk can be insidious, building for years before disaster strikes. Sidney Decker, a professor at Griffith University in Brisbane’s Safety Science Innovation Lab, wrote about this in “The Field Guide to Understanding ‘Human Error.” He describes the normalization of deviance as “Drift.” He says that Murphy’s law is wrong: “What can go wrong usually goes right and then we draw the wrong conclusion: that it will go right again and again, even if we borrow a little more from our safety margins.” NASA came to a similar conclusion: “The lack of bad outcomes reinforces the rightness of trusting past success instead of objectively assessing risk.”

In other words, when outcomes are successful, it reinforces the natural human tendency to focus on the results and assume the steps leading to the outcome were correct. By extension, if a shortcut was taken, it worked. We conclude that it will likely work again.

Luck ≠ Proficiency

Another challenge is discerning where experience, personal growth, learning and gains in proficiency cross the line into deviance. Nearly all pilots are continuously learning our limits and those of the aircraft we fly.

Take a pilot landing a Cessna 206 in a 29-knot crosswind at full gross. This is well beyond the demonstrated crosswind component of the aircraft, but the pilot has 2000 hours in 206s and this particular aircraft has VGs installed, so it has a bit more rudder authority than a stock Cessna. A plane at full gross has a bit more inertia so it is less apt to get tossed around by gusts. After the pilot safely lands the plane, he thinks he’s the next incarnation of Bob Hoover and can handle this combo just fine. Was it skill or luck? My guess is that 95 percent of pilots will take the ego stroke before considering that it might have been good luck, that maybe the wheels touched down right as the gusty wind backed down. Rationalization was just reinforced and deviance normalized.

The danger is starting to think, based on a good outcome, that we can do 29-knot crosswinds in any 206 with any load, or without VGs.

Inoculating Yourself

Sidney Dekker says the best practice for avoiding drift, the slide into the normalization of deviance, is to stay chronically uneasy. As pilots, that means maintaining a skeptical and questioning attitude about our own competence and discipline. The problem for us as individuals is that preventing our own deviance from becoming normalized is very difficult when we are the subject of the insidious behavior modification. It is hard to objectively see risks when our history of safe flights desensitize us. It just reinforces that our decisions, or short-cuts, are safe enough, our skills are up to the task and our past performance provides evidence for a safe outcome.

While NASA’s findings after their shuttle disasters were written for an organization or team, they are pertinent for us. Below, I have paraphrased and adapted the most important conclusions from NASA:

Beware the false illusion of invulnerability. (When NASA engineers raised the possibility of O-ring blow-by, it was said the risk “was true of every other flight we have had.”) If you feel invulnerable, are you really that good all the time or just lucky? Would a new pilot would accept the risk you are about to accept? If not, why not?

Prove to yourself you ARE safe rather than seeking proof that you are NOT. Be skeptical of your own thinking. Are you just assuming you are safe? What is your evidence?

Check your rationalization. What is novel about this flight that you are ignoring? What is not routine? What are the most likely modes of failure? If the weather is changing (which it usually is), assume it will be worse. Instead of approaching your flight with confidence, assume it will go bad. Be prepared to meet your incompetence.

Listen to skeptics. “Are you really flying in this weather?” If others cast doubt on your proposed flight because it makes them uncomfortable, ask yourself why? Are you really that much better than the person questioning your judgment? Is your plane really up for the task? Do they consider things you may have skipped over in your delusion of confidence?

Self-censor. Are you about to violate a FAR, your own personal minimums, or some reasonable limit, for the sake of expediency? Maybe you should reconsider.

Don’t accept silence as agreement. Are you making assumptions or have you actually checked?

Don’t ignore dissent. If the briefer is saying VFR is not advised, maybe you should file or possibly cancel.

Listen to experts. Bounce your plans off others who are more experienced or perhaps more cautious. You might learn something.

I wince a little about all this nervous-Nellie advice to live in doubt and skepticism because I have never enjoyed listening to buzzkill from wet-blanket, fun-police types treating me like a student pilot and standing in the way of my plans. That said, they deserve some consideration because there is plenty of evidence they are a voice of experience and reason and maybe, just maybe, they have a point I am missing.

It would be nice if circumstances leading to accidents were more obvious, if warning bells or lights went off every time we became overly self-satisfied with our skills or complacent with our decisions and actions. But it doesn’t work that way. The best we can do is to recognize that as we take shortcuts or as we accept greater aviation challenges commensurate with our increasing skills, we may also be normalizing some of our deviance. Even if we did just nail that 30-knot crosswind landing, it may not wholly be due to demonstrating our greater skill and proficiency, but perhaps a bit of luck. None of us are Bob Hoover.


Cockpit Rationalizations

There are a lot of areas common to pilots that are ripe for rationalization. Some examples:

SOP: “Sump the tanks before each flight.”

Rationalization: “I sumped the right one; I’m sure the other one is fine.”


SOP: “Check the oil before each flight.”

Rationalization: “I just changed the oil last week, and I haven’t flown since then.”


SOP: “Calculate fuel burn for the intended trip using most recent weather data.”

Rationalization: “It always takes 20 gallons for this flight, I’ll have enough.”


SOP: “Weigh all the payload, and calculate the weight and balance before each flight.”

Rationalization: “It wasn’t out of CG last time with a similar load, so why do the calculation?”


SOP: “Field is IFR, obtain an IFR clearance.”

Rationalization: “I am only 5 miles from the airport; it will be safer to land before it gets worse.”


Aviation Is Ripe for Drift

Normalization of deviance can be found as a root or contributing cause in numerous infamous industrial and transportation accidents, including many aircraft mishaps. It is relatively easy to identify after the fact, similar to lining up the holes in the Swiss cheese or following the links-in-a-chain accident models, which help us identify root causes when it is quite easy to work backward. After an accident, we know the outcome, so we ask, “How did we get here?”

Uncovering normalization of deviance and exposing it to light prior to the accident is much more challenging. An organization can hire outside auditors and evaluators. The primary way an individual uncovers the gradual drift from the norm is through critical self-examination.

We pilots are a pretty disciplined lot all in all, and we generally don’t just flagrantly stop following rules and norms. I would venture to say that when we do, we do it slowly, often with good rationale and lots of reinforcement.

That is what makes it hard to recognize the slippery slope that we may be on, or break that link in the chain, to prevent an otherwise-inevitable accident.


This article originally appeared in the December 2019 issue of Aviation Safety magazine.

For more great content like this, subscribe to Aviation Safety!

Mike Hart
Mike Hart is an Idaho-based flight instructor and proud owner of a 1946 Piper J-3 Cub and a Cessna 180. He also is the Idaho liaison to the Recreational Aviation Foundation.

Other AVwebflash Articles

19 COMMENTS

  1. For me, the best example of “Beware the false illusion of invulnerability” was Scott Crossfield who died flying a Cessna 210A into a thunderstorm resulting in his death. After all the years as a rocket pioneer and test pilot it appears he thought he could handle this storm which led to his making a very wrong choice.

  2. “normalization of deviance.”
    Great article Mike.
    I first heard the term from a friend of mine, a local pilot and NASA engineer. We had a long interesting relevant discussion. Not that I was a risk taker but, looking back over a lifetime of flying for a living . . . well there were times.

  3. Hoo boy, this principle, summed up in the subheading – “… if we think we got away with something, we’ll also think we can get away with it again” – is prevalent everywhere you care to look. I once worked at a family-owned tech business where ignoring the procedural norms was commonplace. It was even considered somewhat of a virtue, because it cut time off the production processes “as necessary” to meet shipping schedules. As the manufacturing and process engineer, I pushed back against this practice whenever I saw it, but the owner’s sons were the ones who managed production. They knew how much they could skew things toward the edges of the process envelopes and still have a reasonable expectation of success. yet on a regular basis, product would be returned by customers for noncompliance or even nonfunction due to defects. The owner would personally rub these customers’ shoulders over the phone and pledge that all units would be replaced ASAP. #1 & #2 Sons got a talking to and nothing else, and it was likely the replacements were rushed through mfg in the same manner as the originals which had been returned. I had to find another job because the situation was unacceptable.

    Aviation has a lot of Stuff that pilots must keep at the forefront of their awareness. Finding an easy way to reduce the mental workload is sometimes welcomed, even when the pilot KNOWS it’s asking for trouble. It’s been said the FARs are written in blood, due to having found the limits of whtever envelope had been tested, and the results were then on display. Seriously – if a procedure has been identified that maximizes the probability of success, why not use it? The mfg business I worked for had a banner on the wall which read “If I don’t have time to do it right, when will I find time to do it over?” Apparently nobody read it. Unfortunately aviation seems to have a very restricted number of available do-overs.

  4. It doesn’t just apply to aviation. If some people are “normalizing deviation” in the rest of their lives, why would they not in aviation? Deviation can be fatal in many areas of our lives – driving, being cavalier when it comes to disease prevention, handling firearms, etc. We need to think more carefully when involved in anything important.

  5. Two thoughts:

    My dad taught me to ride motorcycles, and we rode together often during my formative years. As an inexperienced rider, I often got butterflies before and during a ride–Dekker’s chronic uneasiness. It’s not a particularly comfortable feeling, and I asked my Dad once how long it took before it went away. “Son,” he said, “the day I stop feeling that way before riding is the day I stop riding.” It was a perfect answer that’s stuck with me my entire life; I still ride motorcycles today nearly 50 years later, and the butterflies are still with me.

    Regarding the Gulfstream crash at Bedford, MA in 2014, it’s especially poignant that this tragedy resulted from a failure to use a checklist. The first B-17 prototype, known as the Model 299, crashed in October of 1935, killing two of the five occupants. The pilot lost control of the aircraft because he failed to remove the gust lock prior to takeoff. The event almost destroyed Boeing as an aviation company–it had bet all its resources on the B-17, and critics began to think the aircraft was too complex to be operated safely. But out of that tragedy was born an idea that has saved countless lives. A group of Boeing engineers and pilots devised a checklist for pilots to use as a memory aid. Boeing built another 12 aircraft, and its pilots, aided by checklists, flew nearly 2 million miles without incident. This ultimately convinced the U.S. government that despite its complexity, the B-17 could be safely operated by ordinary pilots. As we know, checklist use became mandatory for military aviators, and was soon adopted and mandated by professional commercial operators as well.

    How sad then that a test crew elected not to use a checklist, thereby repeating nearly 80 years later the very same tragedy that prompted its creation.

    • I went to GIV initial in 2016, long enough after the Bedford crash so that FSI had all the information on that crash. It was a huge topic of discussion in the first two days of initial. The crew was FSI trained and according to the instructors, that crew was highly disciplined in the training environment but were a couple of “corner cutting cowboys” as my instructor put it. They were able to download enough information from the plane to see they hadn’t done the control check in the last—I think—60 ish flights. That means that between the time they started on the accident flight, till the time they tried to rotate, they never deflected the controls. Not once, no control check, no spoiler check, no hydraulic system check, nothing.

    • Riding motorcycles is exactly what I thought about when reading that in the article. Not so much during, but before sometimes; a slight nervousness.

      I’m reminded of Commander Charles Lamb (War in a Stringbag) trying to take comfort in the butterflies he had on some early missions (cited as his most unpleasant), in that they were “old friends” who he knew from the boxing ring, but that disappeared once the first bell rang.

  6. I just shared this article with my granddaughter who is just starting her aviation career. Lots of wisdom in this read.

  7. Back in the 1980’s I was sitting in the cockpit jump seat of a TWA 727 about to push back for takeoff from LaGuardia. This was the first trip of the month together for this crew. The Captain called for the pre-start checklist, and the flight engineer began reciting it, a list he had probably performed hundreds of times. The Captain turned to him and said, “Please read the checklist–if anybody’s memory is going to fail, I would rather that it be mine.” A statement that has stuck with me for all of my flying since, and most of my ground-pounder life as well. It’s the “why” of we have checklists, whether in the cockpit, the operating room, or on an ocean liner’s bridge.

  8. I have been flying for 55 years or so, and it has been a constant struggle to make myself heed the wisdom of this article. I think a related problem is that of “task continuation bias”, the tendency of a pilot to continue with an original course of action that is no longer viable. The more skilled and experienced a pilot is, the more likely he is to fall victim to this well-recognized psychological problem. I had flown hundreds of airshows, for many years, when I succumbed to this malady, which nearly killed me and destroyed the most beautiful airplane that ever was. Google “Tumbling Bear” for more on the crash if you are interested.

    • Looks like it was covered here as well, just put “tumbling bear” in the search box at the top of this page.

  9. While this has clear application to aviation, it can hurt you in other, more mundane activities. The clearest example is driving. Cars have become so comfortable, quiet, and electronic that it is no longer just about transportation. We need to be entertained as we drive offensively. We have forgotten that even modest superhuman speed is deadly and that IAMSAFE is a factor for how we drive and even whether we drive. Our attentiveness and overall skill have deteriorated just as this article has described: normalization of deviance, as if there are no consequences to bad behavior.
    Be careful out there.

    • Just came back from the beach last night, 1h30 drive during the period sun was setting. Two incidents, luckily no contact but could have been. One an old boy, possibly drunk just wandered out of lane on the freeway in front of us, straddling the white line. Hard brake and short horn and he started like a rabbit back in lane. Other at a roundabout, young driver, driving fast and aggressively very nearly lost it and crashed into the typical mum with kids in the back.
      Deviant behaviour in both cases. One old, probably been driving 50 years, other young, probably been driving two. In both cases I am sure they drive like that normally.
      Cars pretty safe so probably if things went wrong, there would have been damage to cars and cuts and bruises to people. Not the same usually in aircraft.

  10. A Well written article and good advice for more than just our flying lives. Being a safety engineer for more than 40 years, I can recall many cases where accidents were the result of several of the things you covered. One of the worst industrial accidents in this country was caused by a maintenance crew that prided itself in performing a cleaning task on a large chemical reactor faster by taking shortcuts from the prescribed procedure, thus “saving” the company valuable production time. Then one day, they got a little too quick and the reactor dumped several tons of a highly flammable solvent on the ground. The result was a huge explosion and fire that killed more than 20 employees, destroyed $1.2 Billion dollars of property and caused another billion dollars in lost production. The sad thing was that the local managers were aware of the crew’s activities and did not put a stop to it. Who’s the real guilty party?

LEAVE A REPLY