Confirmation bias is insidious and dangerous – and we all suffer from it.
The detail was a long cross-country navigation exercise. My student Neil had planned a triangular VFR flight of some 200nm, landing at two distant airports, then back to home base. At the end of the day he was hoping for a ‘sign-off’ in his logbook, which would permit him to make the same journey solo, thus fulfilling one of the experience requirements for issue of his PPL. I had studied weather information prior to arriving at the airfield, and I wasn’t feeling 100% optimistic. Conditions at our home base were fine, but a weakish warm front was heading our way from the west, likely to be attended by a lowering of cloud bases. Timing would be tight. However, Neil was anxious to get the trip completed, and current weather along our route was marginally okay. “There’s a good chance we can make it!” he pronounced, and said he was prepared to risk the cost of an uncompleted exercise, if that were to be the outcome. Feeling some teaching moments might ensue, I agreed we could head westward, and if conditions became worse than forecast we could turn about to route back home, outrunning the oncoming front. In our pre-flight briefing I emphasised to Neil that he should take weather-related decisions in the first instance. I wanted him to practice decision-making by himself, without initially consulting with me. After all, that was something he would also have to do on the day of his long solo flight. We took to the skies.
Part of the route on our first leg brought us overhead a valley with a wide flat floor, about 15nm across, with high mountains on each side. Our first destination airport wasn’t far beyond the end of the valley, but even as we progressed along this portion of the journey I began to see, looking some twenty to thirty miles ahead, that the chances of emerging from the other end at or above our safety altitude were diminishing. A wall of cloud was building up, and years of familiarity with this area and its meteorological peculiarities told me we would only be heading into the soup if we kept pressing onwards. I asked Neil for a decision. He seemed slightly irritated. “Doesn’t look that bad to me…” he suggested. In his tone of voice I could hear his desire to complete the whole journey competing with some better internal counsel warning him to retreat. I said to him: “OK, you make the decision, for now”. I was certain he would eventually have to turn back, but I thought the lesson would be more effective if he had to confront some real consequences. Neil confirmed his thinking by listening to the airport ATIS, which was still reporting surprisingly good conditions. He settled himself in his seat as if to say ‘I’ll show you’, and at that moment one of those occasional weather phenomena occurred which can prove oh, so tempting, to the unwary pilot. At the distant end of the valley a bright blue patch suddenly appeared, and the sun brightly lit the land below. It was like a glimpse of Shangri-La, the mythical earthly paradise of Tibetan legend, and Neil pointed to it, looking at me somewhat triumphantly. I shrugged and made a gesture to convey it was still over to him for a decision. He flew another few miles down the valley towards the entrancing vision, a smile on his face as our destination drew ever closer.
And then, suddenly, the brilliant illuminated landscape was gone, replaced by a curtain of dense grey. There was clearly no longer any way out. Even so, Neil flew a little further, hoping against hope for another bright corridor. None appeared. Again I prompted him for a decision. “Back to base, I suppose”, he said glumly. We did a 180 and headed back along our previous track. As we progressed, to Neil’s surprise and increasing anxiety, the cloud base dropped further, and soon we were at an altitude below the tops of the mountains alongside us. In fact, I too was surprised at how quickly conditions had deteriorated. It can be tempting in flying to presume that the conditions behind you will remain the same, providing an escape route, if you are heading into oncoming weather. But that is by no means always the case. It started to look as if we couldn’t exit the valley at all. I asked him for another decision. We discussed our diminishing options and it soon became clear that if we could find no way out, the safest course of action available would be to make a precautionary landing in a field on the valley floor. I told him to start preparing for that. I wanted to impress on him the reality of getting stuck in a blind alley with no exit.
Neil had been a victim of a psychological tendency known as Confirmation Bias. Or in common parlance, wishful thinking. It’s a mental trap that has been known for centuries. The 16th century English philosopher Francis Bacon put it succinctly:
The human understanding when it has once adopted an opinion … draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects or despises, or else by some distinction sets aside or rejects.
More recently confirmation bias has been studied experimentally, and shown to be a very powerful influencer on human decision-making, not least in the field of aviation. A tragic example of its potency for harm was the case of Korean Air flight 801, which crashed on approach to Guam, in Micronesia, resulting in many fatalities. Even though cautioned by ATC, the crew followed an unusable glideslope indication all the way to the ground. The accident report shows they clearly believed it was working. Unfortunately when unserviceable, the indicator centres in the position one would expect to see during a perfectly flown approach. Once the crew believed they were on the correct trajectory, they ignored numerous warnings giving them contradictory information.
This example resulted in a far more dramatic and sadder outcome than Neil’s questionable decisions during our cross country flight, but taken together they show that confirmation bias can creep in at all levels of flying. How can we fight against it? There are various strategies suggested in the psychological literature. It is important to remember that no human-machine interface will ever be completely free of error, but here are some ways that you can attempt to mitigate confirmation bias:
- Be aware that it exists. Take time to study the phenomenon in informed literature, and develop as deep an understanding of it as you can. It is a factor not just in aviation, but in fields as varied as medicine, politics, financial analysis, intelligence gathering, in fact almost any human endeavour. An awareness you are vulnerable to it is an important context in which to develop effective countermeasures. Put your ego aside and be objectively open-minded.
- Listen to other voices. Seek advice with an open frame of mind. This could be from instructors, fellow pilots, ATC, forecasters. Be ready to listen to their opinions, even if they contradict the course you are yearning to take. Actively seek disagreement, and take in the substance of what you hear, not just to the bits supporting your own position. My student Neil was reassured by the alluring ATIS report at our destination, but we still had to traverse the marginal weather in between.
- Err on the side of caution. Legal weather minima, for example, can be same for a highly experienced pilot as they are for a relative newbie. That does not mean their personal minima should be the same. Work within generous margins given your own level of experience. Again, Neil was tempted by the momentary appearance of blue skies and bright sunshine in the distance, and he allowed that vision to blind him to a slew of contradictory information.
These are just some of the techniques you can employ in dealing with confirmation bias. In the case of our cross country flight my willingness to allow the object lesson to progress was due to an ace I had up my sleeve. Our trainer was fully IFR equipped, and I knew that at a pinch we could talk to ATC, change to instrument rules, and fly home above the clouds. Indeed, that is what we did, and I wouldn’t have progressed so far in my efforts to provide a powerful lesson for Neil, without having that option available. Even then, I was surprised by how quickly conditions behind us had worsened. Don’t assume you can always turn back into better weather.
Neil’s final verdict on the quality of his decision-making? A firm “Must do better!” and a resolve to learn more about confirmation bias, and its associated threats to safety.