Psychologist logo
Plane flying through storm
Cognition and perception, Work and occupational

Sorry to say, but your pilot’s decisions are likely just as irrational as yours and mine

Using cognitive heuristics and shortcuts is useful, but when the stakes are high, whether it be aviation or areas such as medicine, these tendencies need to be countered.

16 May 2016

By Alex Fradera

Flying a plane is no trivial task, but adverse weather conditions are where things get seriously challenging. Tragically, a contributing factor to many fatal accidents is when the pilot has misjudged the appropriateness of the flying conditions. Now in a somewhat worrying paper in Applied Cognitive Psychology Stephen Walmsley and Andrew Gilbey of Massey University have shown that pilots’ judgment of weather conditions, and their decisions on how to respond to them, are coloured by three classic cognitive biases. What’s more, expert flyers are often the most vulnerable to these mental errors.

The researchers first addressed the “anchoring effect”, which is when information we receive early on has an undue influence on how we subsequently think about a situation. Nearly 200 pilots (a mix of commercial, transport, student and private pilots) were given the weather forecast for the day and then they looked at visual displays that showed cloud cover and horizontal visibility as if they were in a cockpit, and their task was to quantify these conditions by eye.

The pilots tended to rate the atmospheric conditions as better – higher clouds, greater visibility – when they’d been told earlier that the weather forecast was favourable. Essentially, old and possibly irrelevant information was biasing the judgment they were making with their own eyes. Within the sample were 56 experts with over 1000 hours of experience, and these pilots were especially prone to being influenced by the earlier weather forecast.

Next, hundreds more pilots read about scenarios where a pilot needed to make an unplanned landing. An airstrip was nearby, but the conditions for the route were uncertain. Each participant had to solve five of these landing dilemmas, deciding whether to head for the strip or re-route. For each scenario they were told two statements that were reassuring for heading for the strip (e.g. another pilot had flown the route minutes ago) and one that was problematic (e.g. the visibility was very low). In each case, the participants had to say which piece of information was most important for deciding whether to land at the nearby airstrip or not.

Across the scenarios, the participants showed no real preference for one type of statement over another. This might sound sensible, but actually it’s problematic. When you want to test a hypothesis, like “it seems safe to land”, you should seek out information that disproves your theory. (No matter how many security guards, alarms and safety certificates a building possesses, if it’s on fire, you don’t go in.) So pilots should be prioritising the disconfirming evidence over the others, but in fact they were just as likely to rely on reassuring evidence, which is an example of what’s known as “the confirmation bias”.

In a final experiment more pilot volunteers read decisions that other pilots had made about whether to fly or not and the information they’d used to make their decisions. Sometimes the flights turned out to be uneventful, but other times they resulted in a terrible crash. Even though the pilots in the different scenarios always made their decisions based on the exact same pre-flight information, the participants tended to rate their decision making much more harshly when the flight ended in disaster than when all went well.

It concerns Walmsley and Gilbey that pilots are vulnerable to this error – an example of the “outcome bias” – because pilots who decide to fly in unwise weather and get lucky could be led by this bias to see their decisions as wise, and increasingly discount the risk involved. Note that both the confirmation and outcome experiments also contained an expert subgroup, and in neither case did they make better decisions than other pilots.

The use of cognitive heuristics and shortcuts – “thinking fast” in Daniel Kahneman’s memorable phrase – is enormously useful, necessary for helping us surmount the complexities of the world day-to-day. But when the stakes are high, whether it be aviation or areas such as medicine, these tendencies need to be countered. Simply raising awareness that these biases afflict professionals may be one part of the solution. Another may be introducing work processes that encourage slower, more deliberative reasoning. That way, when pilots scan the skies, they might be more likely to see the clouds on the horizon.

Further reading

Walmsley, S., & Gilbey, A. (2016). Cognitive Biases in Visual Pilots’ Weather-Related Decision Making Applied Cognitive Psychology DOI: 10.1002/acp.3225