Category: Decision making

Heads in the sand: Most of us would prefer not to know whether bad things are going to happen

Business men hiding head in sandBy Alex Fradera

Humans are infovores, hungry to discover, and nothing holds more fascination than the future. Once we looked for answers through divination, now science can forecast significant events such as the onset of certain hereditary disease. But the fact that some people choose not to know – even when information is accessible, and has a bearing on their lives – has encouraged scientists, including Gerd Gigerenzer and Rocio Garcia-Retamero, to try to map out the limits of our appetite for knowledge. Their recent study in Psychological Review suggests that it is a fear of what we might discover – and wishing that we’d never known – that often drives us to deliberate ignorance.

Continue reading “Heads in the sand: Most of us would prefer not to know whether bad things are going to happen”

A five-minute chat with preschoolers about their past or future selves helps them make better decisions

Little girl's legs in glitter shoes on snowBy Christian Jarrett

There could be an Arctic blizzard blowing outside for all little Mary cares. The fact is, she’s hot from running around indoors, and no matter how much you try to explain to Mary that her future self – the one that’s about to go walking in the cold – would really appreciate that she put her coat on, Mary, like most kids aged under five, finds it very difficult to step outside of the present and consider her future needs.

While psychologists have already spent a lot of time demonstrating the limitations of young children’s ability to plan for the future, until now they’ve not looked much at whether it’s possible to target these “prospective abilities”. However, a new study in Developmental Psychology has done that, showing that a mere five-minute chat about their recent past or future selves seems to help preschoolers remember to do things in the future, and to “time travel” mentally, so that they make better decisions about their forthcoming needs.

Continue reading “A five-minute chat with preschoolers about their past or future selves helps them make better decisions”

Risk-taking teens’ brains seem to disregard past bad outcomes

Two Male Freerunners Jumping Over a Wall Performing ParkourBy guest blogger Lucy Foulkes

Adolescents take more risks than adults: they are more likely to binge drink, have casual sex, commit crimes and have serious car accidents. In fact, adolescence is a paradox because it is a time of peak physical fitness, but also the time when people are most likely to be injured or killed in an accident. For this reason, it’s critical to understand what drives teenagers to take more risks. To date, many explanations of teenage risk taking have focused on the positive side of these behaviours: the rewarding “kick” that comes from taking a risk that ends well. Some studies have shown that teenagers experience more of this rewarding feeling, and this contributes to the increased risk taking seen at this age.

Fewer studies have considered how teenagers respond when risks turn out badly. This is important because all our previous experiences, both good and bad, affect our subsequent behaviour. If we make a risky decision like gambling money, and it pays off, it’s more likely we’ll decide to gamble again in the near future. Equally, if we take a gamble and it turns out badly, we’ll probably be a bit more reserved next time. But it turns out that some teenagers don’t respond like this: according to a new study in NeuroImage, some of them do not adjust their behavior so readily when things go wrong, and this may be linked to a distinct pattern of activation in their brains.

Continue reading “Risk-taking teens’ brains seem to disregard past bad outcomes”

High population density seems to shift us into a future-oriented mindset

613445810_2249c2d193_bBy Christian Jarrett

In the UK we’re familiar with the practical implications of increasing population density: traffic jams, longer waits to see a doctor, a lack of available housing. What many of us probably hadn’t realised is how living in crowded environment could be affecting us at a deep psychological level, fostering in us a more future-oriented mindset or what evolutionary psychologists call a “slow life history” strategy.

In their paper in the Journal of Personality and Social Psychology, Oliver Sng at the University of Michigan and his colleagues present a range of evidence that shows how this strategy plays out in the more patient ways that we approach our relationships, parenting and economic decisions. In essence, the researchers are proposing that the presence of greater numbers of other people in close proximity prompts us to invest in the future as way to compete more effectively.

Continue reading “High population density seems to shift us into a future-oriented mindset”

Psychologists uncover a new self-serving bias – if it’s my theory, it must be true

By Christian Jarrett

If you look at the research literature on self-serving biases, it’s little surprise that critical thinking – much needed in today’s world – is such a challenge. Consider three human biases that you may already have heard of: most of us think we’re better than average at most things (also known as illusory superiority or the Lake Wobegon Effect); we’re also prone to “confirmation bias”, which is favouring evidence that supports our existing views; and we’re also susceptible to the “endowment effect” which describes the extra value we place on things, just as soon as they are ours.

A new paper in the Quarterly Journal of Experimental Psychology by Aiden Gregg and his colleagues at the University of Southampton extends the list of known biases by documenting a new one that combines elements of the better-than-average effect, confirmation bias and the endowment effect. Gregg’s team have shown that simply asking participants to imagine that a theory is their own biases them to believe in the truth of that theory – a phenomenon that the researchers have called the Spontaneous Preference For Own Theories (SPOT) Effect.

Continue reading “Psychologists uncover a new self-serving bias – if it’s my theory, it must be true”

An influential theory about emotion and decision-making just failed a new test

balloonBy Christian Jarrett

It’s a common belief that to make optimal decisions we need to be more logical and less emotional, rather like Mr Spock in Star Trek. In fact, much evidence argues against this. Consider the behaviour of patients whose brain damage has made them unusually cold and logical. Rather than this helping them make decisions, they often seem paralysed by indecision.

These patients, who usually have damage to parts of their frontal cortex, also tend to perform poorly on a game that’s used by psychologists to measure risk-taking behaviour: the Iowa Gambling Task. The neurologist and author Antonio Damasio thinks this is because they have lost the ability to incorporate gut instincts – literally, their visceral reactions – into their decision-making, an idea that forms the basis of his Somatic Marker Hypothesis. This hypothesis has been very influential but the evidence supporting it, now gathered over several decades, is nearly all based on research using the Iowa Gambling Task.

In a recent paper in Decision, two British psychologists tested the Somatic Marker Hypothesis in a new context, the Balloon Analogue Risk Task, which involves deciding how far to pump a balloon. They found little evidence to support the central tenet of the Somatic Marker Hypothesis, the idea that our physiological reactions shape our decisions.

Continue reading “An influential theory about emotion and decision-making just failed a new test”

Why are some of us better at handling contradictory information than others?

Pros versus cons as vector illustration with speech bubblesBy Alex Fradera

Imagine it: you’re happily surfing through your social media feeds – or what we nowadays call your filter bubble – when some unexpected perspectives somehow manage to penetrate. After you “like” the latest critique of police power, for instance, you come across an article arguing that cracking down on crime can benefit minority neighborhoods. Or, elbowing its way into a crowd of articles celebrating trickle-down economics, you encounter a study showing higher taxes boost growth. What happens next? In new research in Contemporary Educational Psychology, Gregory Trevors and his colleagues looked at how reading conflicting information can push our emotional buttons, and lead us either towards resistance or a chance to learn.

Continue reading “Why are some of us better at handling contradictory information than others?”

Contra Kahneman, your mind’s fast and intuitive “System One” is capable of logic

Numbers of the MindBy Christian Jarrett

Nobel-winning psychologist Daniel Kahneman’s international best-selling book is titled Thinking Fast and Slow in reference to the idea that we have two mental systems, one that makes fast, automatic decisions based on intuition, and a second that is slower, deliberate and logical. A further detail of this “dual processing theory” is that given time, and if we make the effort, the second system can step in and correct the intuitive illogical reasoning of the trigger-happy first system.

It’s an elegantly simple model supported by a huge number of studies, but it’s far from perfect. As demonstrated by a new paper in Cognition, it seems that contrary to Kahneman’s caricature of the mind, our intuitive System One is perfectly capable of logic, without the need for any help from System Two. Moreover, it’s actually rather rare for System Two to step in and overrule System One; more common is for System One to find the logical answer all by itself.  Continue reading “Contra Kahneman, your mind’s fast and intuitive “System One” is capable of logic”

Studying “fast chess” to see how decision making varies through the day

38477719_ce9d3dc8a3_bBy Christian Jarrett

We’ve all had the experience of trying to make a tricky decision through the fog of fatigue, but there’s surprisingly little objective evidence about how time of day affects the way we decide. Perhaps late-day tiredness makes us more rash, as we lack the energy to be considered. Alternatively, maybe it’s our mid-morning zest that could lead us to be impetuous. Of course, our own chronotype is also likely come into play – perhaps morning people – “larks” – make better decisions in the morning, whereas evening people – “owls” – make better decisions in the evening.

One place to look for answers is in the data trails left by our online behavior. For a new paper in Cognition, a team led by María Leone has analysed the moves made by dozens of internet “fast chess” players, some of whom have played tens of thousands of two or three-minute games, consisting of an even greater number of moves. The results suggest that regardless of chronotype, we’re inclined to make progressively faster, less accurate decisions as the day wears on, with the effect plateauing in mid-afternoon.  Continue reading “Studying “fast chess” to see how decision making varies through the day”

Sorry to say, but your pilot’s decisions are likely just as irrational as yours and mine

Flying a plane is no trivial task, but adverse weather conditions are where things get seriously challenging. Tragically, a contributing factor to many fatal accidents is when the pilot has misjudged the appropriateness of the flying conditions. Now in a somewhat worrying paper in Applied Cognitive Psychology Stephen Walmsley and Andrew Gilbey of Massey University have shown that pilots’ judgment of weather conditions, and their decisions on how to respond to them, are coloured by three classic cognitive biases. What’s more, expert flyers are often the most vulnerable to these mental errors.

The researchers first addressed the “anchoring effect”, which is when information we receive early on has an undue influence on how we subsequently think about a situation. Nearly 200 pilots (a mix of commercial, transport, student and private pilots) were given the weather forecast for the day and then they looked at visual displays that showed cloud cover and horizontal visibility as if they were in a cockpit, and their task was to quantify these conditions by eye.

The pilots tended to rate the atmospheric conditions as better – higher clouds, greater visibility – when they’d been told earlier that the weather forecast was favourable. Essentially, old and possibly irrelevant information was biasing the judgment they were making with their own eyes. Within the sample were 56 experts with over 1000 hours of experience, and these pilots were especially prone to being influenced by the earlier weather forecast.

Next, hundreds more pilots read about scenarios where a pilot needed to make an unplanned landing. An airstrip was nearby, but the conditions for the route were uncertain. Each participant had to solve five of these landing dilemmas, deciding whether to head for the strip or re-route. For each scenario they were told two statements that were reassuring for heading for the strip (e.g. another pilot had flown the route minutes ago) and one that was problematic (e.g. the visibility was very low). In each case, the participants had to say which piece of information was most important for deciding whether to land at the nearby airstrip or not.

Across the scenarios, the participants showed no real preference for one type of statement over another. This might sound sensible, but actually it’s problematic. When you want to test a hypothesis, like “it seems safe to land”, you should seek out information that disproves your theory. (No matter how many security guards, alarms and safety certificates a building possesses, if it’s on fire, you don’t go in.) So pilots should be prioritising the disconfirming evidence over the others, but in fact they were just as likely to rely on reassuring evidence, which is an example of what’s known as “the confirmation bias”.

In a final experiment more pilot volunteers read decisions that other pilots had made about whether to fly or not and the information they’d used to make their decisions. Sometimes the flights turned out to be uneventful, but other times they resulted in a terrible crash. Even though the pilots in the different scenarios always made their decisions based on the exact same pre-flight information, the participants tended to rate their decision making much more harshly when the flight ended in disaster than when all went well.

It concerns Walmsley and Gilbey that pilots are vulnerable to this error – an example of the “outcome bias” – because pilots who decide to fly in unwise weather and get lucky could be led by this bias to see their decisions as wise, and increasingly discount the risk involved. Note that both the confirmation and outcome experiments also contained an expert subgroup, and in neither case did they make better decisions than other pilots.

The use of cognitive heuristics and shortcuts – “thinking fast” in Daniel Kahneman’s memorable phrase – is enormously useful, necessary for helping us surmount the complexities of the world day-to-day. But when the stakes are high, whether it be aviation or areas such as medicine, these tendencies need to be countered. Simply raising awareness that these biases afflict professionals may be one part of the solution. Another may be introducing work processes that encourage slower, more deliberative reasoning. That way, when pilots scan the skies, they might be more likely to see the clouds on the horizon.

_________________________________ ResearchBlogging.org

Walmsley, S., & Gilbey, A. (2016). Cognitive Biases in Visual Pilots’ Weather-Related Decision Making Applied Cognitive Psychology DOI: 10.1002/acp.3225

further reading
Just two questions predict how well a pilot will handle an emergency
If your plane gets lost you’d better hope there’s an orienteer on board

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Our free weekly email will keep you up-to-date with all the psychology research we digest: Sign up!