Category: Decision making

Risk-taking teens’ brains seem to disregard past bad outcomes

Two Male Freerunners Jumping Over a Wall Performing ParkourBy guest blogger Lucy Foulkes

Adolescents take more risks than adults: they are more likely to binge drink, have casual sex, commit crimes and have serious car accidents. In fact, adolescence is a paradox because it is a time of peak physical fitness, but also the time when people are most likely to be injured or killed in an accident. For this reason, it’s critical to understand what drives teenagers to take more risks. To date, many explanations of teenage risk taking have focused on the positive side of these behaviours: the rewarding “kick” that comes from taking a risk that ends well. Some studies have shown that teenagers experience more of this rewarding feeling, and this contributes to the increased risk taking seen at this age.

Fewer studies have considered how teenagers respond when risks turn out badly. This is important because all our previous experiences, both good and bad, affect our subsequent behaviour. If we make a risky decision like gambling money, and it pays off, it’s more likely we’ll decide to gamble again in the near future. Equally, if we take a gamble and it turns out badly, we’ll probably be a bit more reserved next time. But it turns out that some teenagers don’t respond like this: according to a new study in NeuroImage, some of them do not adjust their behavior so readily when things go wrong, and this may be linked to a distinct pattern of activation in their brains.

Continue reading “Risk-taking teens’ brains seem to disregard past bad outcomes”

High population density seems to shift us into a future-oriented mindset

613445810_2249c2d193_bBy Christian Jarrett

In the UK we’re familiar with the practical implications of increasing population density: traffic jams, longer waits to see a doctor, a lack of available housing. What many of us probably hadn’t realised is how living in crowded environment could be affecting us at a deep psychological level, fostering in us a more future-oriented mindset or what evolutionary psychologists call a “slow life history” strategy.

In their paper in the Journal of Personality and Social Psychology, Oliver Sng at the University of Michigan and his colleagues present a range of evidence that shows how this strategy plays out in the more patient ways that we approach our relationships, parenting and economic decisions. In essence, the researchers are proposing that the presence of greater numbers of other people in close proximity prompts us to invest in the future as way to compete more effectively.

Continue reading “High population density seems to shift us into a future-oriented mindset”

Psychologists uncover a new self-serving bias – if it’s my theory, it must be true

Cool characterBy Christian Jarrett

If you look at the research literature on self-serving biases, it’s little surprise that critical thinking – much needed in today’s world – is such a challenge. Consider three human biases that you may already have heard of: most of us think we’re better than average at most things (also known as illusory superiority or the Lake Wobegon Effect); we’re also prone to “confirmation bias”, which is favouring evidence that supports our existing views; and we’re also susceptible to the “endowment effect” which describes the extra value we place on things, just as soon as they are ours.

A new paper in the Quarterly Journal of Experimental Psychology by Aiden Gregg and his colleagues at the University of Southampton extends the list of known biases by documenting a new one that combines elements of the better-than-average effect, confirmation bias and the endowment effect. Gregg’s team have shown that simply asking participants to imagine that a theory is their own biases them to believe in the truth of that theory – a phenomenon that the researchers have called the Spontaneous Preference For Own Theories (SPOT) Effect.

Continue reading “Psychologists uncover a new self-serving bias – if it’s my theory, it must be true”

An influential theory about emotion and decision-making just failed a new test

balloonBy Christian Jarrett

It’s a common belief that to make optimal decisions we need to be more logical and less emotional, rather like Mr Spock in Star Trek. In fact, much evidence argues against this. Consider the behaviour of patients whose brain damage has made them unusually cold and logical. Rather than this helping them make decisions, they often seem paralysed by indecision.

These patients, who usually have damage to parts of their frontal cortex, also tend to perform poorly on a game that’s used by psychologists to measure risk-taking behaviour: the Iowa Gambling Task. The neurologist and author Antonio Damasio thinks this is because they have lost the ability to incorporate gut instincts – literally, their visceral reactions – into their decision-making, an idea that forms the basis of his Somatic Marker Hypothesis. This hypothesis has been very influential but the evidence supporting it, now gathered over several decades, is nearly all based on research using the Iowa Gambling Task.

In a recent paper in Decision, two British psychologists tested the Somatic Marker Hypothesis in a new context, the Balloon Analogue Risk Task, which involves deciding how far to pump a balloon. They found little evidence to support the central tenet of the Somatic Marker Hypothesis, the idea that our physiological reactions shape our decisions.

Continue reading “An influential theory about emotion and decision-making just failed a new test”

Why are some of us better at handling contradictory information than others?

Pros versus cons as vector illustration with speech bubblesBy Alex Fradera

Imagine it: you’re happily surfing through your social media feeds – or what we nowadays call your filter bubble – when some unexpected perspectives somehow manage to penetrate. After you “like” the latest critique of police power, for instance, you come across an article arguing that cracking down on crime can benefit minority neighborhoods. Or, elbowing its way into a crowd of articles celebrating trickle-down economics, you encounter a study showing higher taxes boost growth. What happens next? In new research in Contemporary Educational Psychology, Gregory Trevors and his colleagues looked at how reading conflicting information can push our emotional buttons, and lead us either towards resistance or a chance to learn.

Continue reading “Why are some of us better at handling contradictory information than others?”

Contra Kahneman, your mind’s fast and intuitive “System One” is capable of logic

Numbers of the MindBy Christian Jarrett

Nobel-winning psychologist Daniel Kahneman’s international best-selling book is titled Thinking Fast and Slow in reference to the idea that we have two mental systems, one that makes fast, automatic decisions based on intuition, and a second that is slower, deliberate and logical. A further detail of this “dual processing theory” is that given time, and if we make the effort, the second system can step in and correct the intuitive illogical reasoning of the trigger-happy first system.

It’s an elegantly simple model supported by a huge number of studies, but it’s far from perfect. As demonstrated by a new paper in Cognition, it seems that contrary to Kahneman’s caricature of the mind, our intuitive System One is perfectly capable of logic, without the need for any help from System Two. Moreover, it’s actually rather rare for System Two to step in and overrule System One; more common is for System One to find the logical answer all by itself.  Continue reading “Contra Kahneman, your mind’s fast and intuitive “System One” is capable of logic”

Studying “fast chess” to see how decision making varies through the day

38477719_ce9d3dc8a3_bBy Christian Jarrett

We’ve all had the experience of trying to make a tricky decision through the fog of fatigue, but there’s surprisingly little objective evidence about how time of day affects the way we decide. Perhaps late-day tiredness makes us more rash, as we lack the energy to be considered. Alternatively, maybe it’s our mid-morning zest that could lead us to be impetuous. Of course, our own chronotype is also likely come into play – perhaps morning people – “larks” – make better decisions in the morning, whereas evening people – “owls” – make better decisions in the evening.

One place to look for answers is in the data trails left by our online behavior. For a new paper in Cognition, a team led by María Leone has analysed the moves made by dozens of internet “fast chess” players, some of whom have played tens of thousands of two or three-minute games, consisting of an even greater number of moves. The results suggest that regardless of chronotype, we’re inclined to make progressively faster, less accurate decisions as the day wears on, with the effect plateauing in mid-afternoon.  Continue reading “Studying “fast chess” to see how decision making varies through the day”

Sorry to say, but your pilot’s decisions are likely just as irrational as yours and mine

Flying a plane is no trivial task, but adverse weather conditions are where things get seriously challenging. Tragically, a contributing factor to many fatal accidents is when the pilot has misjudged the appropriateness of the flying conditions. Now in a somewhat worrying paper in Applied Cognitive Psychology Stephen Walmsley and Andrew Gilbey of Massey University have shown that pilots’ judgment of weather conditions, and their decisions on how to respond to them, are coloured by three classic cognitive biases. What’s more, expert flyers are often the most vulnerable to these mental errors.

The researchers first addressed the “anchoring effect”, which is when information we receive early on has an undue influence on how we subsequently think about a situation. Nearly 200 pilots (a mix of commercial, transport, student and private pilots) were given the weather forecast for the day and then they looked at visual displays that showed cloud cover and horizontal visibility as if they were in a cockpit, and their task was to quantify these conditions by eye.

The pilots tended to rate the atmospheric conditions as better – higher clouds, greater visibility – when they’d been told earlier that the weather forecast was favourable. Essentially, old and possibly irrelevant information was biasing the judgment they were making with their own eyes. Within the sample were 56 experts with over 1000 hours of experience, and these pilots were especially prone to being influenced by the earlier weather forecast.

Next, hundreds more pilots read about scenarios where a pilot needed to make an unplanned landing. An airstrip was nearby, but the conditions for the route were uncertain. Each participant had to solve five of these landing dilemmas, deciding whether to head for the strip or re-route. For each scenario they were told two statements that were reassuring for heading for the strip (e.g. another pilot had flown the route minutes ago) and one that was problematic (e.g. the visibility was very low). In each case, the participants had to say which piece of information was most important for deciding whether to land at the nearby airstrip or not.

Across the scenarios, the participants showed no real preference for one type of statement over another. This might sound sensible, but actually it’s problematic. When you want to test a hypothesis, like “it seems safe to land”, you should seek out information that disproves your theory. (No matter how many security guards, alarms and safety certificates a building possesses, if it’s on fire, you don’t go in.) So pilots should be prioritising the disconfirming evidence over the others, but in fact they were just as likely to rely on reassuring evidence, which is an example of what’s known as “the confirmation bias”.

In a final experiment more pilot volunteers read decisions that other pilots had made about whether to fly or not and the information they’d used to make their decisions. Sometimes the flights turned out to be uneventful, but other times they resulted in a terrible crash. Even though the pilots in the different scenarios always made their decisions based on the exact same pre-flight information, the participants tended to rate their decision making much more harshly when the flight ended in disaster than when all went well.

It concerns Walmsley and Gilbey that pilots are vulnerable to this error – an example of the “outcome bias” – because pilots who decide to fly in unwise weather and get lucky could be led by this bias to see their decisions as wise, and increasingly discount the risk involved. Note that both the confirmation and outcome experiments also contained an expert subgroup, and in neither case did they make better decisions than other pilots.

The use of cognitive heuristics and shortcuts – “thinking fast” in Daniel Kahneman’s memorable phrase – is enormously useful, necessary for helping us surmount the complexities of the world day-to-day. But when the stakes are high, whether it be aviation or areas such as medicine, these tendencies need to be countered. Simply raising awareness that these biases afflict professionals may be one part of the solution. Another may be introducing work processes that encourage slower, more deliberative reasoning. That way, when pilots scan the skies, they might be more likely to see the clouds on the horizon.

_________________________________ ResearchBlogging.org

Walmsley, S., & Gilbey, A. (2016). Cognitive Biases in Visual Pilots’ Weather-Related Decision Making Applied Cognitive Psychology DOI: 10.1002/acp.3225

further reading
Just two questions predict how well a pilot will handle an emergency
If your plane gets lost you’d better hope there’s an orienteer on board

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Our free weekly email will keep you up-to-date with all the psychology research we digest: Sign up!

What does an ambivalent mood do to your problem-solving skills?

Psychologists have got a pretty good picture of how we’re influenced by the big emotional states. Feeling positive encourages an explorative cognitive style that is risk-tolerant and well suited to the open aspects of creativity, whereas negative emotions make us sensitive to threat and prone to vigilant, focused thinking. But what happens when our emotional states are a mix of the two – when we’re in an ambivalent mood? Appropriately, research to date has been inconsistent, with some work suggesting it sharpens our minds, others that it distracts us. In a new paper in Journal of Applied Psychology researchers at the University of Virginia have tidied up the mixed findings about mixed feelings.

Cristiano Guarana and Morela Hernandez lay out why feeling ambivalent should facilitate decision-making: it sends a strong signal that a situation is complex, and that simple solutions are likely to be unsatisfactory. Consistent with this, past research, including their own, has shown ambivalence can lead to more cognitive flexibility and holistic, comprehensive solutions. But other research has linked ambivalence with poor decision-making. How can we reconcile these findings?

Guarana and Hernandez’s theory is that in a real-life situation it’s not always clear where your emotional states arrive from, and if you feel ambivalent, but haven’t bottomed out why, you won’t give that complex situation the attention it needs… and worse, you could attribute your feelings to a peripheral situation that will needlessly suck up your attention. For ambivalence to be cognitively advantageous, the state must be tied to its source.

The researchers conducted four experiments to test their explanation, with the final one combining all the clever bits of design into one setup. For this final study, the researchers first prompted their two hundred participants (all were employees from a range of organizations, on average 45 years old and two thirds were women) to experience feelings of ambivalence by asking them to write a short passage on a personal experience that involved either indifference or ambivalence. Next, the researchers warned half of the participants that the upcoming task could produce mixed reactions, priming them to recognise it as a source of ambivalence. The idea was that these participants would see the upcoming task as the source of their ambivalent feelings.

The main task involved participants reading a scenario about a fraudulent drug trial in which the researcher added made-up data points so he could release the drug to market. The participants then had to judge based on this limited information what happened next: whether they thought it was more likely that the drug was (a) withdrawn from the market, or (b) that it was withdrawn from the market after killing and injuring patients.

This is a classic decision-making conjunction problem: the conjunction of two events is never more likely than either alone, but superficial thinking can lead us to assume the more specific is more likely. In fact, the participants gave the wrong answer more often than right – unless they had been primed to see the test as a source of their ambivalent feelings, in which case they made the correct choice in two out of three instances.

Results from a supplementary task showed how participants thought about the scenario differently when they had been primed to see it as a source of their ambivalent feelings. After responding to the scenario, participants completed word fragments, e.g. DIS___, by writing in the end of words. Some of these fragments could potentially form words related to the drug-trial scenario (e.g. DISEASE). When participants completed the word fragments in this way, this was taken as a sign that they were more sensitive to the concepts in the scenario.

Participants in the priming condition produced more scenario-related words, and the more that they did this, the more likely it was that they also reached the correct solution. This is consistent with the idea that the primed participants tied their ambivalent feelings to the drug trial scenario, and that this encouraged them to pay more attention to it. Interestingly, Guarana and Hernandez showed this only applied to participants scoring low on a measure of self-control: people inclined to skirt difficult issues are the ones to benefit from recognising their ambivalence about a situation.

The message is clear: when you’re feeling a muss of conflicted feelings, take a step back and identify where that message is coming from. Do so, and you authorise your mind to attend to it in the best possible way.

________________________________ ResearchBlogging.org

Identified Ambivalence: When Cognitive Conflicts Can Help Individuals Overcome Cognitive Traps. Guarana, Cristiano L.; Hernandez, Morela Journal of Applied Psychology, Mar 10 , 2016.

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Our free weekly email will keep you up-to-date with all the psychology research we digest: Sign up!

New review prompts a re-think on what low sugar levels do to our thinking

Glucose. Fuel for our cells, vital for life. But how fundamental is it to how we think?

According to dual-systems theory (best known from Nobel laureate Daniel Kahneman’s work), low blood glucose favours the use of fast and dirty System One thinking over the deliberative, effortful System Two. Similarly, the ego depletion theory of Roy Baumeister sees glucose as a resource that gets used up whenever we resist a temptation.

But the authors of a new meta-analysis published in Psychological Bulletin find these claims hard to swallow. Their review suggests that glucose levels may change our decisions about food, but little else.

Jacob Orquin at Aarhus University and Robert Kurzban at the University of Pennsylvania searched the decision-making literature, finding 36 articles that directly investigated glucose by measuring blood concentration, providing participants with sugar solution, or via interventions such as wafting food smells, which triggers some amount of glucose production.

The authors pored through the articles and tabulated every effect, its direction as well as its size. They found the effects were very variable, often operating in different directions from study to study. But when the data was organised according to a key factor, a consistent pattern began to emerge. That factor? Food.

In payment tasks – involving hypothetical purchases (“how much would you pay for …”) and actual purchases while shopping – low blood glucose did increase people’s willingness to overspend … on food. But it actually made them less willing to spend money on non-food products. When it came to persistence on tasks (such as time spent trying to complete a puzzle), low glucose decreased willingness to work for non-food rewards, but led to more tenacious work towards food-related goals. And when people were given the choice between receiving a small amount now or a large one later, low glucose led to a large bias towards immediate gratification when food was the payoff, compared to a much smaller bias for non-food.

This pattern of results doesn’t fit the notion of glucose as willpower-fuel. It suggests instead that low glucose is a signal that, to ensure future wellbeing, food should be prioritised – by paying more for it, working harder for it, and grabbing a little now rather than taking the promise of more in the future. This signaling account also explains the recent discovery that you don’t need to consume glucose to produce some cognitive effects, simply tasting it is enough (by swishing around the mouth); no fuel has been received, but presumably the signaling system is temporarily fooled by the taste receptors.

Kahneman can sleep easy – the findings from this meta-analysis aren’t a blow to his dual process theory as a whole, merely the specific claim that glucose has a role in switching between thinking smart and slow. The meta-analysis is a more substantial problem for the claims of ego depletion, which are intimately related to the idea that willpower is a finite resource that depends on glucose.

Based on the prior glucose research and theory, some publications have recommended strategies like eating chocolate before tense marital discussions or stacking emergency Jelly Belly’s in the office desk drawer. But according to this meta-analysis, these strategies will yield little benefit; the main implication of being low on glucose is a greater preoccupation with finding something to eat. There’s a lot of strong psychological science out there to help with building everyday habits and making better decisions, so if you’re looking for a dose of something, we recommend you check those out instead.

_________________________________ ResearchBlogging.org

Orquin, J., & Kurzban, R. (2015). A Meta-Analysis of Blood Glucose Effects on Human Decision Making. Psychological Bulletin DOI: 10.1037/bul0000035

further reading
Labs worldwide report converging evidence that undermines the low-sugar theory of depleted willpower
New research challenges the idea that willpower is a “limited resource”

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!