Category: Cross-cultural

How does the psychology of ownership differ between Western and Eastern cultures?

Michael Jackson’s glove sold for $350,000 at
a New York auction in 2009. In India,
celebrity possessions are not valued so highly. 

By guest blogger Bruce Hood.

Many of us are nostalgic for original, authentic experiences and prepared to pay for them. For example, not so long ago vinyl records were ubiquitous but nowadays they are considered collectibles, with some attracting a high price. Even with the most mundane record, there is still a tangible tactile experience to possessing these items that iTunes cannot re-create. It’s not just collectors. Most of us prefer to own and derive great pleasure from original items – a theme explored in Paul Bloom’s highly entertaining 2011 TED talk, “The Nature of Pleasure”.

The psychology of possessions reveals that many of us imbue important items with an integral property or essence that defines their true identity. The origin of such thinking can be traced to Plato’s notion of form, but it still operates today as the intuition that significant things are irreplaceable, even by identical duplicates that are physically indistinguishable from the original.

The concept of essentialism also helps explain our pre-occupation with our own stuff. This is the idea that every object is imbued with unique defining characteristics. One essentialist perspective is that our possessions represent who we are, and are even imbued by us in some way. Clearly some objects are entirely pragmatic and functional but others form part of an “extended self” (Belk, 1988; pdf). It may be our car, our clothes or the records we collect. A manifestation of the extended self is the endowment effect (pdf) whereby individuals value their personal possessions more than identical objects owned by others. However, the endowment effect and the extended self are not culturally universal. For example, a recent study (pdf) of the Tanzanian Hazda hunter-gather tribe revealed that they do not show the endowment effect, possibly because they have so few personal possessions.

Others want to emulate their heroes or make a connection with them in some tangible material form by owning their personal possessions. Essentialism explains why memorabilia collectors are not always motivated by financial rewards but rather with a passion to establish a tactile connection with the previous owners they admire. One plausible mechanism aligned with essentialism is positive contamination (pdf) – the notion that coming into direct contact with an item, such as a piece of clothing, can transfer some the previous owner’s essence.

We have been researching authenticity and essentialism in our lab using a duplication scenario. It’s based on a conjuring trick that convinces pre-schoolers that we have a machine that can duplicate objects. In our first study (pdf), we showed that children with sentimental attachment to a teddy bear would not accept an apparent duplicate toy. They also thought that original cups and spoons owned by Queen Elizabeth II were more valuable than identical duplicates even though they reasoned that duplicated silver objects were physically equivalent to originals. In other words, they appreciated the additional value conferred to memorabilia by celebrity association.

In our most recent study conducted via the MTurk platform, we asked Western (mostly US) and Eastern (mostly Indian) adults to estimate the value of four types of collectible: a work of art, a celebrity sweater, a dinosaur bone and moon rock. We then told them about the machine that can create an identical duplicate and asked them to value the copy. In two studies of over 800 adults we found the same basic pattern. Overall, both cultures think originals are worth more than copies, but the two cultures diverge on the celebrity clothing. Unlike Westerners, the Eastern adults saw the duplicate as not significantly different from the original. These results support the hypothesis that individualistic cultures in the West place a greater value on objects associated with unique persons, which explains why the valuation of certain authentic items may vary cross-culturally.

It’s not that Eastern cultures like India do not have celebrities – they are fanatical about their Bollywood stars – but the desire to collect celebrity possessions may not be such a cultural tradition in collectivist societies. Eastern cultures also exhibit essentialist contagion in their rituals and concerns about moral contamination (the caste system being the notable example) but essentialist concerns are primarily heightened for negative contamination as opposed to positive transfer, which is what is believed to be operating in celebrity clothing.

It is not clear how the desire for authenticity and essentialism will change as cultural differences increasingly disappear in a digitizing world of accessible duplication and downloads, but I expect that desire for originality will always be at the core of human psychology as a component of self-identity. We are the only species that really seems to care about originals.


Apicella, C., Azevedo, E., Fowler, J., & Christakis, N. (2014). Evolutionary Origins of the Endowment Effect: Evidence from Hunter-Gatherers SSRN Electronic Journal DOI: 10.2139/ssrn.2255650

Gjersoe, N., Newman, G., Chituc, V., & Hood, B. (2014). Individualism and the Extended-Self: Cross-Cultural Differences in the Valuation of Authentic Objects PLoS ONE, 9 (3) DOI: 10.1371/journal.pone.0090787

–further reading–
The Psychology of Stuff and Things.

Post written by Bruce Hood (@ProfBruceHood) for the BPS Research Digest. Hood is University of Bristol Professor of Developmental Psychology in Society. He is elected Fellow of the BPS, Royal Institution, Society of Biology and the Association for Psychological Science. Also, President of the Psychology section of the British Science Association.

Back to the future – Psychologists investigate why some people see the future as being behind them

Speakers of English and many other languages refer to the future as being in front, and the past behind (e.g. “I look forward to seeing you”). This manner of thinking and speaking is so entrenched, we rarely pause to consider why we do it. One influential and intuitive explanation is that humans have an obvious front (the way our heads face), which combined with our tendency to think about time in terms of space, leads us to see ourselves moving forwards into the future, or the future coming towards us. A problem with this account is that there exist cultures and languages – such as the Andean language Aymara – that think and speak of the future as being behind them (and the past in front).

This leads to the proposition that perhaps people’s sense of the location of the past and future is somehow tied to their culture’s linguistic convention. Not so. In a new paper, Juanma de la Fuente and colleagues investigate Moroccan Arabic speakers – these people refer in their language to the future being in front of them (and the past behind), yet in their hand gestures they convey the opposite temporal arrangement. Clearly the ways we speak and think about time can dissociate. Still unanswered then is what leads people to differ in where they locate the past and future.

In the first of several experiments, de la Fuente’s team presented Moroccan Arabic speakers (most were students at the Abdelmalek Essaadi University in Tetouan) and Spanish speakers (students at the University of Granada) with a diagram featuring a human face with one box in front of it, and one behind.  The participants were told that an object had been picked up by the person in the diagram yesterday, or was to be picked up by them tomorrow. The participants’ task in each case was to indicate which box the object was located in.

This test confirmed that, despite speaking of the future as being in front of them, the majority of Moroccan Arabic speakers think of it as being behind. Around 85 per cent of them located tomorrow’s object behind the person in the diagram, compared with just over 10 per cent of the Spanish speakers. De la Fuente’s group think the reason has to do with temporal focus. Their theory – “the temporal-focus hypothesis” – is that people and cultures who focus more on the past tend to locate it in front.

This argument was supported by several further investigations. A “temporal focus questionnaire” (example items included “The young people must preserve tradition” and “Technological advances are good for society”) confirmed that Moroccan Arabic speakers display a greater focus on the past, as compared with Spanish speakers. Within a group of young and old Spanish speakers, meanwhile, the older participants had a greater focus on the past and they more often located the past in front (on a diagram). Among another group of Spanish speakers, those people who were more focused on the past also tended to locate the past in front. Finally, when the researchers primed Spanish speakers to think about their past (by having them write about their childhoods), they were subsequently far more likely to locate the past in front of them (and the future behind).

The researchers said they’d demonstrated “a previously unexplored cross-cultural difference in spatial conceptions of time” and that they’d validated “a new principle by which culture-specific habits of temporal thinking can arise: the temporal-focus hypothesis.”

ResearchBlogging.orgde la Fuente J, Santiago J, Román A, Dumitrache C, & Casasanto D (2014). When You Think About It, Your Past Is in Front of You: How Culture Shapes Spatial Conceptions of Time. Psychological science PMID: 25052830

–further reading–
The surprising links between anger and time perception

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

The voices heard by people with schizophrenia are friendlier in India and Africa, than in the US

When a patient with schizophrenia hears voices in their head, is the experience shaped by the culture they live in? Tanya Luhrmann and her colleagues investigated by interviewing twenty people diagnosed with schizophrenia living in San Mateo, California; twenty in Accra, Ghana; and twenty others in Chennai India. There were similarities across cultures, including descriptions of good and bad voices, but also striking differences.

In San Mateo the interviewees talked about their condition as a brain disease, they used psychiatric diagnostic terms to describe themselves, and their experiences were almost overwhelmingly negative. Fourteen described hearing voices that told them to hurt others or themselves. Eight people didn’t know the identity of their voices and few described having a personal relationship with their voices.

By contrast, in Chennai, the interviewees frequently spoke of their relationships with their voices – that is, they heard the voices of relatives or friends, giving them advice or scolding them. These patients rarely used diagnostic terms, and rarely talked of voices instructing them to commit violence. Instead, distress, when it occurred, usually arose from their voices talking about sex. Nine interviewees described voices that were significantly good – in terms of being playful or entertaining.

In Accra, yet another picture emerged. Most of the interviewees here mentioned hearing God. This isn’t simply a case of this sample being more religious – the interview groups in all three locations were predominantly religious. Half the interviewees in Accra reported that their voice hearing was mostly or entirely positive. Others frequently emphasised the positive. Use of diagnostic labels was rare, as were incitements to violence by voices.

Luhrmann and her team said their most striking finding was that the experiences of voice hearing in the two non-Western samples were less harsh and more “relational” – that is, patients perceived their voices as other people, who could not be controlled. The researchers believe this difference is likely due to Western cultures emphasising independence and individuality – in which case heard voices are experienced as a violation – whereas African and Asian cultures emphasise how each person’s mind is interwoven with others. “We believe that these social expectations about minds and persons may shape the voice-hearing experience of those with serious psychotic disorder,” the researchers said.

These results need to be replicated with larger samples matched more precisely for illness severity, and with more tightly controlled measures (the current study was deliberately qualitative and exploratory). If replicated, the findings would imply the experience of hearing voices in schizophrenia is to some extent malleable, which could have exciting therapeutic implications. Indeed, it’s notable that the outcomes for patients with schizophrenia outside the West, especially in India, are known to be more positive – perhaps because of the way patients relate to their voices. “The harsh violent voices so common in the West may not be an inevitable feature of schizophrenia,” the researchers said.

ResearchBlogging.orgLuhrmann, T., Padmavati, R., Tharoor, H., & Osei, A. (2014). Differences in voice-hearing experiences of people with psychosis in the USA, India and Ghana: interview-based study The British Journal of Psychiatry DOI: 10.1192/bjp.bp.113.139048

further reading
What’s it like to hear voices that aren’t there?
The same voices, heard differently?
Psychosis isn’t always pathological

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

It’s time for Western psychology to recognise that many individuals, and even entire cultures, fear happiness

It’s become a mantra of the modern Western world that the ultimate aim of life is to achieve happiness. Self-help blog posts on how to be happy are almost guaranteed popularity (the Digest has its own!). Pro-happiness organisations have appeared, such as Action for Happiness, which aims to “create a happier society for everyone.” Topping it all, an increasing number of governments, including in the UK, have started measuring national well-being (seen as a proxy for “happiness”) – the argument being that this a potentially more important policy outcome than economic prosperity.

But hang on a minute, say Moshen Joshanloo and Dan Weijers writing in the Journal of Happiness Studies – not everyone wants to be happy. In fact, they point out that many people, including in Western cultures, deliberately dampen their positive moods. Moreover, in many nations, including Iran and New Zealand, many people are actually fearful of happiness, tending to agree with questionnaire items like “I prefer not to be too joyful, because usually joy is followed by sadness”.

Looking into the reasons for happiness aversion, Joshanloo and Weijers identify four: believing that being happy will provoke bad things to happen; that happiness will make you a worse person; that expressing happiness is bad for you and others; and that pursuing happiness is bad for you and others. Let’s touch on each of these.

Fear that happiness leads to bad outcomes is perhaps most strong in East Asian cultures influenced by Taoism, which posits that “things tend to revert to their opposite”. A 2001 study asked participants to choose from a range of life-course graphs and found that Chinese people were more likely than Americans to choose graphs that showed periods of sadness following periods of joy. Other cultures, such as Japan and Iran, believe that happiness can bring misfortune as it causes inattentiveness. Similar fears are sometimes found in the West as reflected in adages such as “what goes up must come down.”

Belief that being happy makes you a worse person is rooted in some interpretations of Islam, the reasoning being that it distracts you from God. Joshanloo and Weijers quote the Prophet Muhammad: “were you to know what I know, you would laugh little and weep much” and “avoid much laughter, for much laughter deadens the heart.” Another relevant belief here is the idea that being unhappy makes people more creative. Consider this quote from Edward Munch: “They [emotional sufferings] are part of me and my art. They are indistinguishable from me … I want to keep those sufferings.”

In relation to the overt expression of happiness, a 2009 study found that Japanese participants frequently mentioned that doing so can harm others, for example by making them envious; Americans rarely held such concerns. In Ifaluk culture in Micronesia, meanwhile, Joshanloo and Weijers note that expressing happiness is “associated with showing off, overexcitement, and failure at doing one’s duties.”

Finally, the pursuit of happiness is believed by many cultures and philosophies to be harmful to the self and others. Take as an example this passage of Buddhist text: “And with every desire for happiness, out of delusion they destroy their own well-being as if it were their enemy.” In Western thought, as far back as Epicurus, warnings are given that the direct pursuit of happiness can backfire on the self, and harm others through excessive self-interest. Also, it’s been argued that joy can make the oppressed weak and less likely to fight injustice.

There’s a contemporary fixation with happiness in the much of the Western world. Joshanloo and Weijers’ counterpoint is that, for various reasons, not everyone wants to happy. From a practical perspective, they say this could seriously skew cross-cultural comparisons of subjective well-being. “It stands to reason,” they write, “that a person with an aversion to expressing happiness … may report lower subjective wellbeing than they would do otherwise.” But their concerns go deeper: “There are risks for happiness studies in exporting Western psychology to non-Western cultures without undertaking indigenous analyses, including making invalid cross-cultural comparisons and imposing Western cultural assumptions on other cultures.”

  ResearchBlogging.orgJoshanloo, M., & Weijers, D. (2013). Aversion to Happiness Across Cultures: A Review of Where and Why People are Averse to Happiness Journal of Happiness Studies, 15 (3), 717-735 DOI: 10.1007/s10902-013-9489-9

–further reading–
What’s the difference between a happy life and a meaningful one?
Other people may experience more misery than you realise

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

What is “Cultural IQ” training and does it really work?

IQ was once the only game in town. Now it rubs shoulders with a gaggle of human ability measures such as Emotional Intelligence, Empathy Quotient, and Rationality Quotient. The increasingly interconnected and diverse world of work has magnified interest in another newcomer: CQ, or cultural intelligence. With it come courses promising to prepare their students to work with colleagues, partners and customers who have different values and norms. A new paper investigates how effective this training really is.

The researchers, led by Jacob Eisenberg, investigated cultural awareness training that had a narrow, academic focus, predicting that its impact should be limited to the more intellectual aspects of CQ: cognitive CQ (spotting trends and gathering explicit knowledge on how cultures work), and metacognitive CQ (awareness of what you do and don’t know about other cultures). The training involved lectures and seminars led by professors in the manner of a traditional academic program; these cost-effective methods reflect those typically used in this educational industry.

The first study involved students (mostly Austrians) on a Study Abroad programme intended to increase their language knowledge, expose them to different cultures, and introduce them to different teaching methods. Students on this programme also completed a 3-day cultural management course. At the end of the course, the students rated themselves significantly improved at cognitive and meta-cognitive elements, but not at the other aspects of CQ, including motivational CQ (the amount of emotional resources put towards cultural sensitivity), and behavioural CQ (adopting behaviours such as appropriate tone of voice or recognising personal space).

This finding was replicated in a second study with a more diverse sample of students from 46 nationalities, who received short slots of training spread over several months as part of their International Management Masters.

Eisenberg’s team predicted that participants who had lived in more countries (with minimum stays of six months) should have higher CQ than their more sheltered peers, but that training should close this gap. This was partially borne out. In Study 1, residence history was more strongly correlated with pre-training than post-training levels of cognitive and metacognitive CQ. Meanwhile, the correlation between residence history and motivational CQ was unchanged by the training, strengthening the hypothesis that academic CQ training influences only cognitive aspects. In Study 2, which showed generally weaker effects (perhaps these more international students had less to gain from the training) these trends didn’t reach significance.

The second study also included a no-training control group, who failed to show the benefits enjoyed by the CQ training group. However, it’s a shame that the matching was poor – the controls were retested after three weeks, whereas for the training group eight weeks on average elapsed before retesting. Perhaps the CQ boost was only found in the training group because they spent five extra weeks on their Masters course, surrounded by students from diverse cultures, before they were finally tested?

Looking across the evidence, there’s a good case for claiming that cognitive elements of cultural intelligence can be selectively developed through academic training, including being conscious of how much you don’t yet know about other cultures. But such training doesn’t seem to help people to actually alter behaviour, nor to maintain an appetite for ambiguous cultural environments, arguably even more vital to adapting to a culture. Methods matter, and if you want people to feel or act differently, traditional teaching seems unlikely to be enough, however convenient it may be for the industry to provide.

Eisenberg, J., Lee, H., Bruck, F., Brenner, B., Claes, M., Mironski, J., & Bell, R. (2013). Can Business Schools Make Students Culturally Competent? Effects of Cross-Cultural Management Courses on Cultural Intelligence Academy of Management Learning & Education, 12 (4), 603-621 DOI: 10.5465/amle.2012.0022

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Shame, stigma and mother-blaming following miscarriage

Imagine having a miscarriage and keeping it secret because you’d get the blame for your pregnancy loss? We might believe that only happened in the past, but it is a situation faced by countless women every day. And, like miscarriage itself, it remains taboo to talk about.

Miscarriage is a common event. Around 1:4 pregnancies end in this way. Yet worldwide we remain poor at supporting women and their partners during and after miscarriage. This can be particularly acute in communities where access to health services are limited, which in turn can exacerbate physical and psychological recovery after pregnancy loss.

These issues are sensitively explored by Dellicour and colleagues who invited 90 women from Rarieda District, Nyanza Province, in western Kenya to talk about conception, pregnancy, birth, disability and loss. These focus group discussions included teens, women of childbearing age, pregnant women, Nyamrerwas (traditional birth attendants) and mothers of children born with a congenital abnormality.

Despite awareness of different reasons a woman might miscarry (both accurate and inaccurate) a strong sense of women blaming ran throughout the participants narratives, which has implications for both disclosing and seeking help during or after pregnancy loss.

Infidelity in particular was negatively associated with miscarriage – participants suggested either that conceiving a child with someone who wasn’t your husband could cause congenital abnormalities or a miscarriage, or sleeping with someone else while pregnant with your husband’s child could do the same. Other blame-related factors linked to miscarriage included women or their partners not respecting ‘tradition’ (for example failing to pay a bride price or building a new home), or being cursed or possessed.

Women and their partners remain uncertain
how to communicate together or move on following loss

Miscarriage is well noted for affecting relationships. Women and their partners remain uncertain how to communicate together or move on following loss. In cultures where blame and shame is strongly associated with miscarriage this can be compounded, as it is difficult to support your partner if you believe she has lost a baby due to infidelity or being cursed. Links to increased familial violence go with miscarriage in such cases, and may extend to those who’ve had children born with congenital abnormalities which may be viewed as a sign of the mother’s wrongdoing, coupled with the child being viewed as a burden on the family.

Participants in this study talked of women who’ve miscarried being distanced from their families and existing children until they’ve either been ‘cleansed’ by a spiritual healer or the church. Another view was that those who have miscarried should stay away from pregnant women in case they cause them to do the same. All of this results in situations where women who’ve miscarried (and may be in distress) are isolated, stigmatised and prevented from seeking help, if any is available – in many cases it is not.

The paper clearly unpacks the barriers faced by women in rural Kenya and recommends an informational approach to addressing their problems, but it would have been good to also hear more about how these issues might be addressed in practice. I hope the researchers are able to follow this paper up with more practical guidance around how communities, media charities, NGOs, healthcare and therapy providers might find ways to unpack and address the blaming, shaming and silencing of women during and after miscarriage.

Dellicour S, Desai M, Mason L, Odidi B, Aol G, Phillips-Howard PA, Laserson KF, & Ter Kuile FO (2013). Exploring risk perception and attitudes to miscarriage and congenital anomaly in rural Western Kenya. PloS one, 8 (11) PMID: 24236185

Post written for the BPS Research Digest by guest host Petra Boynton, Senior Lecturer in International Primary Care Research, University College London and the Telegraph’s Agony Aunt.

Around the world, things look better in hindsight

Human memory has a pervasive emotional bias – and it’s probably a good thing. That’s according to psychologists Timothy Ritchie and colleagues.

In a new study published in the journal Memory, the researchers say that people from diverse cultures experience the ‘fading affect bias’ (FAB), the tendency for negative emotions to fade away more quickly than positive ones in our memories.

The FAB has been studied previously, but the most previous research looked at the memories of American college students. Therefore, it wasn’t clear whether the FAB was a universal phenomenon, or just a peculiarity of that group.

In the new study, the authors pooled together 10 samples from different groups of people around the world, ranging from Ghanaian students, to older German citizens (who were asked to recollect the fall of the Berlin Wall). In total, 562 people were included.

The participants were asked to recall a number of events in their lives, both positive and negative. For each incident, they rated the emotions that they felt at the time it happened, and then the emotions that they felt in the present when remembering that event.

Ritchie and colleagues found that every cultural group included in the study experienced the FAB. In all of these samples, negative emotions associated with remembered events faded to a greater degree than positive emotions did. Importantly, there was no evidence that this effect changed with people’s age: it seems to be a lifelong phenomenon.

The authors conclude that our ability to look back on events with rose-tinted spectacles might be important for our mental health, as it could help us to adapt and move on from adversity: ‘We believe that this phenomenon is part of a set of cognitive processes that foster emotion regulation and enable psychological resilience.’

However, the authors admit that their study had some limitations. While the participants were diverse geographically and culturally, they all had to speak fluent English, because all of the testing was carried out in that language. In order to confirm that the FAB is truly universal, it will be important to examine it in other languages. Ritchie and colleagues also note that despite this apparent universality of the phenomenon, ‘We do not intend to imply that the FAB occurs for the same reasons around the world.’

Ritchie TD, Batteson TJ, Bohn A, Crawford MT, Ferguson GV, Schrauf RW, Vogl RJ, & Walker WR (2014). A pancultural perspective on the fading affect bias in autobiographical memory. Memory (Hove, England) PMID: 24524255

Post written for the BPS Research Digest by guest host Neuroskeptic, a British neuroscientist who blogs for Discover Magazine.

Studies of SE Asian tribes force a re-think on the psychology of language and smell

“What does coffee smell like?” “What about lemon?” These questions are tricky for English speakers to answer because we tend to describe smells by referring to their typical source. So, an aroma that smells like coffee is described as, well, smelling like coffee. Ditto for lemon or cinnamon or rotten eggs.

The fact is we don’t have abstract words to describe the essence of these odorous experiences. This contrasts with our language for other sensory experiences such as colour. For example, the word “red” describes the first-hand experience of seeing that particular wavelength of light, and the word can be meaningfully applied regardless of the source of the red.

“What does this smell of?” is another difficult question for English speakers. When we’re asked to identify everyday items like coffee or chocolate by their smell, our accuracy is around 50 per cent. “Similar performance with a visual object … would [lead a person to] be diagnosed as aphasic and sent for medical help,” say Asifa Majid and Niclas Burenhult, the authors of a new investigation into the links between language and smell.

These researchers explain how the lack of smell-based terms in English, combined with our poor skills at smell identification, have led generations of scholars to propose that olfaction (the scientific name for the sense of smell) is unimportant to humans. Even great minds like Darwin and Kant have arrived at this conclusion.

But now Majid and Burenhult have uncovered evidence from a Malaysian tribe – the Jahai who speak Jahai – that shows such conclusions are premature and an over-generalisation. In a smell identification test, 10 members of the Jahai (all men; average age 37) were as precise naming 12 smells as they were at naming colours. In contrast, 10 age-matched speakers of American English were vague and inconsistent at naming 12 smells, but excelled as you’d expect at naming colours. What’s more, the Jahai were far more succinct than English speakers at naming the smells, and 99 per cent of the time they used abstract terms for smells (whereas this was rare for the English speakers).

The lesson from this field trip is that we shouldn’t assume that findings about language and smell from the Western world (especially English speaking cultures) necessarily apply to humanity as a whole. Odour is incredibly important to members of the Jahai, and they use at least 12 abstract smell-based words to discuss plants and animals in their everyday lives. “Jahai speakers show us that olfactory abstraction is possible,” said Majid and Burenhult, “and humans can be adept at talking about smells.”

What is it like to have a rich language for smells? Further insight comes from a study of another South East Asian Tribe – the Maniq in Thailand, who speak Maniq. The results of this investigation have been shown exclusively to the Research Digest ahead of publication. In the course of three field trips working with co-author Ewelina Wnuk, Asifa Majid first asked 8 Maniq speakers (4 female) to provide examples of items that fit 15 smell terms used in their language. This showed how the same term could be used to describe the same odorous essence attributed to a variety of different types of object, location or activity. For instance the term caŋə was attributed to food, cooked meat, and white sun, while paʔ ʔ was applied to old shelters, mushrooms and pouring water (among other things).

Next Wnuk and Majid tried to identify the factors underlying this diverse smell-based vocabulary. Members of the Maniq were presented with three of their smell terms at a time and asked to pick the odd one out. Successive trials of this kind suggested that the range of smell terms are best described as existing along two dimensions – pleasantness and dangerousness. Another study found strong consensus between tribe members about the extent to which each odorous term conveyed dangerousness, pleasantness and other descriptors.

As with the Jahai, the richness of smell-based language in Maniq reflects the lives of this tribe. They regularly discuss smells; they avoid bad – dangerous or unpleasant smells – and they deliberately use and wear appealing odours to bring good health and avoid danger. The attribution of smell terms to their cultural practices is often complex. For instance the term Caŋus, associated with pleasantness and cleanliness is also applied to the smell of the fruit “kul w”, even though it is poisonous. A leaf monkey – a food source for the Maniq – is said to smell of Caŋus if it has eaten the poisonous fruit – a warning that eating the monkey will cause sickness.

Research with the Maniq shows again that assumptions about the limited role of smell in our lives, and about our poor ability to describe smell experiences, has likely been premature. By studying exclusively Western samples we underestimate the richness of human experience. “The cultural and linguistic elaboration of smell among the Maniq constitutes compelling evidence against the universal paucity of olfactory terms, the ‘weak link’ between smell and language, and the general insignificance of olfaction for humans,” write Wnuk and Majid.


Majid A, and Burenhult N (2014). Odors are expressible in language, as long as you speak the right language. Cognition

Ewelina Wnuk, and Asifa Majid (In Press). Revisiting the limits of language: The odor lexicon of Maniq. Cognition.

further reading
The smell of fear more powerful than previously realised
Humans don’t smell that bad
Humans can track scents like a dog
Mice and humans like the same smells

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Comparing children’s sharing tendencies across diverse human societies

Up until about the age of seven, children across the world show similar levels of sharing behaviour as revealed by their choices in a simple economic game. The finding comes courtesy of Bailey House and his colleagues who tested 326 children aged three to fourteen from six different cultural groups: urban Americans from Los Angeles; horticultural Shuar from Ecuador; horticultural and marine foraging Fijians from Yasawa Island; hunter-gathering Akas from the Central African Republic; pastoral, horticultural Himbas from Namibia; and hunter-gatherer Martus from Australia.

In one game, the children had to choose whether to take two food rewards for themselves or take one and give the other to their partner. When this partner was sat before them, a similar pattern was found across the diverse cultural groups – progressively from age three to about seven or eight the children grew more selfish. That is, the older the child, up to seven or eight, the more likely they were to keep both treats for themselves.

Intriguingly, aged eight to fourteen, the behaviour of the children varied depending on the culture they belonged to. For instance, the American children showed a sharp up-turn in making the selfless option. The Aka showed a similar increase in selflessness, but starting at a slightly later age. The Fijian children, by contrast, became even more likely to choose the selfish option right into early adolescence.

The older children’s choices tended to mirror the behaviour of the adults from their culture on similar games, suggesting they were gradually acquiring the social norms around altruism and reciprocity for their specific society.

The emergence of cultural differences in the older children’s choices only appeared for this costly version of the game, in which giving to another person meant sacrificing their own gain. In a different version, in which they could be generous at no cost to themselves, no such differences emerged across cultures.

Bailey House and his team said their results illustrated how cultural norms interact with children’s developing sense of fairness, which is consistent with theories that emphasise the role of genes and culture in human altruism and how it varies between societies. The results also reinforce past evidence suggesting that cultural norms particularly influence how people choose to behave when altruistic choices are costly to themselves.

“Our findings contribute to ongoing discussions of the processes that underlie both uniformity and diversity in social behaviour across societies,” the researchers said, “and highlight the importance of expanding the scope of developmental studies to encompass a wider range of extant human diversity.” Indeed, psychology has long been criticised for focusing too much on rich, Western participants and this study is to be applauded for its cross-cultural reach.


House BR, Silk JB, Henrich J, Barrett HC, Scelza BA, Boyette AH, Hewlett BS, McElreath R, and Laurence S (2013). Ontogeny of prosocial behavior across diverse societies. Proceedings of the National Academy of Sciences of the United States of America, 110 (36), 14586-91 PMID: 23959869

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Findings from the first study to compare the minds of gods

One explanation for the ubiquity of religion is that it fostered advantageous cooperation among our ancestors. The human mind readily develops belief in supervisory god-like entities and these beliefs help promote in us cooperative, moral behaviour. One problem with this account: how come some religions don’t believe in a god or gods with moral concerns? Benjamin Purzycki may have the answer. He argues there’s a difference between explicit, formal theological religious beliefs and people’s religious intuitions. Even among religions that state their gods are unconcerned by human moral behaviour, he predicts there is an automatic bias toward believing that these gods know and care about interpersonal behaviour between people.

To test this moralisation bias theory, Purzycki has conducted what he describes as “the first study to systematically compared the minds of gods.” For this he surveyed 88 Christians at the University of Connecticut (including 60 Catholics, 14 Protestants) and 88 ethnic Tyvans living in Southern Siberia.

True to the religious teachings of their faith, the Christians stated initially that their god knows everything. However, when they rated God’s knowledge of 50 moral and non-moral issues (e.g. “God knows if I was helpful to someone”; “… knows what is under my bed”), they showed a clear bias for rating him more knowledgeable and concerned about moral facts than non-moral ones. “In one sitting, students claim both that God knows everything, but knows moral information better than non-moral information,” Purzycki said.

There was a similar contradiction among the more varied answers of the Tyvans. Their religion incorporates elements of Buddhism, shamanism and totemism among other influences. They believe in the existence of Cher eezi spirit masters of different forms – including a woman on a horse; a bull; and a small marmot – that oversee natural resources in specific regions. The Tyvans’ explicit teachings state that the Cher eezi are not concerned with people’s interpersonal moral behaviour. However, asked to rate their spirit masters’ knowledge of 50 issues, the Tyvans showed a consistent bias, rating their knowledge and concern of moral facts as greater than their knowledge and concern for non-moral facts.

This was the case even when the analysis was restricted to those Tyvans who didn’t list a single interpersonal behaviour when asked at the survey start to name things that please or anger their spirit masters. On the other hand, true to their teachings, the Tyvans’ survey answers were influenced by geography – they said spirit masters knew and cared more about moral behaviour in their relevant geographical location.

“Despite the world’s religious diversity and cultural models, interpersonal social behaviour is an essential constant in religious cognition,” said Purzycki. “… As such religious systems around the world may indeed be essentially about interpersonal social regulation and monitoring regardless of whether moral concern is explicitly attributed to gods.”

Although Purzycki’s findings are consistent with the idea that regardless of teachings, religious people implicitly see all gods as concerned with how we behave, he wonders about the processes that lead these concerns to be made explicit in some religious teachings. One possibility he says, is that “as societies become more complex [and] individual behaviour more easily hidden from others, concepts of omniscient moralistic high gods may become not only more easily promulgated, but also more salient in individual minds.”


Benjamin Grant Purzycki (2013). The minds of gods: A comparative study of supernatural agency. Cognition DOI: 10.1016/j.cognition.2013.06.010

–Further reading–
More posts on the psychology of god and religion in the Digest archive.

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.