Many neuro-imaging studies claim to have investigated what happens in the brain when people interact socially. To overcome the awkward fact that participants have to lie entombed in the bore of a large magnet, these studies have used various means to simulate a social interaction. This includes: having participants watch videos of social interactions; interact with an animated character; or play a game with a human opponent (usually computer controlled) supposedly located in another room. Such methods score marks for improvisation but arguably none of them fully capture the dynamic cut and thrust of a real face-to-face social interaction between two people. That’s why Elizabeth Redcay and her colleagues have devised the first ever experimental set up that allows for live face-to-face (via video link) interaction whilst participants are prostrate inside a brain-imaging magnet.
Participants in this study watched a live video feed of the experimenter. The experimenter in turn had a display showing them a live feed of where the participant was looking. Experimenter and participant then engaged in a series of ‘games’ that required social interaction. For example, in one, the experimenter picked up various toys and the participant had to look in the direction of the appropriately coloured bucket to which the toy belonged. Compared with watching a recording of this same interaction, the live interaction itself triggered increased activation in a swathe of social-cognitive, attention-related and reward processing brain regions.
The second experiment involved the participant identifying which screen quadrant a mouse was hidden in. In the live ‘joint attention’ condition, the experimenter’s gaze direction cued the mouse’s location and only when both experimenter and participant looked at the correct quadrant did the mouse appear. Compared with a solo condition in which a house symbol cued the mouse location, the interactive joint attention condition triggered increased activation in the right superior temporal sulcus and right temporal parietal junction. The former brain region has previously been associated with processing socially relevant stimuli such as eye gaze and reaching, whereas the latter temporal-parietal region is associated with thinking about other people’s thoughts.
Past research using simulations of social interaction has identified the dorso-medial prefrontal cortex as a key area involved in social engagement. The quietness of this region in the current study suggests it may have been the competitive or social judgement elements of previous paradigms, rather than social interaction per se, that led to its activation.
‘Social interaction in the presence of a live person (compared to a visually identical recording) resulted in activation of multiple neural systems which may be critical to real-world social interactions but are missed in more constrained, offline experiments,’ the researchers said.
Redcay’s group said their new set-up would be ideal for studying the social difficulties associated with autistic spectrum disorders (ASD). Attempts to identify the neural bases of these difficulties have previously met with mixed success. ‘A neuroimaging task that includes the complexity of dynamic, multi-modal social interactions may provide a more sensitive measure of the neural basis of social and communicative impairments in ASD,’ the researchers said.
Redcay E, Dodell-Feder D, Pearrow MJ, Mavros PL, Kleiner M, Gabrieli JD, & Saxe R (2010). Live face-to-face interaction during fMRI: a new tool for social cognitive neuroscience. NeuroImage, 50 (4), 1639-47 PMID: 20096792
Image courtesy of Elizabeth Redcay.