Tuesday, 3 September 2013
A new study has examined this phenomenon in relation to the comments posted to a news-sharing website similar to Reddit.com and Digg.com. Users of the site share links to news stories and others then post their comments on the stories. In turn these comments can be rated positively or negatively, thus encouraging or deterring others from reading them. Such systems promise to martial the wisdom of the crowd. However, they could also be vulnerable to distortion if raters are influenced not only by the comment in question, but also by the ratings it has already received.
Lev Muchnik and his colleagues tested this possibility experimentally. Collaborating with a news-sharing website they randomly assigned either a positive or negative first-rating, or no rating (control condition), to 101,281 real comments posted over 5 months. This simple manipulation had a significant effect on the way other site users subsequently rated the comments.
An initial positive rating on a comment tended to have a snowball effect, encouraging further positive ratings. The first viewer of a comment rated positive by the researchers was 32 per cent more likely to add their own positive rating, as compared with the control condition. In contrast, there was no effect on the likelihood of a negative rating being given. Five months later, these effects accumulated so that comments given an initial positive rating by the researchers ended up with a 25 per cent higher average rating as compared with control comments, and they were more likely to end up with an exceptionally high average rating score. These positive herding effects were found for comments in the politics, cultural and business categories of the site, but not in economics, IT, fun or general news.
The situation was different for comments given an initial negative rating by researchers. These were more likely than control comments to receive both positive and negative ratings from other users. These effects cancelled out so that in the long run, comments given an initial negative rating ended up with average ratings that were no different from control comments. This was true across all subject categories.
Muchnik and his colleagues think the effects they observed are due to two underlying mechanisms - opinion change (users with little history of rating a given commenter were more likely than usual to give a positive rating to a comment that the researchers had rated positive), and increased turn out (i.e. seeing that a comment had already been rated tended to encourage users to add their own ratings, positive and negative). These mechanisms combined with a general trend for positivity on the site - that is, positive ratings were made more often than negative, overall. "Our findings suggest that social influence substantially biases rating dynamics in systems designed to harness collective intelligence," the researchers said.
The rating of comments on a news sharing site is a microcosm for the effects of social influence that play out constantly in the way we respond to news and culture through the prism of other people's reactions. It will be interesting to see if future research in other contexts replicates the positive snowball effect reported here, as well as the corrective response to initial negative ratings.
It will also be valuable to examine why the positive snowball effect was found for some topics but not others. In reality, of course, many comments, products and performances really are exceptional, dire or distasteful, and so another avenue of research will be to see how social influence interacts with objective quality to shape people's reactions.
Lev Muchnik, Sinan Aral, and Sean J. Taylor (2013). Social Influence Bias: A Randomized Experiment. Science DOI: 10.1126/science.1240466
The psychology of online reviews
Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.