Science in the Media

The Facebook Emotion Experiment

Facebook has been messing with our emotions. Primarily, it seems, by publishing the results of secret experiments where they tried to mess with our emotions. In the month since the social network published its study on “massive-scale emotional contagion”, there has been an online avalanche of comment, analysis and argument, both in the news and in our News Feeds. Reaction has ranged from mildly unnerved to angrily indignant, but now that some of the dust has settled, what actually happened? And (more importantly) what can we say about the science communication?

If you haven’t yet read the paper, you can find it here. In brief: The authors (a Facebook data scientist and two Cornell University academics) claimed in PNAS that they were able to influence the emotions of 689,003 Facebook users by manipulating the emotional content of their News Feeds. The scandal was that none of the participants were told about the experiment.

There are a lot of ethical questions raised by a study like this, which I won’t go into too much detail about here. The backlash revolves around the issue of informed consent. Should Facebook have asked users before running this experiment? According to the PNAS editor, the local Institutional Review Board (IRB) waived consent “on the grounds that Facebook apparently manipulates people’s News Feeds all the time”. Some also say that this kind of testing happens online all the time without our consent in A/B testing, and this isn’t much different. But others argue that this is an experiment published in a scientific journal where the authors deliberately set out to manipulate people’s emotional states, so informed consent is definitely required. Personally, I think Facebook acted unethically – there’s a difference between using A/B testing to figure out the most popular version of a web page and running an experiment which directly aims to influence people’s emotions. Advertisements and politicians try to influence our emotions all the time, but we are aware of their intentions and we expect it. The fact that these experiments probably happen behind closed doors at other online corporations also doesn’t make them ethical. To me, it seems like Facebook got to have their cake and eat it – they were exempt from requiring informed consent in the first place because they are a private company, but they still got to publish their findings in a scientific journal where these ethical considerations should apply. Something doesn’t add up.

It’s interesting to look at how the experiment was communicated to the public at different stages. My first exposure to the story was when it was already in the headlines. The results of the study were certainly overblown in a lot of this coverage, and in the process a lot of the science got drowned out. It seems like many of the concerns that emerged initially were knee-jerk reactions from people that hadn’t read the paper properly (sigh). For one thing, there’s a big difference between what people post and how they actually feel – a  point that was neglected by many, including the authors of the study. Compare, for example, a line from this Slate article , “Facebook intentionally made thousands upon thousands of users sad”, or this tweet which calls it “Facebook’s transmission of anger experiment”, with a public comment from author Adam Kramer, “the result was that people produced an average of one fewer emotional word, per thousand words, over the following week”. Although I’m not defending the experiment, Kramer’s statement puts the results into perspective.

When the study was first published, however, the coverage was very different to the “WTF Facebook is manipulating our emotions” angle. This early article from New Scientist has a much more banal take on the story. It’s just your common or garden technopsychosocial news piece – look here at how emotions can be transmitted online, isn’t that neat? There was almost a progression, from this polite interest to slightly creeped out to full blown outrage as the story gained traction. This is a point made in this article where the author suggests that the contrast in reactions can be explained by differences in the way the situation was “framed” by different parties, which is an intriguing idea. As an aside, from a Science Communication perspective, the experiment seems to have undermined the public’s trust in Facebook a lot more than their trust in the scientists, but it seems like the latter may have helped to design the experiment in the first place rather than just analysing an already existing dataset.

Going back even earlier to the wording of the actual paper,  it’s perhaps possible to see why the whole thing ended in disaster. The use of the phrase “emotional contagion”, which sounds like some kind of Orwellian mind control tactic, probably didn’t help. It reads like an exercise in emotional manipulation for its own sake, with little concern for the subjects involved. Compare this to the public statement  made on Kramer’s Facebook wall: “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.” In fact, the experiment found the opposite result – seeing positive posts led to positive posting.

This certainly seems like a more honorable motive for conducting such a study. I have my doubts as to whether this was really the reason Facebook undertook this research, but what if they had framed the whole experiment in this more altruistic way? Would the reaction have been as violent? I for one don’t think that it would, and it seems to be something that Facebook has regretted not doing. When Kramer and Sheryl Sandberg (Facebook’s Chief Operating Officer) issued apologies, it wasn’t for the research itself, but for how the research was communicated. Not much of an apology if you ask me, and it seems a bit rich to say “we never meant to upset anyone” after running an experiment that tried to induce negative emotions in hundreds of thousands of people. But it does give us some food for thought: given the fallout from this research, if it had been communicated differently, would people have cared as much?

Anand Jagatia is currently studying for an MSc in Science Media Production at Imperial College London.
Image credit: Juan Ignacio Sánchez Lara (via Flickr).