Study in which Facebook manipulated the news feeds of almost 700,000 users for psychological research sparks controversy.
A new study of ‘emotional contagion‘ which manipulated the Facebook news feeds of 689,003 users for a week in 2012 has been widely criticised over ethical and privacy concerns.
Facebook users who ‘took part’ in the experiment did not give their consent, either before or after their news feeds were manipulated.
In fact, you may have taken part without knowing anything about it.
News feed manipulation
Working with researchers at Cornell University and the University of California, San Francisco, Facebook subtly adjusted the types of stories that appeared in user’s news feeds for that week (Kramer et al., 2014).
Some people saw stories that were slightly more emotionally negative, while others saw more content that was slightly more emotionally positive.
People were then tracked to see what kind of status updates they posted subsequently.
One of the study’s authors, Jeff Hancock, explained the results:
“People who had positive content experimentally reduced on their Facebook news feed, for one week, used more negative words in their status updates.
When news feed negativity was reduced, the opposite pattern occurred: Significantly more positive words were used in peoples’ status updates.”
In other words: positive and negative emotions are contagious online.
The study echoes that conducted recently by Coviello et al. (2014), which found that positive emotions are more contagious than negative.
The previous study, though, while it was conducted on Facebook in a somewhat similar way, did not manipulate users’ news feeds, rather it used random weather variations to make a natural experiment:
“…they needed something random which would affect people’s emotions as a group and could be tracked in their status updates — this would create a kind of experiment.
They hit upon the idea of using rain, which reliably made people’s status updates slightly more negative.” (Happiness is Contagious and Powerful on Social Media)
Was it right?
There are all sorts of conversations going on about whether or not this experiment was sound.
Media outlets have been scrambling around to see what various rules have to say about this.
Was the study’s ethical procedure correctly reviewed? Did Facebook break its terms of service?
But let’s just forget the rules for a moment and use our brains:
- Was this study likely to do anyone any harm? Highly unlikely. Psychologists measure the influence of manipulations using an ‘effect size’. In this study it was d = 0.001. Trust me, this is beyond miniscule.
- Should users have been told they’d taken part in an experiment afterwards? Yes, it would have been a nice courtesy — and most people would probably have been fine with it.
Big data
The reason people are jumping on the story is because of concerns about what other people are doing with our data, especially big corporations and governments.
Take Facebook itself: many people don’t realise that Facebook is already manipulating your news feed.
The average Facebook news feed has 1,500 items vying for a spot in front of your eyeballs.
Facebook doesn’t show you everything, so it has to decide what stays and what goes.
To do this they use an algorithm which is manipulating your news feed in ways that are much less transparent than this experiment.
Professor Susan Fiske of Princeton University, who edited the article for the academic journal it was published in (PNAS), told The Atlantic:
“I was concerned until I queried the authors and they said their local institutional review board had approved it—and apparently on the grounds that Facebook apparently manipulates people’s News Feeds all the time… I understand why people have concerns. I think their beef is with Facebook, really, not the research.”
But should our justifiable concerns about being spied on, manipulated and exploited stop researchers conducting a harmless and valuable psychology experiment?
Final word goes to the study’s lead author, Adam Kramer, a Data Scientist at Facebook, who was moved to apologise:
“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product.
We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out.
At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.
We didn’t clearly state our motivations in the paper.”
Image credit: Dimitris Kalogeropoylos