Good news guys! If you’ve been feeling sadder than usual, you can probably blame Facebook for that. The bad news is this time, those blues may have been the result of being a guinea pig in their massive, ethically questionable social experiment. The title of the study, ‘Experimental evidence of massive-scale emotional contagion through social networks,’ is not even the creepiest part about it.
Here’s what happened one fine week in 2012: Facebook data scientists manipulated about 700,000 users News Feeds to show specific kinds of information. Some users were shown content with mostly happy and positive language, while others were shown content with sadder words. At the week’s end, those users were more disposed to posting either happier or sadder content, corresponding to the kind they were given. Meaning the study was a huge success! Congrats Facebook, I hope there is a Nobel Prize in the category of moral depravity.
Right, so the study apparently proves that “Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.” Upon first reading that conclusion, I actually felt like it was kind of obvious. Duh, when I see something sad, I feel sad too. ‘Emotional contagion’ sounds like a fancy way of saying ‘empathy.’ But it’s the fact that Facebook now knows they have the power to subliminally control our emotions that’s really scary.
Well, this study at least makes them think they have that power. There seem to be some flaws with the experiment’s design, according to John Grohol, Psy D., founder of the Psych Central website. He has a problem with the tool used to detect positive and negative words in Facebook posts, which produces kind of counterintuitive results. For example, the tool would give the sentence “I am not having a great day” one positive point and one negative point, each for the words ‘great’ and ‘not.’ A real human familiar with emotions would probably not give this sentence any positive points.
The findings of the experiment, valid or not, aren’t as disturbing as Facebook’s legal ability to conduct it. Whenever you agree to those lengthy terms and conditions, you’re consenting to be used as part of research like this. Of course, the study did have to be passed by an institutional review board, since the experiment involved humans. According to the study’s editor, Susan Fiske, the IRB approved the study, “…apparently on the grounds that Facebook apparently manipulates people’s News Feeds all the time.”
Well. That’s just great.