Facebook secretly manipulated the feelings of 700,000 users to understand “emotional contagion” in a study that prompted anger and forced the social network giant on the defensive.
For one week in 2012, Facebook tampered with the algorithm used to place posts into users’ news feeds to study how this affected their mood, all without their explicit consent or knowledge.
The researchers wanted to see if the number of positive or negative words in messages the users read determined whether they then posted positive or negative content in their status updates.
The study, conducted by researchers affiliated with Facebook, Cornell University and the University of California at San Francisco, appeared in the June 17 edition of the Proceedings of the National Academy of Sciences.
Results of the study spread – and with that anger and disbelief – when the online magazine Slate and The Atlantic website wrote about it Saturday.
“Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness,” the study authors wrote.
“These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”
While other research has used metadata to study trends, this experiment appears to be unique because it manipulated the data to see if there was a reaction.