Why Academics Are Incensed By Facebook's Emotion-Manipulating Social Experiment

Source: 
Coverage Type: 

For a long time, Facebook operated under an incisive motto: "Move fast and break things." Acting on this mantra has a tendency to upset Facebook's change-averse user base -- and sometimes, Facebook doesn't even have to break anything.

Recently, New Scientist revealed that Facebook's data team manipulated the news feeds of 689,003 users for one week in January 2012. Using an algorithm, Facebook and researchers at Cornell University either removed all positive posts or all negative posts from a user's news feed.

The joint study was published in the Proceedings of the National Academy of Sciences. The Internet was not pleased. Even though the sample size was relatively small, Slate, for example, called the research "unethical."

Comments on Metafilter filled a spectrum ranging from "meh" to furious: "I don't remember volunteering to participate in this. If I had tried this in graduate school, the institutional review board would have crapped bricks."


Why Academics Are Incensed By Facebook's Emotion-Manipulating Social Experiment