In defense of Facebook’s newsfeed study

Coverage Type: 

[Commentary] Did Facebook overstep its bounds when it ran a secret psychological experiment on a fraction of its users in 2012? That's the question at the heart of the most recent Internet firestorm.

The consensus is that Facebook probably did something wrong. But what, exactly? To say this is one more example of Facebook prioritizing power over privacy is to vastly oversimplify what's going on here.

The reality is that people are objecting for a lot of reasons. Whatever your gut feelings about Facebook, don't give into them. Yet.

There has been a vigorous yet healthy debate going on about the convergence of business and academic research, and whether Facebook acted irresponsibly or unethically with its users' data. To understand why, let's unpack some of the charges being lobbed at the social network. Call it a taxonomy of Facebook critiques.

It used people's data for an academic study.

It manipulated people's newsfeeds to make them happy or sad.

The study made it past an institutional review board. How? The IRB looked at the results of Facebook's data analysis and gave it the green light, but evidently didn't consider how Facebook acquired the data in the first place. Was that an ethical lapse? If Facebook were an arm of the government or a federally funded academic institution, then yes. Research conducted in those environments on human subjects require an IRB's approval. But as a private entity, Facebook isn't legally bound by those requirements, nor was the study itself, apparently.
People should've been given the opportunity to opt in or out.

It's creepy.

In defense of Facebook’s newsfeed study