Who’s afraid of a big bad algorithm?

Author: 
Coverage Type: 

[Commentary] “If you use Facebook to get your news,” Caitlin Dewey wrote recently in The Washington Post, “please—for the love of democracy—read this first.” Dewey’s central claim is that since most millennials now get their news from Facebook, and Facebook has an algorithm that dictates what we see based in part on our biases, millennials in particular will not get a full picture of the news. Even worse, these algorithms are controlled by giant corporations out to make money, and seem completely unaccountable to the public interest. But many current arguments about the dangers of algorithms tend to simplify how they work.

First, algorithms are made by people, meaning they are often more sophisticated than we might think; this built-in human influence also offers a layer of quality control that is often ignored. Second, a good algorithm will show you what you want to read, but it will also continually refine its suggestions, introducing new content that lies outside your immediate interests.

[Nikki Usher is an assistant professor at George Washington University's School of Media and Public Affairs]


Who’s afraid of a big bad algorithm?