The Cambridge Analytica Scandal: Facebook, Privacy, and Democracy

 You’re reading the Benton Foundation’s Weekly Round-up, a recap of the biggest (or most overlooked) telecommunications stories of the week. The round-up is delivered via e-mail each Friday.

Round-Up for the Week of March 19-23, 2018

Robbie McBeath

Cambridge Analytica -- a political consulting firm which combines data mining, data brokerage, and data analysis with strategic communication for the electoral process --  harvested, without consent, millions of U.S. Facebook users’ profiles to build a powerful software program to influence the 2016 U.S. election. The revelation this week put Facebook into the hot-seat, reigniting the debate on how the company handles user privacy.

What Has Been Reported

Over the weekend, a former Cambridge Analytica employee revealed the company harvested millions of Facebook profiles and used them to build a powerful software program to develop psychological profiles of voters. These profiles were then targeted with personalized political advertisements to predict and influence choices at the ballot box.  

Cambridge Analytica got its data through researcher Aleksandr Kogan, a Russian American who worked at the University of Cambridge. In 2013, Kogan developed a third party app (a Facebook quiz called “thisismydigitallife”) which not only collected data from people who took the quiz (270,000), but also exposed a loophole in Facebook’s application programming interface (API) that allowed Kogan to collect data from the Facebook friends of the quiz takers as well (50 million) -- without these friends knowing.

Below are some of the more notable revelations:

  • According to the New York Times, the breach allowed Cambridge Analytica to exploit the private social media activity of a huge swath of the American electorate, developing techniques that underpinned its work on President Trump’s campaign in 2016.
  • In 2014, to prevent abusive apps, Facebook changed its platform to dramatically limit the data apps could access: apps like Kogan's could no longer ask for data about a person's friends unless their friends had also authorized the app. Facebook also required developers to get approval from the company before they could request any sensitive data from people. In theory, these actions would prevent any app like Kogan's from being able to access so much data today.

  • In 2015, Facebook found out that information had been harvested on an unprecedented scale. However, at the time it failed to alert users and took only limited steps to recover and secure the private information of more than 50 million individuals.

  • On March 16, Facebook announced that it was suspending Cambridge Analytica from the platform, pending further information over misuse of data.

Zuckerberg Responds

For five days, Facebook CEO Mark Zuckerberg remained what many called “deafeningly silent” before finally posting a response on his personal Facebook page. The Facebook CEO announced changes to the platform intended to protect users data:

First, we will investigate all apps that had access to large amounts of information before we changed our platform to dramatically reduce data access in 2014, and we will conduct a full audit of any app with suspicious activity. We will ban any developer from our platform that does not agree to a thorough audit. And if we find developers that misused personally identifiable information, we will ban them and tell everyone affected by those apps. That includes people whose data Kogan misused as well.

Second, we will restrict developers' data access even further to prevent other kinds of abuse. For example, we will remove developers' access to your data if you haven't used their app in 3 months. We will reduce the data you give an app when you sign in -- to only your name, profile photo, and email address. We'll require developers to not only get approval but also sign a contract in order to ask anyone for access to their posts or other private data. And we'll have more changes to share in the next few days.

Third, we want to make sure you understand which apps you've allowed to access your data. In the next month, we will show everyone a tool at the top of your News Feed with the apps you've used and an easy way to revoke those apps' permissions to your data. We already have a tool to do this in your privacy settings, and now we will put this tool at the top of your News Feed to make sure everyone sees it.

Not everybody was convinced about Zuckerberg’s sincerity. Nicholas Proferes, writing for Slate, said that Zuckerberg has issued numerous mea culpas before, and has it down to a formula: 1) Acknowledge, 2) Diffuse blame, 3) Make the problem manageable, 4) Empower users, and 5) Invoke personal care. This week’s Zuckerberg response follows the pattern. Senator Richard Blumenthal (D-CT), who sits on the Senate Judiciary Committee, when asked if Zuckerberg’s statement satisfied him, said, “In no way did it satisfy me.”

In an interview this week, Zuckberger indicated he would be willing to testify to Congress "if it is the right thing to do." Many in Congress believe it is the right thing to do and called for Zuckerberg to do just that. It was also reported that Christopher Wylie, the Cambridge Analytica whistleblower, plans to give an interview to Democrats on the House Intelligence Committee in response to an invitation from Ranking Member Adam Schiff (D-CA).

Have No Fear, the……..

On March 20, the Federal Trade Commission opened an investigation into Facebook, which, apparently, marks the most substantial political and legal threat yet to Facebook. The inquiry could result in the U.S. government slapping Facebook with massive fines ($40,000 for each violation, which, theoretically,  could mean up to $2 trillion).

At issue for Facebook -- and at the heart of the FTC probe -- is a settlement the company reached with the agency in November 2011, ending an investigation that Facebook deceived users about the privacy protections they are afforded on the site. According to the Washington Post:

Among other requirements, the resulting consent decree mandated that Facebook must notify users and obtain their permission before data about them is shared beyond the privacy settings they have established. It also subjected Facebook to 20 years of privacy checkups to ensure its compliance.

Recently, though, former FTC officials have said that Facebook’s entanglement with Cambridge Analytica may have violated the company's legal agreement with the federal watchdog agency. Whistleblowers in recent days contend that Cambridge Analytica collected information about users and their friends under a since-ceased policy governing third-party apps on Facebook – then kept that data even after Facebook asked that it be deleted.

When Facebook signed the FTC consent decree, its yearly revenue was $3.7 billion. In 2017, it was more than 10 times higher.

On March 20, FTC Commissioner Terrell McSweeny issued the following statement:

The FTC takes the allegations that the data of millions of people were used without proper authorization very seriously. The allegations also highlight the limited rights Americans have to their data. Consumers need stronger protections for the digital age such as comprehensive data security and privacy laws, transparency and accountability for data brokers, and rights to and control over their data.

The FTC, however, has been operating with only two commissioners on its five-member panel since last February, amid constant delays to get a new team in place. [See: The Trump FTC and the Internet] The confirmation process has been repeatedly delayed. [Furthermore, a Bloomberg article from January is headlined: New FTC Nominees Lack Consumer Privacy, Security Experience. Oh boy.]

Privacy and Facebook’s Business Model

Sen Blumenthal said this week, “Facebook bears a responsibility to safeguard privacy. It has a trust and probably a legal as well as moral obligation.”.  But the social media giant’s business model is the sale of personalized data to third parties. Advertising accounted for more than 98% of Facebook's total revenue in 2017. As David Pierson noted for the Los Angeles Times, “Exploiting Facebook data to influence voters? That’s a feature, not a bug, of the social network.”

In an op-ed in The Guardian, Open Markets Institute fellows Barry Lynn and Matt Stoller made the case that “America’s Facebook problem” has “endangered basic democratic institutions.” The wrote:

What makes Facebook’s apparent mishandling of data so deeply dangerous is the corporation’s power and reach over both the distribution and generation of news and information. Facebook is the leading way that most Americans get their news. In 2017 it accounted for 26% of external traffic referral to the websites of news publishers. In tandem with Google, the figure is 70%.

It is this bottlenecking of the news that makes Facebook such a tempting target for data miners. It is this bottlenecking that has also enabled Facebook to steer readers to certain publishers and away from others, in ways that increase Facebook’s earnings but can seriously weaken or even bankrupt well-established newspapers and magazines.

To make a tough situation worse, Facebook exploits this bottleneck to divert billions in advertising revenues away from trustworthy sources of news into its own coffers…

This immense power comes from Facebook’s status as a de facto monopoly.

Regulatory Proposals

Senator Blumenthal said, "The business models need not be the problem. It's the rules for those business models as to how privacy is protected. Information is made available all the time to marketing firms as well as retailers. And there are rules that apply to how that information is used and how products are marketed to them and the opt-out...I think Congress has a responsibility for making rules and overseeing them."

But what would that regulation look like?

Lynn and Stoller believe that, “Rather than simply carve away some of Facebook’s huge profits, the FTC should immediately move to restructure the corporation to ensure this now essential medium of communication really serves the political and economic interests of American citizens in the 21st century.” They list 9 actions the next set of FTC commissioners could take, including imposing strict privacy rules (perhaps using Europe’s new General Data Protection Regulation as a guide), spinning off Facebook’s ad network, establishing a system to ensure the transparency of all political communications on Facebook, and establishing whether Facebook violated the 2011 consent decree and, if so, seek court sanctions.

In Bloomberg, Paul Ford called for a “Digital Protection Agency.” “Its job would be to clean up toxic data spills, educate the public, and calibrate and levy fines,” he wrote.

Washington Post tech columnist Geoffrey Fowler pointed out that, “Aside from a dramatic change of heart from founder Mark Zuckerberg, getting Facebook to reform what data it collects and how it uses it requires destabilizing its business. And that boils down to this: Making Facebook an unreliable or expensive way for marketers to reach us.” Fowler contends this could be done through consumer activism or government action.

Senator Mark Warner (D-VA) has been advocating for the Honest Ads Act, which would improve the transparency of online political advertisements.  But Facebook has been lobbying against it. Stoller and Lynn write:

Facebook’s power also comes from its willingness to stiff-arm policymakers. Despite proof that Facebook was an important tool of Russian hackers seeking to disrupt American democracy, the corporation has aggressively lobbied against the Honest Ads Act, a simple disclosure proposal put forward by Senators Amy Klobuchar (D-CA) and Mark Warner. For years, Facebook has lobbied successfully in state capitals all over the country to block privacy rules, such as on facial recognition technologies.

The Federal Election Commission (FEC) in March proposed new rules that would bump up disclosure requirements for certain political ads on platforms like Facebook and Google. The commission voted 4-0 to open 60 days for public comments and have scheduled a June 27 public hearing on the matter.


Privacy is a necessary component for self-government. Citizens must be able to communicate, to collect and disperse information, learn, and contemplate without constant public scrutiny.

This week, we learned that the privacy of millions of citizens was violated and used for political gain. Christopher Wylie offered this perspective:

Instead of standing in the public square and saying what you think, and then letting people come and listen to you and have that shared experience, as to what your, what your narrative is, you are whispering it to the ear of each and every voter, and you may be whispering one thing to this voter, and another thing to another voter. We risk fragmenting society in a way where we don’t have anymore shared experiences. And we don’t have anymore shared understanding. If we don’t have anymore shared understanding, how can we be a functioning society?

This story ain't ending this week. Be sure to follow allong in Headlines.

Quick Bits

Weekend Reads (resist tl;dr)

ICYMI from Benton

Upcoming Events


By Robbie McBeath.