Is Facebook a 'Bug' in Our Democracy? Part 3
Bug: noun | \ ˈbəg \ Selected definitions:
An error in a computer program or system.
A concealed listening device.
A microorganism (such as a bacterium or virus) especially when causing illness or disease.
An unexpected defect, fault, flaw, or imperfection.
A sudden enthusiasm.
Is it time to recognize that Facebook, and ‘Big Tech’ at large, may be a bug in our democracy? In the wake of the Facebook-Cambridge Analytica revelations, what policy solutions could best help our democratic discourse in the Digital Age?
I. Is Facebook Accountable?
Although this may all seem entirely new, the question over how to handle the effects of Facebook is part of a long history about the media's role in our democracy.
In the early-1940s for example, the Hutchins Commission was established to explore mounting problems facing the press. Time magazine founder Henry Luce and University of Chicago Chancellor Robert Hutchins feared “low” journalism -- newspapers and the so-called "pulp press" of mass culture and society -- was inching toward government intervention due to rapidly-increasing concentration of media power in fewer and fewer hands, the failure of those few to provide adequate service, and the perception of irresponsible behavior by journalists and media owners. Sound at all familiar?
Luce, Hutchins, and most members of the Commission on Freedom of the Press also believed First Amendment freedoms were increasingly threatened by newly formed totalitarian regimes in key global positions.
The Hutchins Commission on Freedom of the Press met 17 times over two years, interviewing 58 witnesses. The Commission’s report, A Free and Responsible Press, was released on March 26, 1947. The panel said freedom of the press implied a negative freedom "from" what it called "external compulsions" but not the "pressures" necessary for robust public discourse. However, the First Amendment also meant the press had a positive freedom -- "for making its contribution to the maintenance and development of a free society." The Commission said newspapers should redefine themselves as "common carriers of public discussion" by providing a:
Truthful, comprehensive account of the day's events in a context which gives them meaning;
Forum for the exchange of comment and criticism;
Means of projecting the opinions and attitudes of the groups in a society to one another; and
Way of reaching every member of the society by the currents of information, thought, and feeling which the press supplies.
Among the main findings from the report is the notion that the press plays an important role in the development and stability of modern society and, as such, it is imperative that a commitment of social responsibility be imposed on mass media.
“No democracy will indefinitely tolerate concentration of private power, irresponsible and strong enough to thwart the democratic aspirations of the people. If these giant agencies of communication are irresponsible, not even the First Amendment will protect their freedom from government control," the report reads.
Given that Facebook is partly responsible for the erosion of journalism and now has a huge influence on our media ecosystem, should it be held to a commitment of social responsibility? If that answer is “Yes,” Facebook has done a poor job in handling its responsibilities. “Facebook has so far shirked the traditional social responsibilities—lacking as they often are—of news media publishers within a democratic society,” notes University of Pennsylvania Professor Victor Pickard.
II. Catching the Regulatory Bug
5. Bug; n., A sudden enthusiasm.
So what’s the appropriate regulatory response?
Facebook has promised to change some of its practices before, but has not delivered much substantive action. In November 2011, Facebook settled with the Federal Trade Commission over charges that it “deceived consumers by telling them they could keep their information on Facebook private, and then repeatedly allowing it to be shared and made public.”
The FTC settlement did not levy any fines at the time and did not accuse Facebook of intentionally breaking the law. But the settlement barred Facebook from making any further deceptive claims, required that the company get consumers' approval before it changes the way it shares their data, and required the company to obtain periodic assessments of its privacy practices by independent, third-party auditors for the next 20 years.
Now, in the aftermath of the Cambridge Analytica story, two former FTC officials who crafted the landmark consent decree say the company may have violated that decree when it shared information from tens of millions of users with Cambridge Analytica in 2014. On March 26, 2018, the FTC confirmed it had opened an investigation of Facebook. If the FTC finds violations occurred, the agency could fine Facebook, said former FTC Bureau of Consumer Protection Director David Vladeck. Columbia Law professor and former FTC official Tim Wu said, “Every single violation is punishable by a $40,000 fine. So it could be billions of dollars in damages if the FTC decides to police this very aggressively.”
Facebook’s broken privacy promises are one indication that it cannot be trusted to regulate itself. Facebook’s business model may make self-regulation impossible.
“The problems are central and structural, the predicted consequences of its business model,” said Wu. “Facebook, at its core, is a surveillance machine, and to expect that to change is misplaced optimism.” Wu concludes:
If we have learned anything over the last decade, it is that advertising and data-collection models are incompatible with a trustworthy social media network. The conflicts are too formidable, the pressure to amass data and promise everything to advertisers is too strong for even the well-intentioned to resist.
Could consumer pressure change Facebook? Possibly, though it may not be enough to lead to substantive changes. The sheer size of Facebook makes it hard for consumer protests to be effective. Moreover, University of Virginia Professor Siva Vaidhyanathan notes:
Quitting Facebook lets Google and Twitter off the hook. It lets AT&T and Comcast and its peers off the hook. The dangers of extremist propaganda and hate speech are just as grave on YouTube, which is owned by Google. Russian agents undermining trust in institutions and democracy are even more visible on Twitter. And every major telecommunications firm, as well as Google and Twitter, relies on surveillance systems similar to the one Facebook uses to run targeted advertising. Facebook is bigger and better at all of this than the others, but its problems are not unique.
If the people who care the most about privacy, accountability and civil discourse evacuate Facebook in disgust, the entire platform becomes even less informed and diverse. Deactivation is the opposite of activism.
“Facebook does not care about one among 2.2 billion users. Act as a citizen, not a Facebook user. Demand regulation,” Vaidhyanathan tweeted.
III. Policy Proposals
The Federal Communications Commission balances the commercial interest of media operators with the public interest when considering access to public airwaves, accessibility for people with disabilities, maintaining effective communications during emergencies, or ensuring communication technologies -- like broadband -- reach every citizen. The FCC has a history of assigning companies public interest responsibilities to make sure that the public’s interests are served.
But the FCC was set up to regulate communications networks, not the companies that operate over the networks. The Federal Trade Commission’s mission is to protect consumers by preventing anticompetitive, deceptive, and unfair business practices. The FTC also enhances informed consumer choice by promoting competition.
FTC enforcement of privacy protections is certainly one way to respond to Facebook. But lawmakers are offering additional proposals to address the issues raised by the Facebook-Cambridge Analytica revelations.
On April 10, Senators Edward Markey (D-MA) and Richard Blumenthal (D-CT) introduced a privacy bill of rights to protect the personal information of American consumers. The Customer Online Notification for Stopping Edge-provider Network Transgressions (CONSENT) Act would require the FTC to establish privacy protections for customers of online edge providers like Facebook and Google. Specifically, the CONSENT Act requires edge providers (like Facebook) to:
Obtain opt-in consent from users to use, share, or sell users’ personal information;
Develop reasonable data security practices;
Notify users about all collection, use, and sharing of users’ personal information; and
Notify users in the event of a breach.
The FTC would be responsible for CONSENT Act enforcement.
Former-Federal Communications Commission Chairman Tom Wheeler is looking at the algorithms of social media companies. “Technology and capitalism have combined to deliver us to a decidedly undemocratic outcome. The internet was once heralded as the great democratizing tool. That vision was smashed by the algorithms of the social media platforms,” he wrote in November 2017. “Algorithms got us into this situation. Algorithms must get us out.”
“Today, the software algorithms that create social media news feeds are black boxes; we have no idea what goes into them or what comes out and why….Wael Ghonim has proposed opening that input/output information as ‘public interest APIs’ (Application Programming Interface), a common software practice that allows third-party access to information.”
Using an open API, similar to how Google Maps operates, “it becomes possible to build public interest algorithms to monitor and report on the effects of social media algorithms...Openly available to all (and perhaps even a reference app on the social media platforms themselves), a public interest algorithm can provide awareness of and access to the information behind any posting. Such sunlight will not only expose any propaganda, but also will help independent evaluation of the veracity of the information being delivered.”
An Internet Bill of Rights
Rep Ro Khanna (D-CA), whose district falls in the heart of Silicon Valley, has called for an Internet Bill of Rights to “ensure net neutrality, protect citizens from warrantless government mass surveillance, and provide consumers with more control over their personal data.” His six principles for the Internet Bill of Rights are:
- Right to Universal Web Access;
- Right to Net Neutrality;
- Right to Be Free From Warrantless Metadata Collection;
- Right to Disclose Amount, Nature, and Dates of Secret Government Data Requests;
- Right to be Fully Informed of Scope of Data Use; and
- Right to be Informed When There Is Change of Control Over Data.
In 2016, the European Union passed the comprehensive General Data Protection Regulation, or GDPR. The details of the law are complex, but the GDPR can be (over)simplified to:
- Companies are required to collect and store only the minimum amount of user data to provide a specific, stated service.
- Companies must use clear and plain language to explain how they will use their users’ personal details. The companies must also provide information about what other kinds of entities users’ data will be shared with.
- Individuals have the right not to be subject to completely automated decisions which significantly affect them.
- People have the right to obtain a copy of the records that companies hold about them.
On April 4, 2018, Facebook CEO Mark Zuckerberg said that Facebook will voluntarily implement the GDPR everywhere. "We're going to make all the same controls and settings available everywhere, not just in Europe."
Facebook’s “de facto monopoly” status has led to a call for divestment. Professor Vaidhyanathan wrote, “The Department of Justice should consider severing WhatsApp, Instagram and Messenger from Facebook, much as it broke up AT&T in 1982. That breakup unleashed creativity, improved phone service and lowered prices. It also limited the political power of AT&T.”
Others see a solution beyond regulating Facebook as a monopoly. Professor Wu has suggested many “alternative” Facebook models to compete with Facebook:
Facebook should become a public benefit corporation. These companies must aim to do something that would aid the public, and board members must also take that public benefit into account when making decisions. Mark Zuckerberg has said that Facebook’s goals are ‘bringing us closer together’ and ‘building a global community.’ Worthy, beautiful goals, but easier said than done when Facebook is also stuck delivering ever-increasing profits and making its platform serve the needs of advertisers.
Wu expanded on this idea:
Another ‘alt-Facebook’ could be a nonprofit that uses that status to signal its dedication to better practices, much as nonprofit hospitals and universities do. Wikipedia is a nonprofit, and it manages nearly as much traffic as Facebook, on a much smaller budget. An ‘alt-Facebook’ could be started by Wikimedia, or by former Facebook employees, many of whom have congregated at the Center for Humane Technology, a nonprofit for those looking to change Silicon Valley’s culture. It could even be funded by the Corporation for Public Broadcasting, which was created in reaction to the failures of commercial television and whose mission includes ensuring access to telecommunications services that are commercial free and free of charge.
Others, such as Professor Zeynep Tufekci, have focused on establishing alternatives to Facebook that change the business model, enabling users to pay for the control over their privacy, or for companies to pay users for access to theirs. Again, Wu looks at how this might work:
One set of Facebook alternatives might be provided by firms that are credibly privacy-protective, for which users would pay a small fee (perhaps 99 cents a month). In an age of ‘free’ social media, paying might sound implausible — but keep in mind that payment better aligns the incentives of the platform with those of its users. The payment and social network might be bundled with other products such as the iPhone or the Mozilla or Brave browser.
Facebook as a….Utility?
Journalist Bruce Shapiro proposed treating Facebook as a utility. “We can declare it’s time for communication platforms to be recognized as essential utilities for modern society; and like other such utilities, they should be regulated, subject to robust public scrutiny and accountability,” he wrote.
In an unexpected twist, in July of 2017, Steve Bannon started to push for treating essential tech platforms as utilities, on the grounds that these tools have effectively become a necessity in contemporary life, and that there are qualities to platforms that lend themselves to becoming natural monopolies.
Online Political Advertising
Another venue for potential reform is regulation of the online political advertising market. Facebook’s business model aligns the interests of advertisers and its platform. Disinformation operators, such as those seen in the 2016 election, often are indistinguishable from any other advertiser. New America fellows Dipayan Ghosh and Ben Scott, in Digital Deceit: The Technologies Behind Precision Propaganda on the Internet, discuss the importance of reform to online political advertising:
The central problem of disinformation corrupting American political culture is not Russian spies or a particular social media platform. The central problem is that the entire industry is built to leverage sophisticated technology to aggregate user attention and sell advertising. There is an alignment of interests between advertisers and the platforms. And disinformation operators are typically indistinguishable from any other advertiser. Any viable policy solutions must start here.
And some see Facebook’s election ad market as specifically capitalizing on the lack of online ad transparency and enforcement. Noam Cohen, in an op-ed for the New York Times, wrote:
What Facebook is selling to political campaigns is the same thing Uber is selling to its drivers and customers and what YouTube is selling to advertisers who hope to reach an audience of children — namely, the right to bypass longstanding rules and regulations in order to act with impunity…..
Selling relief from government scrutiny of elections is a different kind of threat to the social fabric than selling relief from government scrutiny of commerce, especially in light of our country’s record of denying voting rights to African-Americans. Facebook can’t be allowed to be a tool for enemies of democracy because it fears that regulation could hurt its bottom line.
Senator Mark Warner (D-VA) has been advocating for the Honest Ads Act, which would improve the transparency of online political advertisements. In discussing his bill in regards to the Cambridge Analytica news, Sen. Warner said, “This story is more evidence that the online political advertising market is essentially the Wild West. Whether it’s allowing Russians to purchase political ads, or extensive micro-targeting based on ill-gotten user data, it’s clear that, left unregulated, this market will continue to be prone to deception and lacking in transparency.”
On April 10, Twitter pledged to support the bill. The company also launched a new platform called the Ads Transparency Center, or ATC, that will “go beyond the requirements of the Honest Ads Act and eventually provide increased transparency to all advertisements on Twitter.”
The Federal Election Commission (FEC) voted in March to launch a proceeding to explore new rules that would bump up disclosure requirements for certain political ads on platforms like Facebook and Google. The commission voted 4-0 to open a 60-day public comment period. The FEC has also scheduled a June 27 public hearing on the matter.
On April 6, 2018, Facebook announced, through a Zuckerberg post, that advertisers who want to promote their views on key political issues will have to verify their identity and location. These “issue ads” will also be labeled as “political ads” on the platform, along with information about who’s paying for their existence. In addition to advertisers needing to prove who they are, managers of “large” pages will have to do so, as well.
In his Senate testimony April 10, Zuckerberg expressed support for the Honest Ads Act.
Immediate policy proposals could go a long way to better protect our democratic discourse from the harms of Facebook and Big Tech. But we also need a long-term policy agenda.
“Our long-term agenda should be to bolster institutions that foster democratic deliberation and the rational pursuit of knowledge,” said Professor Vaidhyanathan. “These include scientific organizations, universities, libraries, museums, newspapers and civic organizations. They have all been enfeebled over recent years as our resources and attention have shifted to the tiny addictive devices in our hands.”
“We have in the course of a single century built an entire society, economy and culture that runs on information. Yet we have hardly begun to engineer data ethics appropriate for our extraordinary information carnival,” wrote Colin Koopman, a professor at University of Oregon. "If we do not do so soon, data will drive democracy, and we may well lose our chance to do anything about it.”
Professor Victor Pickard notes that discourses about the democratic potential of digital technologies often overlook the policy roots and normative foundations of our communication systems. “An abiding faith in technological liberation and a tendency to naturalize market forces have discouraged the implementation of public policies that could prevent corporate capture of our core information systems.” However, he thinks we need long-term policy solutions to address these challenges:
The first steps we take toward addressing these policy problems must be discursive. Articulating that our news media are public services and infrastructures—not simply commodities—is an essential starting point. Additionally, we must be adamant that the market can’t provide for all our information needs. Shifting the media regulatory paradigm from corporate libertarianism to social democracy would help facilitate policies that reduce monopoly power, remove commercial pressures, install public interest protections, and build out public alternatives.
While it’s not feasible to bring back content regulations like the long-repealed Fairness Doctrine, demanding more social responsibility from major media institutions, especially Facebook, is key. And finding ways to structurally support actual journalism is even more essential. For example, the philanthropy world should redouble efforts to shore up—and reinvent—struggling newspapers as they transition to nonprofit status.
For the long term we must also establish true public alternatives not dependent on the market—ideally a new public service media system supported through a combination of private contributions and public subsidies.
But implementing these long-term, structural reforms will be a challenge. Leah Lievrouw, professor in the department of information studies at the University of California, Los Angeles, observed:
So many players and interests see online information as a uniquely powerful shaper of individual action and public opinion in ways that serve their economic or political interests (marketing, politics, education, scientific controversies, community identity and solidarity, behavioral ‘nudging,’ etc.). These very diverse players would likely oppose (or try to subvert) technological or policy interventions or other attempts to insure the quality, and especially the disinterestedness, of information.
We are in a brave new world. Facebook and 'Big Tech' have contributed to the erosion of our democratic discourse. We need to have these new titans assume responsibilities on par to the influence they have over our information ecosystem. We need to address this bug in our democracy. Short-term policy solutions can help curb some of Facebook’s harmful effects, but the larger task before policymakers -- and all of us -- is to critically examine the long-term health of our democratic discourse. What’s needed are policy solutions that take into account business models that are inherently at odds with the public interest.