Is Facebook a 'Bug' in Our Democracy? Part 1
Bug: noun | \ ˈbəg \ Selected definitions:
An error in a computer program or system.
A concealed listening device.
A microorganism (such as a bacterium or virus) especially when causing illness or disease.
An unexpected defect, fault, flaw, or imperfection.
A sudden enthusiasm.
Is it time to recognize that Facebook, and ‘Big Tech’ at large, may be a bug in our democracy? The Cambridge Analytica story reveals the harmful effects of business models that rely on massive data collection. What is lost is our privacy, contributing to the declining health of our democratic discourse.
Privacy, Facebook’s Business Model, and ‘Surveillance Capitalism’
Loss of Privacy
Bug; n., An error in a computer program or system.
With its stock price tumbling, Facebook has been scrambling to respond to the news that Cambridge Analytica, for political purposes, used personal information acquired about Facebook users by an external researcher who claimed to be collecting the data for academic purposes. Facebook has admitted that data on 87 million users may have been improperly shared. On March 28, the company announced it will be rolling out a centralized system for its users to control their privacy and security settings. And Facebook CEO Mark Zuckerberg acknowledged the massive data comprise in an apologetic media tour. He has committed to testifying on April 11 before the House Commerce Committee.
For many, Zuckerberg's response has been seen as a small concession that does not address the much bigger problem. As Ethan Zuckerman notes in The Atlantic, “Zuckerberg’s statement fell short in a very specific way: He’s treating the Cambridge Analytica breach as a bad-actor problem when it’s actually a known bug.”
And the 'bug' is this: Facebook relies on eroding users' privacy -- or, at least, their expectation of privacy -- to maintain its economic power.
In 2010, then 25-year-old Mark Zuckerberg turned over the rock, showing us the bug. "People have really gotten comfortable not only sharing more information and different kinds, but more openly and with more people," he said. "That social norm is just something that has evolved over time."
Business Model: Data Exploitation
Facebook’s business model is based on collecting users’ demographic and psychographic information and then selling the ability to target advertisements to people using this data. As University of Virginia Professor Siva Vaidhyanathan described in the New York Times:
Its core functions are to deploy its algorithms to amplify content that generates strong emotional responses among its users, and then convert what it learns about our interests and desires into targeted ads. This is what makes Facebook Facebook. This is also what makes Facebook such an effective vehicle for promoting so much garbage and what makes it the most pervasive personal surveillance system in the world. As long as that’s true, don’t expect Facebook to fix itself.
Tristan Harris, a former Google Design Ethicist and founder of the Center for Humane Technology, said, “[Product designers] play your psychological vulnerabilities (consciously and unconsciously) against you in the race to grab your attention.”
This business model creates a powerful incentive to gather as much personal information from as many people as possible.
Most ad-supported websites track their users as part of agreements that seek to make their ad inventory more valuable. This digital advertising model, which other tech giants (most notably Google) have eagerly adopted, is very profitable. The granular data that Facebook harvests helped increase Facebook's advertising revenue last year by 49%. Facebook now pockets roughly a quarter of all online and mobile advertising, which added up to almost $40 billion in 2017. And make no mistake, Facebook is an advertising company: advertising accounted for more than 98% of Facebook's total revenue in 2017, according to company filings.
But though this business model is very profitable, it is also very troubling. Ethan Zuckerman wrote:
[This is] the ‘original sin’ of the internet. It’s a dangerous and socially corrosive business model that puts internet users under constant surveillance and continually pulls our attention from the tasks we want to do online toward the people paying to hijack our attention.
Privacy Violations are Bad. 'Surveillance Capitalism' is Worse.
2. Bug; n., A concealed listening device.
“Surveillance Capitalism” is a term first introduced by John Bellamy Foster and Robert McChesney in Monthly Review and later popularized by academic Shoshana Zuboff. The term illustrates an economic model that monetizes data acquired through digital surveillance and behavior tracking. While tech titans like Facebook and Google obtain a majority of their revenue through the targeted advertising this type of tracking makes possible, privacy specialist Bruce Schneier notes that it is not just the big companies that are amassing personal data:
There are 2,500 to 4,000 data brokers in the United States whose business is buying and selling our personal data. Last year, Equifax was in the news when hackers stole personal information on 150 million people, including Social Security numbers, birth dates, addresses, and driver's license numbers. Equifax is one of those thousands of data brokers, most of them you've never heard of, selling your personal information without your knowledge or consent to pretty much anyone who will pay for it.
"Businesses that make money by collecting and selling detailed records of private lives were once plainly described as 'surveillance companies.’ Their rebranding as 'social media' is the most successful deception since the Department of War became the Department of Defense," tweeted National Security Agency whistleblower Edward Snowden.
And this surveillance model applies to internet service providers as well. “[E]very major telecommunications firm, as well as Google and Twitter, relies on surveillance systems similar to the one Facebook uses to run targeted advertising,” wrote Professor Vaidhyanathan.
In October of 2016, the Federal Communications Commission, led by then-Chairman Tom Wheeler, adopted broadband privacy rules that increased consumer control over their own data. The rules required broadband internet access service providers to obtain permission from subscribers before gathering and sharing data on subscribers’ web browsing, app use, location, and financial information. Those rules were repealed by Republicans in Congress and President Donald Trump in April of 2017.
Privacy and Democratic Discourse
3. Bug; n., A microorganism (such as a bacterium or virus) especially when causing illness or disease.
Although enormously profitable, the "socially corrosive" surveillance capitalism business model of large tech media companies affects our communication ecosystem and ability to be democratically engaged online. The advertising-focused presentation of information and the surveillance of targeted communities is depreciating our First and Fourth Amendment rights, chilling our speech and harming our democratic discourse.
According to Harvard Law Professor Jonathan Zittrain, digital gerrymandering is the selective presentation of information by an intermediary to meet its agenda rather than to serve its users. This is Facebook and Google's "secret sauce" and from where the companies derive most of their value. By using complex algorithms to present a user with selected information, users get value in quickly receiving the information they "want", but are also presented with a refined feed that both collects their information (clicks, behavior patterns, and essentially their thoughts) and commodifies it for advertisers.
Digital gerrymandering utilizes a user's data to microtarget advertisements to them -- or to exclude them. While not necessarily a harmful behavior on its own, digital gerrymandering can easily become a harmful tool. In October 2016, ProPublica reported that Facebook's system allows housing advertisers to exclude users by race, despite the fact that advertisements that exclude people based on race, gender, and other sensitive factors are prohibited by federal law in housing and employment.
Digital gerrymandering can also be a harm to our democratic discourse. Cambridge Analytica whistleblower Christopher Wylie summed up what microtargeted advertising means in the political sphere:
Instead of standing in the public square and saying what you think, and then letting people come and listen to you and have that shared experience, as to what your, what your narrative is, you are whispering it to the ear of each and every voter, and you may be whispering one thing to this voter, and another thing to another voter.
We risk fragmenting society in a way where we don’t have anymore shared experiences. And we don’t have anymore shared understanding. If we don’t have anymore shared understanding, how can we be a functioning society?
The Loss of Privacy Chills Free Thought and Speech
Traditionally -- even in the Digital Age -- our First Amendment rights and our Fourth Amendment right to privacy are seen as something at odds with each other. When, say, does someone have the right to report 'news' about another person and when does someone have the right to keep information about themselves private? Where do your rights begin and where do mine end?
But, without getting too lost in the history and complexity of constitutional law, the First Amendment and privacy work hand-in-hand in a democratic society. The American Library Association states:
The right to privacy – the right to read, consider, and develop ideas and beliefs free from observation or unwanted surveillance by the government or others – is the bedrock foundation for intellectual freedom. It is essential to the exercise of free speech, free thought, and free association.
Losing privacy, or losing even the expectation of privacy, can dramatically decrease one's enthusiasm to participate in democracy. In The Eternal Value of Privacy, Bruce Schneier, writes:
For if we are observed in all matters, we are constantly under threat of correction, judgment, criticism, even plagiarism of our own uniqueness. We become children, fettered under watchful eyes, constantly fearful that—either now or in the uncertain future—patterns we leave behind will be brought back to implicate us, by whatever authority has now become focused upon our once-private and innocent acts. We lose our individuality, because everything we do is observable and recordable.
"If we don't have privacy, what we're losing is the ability to make mistakes, we're losing the ability to be ourselves. Privacy is the fountainhead of all other rights. Freedom of speech doesn’t have a lot of meaning if you can’t have a quiet space. A space within yourself, within your mind, within the community of your friends, within your home, to decide what it is you actually want to say,” said Edward Snowden in 2016.
A 2014 Pew Research Center survey illustrates Americans’ complicated feelings about social media and privacy, finding that:
- 91% of Americans “agree” or “strongly agree” that people have lost control over how personal information is collected and used by all kinds of entities;
- 80% of social media users said they were concerned about advertisers and businesses accessing the data they share on social media platforms; and
- 64% of social media users said the government should do more to regulate advertisers.
Moreover, in 2012, 86% of internet users said they had taken steps to try to be anonymous online. “Hiding from advertisers” was relatively high on the list of things they wanted to avoid.
The erosion of online privacy has real-world consequences, especially for historically marginalized communities, such as communities of color. In October 2016, the American Civil Liberties Union reported that Facebook provided data access for a surveillance product marketed to target activists of color. Malkia Cyril of the Center for Media Justice said, “The fact that third parties are making big money off of the sale and trading of our data with law enforcement is a huge problem and one that any social media user in this country or beyond should be disgusted and surprised by. It clearly has a chilling effect on democratic protests.”
Cyril connected the history of surveilling Dr. Martin Luther King Jr. and other black activists with the digital surveillance activists face today:
Like its predecessors, the democratic movement for black lives has been met by anti-democratic state surveillance and anti-black police violence. New “smart” policing methods are being used by modern-day gumshoes who, fueled by the false rhetoric of black criminality, experiment with high-tech tools to the detriment of black democratic engagement.
Twentieth century surveillance is alive and well in the 21st century, and is one powerful reason why, in a digital age and era of big data, the fight for racial justice must also include a fight for the equal and fair application of first and fourth amendment rights.
For black communities and others pushed to the margins of political and economic power – democratic engagement and the exercise of our human and civil rights in a digital age demands the ability to encrypt our communications.
While Facebook continues to try to minimize the fallout from the latest revelations, we need to begin a deeper examination of an intrinsic bug burrowed into our social media platforms. While offering us services for no charge, what they really cost us is our privacy. We have come to a tipping point where we, and our policymakers, must ask if these services are worth the trade-off.
This is the first in a series of articles. Next, I will look at how, in the space vacated by reliable journalism, political campaigns have stepped in. The campaigns work hand-in-hand with data companies to exploit our personal information to, it now seems, dramatically impact our elections.