Facebook Hearing: "Big Tech now faces that Big Tobacco jaw-dropping moment of truth"

The Senate Commerce Committee's Subcommittee on Consumer Protection, Product Safety, and Data Security convened a hearing to hear from former Facebook employee Frances Haugen. Recent Wall Street Journal investigations have revealed troubling insights regarding how Instagram affects teenagers, how it handles children onto the platform, and other consumer protection matters related to Facebook. In prepared testimony, Haugen said:

During my time at Facebook, first working as the lead product manager for Civic Misinformation and later on Counter-Espionage, I saw that Facebook repeatedly encountered conflicts between its own profits and our safety. Facebook consistently resolved those conflicts in favor of its own profits. The result has been a system that amplifies division, extremism, and polarization — and undermining societies around the world. In some cases, this dangerous online talk has led to actual violence that harms and even kills people. In other cases, their profit optimizing machine is generating self-harm and self-hate — especially for vulnerable groups, like teenage girls. These problems have been confirmed repeatedly by Facebook’s own internal research. This is not simply a matter of some social media users being angry or unstable. Facebook became a $1 trillion company by paying for its profits with our safety, including the safety of our children. And that is unacceptable.

In addition to promoting harmful, hyper-engaging content in the United States, Facebook’s engagement-based ranking system is “literally fanning ethnic violence” in places like Ethiopia, Haugen said. She also criticized Facebook’s focus on technology tools to detect vaccine and other misinformation. Facebook is “overly reliant on artificial intelligence systems that they themselves say will likely never get more than 10 to 20 percent of the content,” she said.

Haugen’s appearance stood out not only for the inside look but for the way she united Republican and Democratic lawmakers around tackling the issue of the platform’s harm to teenagers. Some senators called her testimony a “Big Tobacco” moment for the technology industry. The lawmakers said Haugen’s testimony, and the thousands of pages of documents she had gathered from the company and then leaked, showed that Facebook’s top executives had misled the public and could not be trusted. “This research is the definition of a bombshell,” said Subcommittee Chairman Richard Blumenthal (D-CT). Republican and Democratic lawmakers at the hearing renewed their calls for regulation, such as strengthening privacy and competition laws and special online protections for children, as well as toughening of the platforms’ accountability. One idea that got a particular boost was requiring more visibility into social-media data as well as the algorithms that shape users’ experiences. “It is clear that Facebook prioritizes profits over the well-being of children and [all users],” Sen Marsha Blackburn (R-TN) said.

“So here’s my message for Mark Zuckerberg: Your time of invading our privacy, promoting toxic content and preying on children and teens is over,” said Sen Ed Markey (D-MA). Sen Blumenthal said after the hearing, “Facebook is a black box, and Mark Zuckerberg is the algorithm designer in chief.” 

“The tech gods have been demystified,” said Sen Roger Wicker (R-MS). “The children of America are hooked on their product. There is cynical knowledge on behalf of these big tech companies that this is true.” “I think the time has come for action, and I think you are the catalyst for that action,” Sen. Amy Klobuchar (D-MN) told Haugen during the hearing. “I would simply say, let’s get to work,” said Sen John Thune (R-SD), who has sponsored several measures on algorithm transparency. “We’ve got some things we can do here.”

Haugen suggested reforming Section 230 of the Communications Decency Act to strip social media companies of the right not to be sued over decisions they make on how algorithms promote certain content. “[Platforms] have 100 percent control over their algorithms,” she said. “Facebook should not get a free pass on choices it makes to prioritize growth and virality and reactiveness over public safety.”


Protecting Kids Online: Testimony from a Facebook Whistleblower (Senate Commerce Committee) Testimony (Frances Haugen) Facebook Whistle-Blower Urges Lawmakers to Regulate the Company (NYTimes) Facebook whistleblower Frances Haugen tells lawmakers that meaningful reform is necessary ‘for our common good’ (WashPost) Facebook Whistleblower’s Testimony Builds Momentum for Tougher Tech Laws (WSJ) Facebook chose to maximise engagement at users’ expense, whistleblower says (FT) Ex-Facebook employee tells Congress social media giant endangers users, democracy (LA Times)