Facebook's Mark Zuckerberg, Jeff Bezos of Amazon, Sundar Pichai of Google, and Tim Cook of Apple all dodged lawmakers’ most pointed questions, or professed their ignorance. The result was a hearing that, at times, felt less like a reckoning than an attempted gaslighting — a group of savvy executives trying to convince lawmakers that the evidence that their yearslong antitrust investigation had dug up wasn’t really evidence of anything. The performance wasn’t particularly convincing.
Within a 48-hour period this week, many of the world’s internet giants took steps that would have been unthinkable for them even months earlier. Taken independently, these changes might have felt incremental and isolated — the kind of refereeing and line-drawing that happens every day on social media. But arriving all at once, they felt like something much bigger: a sign that the Wild Wild Web — the tech industry’s decade-long experiment in unregulated growth and laissez-faire platform governance — is coming to an end.
On Dec. 30, Andrew Bosworth, the head of Facebook’s virtual and augmented reality division, wrote on his internal Facebook page that, as a liberal, he found himself wanting to use the social network’s powerful platform against President Donald Trump. But citing the “Lord of the Rings” franchise and the philosopher John Rawls, Mr. Bosworth said that doing so would eventually backfire. “So what stays my hand?
Not long ago, many leading technologists considered themselves too lofty and idealistic to concern themselves with the petty affairs of government. But that was before privacy scandals, antitrust investigations, congressional hearings, Chinese tariffs, presidential tweets and Senator Elizabeth Warren (D-MA). Now, as they try to fend off regulation and avoid being broken up, some of the largest companies in Silicon Valley are tripping over their Allbirds in a race to cozy up to the United States government.
A Q&A with Neal Mohan, YouTube’s chief product officer.
Regulating big tech is quickly becoming a central theme of the 2020 presidential race. But many of the tech-industry insiders I spoke with, including some who agree with Sen Elizabeth Warren (D-MA) that the big companies are too powerful, cautioned that some of the details in her proposal were too vague, and could backfire if put into effect as written. Warren’s plan is a bold first stab at reform, and some of her proposals make a lot of sense. But I’d offer a few edits.
As President Donald Trump and his allies have waged a fear-based campaign to drive Republican voters to the polls for the midterm elections, far-right internet communities have been buoyed as their once-fringe views have been given oxygen by prominent Republicans. Since the 2016 election, these far-right communities have entered into a sort of imagined dialogue with the president. They create and disseminate slogans and graphics, and celebrate when they show up in Trump’s Twitter feed days or weeks later. They carefully dissect his statements, looking for hints of their influence.
Just hours after the news broke that explosive devices had been sent to Bill and Hillary Clinton, Barack Obama and other prominent Democrats, a conspiracy theory began to take shape in certain corners of conservative media. The bombs, this theory went, were not actually part of a plot to harm Democrats, but were a “false flag” operation concocted by leftists in order to paint conservatives as violent radicals ahead of the elections.
A competitive race in Virginia’s 10th Congressional District has an alarming new element: anonymous attack ads on Facebook. The ads, which appeared on a Facebook page called “Wacky Wexton Not,” were purchased by a critic of Jennifer Wexton, a Democratic candidate trying to unseat Rep Barbara Comstock (R-VA). The person or group behind the ads is known to Facebook, but a mystery to the public.
Facebook has identified a coordinated political influence campaign, with dozens of inauthentic accounts and pages that are believed to be engaging in political activity around divisive social issues ahead of November’s midterm elections. The company detected and removed more than 32 pages and accounts connected to the influence campaign on Facebook and Instagram as part of its investigations into election interference.