Digital Content

Information that is published or distributed in a digital form, including text, data, sound recordings, photographs and images, motion pictures, and software.

Facebook moving non-promoted posts out of news feed in trial

Facebook is testing a major change that would shift non-promoted posts out of its news feed, a move that could be catastrophic for publishers relying on the social network for their audience. A new system being trialled in six countries including Slovakia, Serbia and Sri Lanka sees almost all non-promoted posts shifted over to a secondary feed, leaving the main feed focused entirely on original content from friends, and advertisements.

The change has seen users’ engagement with Facebook pages drop precipitously, by 60% to 80% . If replicated more broadly, such a change would destroy many smaller publishers, as well as larger ones with an outsized reliance on social media referrals for visitors.

Tightening Political Ad Disclosure Rules May Not Curb 'Fake News,' Interactive Advertising Bureau Says

The Interactive Advertising Bureau will testify that it supports efforts to strengthen disclosure requirements for online ads that expressly advocate for particular candidates. But the group will also warn lawmakers that tightening those rules won't necessarily affect the spread of "fake news" online. "Enhancing the existing framework by clarifying the responsibility of publishers, platforms, and advertisers in making available these disclosures to the public would create greater legal certainty across the industry and provide valuable information," IAB CEO and President Randall Rothenberg plans to tell Congress in a prepared statement. "But the 'fake news' and 'fake ads' at the center of the current storm did not engage in such overt candidate support. So they were not, and based on current Supreme Court jurisprudence will not, be regulated under the Federal Election Campaign Act."

Rothenberg will testify Oct 24 before the House Oversight subcommittee on information technology, which is slated to hold a hearing about online political ads. David Chavern, CEO of News Media Alliance, will also testify Tuesday, as well as representatives from the Center for Competitive Politics, and the Brennan Center for Justice, among others.

Can Alphabet’s Jigsaw Solve Google’s Most Vexing Problems?

With Alphabet’s engineering resources, Jigsaw translates this research into internet tools that combat hate speech, detect fake news, and defend against cyberattacks. Jigsaw CEO Jared Cohen’s eight-day visit to Pakistan in December provided firsthand insights into what methods extremists are now using to recruit new members online, which Jigsaw aims to circumvent using targeted advertising to counter terrorist propaganda. Although Cohen’s mission sounds philanthropic, Jigsaw operates as a business, no different from any of Alphabet’s moonshots. Yet Cohen says there’s no stress on the group to generate a profit. For now, its value to the enterprise is the ancillary benefits of protecting Google’s myriad other businesses—Android, Gmail, YouTube—from the world’s worst digital threats. And if, in the process, Jigsaw can help address some of the most acute unintended consequences of digital communication, all the better.

“I don’t think it’s fair to ask the government to solve all these problems—they don’t have the resources,” says Alphabet executive chairman Eric Schmidt. “The tech industry has a responsibility to get this right.”

Homegrown ‘fake news’ is a bigger problem than Russian propaganda. Here’s a way to make falsehoods more costly for politicians.

[Commentary] State-sponsored propaganda like the recently unmasked @TEN_GOP Twitter account is of very real concern for our democracy. But we should not allow the debate over Russian interference to crowd out concerns about homegrown misinformation, which was vastly more prevalent during and after the 2016 election. The problem isn’t that we’re only willing to listen to sources that share our political viewpoint; it’s that we’re too vulnerable as human beings to misinformation of all sorts. Given the limitations of human knowledge and judgment, it is not clear how to best protect people from believing false claims.

Brendan Nyhan is a professor of government at Dartmouth College.

Yusaku Horiuchi is a professor of government at Dartmouth College.

How Facebook’s Master Algorithm Powers the Social Network

[Commentary] Artificial intelligence permeates everything at Facebook, the social network’s head of applied machine learning says—and humans are bound to understand Facebook less than ever. The algorithm behind Facebook’s News Feed, a “modular layered cake,” extracts meaning from every post and photo.

DC Court Allows Live Streaming

In a first for the US Court of Appeals for the DC Circuit, oral argument in a major abortion case, Garza vs. Hargan Oct. 20 will be live streamed after Fix the Court, which advocates for greater access to federal courts, made the request. Chief Judge Merrick Garland issued the decision in a one-sentence letter to Fix the Court executive director Gabe Roth, saying simply: "Thank you for your letter of today's date, requesting that the court provide a live audio feed of arguments  in Garza v. Hargan, 17-5236, tomorrow.

How Fiction Becomes Fact on Social Media

At a time when political misinformation is in ready supply, and in demand, “Facebook, Google, and Twitter function as a distribution mechanism, a platform for circulating false information and helping find receptive audiences,” said Brendan Nyhan, a professor of government at Dartmouth College. For starters, said Colleen Seifert, a professor of psychology at the University of Michigan, “People have a benevolent view of Facebook, for instance, as a curator, but in fact it does have a motive of its own. What it’s actually doing is keeping your eyes on the site. It’s curating news and information that will keep you watching.” That kind of curating acts as a fertile host for falsehoods by simultaneously engaging two predigital social-science standbys: the urban myth as “meme,” or viral idea; and individual biases, the automatic, subconscious presumptions that color belief.

Stopping to drill down and determine the true source of a foul-smelling story can be tricky, even for the motivated skeptic, and mentally it’s hard work. Ideological leanings and viewing choices are conscious, downstream factors that come into play only after automatic cognitive biases have already had their way, abetted by the algorithms and social nature of digital interactions.

Smartphones Are Weapons of Mass Manipulation, and Tristan Harris Is Declaring War on Them

If, like an ever-growing majority of people in the U.S., you own a smartphone, you might have the sense that apps in the age of the pocket-sized computer are designed to keep your attention as long as possible. You might not have the sense that they’re manipulating you one tap, swipe, or notification at a time. But Tristan Harris thinks that’s just what’s happening to the billions of us who use social networks like Facebook, Instagram, Snapchat, and Twitter, and he’s on a mission to steer us toward potential solutions—or at least to get us to acknowledge that this manipulation is, in fact, going on.

Harris, formerly a product manager turned design ethicist at Google, runs a nonprofit called Time Well Spent, which focuses on the addictive nature of technology and how apps could be better designed; it pursues public advocacy and supports design standards that take into account what’s good for people’s lives, rather than just seeking to maximize screen time. He says he’s moving away from Time Well Spent these days (his new effort is as yet unnamed), trying to hold the tech industry accountable for the way it persuades us to spend as much time as possible online, with tactics ranging from Snapchat’s snapstreaks to auto-playing videos on sites like YouTube and Facebook.

The Future of Truth and Misinformation Online

Experts are evenly split on whether the coming decade will see a reduction in false and misleading narratives online. Those forecasting improvement place their hopes in technological fixes and in societal solutions. Others think the dark side of human nature is aided more than stifled by technology.

Despite backlash over political ads, Facebook's role in elections will only grow

As the political world looks to apply the lessons of Donald Trump’s victory to future campaigns, one of the few clear conclusions is that Facebook played an outsized role in propelling the candidate to his improbable win.

The company’s ability to affordably target hyper-specific audiences with little to no transparency gives it a distinct advantage over other forms of media, researchers and political operatives believe. Political ads on Facebook have fueled controversy. They spread Russian propaganda and reportedly helped the Trump team suppress black support for Hillary Clinton and aided a conservative political action committee in targeting swing voters with scaremongering anti-refugee ads. Yet the backlash is unlikely to dissuade future campaigns from deploying one of Facebook’s most potent tools. Even the threat of new regulation governing the disclosure rules for political ads on social media can’t stunt the company’s stock price, which continues to reach new heights. If anything, the controversies appear to be functioning like a giant advertisement for the effectiveness of Facebook’s political advertising business.

“I don’t lose sleep over Facebook’s business. I lose sleep over the future of democracy,” said Siva Vaidhyanathan, a professor of media studies at the University of Virginia.