Atlantic, The

Method Journalism

[Commentary] With the launch of new site after new site in 2014, it's been a fascinating time to watch digital media try to figure itself out.

Amid the turmoil of disruption, buffeted by tech companies' control over information distribution, but aware of new fields of possibility, the past few years were filled with defending legacy brands.

There are some exciting sites and they're doing great work, and they are also making mistakes and doing weird stuff as they find their identities. But the more I thought about what's different in this era of media relative to earlier ones: none of these sites is focused on an area of coverage. They are, instead, about the method of coverage.

In a world where traditional beats may not make sense, where almost all marginal traffic growth comes from Facebook, where subscription revenue is a rumor, where business concerns demand breadth because they want scale… a big part of the industry's response has been to create sites that become known by how they cover something rather than what.

Should US Hackers Fix Cybersecurity Holes or Exploit Them?

[Commentary] There’s a debate going on about whether the US government -- specifically, the National Security Agency and United States Cyber Command -- should stockpile Internet vulnerabilities or disclose and fix them.

It's a complicated problem, and one that starkly illustrates the difficulty of separating attack and defense in cyberspace.

A software vulnerability is a programming mistake that allows an adversary access into that system. Heartbleed is a recent example, but hundreds are discovered every year. Unpublished vulnerabilities are called “zero-day” vulnerabilities, and they’re very valuable because no one is protected. Someone with one of those can attack systems world-wide with impunity.

When someone discovers one, he can either use it for defense or for offense.

Defense means alerting the vendor and getting it patched. Lots of vulnerabilities are discovered by the vendors themselves and patched without any fanfare. Others are discovered by researchers and hackers. A patch doesn’t make the vulnerability go away, but most users protect themselves by patch their systems regularly.

Offense means using the vulnerability to attack others. This is the quintessential zero-day, because the vendor doesn't even know the vulnerability exists until it starts being used by criminals or hackers. Eventually the affected software's vendor finds out -- the timing depends on how extensively the vulnerability is used -- and issues a patch to close the vulnerability.

Antonin Scalia Totally Gets Net Neutrality

The Federal Communications Commission proposed new rules to regulate broadband Internet providers. Many supporters of an open web don’t like these rules. The agency’s suggested regulations, they say, will either sacrifice a key tenet of the Internet -- network neutrality, a storied and contested idea -- or prove ineffectual.

They say the agency must re-categorize broadband Internet providers, so that they become utilities -- common carriers. It’s obvious, obvious, they say, that the FCC categorizes broadband incorrectly in the first place.

Turns out a member of the nation’s highest ranking court made their case for them almost a decade ago. That judge’s name? Antonin Scalia, who wrote that the FCC’s interpretation of the law around “information services” was “implausible.” With its decision to regard cable broadband as an information service, the agency had “[established] a whole new regime of non-regulation, which will make for more or less free-market competition, depending upon whose experts are believed.” In ruling that broadband was an information service, the FCC “had exceeded the authority given it by Congress.”

Judge Scalia based his argument on an interesting analogy: “If, for example, I call up a pizzeria and ask whether they offer delivery, both common sense and common “usage,” […] would prevent them from answering: ‘No, we do not offer delivery -- but if you order a pizza from us, we’ll bake it for you and then bring it to your house.’ The logical response to this would be something on the order of, ‘so, you do offer delivery.’ But our pizza-man may continue to deny the obvious and explain, paraphrasing the FCC and the Court: ‘No, even though we bring the pizza to your house, we are not actually “offering” you delivery, because the delivery that we provide to our end users is “part and parcel” of our pizzeria-pizza-at-home service and is “integral to its other capabilities.”

The Library of Congress Wants to Destroy Your Old CDs

If you've tried listening to any of your old CDs lately, if you even own them anymore, you may have noticed they won't play.

CD players have long since given up on most of the burned mixes I made in college. And while most of the studio-manufactured albums I bought still play, there's really no telling how much longer they will. My once-treasured CD collection -- so carefully assembled over the course of about a decade beginning in 1994 -- isn't just aging; it's dying. And so is yours.

"All of the modern formats weren't really made to last a long period of time," said Fenella France, chief of preservation research and testing at the Library of Congress. "They were really more developed for mass production." "If you want to really kill your discs, just leave them in your car over the summer."

France and her colleagues are trying to figure out how CDs age so that we can better understand how to save them. This is a tricky business, in large part because manufacturers have changed their processes over the years but won't say how. And so: we know a CD's basic composition -- there's a plastic polycarbonate layer, a metal reflective layer with all the data in it, and then the coating on top -- but it's impossible to tell just from looking at a disc how it will age.

"We're trying to predict, in terms of collections, which of the types of CDs are the discs most at risk," France said.

Edward Snowden's Other Motive for Leaking

A few pages into Glenn Greenwald's newly released book, No Place to Hide: Edward Snowden, the National Security Agency, and the US Surveillance State, there is a fascinating passage that transforms my understanding of why the contractor leaked NSA secrets.

The familiar rationale still applies. Edward Snowden wanted to inform Americans about the actions of our government and to spark a debate about mass surveillance. "My sole motive is to inform the public as to that which is done in their name," he reportedly wrote in a note to his collaborators, "and that which is done against them."

Actually, though, he had a second motive. He was also trying to reach elites. In leaking, he hoped to inform and influence a small subculture of tech influencers. Regardless of how Americans reacted to his leaks, he hoped they'd awaken to the ideology and reach of the surveillance state, and that at least some programmers would be inspired to thwart it with technology.

Russia Mulls a Digital Iron Curtain

In Russia, it is unclear how users will react to the new reality being created around an Internet that was once widely free.

In April, the State Duma passed legislation that would require non-Russian tech companies to store all domestic data within Russia for at least six months. And Kommersant, a well-regarded newspaper, reported that a commission set up by Russian President Vladimir Putin is recommending a system that would allow the government to filter and access all content passing through Russian servers.

It is still unclear whether major companies like Google and Facebook will agree to the expensive task of placing servers and data-storage centers inside Russia -- or if Moscow will follow through with blocking access to the sites if they do not.

Whatever he decides to do, Putin is representative of an accelerated push by autocratic leaders worldwide to reign in the unwieldy Internet space. But doing so once populations have already experienced the value and convenience of open access can be difficult.

Here's a look below at some case studies of web censorship -- ranging from the most extreme version of a truly "sovereign" web to one of evolving ad-hoc efforts to chip away at Internet freedom.

Can Cell Phones Stop Crime in the World's Murder Capitals?

In the last three months, Guatemala has witnessed 356 homicides, 202 armed attacks, 44 illegal drug sales, 11 kidnappings, and six cases of "extortion by cell phone."

These numbers come courtesy not of Guatemalan law-enforcement but of Alertos.org, a new platform that recruits citizens to report crimes. And they've enlisted in the effort, using email, Twitter, Facebook, mobile apps, and text messaging to chronicle thousands of criminal activities since 2013 -- in a country where a hobbled police force is struggling to address the fifth-highest murder rate in the world.

In recent years, police have courted cell phone-toting citizens as crime "censors" everywhere from Washington, D.C. to the tiny Kenyan village of Lanet Umoja. But the practice has gained particular traction in Latin America, which, as the UN reported in April, has the highest rate of criminal violence on the planet (the region accounts for 8 percent of the world's population and a third of its murders).

The criminal syndicates and drug cartels behind this bloodshed have overwhelmed, crippled, and corrupted national police forces, resulting in the highest levels of impunity in the world as well. In these countries, criminals literally get away with murder, again and again. Amateur crime-mapping has emerged as a parallel law-enforcement mechanism -- in part owing to the popularity of cell phones in the region.

Online crime reporting can work remarkably well, harnessing the knowledge and networks of communities and saving money that would otherwise be spent on desk officers taking reports in person or by phone. But that success depends on people believing that police will swiftly take action on their reports, which in turn depends on law-enforcement agencies integrating crime-mapping initiatives into their broader operations in the first place.

The Case for Rebooting the Network Neutrality Debate

[Commentary] The Internet uproar about network neutrality tends to come in waves. Right now we’re riding the crest of one.

In the two weeks since Federal Communications Commission Chairman Tom Wheeler’s proposal for new net neutrality rules became public, the Internet has erupted in protest. The legal vacuum created by the Court of Appeals for the DC Circuit threatens the Internet that we know and love. It threatens the start-up economy. But simply adopting rules that are network neutrality in name only is not enough. Different rules -- like a ban on access fees versus a ban on discriminatory or exclusive access fees -- will result in vastly different environments for the use of the network and in very different application innovation ecosystems. As we -- the public, policy makers, and regulators -- think through the choice between limited network neutrality regulation under Section 706 of the Telecommunications Act and more comprehensive network neutrality rules under Title II of the Communications Act, we need to ask the right questions and ask them in the right order:

  1. What kind of rules do we need to protect users and innovators against the threat of blocking and discrimination?
  2. How will access fees affect the environment for application innovation and free speech, and how does this affect what kind of rules we need?
  3. And, finally, which foundation -- Section 706 or Title II -- will allow us to adopt these rules?

The answers are clear.

  • First, we need strong network neutrality rules that prohibit blocking, discrimination against specific applications or classes of applications, and access fees – rules that apply equally to the fixed and mobile Internet.
  • Second, we need rules that provide certainty to innovators, investors, and ISPs alike. Innovators and their investors need to know that they won’t be discriminated against and that ISPs cannot create new barriers to innovation by charging access fees.
  • Third, start-ups are small and don’t have many resources, let alone a legal team. So we need rules that can be enforced through simple, straightforward legal processes, not rules that tilt the playing field in favor of large, established companies that can pay armies of lawyers and expert witnesses and afford long, costly proceedings at the FCC.
  • Fourth, we need rules that give ISPs flexibility to realize their legitimate goals such as network management, price discrimination, or product differentiation, albeit through means that do not distort competition, harm application innovation, or violate user choice.
  • Fifth, we need rules that do not overly constrain the evolution of the Internet infrastructure and keep the costs of regulation low.

[van Schewick is a professor at Stanford Law School and the director of the school's Center for Internet and Society]

Michael Hayden's Unwitting Case Against Secret Surveillance

[Commentary] Is state surveillance a legitimate defense of our freedoms? The question was put to Michael Hayden, former director of the NSA and the CIA, during a debate in Toronto.

Alan Dershowitz joined him to argue the affirmative. Glenn Greenwald and Reddit co-founder Alexis Ohanian argued against the resolution.

"State surveillance is a legitimate defense of our freedoms," Hayden said, restating the resolution. "Well, we all know the answer to that. It depends. And it depends on facts."

In doing so, Hayden unwittingly echoed a core belief of the national security state's critics. He's absolutely right: To judge whether a particular kind of surveillance is legitimate, one must know exactly what's being considered and its purpose.

Why the British Library Is Spending $55 Million on News Archives

Just 2 percent of the British Library's massive archive of print newspapers have been digitized. That's going to change. The institution is completing a seven-year effort to upgrade its news archives, a $55 million (£33 million) project that's aimed at expanding the library's definition of "news."

Curator Luke McKernan said that "news" can mean "anything of relevance to a particular community at a particular point in time." Most people by now will acknowledge that news is recorded in newspapers and on Facebook and on Twitter and on blogs, etc., etc., but McKernan said he's also thinking about "diaries, oral history, recordings, maps, posters, letters," and so on.

McKernan wants to establish links between different kinds of resources, a strategy that's becoming increasingly important as institutions like libraries rethink how their resources will fit into a larger network of interconnected data and information online.