Atlantic, The

Why Tech Still Hasn't Solved Education's Problems

Remember MOOCs, or massive open online courses? Now, as another school year lurches into gear, those companies have a meek record.

Udacity tried replacing intro courses at San Jose State; it ended in failure. So why has the promised boom in educational technology failed to appear -- and why was the technology that did appear not very good?

Paul Franz, a language arts teacher in California, suggests that education is too complex to tackle by tech alone.

What Good Is All This Tech Diversity Data, Anyway?

[Commentary] The drumbeat of diversity data coming from tech companies like Google, LinkedIn, Facebook, and Twitter has been anticlimactic, not least because it shows what most people already expected: that leaders in technology are overwhelmingly hiring white men.

All the companies say they need to do more. Few are willing to talk about the issue beyond what they've released in charts and blog posts.

As important as it is to get diversity numbers on the record, if what we're interested in is changes to that record, it's worth asking: Does releasing the numbers alone catalyze change? We have some evidence on this question. The answer is no.

The Latest Snowden Leak Is Devastating to NSA Defenders

[Commentary] Consider the latest leak sourced to Edward Snowden from the perspective of his detractors. The National Security Agency's defenders would have us believe that Snowden is a thief and a criminal at best, and perhaps a traitorous Russian spy.

In their telling, the NSA carries out its mission lawfully, honorably, and without unduly compromising the privacy of innocents. For that reason, they regard Snowden's actions as a wrongheaded slur campaign premised on lies and exaggerations.

Snowden defenders see these leaked files as necessary to proving that the NSA does, in fact, massively violate the private lives of American citizens by collecting and storing content -- not "just" metadata -- when they communicate digitally. They'll point out that Snowden turned these files over to journalists who promised to protect the privacy of affected individuals and followed through on that oath.

The NSA collects and stores the full content of extremely sensitive photographs, emails, chat transcripts, and other documents belong to Americans, itself a violation of the Constitution -- but even if you disagree that it's illegal, there's no disputing the fact that the NSA has been proven incapable of safeguarding that data.

There is not the chance the data could leak at some time in the future. It has already been taken and given to reporters. The necessary reform is clear. Unable to safeguard this sensitive data, the NSA shouldn't be allowed to collect and store it.

The Military Doesn't Want You to Quit Facebook and Twitter

Cornell University said the Facebook emotion study received no external funding, but it turns out that the university is currently receiving Defense Department money for some extremely similar-sounding research -- the analysis of social network posts for “sentiment,” i.e. how people are feeling, in the hopes of identifying social “tipping points.”

It’s the sort of work that the US military has been funding for years, most famously via the open-source indicators program, an Intelligence Advanced Research Projects Activity (IARPA) program that looked at Twitter to predict social unrest.

Defense One recently caught up with Lt Gen Michael Flynn, the director of the Defense Intelligence Agency who said the US military has “completely revamped” the way it collects intelligence around the existence of large, openly available data sources and especially social media like Facebook.

“The information that we’re able to extract form social media -- it’s giving us insights that frankly we never had before,” he said. In other words, the head of one of the biggest US military intelligence agencies needs you on Facebook.

Former NSA Chief Clashes With ACLU Head In Debate

Is the National Security Agency keeping us safe? That was the question that MSNBC used to frame a debate at the Aspen Ideas Festival, which The Atlantic co-hosts with The Aspen Institute.

The debate featured General Keith Alexander, former head of the National Security Agency; former Congresswoman Jane Harman; and former solicitor general Neal Katyal spoke in defense of the signals intelligence agency.

Anthony Romero of the ACLU, academic Jeffrey Rosen and former Congressman Mickey Edwards acknowledged the need for the NSA, but argued that it transgresses against our rights with unnecessary programs that violate the Constitution. The two teams also spent time arguing about Edward Snowden and whether his leaks were justified. By the end of the 90 minute session the civil libertarian team handily beat the national security state team in audience voting.

Anthony Romero of the ACLU was at his strongest when pressing the other team to explain why the American people shouldn't have a right to privacy in their metadata, given how revealing it can be. He rejected the notion that the phone dragnet is permissible because, although the NSA keeps records of virtually every phone call made, it only searches that database under a narrow set of conditions.

The Test We Can -- and Should -- Run on Facebook

[Commentary] For a widely criticized study, the Facebook emotional contagion experiment -- which deployed its own type of new techniques -- managed to make at least one significant contribution. It has triggered the most far-reaching debate we’ve seen on the ethics of large-scale user experimentation: not just in academic research, but in the technology sector at large.

Perhaps we could nudge that process with Silicon Valley’s preferred tool: an experiment. But this time, we request an experiment to run on Facebook and similar platforms. Rather than assuming Terms of Service are equivalent to informed consent, platforms should offer opt-in settings where users can choose to join experimental panels. If they don’t opt in, they aren’t forced to participate.

This could be similar to the array of privacy settings that already exist on these platforms. Platforms could even offer more granular options, to specify what kinds of research a user is prepared to participate in, from design and usability studies through to psychological and behavioral experiments.

Of course, there is no easy technological solution to complex ethical issues, but this would be significant gesture on the part of platforms towards less deception, more ethical research and more agency for users.

[Crawford is a visiting professor at MIT’s Center for Civic Media, a principal researcher at Microsoft Research, and a senior fellow at NYU’s Information Law Institute]

How to Run Facebook's Mood Manipulation Experiment on Yourself

When news of Facebook’s attempt to emotionally manipulate its users has emerged, debate quickly focused on the experiment’s ethics. Lauren McCarthy, though, kept thinking about the experiment itself. But as discussion went on, she found that “no one was talking about what the study might mean. What could we do beyond the ethics?”

Now, she has a preliminary answer. McCarthy has made a browser extension, Facebook Mood Manipulator, that lets users run Facebook’s experiment on their own News Feeds.

Just as the original 2012 study surfaced posts analyzed to be either happier or sadder than average, McCarthy’s extension skews users’ feeds either more positive or more negative -- except that, this time, users themselves control the dials.

Unlike the Facebook study, which only surfaced posts judged happier or sadder, McCarthy’s software also lets people see posts that use more “aggressive” or “open” words in their feed. The extension, in other words, lets users reclaim some control over their own feed. It lets users discover what it’s like to wrestle with their own attentional algorithm -- as subtle, or as stupid, as it can sometimes be.

The Internet Is Fracturing Into Separate Country-Specific Networks

[Commentary] The World Wide Web celebrated its 25th birthday recently. Today the global network serves almost 3 billion people, and hundreds of thousands more join each day.

If the Internet were a country, its economy would be among the five largest in the world. Yet all of this growth and increasing connectedness, which can seem both effortless and unstoppable, is now creating enormous friction, as yet largely invisible to the average surfer. Fierce and rising geopolitical conflict over control of the global network threatens to create a balkanized system -- what some technorati, including Google’s executive chairman, Eric Schmidt, have called “the splinternet.”

Some experts anticipate a future with a Brazilian Internet, a European Internet, an Iranian Internet, an Egyptian Internet -- all with different content regulations and trade rules, and perhaps with contrasting standards and operational protocols. Whether or not this fragmentation can be managed is up to question, but at a minimum, this patchwork solution would be disruptive to American companies like Google, Facebook, Amazon, and eBay, which would see their global reach diminished. And it would make international communications and commerce more costly.

One Closed API at a Time, The Era Of The Open Web May Be Waning

[Commentary] APIs -- application programming interfaces -- are enablers of remix culture, essentially. And what they mix is structured data. They are, essentially, a way for companies and developers to talk to each other and build off of each other.

They're a means of converting the information a service contains into the stuff of the wider Internet. Because of all that, APIs have been seen, traditionally, as symbolic and practical.

So it's hard not to see the closure of the Netflix API, on top of the closure of all the other APIs, as symbolic in its own way -- of a new era of the web that is less concerned with outreach, and more concerned with consolidation. A web controlled by companies that prefer their own way of doing things, without external input. A web that takes the productive enthusiasms of independent developers and says, essentially, "Thanks, but no thanks."

The Promise of a New Internet

[Commentary] People tend to talk about the Internet the way they talk about democracy -- optimistically, and in terms that describe how it ought to be rather than how it actually is.

This idealism is what buoys much of the network neutrality debate, and yet many of what are considered to be the core issues at stake -- like payment for tiered access, for instance -- have already been decided. Internet advocates have been asking what regulatory measures might help save the open, innovation-friendly Internet.

But increasingly, another question comes up: What if there were a technical solution instead of a regulatory one? What if the core architecture of how people connect could make an end run on the centralization of services that has come to define the modern net?

It's a question that reflects some of the Internet's deepest cultural values, and the idea that this network -- this place where you are right now -- should distribute power to people.

In the post-NSA, post-Internet-access-oligopoly world, more and more people are thinking this way, and many of them are actually doing something about it. Among them, there is a technology that's become a kind of shorthand code for a whole set of beliefs about the future of the Internet: "mesh networking." These words have become a way to say that you believe in a different, freer Internet.