We Already Know How to Protect Ourselves From Facebook

Facebook CEO Mark Zuckerberg.Credit Jeff Roberson/Associated Press
Facebook CEO Mark Zuckerberg.Credit Jeff Roberson/Associated Press

This week, Facebook’s chief executive, Mark Zuckerberg, is scheduled to testify before two congressional committees amid the growing outcry over the company’s data collection practices. Because I have been analyzing the potential negative effects of Facebook on politics for a long time, I am fielding a lot of inquiries about what legislators should ask Mr. Zuckerberg.

Here’s my answer: Nothing. We already know most everything we need for legislators to pass laws that would protect us from what Facebook has unleashed.

The sight of lawmakers yelling at Mr. Zuckerberg might feel cathartic, but the danger of a public spectacle is that it will look like progress but amount to nothing: a few apologies from Mr. Zuckerberg, some earnest-sounding promises to do better, followed by a couple of superficial changes to Facebook that fail to address the underlying structural problems.

This has been Facebook’s public relations strategy for years. After each scandal, it expresses regrets, announces a few cosmetic fixes and then works like mad to scuttle any legislation that might have a favorable impact on the core problem: how our data is harvested, used and profited from. It would be a shame if we went through that again.

In addition to apologizing, Mr. Zuckerberg will no doubt promise more transparency. Don’t get me wrong: I’m all for transparency. But while transparency can help us diagnose problems in the online economy, it alone doesn’t fix them.

Mr. Zuckerberg is also likely to promise to lock down all the data Facebook has collected on billions of people. That sounds like a good idea, but it is mostly irrelevant now; the data is already compromised.

More important, it is in Facebook’s financial interest to lock down its stores of data. After all, the company’s product, which it sells to advertisers and other interested parties, is microtargeted access to us and our attention. The extensive data it collects on billions of people is its means of executing that business. It does not want to give that resource away.

So why did it give away people’s data in the past? In part because it is a reckless company (“Move fast and break things” used to be a company motto of sorts). And in part because the data — a tantalizing resource for programmers — could be used to lure developers to make games, quizzes and other apps for Facebook that would keep users coming back to the site.

But that phase of the company’s development is over. Because Facebook does not sell our data directly (or even want to), extracting promises from Mr. Zuckerberg that it not do so would be worse than a toothless remedy. It would only serve Facebook’s business model.

What would a genuine legislative remedy look like? First, personalized data collection would be allowed only through opt-in mechanisms that were clear, concise and transparent. There would be no more endless pages of legalese that nobody reads or can easily understand. The same would be true of any individualized targeting of users by companies or political campaigns — it should be clear, transparent and truly consensual.

Second, people would have access, if requested, to all the data a company has collected on them — including all forms of computational inference (how the company uses your data to make guesses about your tastes and preferences, your personal and medical history, your political allegiances and so forth).

Third, the use of any data collected would be limited to specifically enumerated purposes, for a designed period of time — and then would expire. The current model of harvesting all data, with virtually no limit on how it is used and for how long, must stop.

Fourth, the aggregate use of data should be regulated. Merely saying that individuals own their data isn’t enough: Companies can and will persuade people to part with their data in ways that may seem to make sense at the individual level but that work at the aggregate level to create public harms. For example, collecting health information from individuals in return for a small compensation might seem beneficial to both parties — but a company that holds health information on a billion people can end up posing a threat to individuals in ways they could not have foreseen.

Facebook may complain that these changes to data collection and use would destroy the company. But while these changes would certainly challenge the business model of many players in the digital economy, giant companies like Facebook would be in the best position to adapt and forge ahead.

If anything, we should all be thinking of ways to reintroduce competition into the digital economy. Imagine, for example, requiring that any personal data you consent to share be offered back to you in an “interoperable” format, so that you could choose to work with companies you thought would provide you better service, rather than being locked in to working with one of only a few.

Right now, Silicon Valley is stuck in a (very profitable) rut. To force it to change would not only make us safer but also foster innovation.

That would be a better, more satisfying outcome than any dramatic “Have you no sense of decency, sir?” moment that a congressional hearing might produce.

Zeynep Tufekci is an associate professor at the School of Information and Library Science at the University of North Carolina, the author of Twitter and Tear Gas: The Power and Fragility of Networked Protest and a contributing opinion writer.

Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *