Facebook Released Its Content Moderation Rules. Now What?

Facebook Released Its Content Moderation Rules. Now What?

Tuesday was a huge day for online speech. Facebook finally released the internal rules that its moderators use to decide what kinds of user content to remove from the site, including once-mysterious details on what counts as “graphic violence,” “hate speech” or “child exploitation.” It also announced the introduction of an appeals process for users who want to challenge the removal of their posts.

These developments represent a big step toward due process, which is essential on a site where so much of our speech now takes place. But an entity with such enormous power over online expression should do even more to listen to its users about what kinds of expression are allowed, and its users should be ready to be heard. Ideally, Facebook will eventually create a more robust system to respond to those who believe their posts were taken down in error, and to give its users the opportunity to weigh in on its policies.

Over the past decade, Facebook has decided how and when to take down posts using two sets of rules: one public and one private. The public rules are the “community standards” that very generally describe bans on certain types of content — like “no graphic violence” or “no sexually explicit content.” The second set of rules, used to guide moderators deciding whether to take down flagged material, remained secret, although training slides leaked in 2017 gave some insight into what they were.

Thanks in part to some high-profile controversies over Facebook’s removal of material — or its failure to delete certain content fast enough — awareness has spread that the company constantly makes choices about which kind of user speech is acceptable. For example, it was criticized for its removal of a famous photo of a girl running naked on a road after a napalm attack on her village in Vietnam. And there was an outcry over a video of a man being shot point-blank in the head, and the length of time it stayed on the site.

Those aware of this opaque side of content moderation — academics, free speech advocates and civil society organizations — have long pushed for full disclosure of the rules. But while the release of rules might appear to be a victory for these proponents of transparency, and for Facebook’s users, it’s a less powerful moment than it could have been.

First, no notice (of the kind that has appeared in the past when, say, Facebook has made changes to privacy policy) appeared in users’ feeds to inform them that the community standards had been updated. Instead, users had to rely on news reports that the rules were finally being shared.

Second, while an appeals system for the removal of individual posts is a promising first step, the one that Facebook started this week has a lot of growing to do before it can really create real procedural fairness. While users can ask Facebook to take a second look at content removed for nudity, hate speech or violence, there is no indication that these requests will be handled by more experienced moderators.

Instead, it seems possible that they will simply be put through the system again, making the process more of a retrial than an appeal. It doesn’t appear that there will be an opportunity for users to provide new information that could explain why the removal of a post was a mistake.

The good news is that there’s hope that ideas like this will be included in later iterations of the appeals process. Facebook did say on Tuesday that it will be “working to extend this process further, by supporting more violation types, giving people the opportunity to provide more context that could help us make the right decision, and making appeals available not just for content that was taken down, but also for content that was reported and left up.” It’s essential that the company follow through on this.

The most serious shortcoming highlighted by this week’s announcement was that Facebook still doesn’t have a meaningful vehicle for users to directly comment on the policies that are governing their online speech, and the speech they see on the platform. As a result, Facebook is still not truly accountable to all of its users.

The company announced that it will be holding “Facebook Forums” — public meetings around the world where attendees can discuss its content removal policies. While this certainly opens the door to many more opinions and ideas, if these forums are dominated by stakeholders or those already in dialogue with Facebook, the company will miss an opportunity to operate in a truly democratic fashion by listening directly to users.

Even if these gatherings are truly public, as Facebook has suggested they will be, it will be impractical for the vast majority of users to attend. Facebook should consider using surveys of user preference, opportunities for users to vote on sections of content policy, the establishment of an e-rulemaking approach to new policies with the chance for public comment on new rules, and even open online forums for users to participate in parallel with Facebook Forums.

In the decade since Facebook began making decisions about acceptable content, it has become one of the largest platforms for online speech. Last June, the Supreme Court declared that access to social media was an exercise of First Amendment rights, and Justice Anthony Kennedy called such sites the “modern public square.”

This idea — that Facebook is governing a public forum of expression — makes the need for due process, complete with both transparency and accountability, increasingly urgent. Tuesday’s announcement gave us the former, and it established building blocks to create the latter. This means it is now up to us, as citizen-users, to hold the platform to its promises and to be in dialogue about what we want it to be. If this grand experiment to create a better global public square fails, it won’t just be the fault of Facebook, but all of us.

Kate Klonick is a resident fellow at Yale Law School’s Information Society Project.

Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *