Facebook wants its users to drive out fake news

Mark Zuckerberg announced recently that Facebook plans to ask its community to help rate news producers’ credibility. Randomly selected users will be asked whether they are familiar with an outlet, and if so, invited to judge its trustworthiness. The ratio that results — of those who know the source, the proportion that trusts it — will “inform ranking in the News Feed” (though Facebook has remained vague about its relevance compared with other metrics). Rather than relying on expert judgment or hiring his staff to make those assessments, Zuckerberg has concluded that crowdsourcing is the “most objective” way to ensure “high quality” news on the platform.

But the reliability of this “trusted sources” measure is dubious, for two reasons. First, people in superficial surveys of this kind often indicate trust in fake sources that have familiar and vaguely credible names. Second, partisan Facebook users with a high interest in promoting “their” media could bias the results.

Here’s how I did my research

I conducted a study before the German 2017 elections that used more than 400 participants who lived in Germany, recruited online by a professional surveying company. I weighted the data to make it representative on a number of demographic variables (more information on the data here). For the following analysis, I also exclude respondents with no Facebook account.

I asked this group to rate the trustworthiness of what they were told were 10 different news outlets. Five outlets were established media brands with a large reach and only moderate partisan leanings. Two were smaller newspapers that were openly partisan, but still professional organizations run by full-time journalists.

Three were unprofessional “news” sites that published made-up or highly distorted stories that had circulated on Facebook. They were “unprofessional” in the sense that they don’t offer any information about who is in charge of the publication or how information is sourced, and may even allow anyone to invent and post their own stories. Yet their names sound fairly inconspicuous: “24-aktuelles.com,” “Internetz-Zeitung” and “Newsblitz” — the last of which may sound aggressive in English but is less so in German.

I asked participants to rate these outlets on a trustworthiness scale from 0 to 10. Much as Facebook’s proposed metric would, I calculated the average trust score from among respondents that said they were familiar with the outlet.

Trustworthy names are easy to fake

The figure below shows that the large professional outlets, which you can see in the left-hand group, indeed scored the highest averages.

But their advantages over the three fake outlets, in the center group, are small. Worse, the scam sites had scores that were equal to or higher than the two partisan but more professional sites, in the right-hand group. Note that these relative measures do not account for the fact that many more people ticked “don’t know” for the fake sites.

Facebook wants its users to drive out fake news

It’s true that survey measures have the notorious weakness of being superficial. Just because a number of people say they would trust an outlet, that does not mean that they would do so if they came across it in real life. Trust is more about actual behavior than about quick reactions; people might be more clever when navigating a real news environment.

But this is exactly the problem: Facebook is proposing a superficial way to measure a complex social phenomenon.

Of course, not all misinformation sources carry staid and familiar-sounding names. But they can do so without effort. Think about the made-up Denver Guardian: Many have surely come across the Denver Post and the Guardian, separately. Our brain likes to manufacture accurate memories into false memories. Within the few seconds that Facebook users have to respond to the “trusted sources” pop-up, a Denver Guardian might sound legitimate enough.

One could go further and survey people’s trust in outlets that do not even exist but have harmless names. Similar results are likely. The fake sources used in my study were small, without much reach; most participants probably had never come across them.

Partisan zeal is hard to avoid

My study data also shows that participants are suspicious of overly partisan outlets, seen in the right-hand group. This may be what Facebook intends.

But the company’s approach does not shield against sectarian zeal. Many little-known online news sites preach to the converted; the content is exclusively partisan and all the readers agree. As others have pointed out, if the only people allowed to rate their trustworthiness are their readers, these sites will fare extremely well.

Facebook has responded to this criticism: High trust scores will only be effective when they come from “a lot of different types of people.” This sounds promising, but it doesn’t address a related, more fundamental problem of surveys: Facebook cannot force the users it sampled to respond, so not all of them will. Academics call this the nonresponse bias and have observed different reasons for it. One of them is that people’s interest in a topic increases their motivation to respond. Those who are ideologically most invested in denigrating the other side’s media and cheerleading their own side’s media are most likely to take the time to rate outlets.

Adjust for the absolute level of familiarity

We do not yet know exactly how Facebook’s “trusted sources” metric will work. The company clarified in the last days that the metric will interact with many other signals, and only matters for a few outlets so far.  But clearly Facebook’s new policy will affect the news media market — and therefore public debate. If Facebook’s goal is to take account of the trustworthiness of news media sources, then research suggests its trusted sources metric may not actually be able to do so. In the current form, it is likely to lead to overrating of anyone who manages to find a credible name and let those with partisan interests bias scores.

How could Facebook correct for this? By adjusting for the absolute level of familiarity. Looking back at my study, the trust averages of “Deutschlandfunk” (a professional outlet) and “24-aktuelles.com” (a scam site) are very close to each other. But almost twice as many participants indicated their familiarity with the former. This factor could help Facebook to achieve its goal with its “trusted sources” effort.

Bernhard Clemm (@BernhardClemm) is a PhD researcher at the European University Institute, Florence. He studies on political psychology and public opinion on the Internet.

Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *