Earlier this year, Facebook presented its plans to create an ‘oversight board for content decisions’. According to its draft charter , the purpose of this body will be to determine how content should be governed and enforced on the platform. Brent – you have been involved in the initial phases of the board – could you tell us more about the strategy and why your company feels it must look beyond its internal structures to curate online content?
The goal of the board is to extend decision making beyond the company itself and beyond Silicon Valley and hear from a wide array of people, voices and experts who stand beyond those we usually talk to.
In the last six months we’ve talked to over 650 people in workshops and roundtables around the world from over 88 countries, and we’ve heard from over 1,200 people via public submissions. The consequences are that they’ve spotted a lot of issues and given us a huge amount of feedback. The rationale behind involving these people is that it should result in an institution that’s stronger than if we had launched it on our own.
The approach you’ve set out for the oversight board has drawn comparisons to those of governmental structures. Yet international norms such as the Universal Declaration of Human Rights already deal with issues such as freedom of expression, opinion and hate speech. Aren’t there risks of reinventing the wheel or undermining existing mechanisms?
The approach we’ve taken is one that’s informed by a number of global international norms, international laws and human rights. Facebook itself is rooted in free expression and Article 19 of the Universal Declaration of Human Rights. [Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers.]
The exercise that we’re engaged in as one that’s fundamentally grounded in and informed by work that’s been advancing over last century around these types of international norms. What we are trying to do now is understand how these norms apply in the context of Facebook, Instagram and the digital sphere as a whole.
How might the oversight board operate in countries with a value system at odds with the international norms and human rights you’ve referred to?
When we think about building out this board, the board can have no more power, can be delegated no more power, than Facebook holds, and Facebook’s ability to operate is bound by the laws in place in the countries in which it operates.
The board is not and cannot be a substitute for local laws, it’s a question of how you extend beyond it and how you interpret situations where laws do not yet apply.
In the last five years, Facebook’s position with regards to the government regulation has shifted significantly in favour of more legislations. Looking beyond the oversight board, what would be an ideal regulatory environment for a company like Facebook to continue to be successful?
Mark Zuckerberg has called for regulation in four areas: one is in election protection, a second is in data portability, a third is in privacy and data protection issues, and then a fourth is to do with harmful content.
What we believe – and this is embodied both in the oversight board and our call for regulation – is that because we’re a private company, we shouldn’t hold the full responsibility for content decisions that extend to over two billion people around the world.
What we are calling for is greater accountability, greater oversight and greater transparency. It’s also important that democratic institutions and civil society have a voice and perspective on these issues so we can make the digital sphere fairer and more just then it is today.
Senator Elizabeth Warren and Co-Founder of Facebook Chris Hughes have called for Facebook to be broken up, along with other big tech companies. Do you think that Facebook’s dominant market position is a threat to competition?
Firstly, the market feels very competitive for us, we’re in the middle of trying to figure out how to run the business, and secondly, the ability to deal with the types of concerns that people have articulated are not made easier by breaking up companies.
If you look at the types of issues around content moderation, around privacy, these require substantial systems and investments and technical insights that don’t get stronger if you then try to fracture parts of a company.
What about today’s start-ups which have to compete in an environment dominated by a handful of large tech companies – what challenges do you think a similar organization to Facebook would face if it was starting from scratch in 2019?
It’s a good question. I think one of the major challenges would be how they deal with some of the government regulation that will come into play, the heightened expectations it will bring, and whether they’ll have the right infrastructure in place to deal with that.
At the 2019 Chatham House Cyber Conference in June, Damian Collins MP spoke about two types of citizens: the ‘remote-controlled’ and the ‘empowered’, where the former is consuming information predetermined by algorithms, and the latter is able to access and navigate information free of algorithmic curation. Can you imagine a situation in which government regulation improves Facebook’s algorithms?
There is going to be more regulation on just about every topic associated with technology and I believe more work needs to be done on these issues. However, one concern I have is that governments aren’t necessarily in the best position to design and understand products. Moreover, they’re not always well-versed in some of the really technical issues.
In the case of the algorithms, they’re designed to bring people the content and information that they most want to see and the reason that those algorithms are in place is that they actually help people navigate a wide and vast array of different content.
LinkedIn recently announced that they have made changes to their algorithm to rank content related to a user’s work interests above clickbait and viral content. What actions are Facebook prepared to take to ameliorate its newsfeed algorithms?
We are doing a lot of work to figure out down-ranking. If you look at misinformation for example, when third-party fact checkers come forward to inform us that a piece of content is fake or false, we essentially apply a speed limit making it far more difficult for that content to be seen.
Thomas Farrar, Digital Editor, Communications and Publishing Department and Brent Harris, Director of Global Affairs, Facebook.