Update, June 2019: We will be tackling the human rights issues related to freedom of expression online at RightsCon. Here’s a look at some of the relevant sessions.
In January of this year, Facebook announced the launch of a process to create an independent oversight board to review some of the company’s decisions about what user speech to leave up or remove according to the terms of service rules, a practice known as “content moderation.”
These decisions affect our capacity to freely express our ideas and access information, and they can have a tremendous impact on human rights across the globe. This impact is felt at both the individual and collective level, since individual take-down decisions have a cumulative effect: they shape the space for discussion and can silence (or amplify) the voices of entire communities. There is a significant risk for content moderation practices, however well intended, to further censor or exclude vulnerable or marginalized communities, including activists, journalists, and human rights defenders. With dominant players like Facebook or YouTube using AI and machine learning technology for operating at scale, content moderation done the wrong way could shut down efforts to document war crimes and hold human rights abusers to account, en masse.
Access Now therefore welcomes Facebook’s recognition of its key role and responsibility to safeguard fundamental rights and protect civil liberties in the digital age, as well as its intention to explore new approaches to address the important human rights issues raised by content moderation. We believe it is critically important to do this the right way. That entails taking a comprehensive approach that follows and reinforces international human rights standards, including during the process of setting up the content moderation oversight board.
To enable a discussion that is underpinned by these global standards, we wrote a paper that lays out a set of principles designed to help Facebook and other platforms create content moderation processes and practices that will protect free expression. The paper provides our principles for rights-based content moderation, our preliminary recommendations for the governance and functioning of Facebook’s planned oversight board, and our analysis of the possible risks and benefits of this approach.
You can find our paper here.