Access Now’s Digital Security Helpline (“the Helpline”) was formally established in 2013 and was conceived as a 24/7 technical help desk, responsible for supporting civil society groups — activists, media organizations, journalists, bloggers, and human rights defenders — in preparing, identifying, containing, and recovering from digital security incidents affecting some of the most at-risk individuals and organizations in the world. After more than 6,000 cases and requests for assistance have been resolved, we continue to adapt and adjust our operational procedures and community documentation to better serve our constituency. Today, we would like to share learnings and reflections that emerged after doing a review of our response protocols for incidents that affect the availability of content on online platforms.
What kinds of content-related requests do we receive?
In the past few years, we have seen what appears to be a more aggressive approach to enforcing content moderation rules. These efforts have been prone to errors, partially associated with the use of computer algorithms for content moderation and online platforms’ lack of capacity for understanding and responding to cultural and political context and assessing the implications for users. This, together with a lack of meaningful transparency by these platforms regarding the creation of rules and how they are applied, has contributed to increasing reports of censorship and the verified frustration of efforts to document human rights violations. At the same time, the platforms have provided their users little opportunity to respond to erroneous content moderation decisions or to access an effective remedy mechanism. This has led some users, especially those most at risk, to seek alternative channels for their content-related grievances, including our Digital Security Helpline.
Content-related requests typically fall into two categories: 1) requests to remove content, where at-risk users flag specific content that they deem to be harassing or dangerous and could put individuals or organizations at greater risk; and 2) requests for content recovery, where constituents denounce censorship, arbitrary content takedowns, or organized efforts to suppress critical voices. Requests to act in cases related to content are now part of approximately 20% of all cases the Digital Security Helpline receives.
These developments have put the Helpline in a difficult position. Despite the fact that Access Now’s Digital Security Helpline is a part of a global human rights organization with significant knowledge of the most pressing issues facing civil society around the world, we, as a technical incident response team for civil society, lack the qualifications to make content moderation decisions. Therefore, acknowledging the limits to our abilities, we do not believe that it would be appropriate for us to take on a more active role in the content moderation process.
However, we take our commitment to keeping at-risk users safe and defending their human rights online very seriously — whether in the context of helping a client of the Digital Security Helpline or otherwise. We understand that Access Now’s position may allow us to positively impact individual cases as well as to facilitate productive policy conversations governing content online. This is why, addressing the issue from the policy lens, Access Now has recently published a policy paper, 26 recommendations on content governance: a guide for lawmakers, regulators, and company policy makers , and, from the Helpline perspective, we will continue supporting a limited range of cases.
What is our approach to addressing them?
Cases the Helpline will consider taking on are limited to those in which platforms have imposed rules through their own terms of service or community standards. The Helpline does not assist with cases in which a removal of content is mandated by law or required through a formal legal mechanism, such as cases of alleged child sexual abuse or violation of anti- terrorism laws, for example.
Access Now has a strong position against censorship and limiting access to information. Accordingly, the Digital Security Helpline will only support the removal of content in situations where the standard support mechanisms are not effective, and it is clear, based on our internal assessment, that the content poses a credible and imminent risk to the individual or someone else’s physical safety. Such assessment includes looking at whether the content poses threats of physical violence or doxxing, for example. We also take into consideration security consequences that vulnerable groups, such as women, minorities, or individuals in conflict situations, face.
Similarly, we will continue to intervene in cases where critical voices from civil society are being censored, and where the standard support mechanisms are not proving effective, provided that the content or accounts in question do not pose a risk to someone else. We will also continue to work with our trusted partners, within the Computer Incident Response Center for Civil Society (CiviCERT) and beyond, to confirm requests for support and gain insight of issues affecting some of the most vulnerable communities in the world.
Moving forward: combining policy guidance, global partnerships, and direct technical assistance to protect human rights
While the Helpline will be assisting in limited capacities on individual content-related cases, Access Now will also be leveraging our policy work to create sustainable and impactful change. Thus, our policy team will continue providing recommendations to social media platforms to improve their response to content moderation issues, including ensuring that their decisions conform to the principles of transparency and proportionality, and that they provide direct access to an effective remedy for all users. Examples of such recommendations include our reports, Protecting free expression in the era of online content moderation and 26 recommendations on content governance: a guide for lawmakers, regulators, and company policy makers. Lastly, we will also continue engaging with partners, such as WITNESS, Syrian Archive, and other civil society groups who are well positioned to speak and advocate on behalf of the communities they represent on content moderation issues.
We hope that, in the long run, platforms will further improve their content reporting and appeals mechanisms to ensure that their users — particularly those at greater risk — do not have to resort to third parties when their capacity to express their ideas and access to information is unfairly limited. In the meantime, Access Now’s Digital Security Helpline will continue stepping in to fill in this gap by responding to the most pressing needs that at-risk users are facing.