Building international norms is a slow process. Often, before governments and multinational companies can agree to any common principles for such norms — much less take action on them — they require overwhelming evidence. They look for widespread agreement among experts on a particular issue, clarifying what the problem is and how to address it.
That’s why David Kaye, the U.N. Special Rapporteur on the freedom of opinion and expression, has begun a multi-year project to provide “guidance on how private actors should protect and promote freedom of expression in a digital age” by taking a broad approach. In his latest report (PDF), Kaye has carefully mapped out a shared pathway for progress, identifying the issues at stake, as well as multiple players, rules, knowledge gaps, and avenues for advocacy.
Access Now participated in two days of consultation for this report, and we’re excited to be in Geneva next week when he delivers it to the U.N. Human Rights Council. We strongly support the ambitious program of work that the Special Rapporteur has delineated. Everywhere we look in the digital sphere, we see a need for common standards, definitions, and best practice guides to help regulatory bodies and corporations meet their responsibility to respect human rights.
Below, we take a look at key issues the report identifies, potential solutions that are explored, and areas for further investigation as the project unfolds.
The problem: legal — and extra-legal — threats to free expression and privacy
Internet shutdown orders, filtering requirements, vague laws — these are among the threats to human rights that the Special Rapporteur has identified. The report notes that internet platforms and other “private intermediaries” are “typically ill equipped to make determinations of content illegality,” citing findings by the Inter-American Commission on Human Rights. Yet new bills like Tanzania’s Cybercrime Act of 2015, Kaye observes, pressure companies to “assess the validity of State requests” against legal criteria, and “remove or delink such content based on such assessments.” In this way, governments force companies to make decisions that mediate our fundamental rights, a difficult position that lacks a clear foundation in law.
Even without passing or enforcing laws on the books, governments assert control over expression by using “extra-legal requests to suppress or remove content,” Kaye finds. The new E.U. “code of conduct” on countering (vaguely defined) terrorist activity and hate speech online falls under this category. So too does the galling practice some government agencies have engaged in of arbitrarily using legal mechanisms in other countries — such as the U.S. “DMCA” copyright notice-and-takedown mechanism — to silence political satire and criticism. We oppose measures like this because they represent attempts to restrict free expression outside open or participatory legal processes.
Then there are the threats to our privacy. Private sector players develop, market, and sell surveillance technology that governments use “to target, harass or intimidate members of vulnerable groups.” Companies may also know about and/or provide secret government access to telecommunications networks or devices. This increases their responsibility to mitigate harm to human rights, and to provide notice to customers, as well as to ensure our digital security.
Finally, Kaye cites the problem of governments ordering private companies to keep information about their requests secret. Slowing the trend toward increased corporate transparency, “several States prohibit disclosures concerning government requests for content removal or access to user data. India, for example, prohibits online intermediaries from disclosing details of government orders to block access to Internet content.” To establish rights-respecting law and norms, we need more, not less, evidence about how governments access our data or restrict our content.
A part of the solution? “Soft” law
As Kaye points out, international human rights law does not “directly govern the activities or responsibilities of private business.” So the Special Rapporteur highlights the need for norm-building programs to fill gaps, establish expected behavior by both states and companies, and give legislators a chance to catch up. Kaye refers to several initiatives to help build norms, going beyond the U.N. Guiding Principles on Business and Human Rights. These include Ranking Digital Rights Corporate Accountability Index and the African Declaration on Internet Rights & Principles.
For their part, many companies have been taking proactive measures to increase our digital security and provide tools for anonymity and privacy, through the use of encryption and other measures. But instead of supporting companies that do this, governments often get in the way. The Special Rapporteur points to the U.S. Federal Bureau of Investigation’s attempt to compel “Apple to create software that facilitates access to a suspect’s iPhone in a terrorism investigation,” and the U.K.’s Investigatory Powers Bill that “would authorize intelligence services to apply for a warrant that requires private entities to … secure interference with any equipment for the purpose of obtaining communications […] equipment data and any other information’.”
Areas for further investigation: internet shutdowns, intermediary liability, government efforts to weaken digital security, and more
The dismaying trend toward internet shutdowns does not go unnoticed in Kaye’s report. “States compel or pressure companies to shut down networks or block entire services,” the Special Rapporteur finds. “This trend requires further documentation and scrutiny. Future work will examine laws, policies and extralegal measures that enable Governments to undertake such restrictions, and the costs and consequences of such restrictions.” In this vein, we recently found that there are laws in 27 countries that enable internet shutdowns. Kaye is also interested in learning more about the “responsibilities of companies to respond to such measures in a way that respects rights, mitigates harm and provides avenues for redress where abuses occur.” This is an issue we intend to address through an update to our Telco Action Plan and related guidance.
The report identifies several other areas for future work, including investigations of:
- content restrictions under terms of service and community guidelines (including efforts to combat hate speech and extremist or terrorist content);
- liability for content hosting, including in copyright contexts;
- the censorship and surveillance industry, producing tools that governments often abuse;
- Efforts to undermine digital security and weaken technological safeguards; and
- internet governance and connectivity initiatives.
The Special Rapporteur has been open to input from all stakeholders, and we hope he continues to get support from both civil society and companies throughout the life of this project. For our part, we warmly welcome the report, and will continue to provide assistance as this much-needed project continues to develop.