Access comments on Commission Consultation on “Self-Regulation”
12:13pm | 5 October 2012 | by Raegan MacDonald, English
Understanding how to regulate the online world has perpetually caused dilemmas for policy makers. Given that innovation often outpaces regulation, governments often turn to "self-regulation" or "voluntary" efforts as an alternative to government legislation.
To assist the European Commission to navigate this space, Access last week submitted to an open consultation on a proposed “Code for Effective Open Voluntarism: Good design principles for self- and co-regulation and other multistakeholder actions”.
The purpose of the consultation is three-fold:
1. to get feedback on the Commission’s draft code on voluntarism;
2. to develop a cross-EU and multistakeholder “network of excellence”; and
3. to gather knowledge, ideas, and opinions on how best to conduct self-regulatory initiatives.
Self-regulatory initiatives, or “voluntarism,” are essentially an alternative to governmental, or “hard,” regulation. In many cases, industry self-regulation has served the fast moving ICT sector well by providing for flexible and timely solutions that ensure the security of networks and protect consumers. A good example of this is the regulation of spam and malware.
However, the scope and understanding of “self-regulation” is increasingly blurring, as companies are often coerced into adopting measures that push them into a role of “internet police,” where they are not regulating themselves, but rather policing users. (for a detailed analysis on this trend, see this study)
“Clean IT” is one such initiative that has recently received a lot of attention, particularly after European Digital Rights (EDRi) published a leaked draft document in late September. Funded by the European Commission’s DG Home Affairs and led by a handful of European law enforcement authorities, the initiative seeks to “Reduce the impact of the terrorist use of the Internet” through voluntary self-regulatory measures. The proposals urge the participating companies to ban unwanted content and activity through their Terms of Service.
The document contains a laundry list of dangerous policies, including (but not limited to):
- implementation of "blocking" or filtering systems on social networks;
- allowing law enforcement authorities to take down content "without following the more labour-intensive and formal procedures for 'notice and action'";
- heightened liability for ISPs for not making "reasonable" efforts to use monitoring to identify "terrorist" use of the Internet;
- legally binding rules enforcing "real name" policies.
This is, unfortunately, only one of the many troubling initiatives currently underway. A parallel initiative, called the “CEO Coalition to make the internet a better place for kids”, has been discussing many of the same elements in Clean IT, including rules on notification and removal of content, upload filters, and monitoring of networks. One of the greatest takeaway from the Clean IT discussion is that this dangerous trend is rearing its ugly head in a variety of policy forums, dialogues, initiatives, or whatever they may be called. It’s further worth noting, that according to the 2003 Inter-institutional agreement, self-regulatory initiatives will not be applicable when fundamental rights are at stake. Thanks to EDRi’s leak, the public has been informed on the status of what is now being called merely “food for thought” by the members of Clean IT, as it is difficult to see how such measures could be applicable as many of them would restrict fundamental rights such as freedom of expression in the online environment.
The dangers of “self-regulation”
Coercing industry to adopt measures “voluntarily” is a tactic frequently used by the copyright industry and their allies. And it’s not only governments that are putting the pressure on. Last year IFPI, the International Federation of the Phonographic Industry, sued Eircom, Ireland’s largest ISP, to force them to “voluntarily” adopt a “3 strikes” policy (for accusations, not convictions). Even after the Irish Data Protection Authority appealed this ruling, IFPI came out triumphant.
We have also seen the same types of measures found in dangerous intellectual property treaties such as ACTA and the TPP (Trans-Pacific Partnership Agreement). Companies are increasingly being pushed into playing the role of “judge, jury and executioner,” deciding what content is legal, illegal, or just plain unwanted on their platforms or networks. In addition to circumventing the rule of law, these types of measures threaten fundamental rights like free expression, access to information, and privacy.
As outlined in the International Covenant on Consumer Rights (ICCPR), the European Convention on Human Rights, and the European Charter of Fundamental Rights, any restrictions on fundamental rights must be done through judicial processes and not carried out by private corporations through voluntary, ad hoc, agreements.
Access’ Recommendations to the Commission
Access has submitted the following guidelines to the Commission to ensure that any voluntary initiatives are appropriate, credible, and do not restrict fundamental rights (as stipulated in the 2003 Inter-institutional Agreement). The following is a summary of our recommendations. To read the submission, go here.
Upfront” consensus that measures adopted will not restrict fundamental rights
Ensuring consideration of the value for the end-user
Evidence and research based dialogues
Conditions of involvement of public authorities
Clear, open and formal process for stakeholder engagement