European Ombudsman surveillance

Terrorist Content Regulation: the fight for fundamental rights isn’t over

When the European Parliament (EP) adopted the Report on the proposed Terrorist Content Regulation on 17 April 2019, digital rights advocates breathed a temporary sigh of relief. The EP successfully addressed many (but not all) serious fundamental rights concerns that stemmed from the original draft proposed by the European Commission in 2018. However, the ongoing trialogue negotiations between the European Commission, the European Parliament, and the EU Member States have so far produced worrisome outcomes that significantly depart from the EP position. 

As the negotiations reach their conclusion, the fundamental rights of online users are once again in danger. This Friday, shadow rapporteurs will meet to discuss the most recent text of the Regulation proposed by the German presidency of the Council of the EU (the Presidency text). 

European Digital Rights (EDRi) and Access Now have closely reviewed the Presidency text. While we welcome a number of proposed changes, serious risks for fundamental rights remain in the text. We therefore urge the Rapporteur leadings the trilogue negotiation to seriously consider the following points and maintain a position that is in line with human rights standards and the interest of users online:

I. Stick to the narrow definition of online terrorist content 

The proposed definition of online terrorist content by the Presidency text includes vague terminology, such as supplying information or material resources, funding terrorist activities in any way, or otherwise supporting criminal activities. Such a broad category of criminal offence should not be included in the scope of the definition. A vague scope will always lead to overreach and human rights abuses.

II. Fix the broken cross-border mechanism

According to the current proposed cross-border mechanism, the removal orders should have extraterritorial effect. In practice, this means that any Member State can issue a removal order to any online platform that has legal establishment in the EU. This would allow more authoritarian governments with weak rule-of-law protections to determine what stays online and what should be removed for the whole EU. Removal orders with pan-European effects that do not contain any built-in human rights safeguards are in direct conflict with the principle of legality and territoriality.

III. Exclude any possibility of imposing the mandatory use of automated measures 

Automated measures make profound mistakes in assessing the societal, political, and historical context of a piece of content. To use more technical language, they are prone to so-called false positives, which means that they systematically suppress legitimate content and unjustifiably curtail the right to freedom of expression and information of online users. If the use of proactive measures in any shape or form is imposed on online platforms, the obligation will increase the number of swift removals without appropriate assessment of the content’s legality and will ultimately result in general monitoring of user-generated content.

IV. Remove the mechanism of referrals from the text once and for all

Despite significant criticism by experts in the field, referrals remain in the current German Presidency text . When receiving referrals, a company must decide to remove the content solely on the basis of its terms of service. Access Now has repeatedly underlined that it is the positive obligation of Member States to prevent unjustified interference by private actors with users’ fundamental rights . Online platforms should not replace the vital competence of states and act as quasi-judicial bodies.

V. Do not enforce unduly short timeframes for content removal 

While the recent Presidency text prolongs the one-hour timeframe for compliance with removal orders to 12 hours, in practice this will not deliver a different outcome. Unduly short time frames will still impose a heavy burden on SMEs which will need to ensure constant monitoring of the content shared on their platforms in order to act promptly upon receipt of the removal order. Furthermore, excessively short deadlines will always lead to illegitimate prior restrictions of users’ free expression and to the deployment of context-blind upload filters. ARTICLE 19’s analysis rightly points out that “removals within short time frames can incentivise companies to allocate resources to removal of notices regardless of their severity and to focus on content simply because it has been posted in the last 24 hours.”

VI. Do not give up on independent judicial review 

We oppose the parts of the Presidency text that would undermine the independent judicial authorities’ role to issue removal orders. Such an approach would create strong incentives for national law enforcement agencies to force online platforms to remove notified content with no prior judicial authorisation and without proper legal assessment of the alleged illegality of a piece of content. 

VII. Preserve a journalistic exemption that reflects the diversity of the media landscape 

It is crucial to maintain the exception for certain protected forms of expression, such as educational, journalistic, artistic, and research materials, as well as content shared for raising awareness. However, the journalistic exemption in the Presidency text only covers traditional media publishers, leaving bloggers or civil society organisations outside its scope. The jurisprudence of the European Court of Human Rights (ECtHR) specifically requires a particular caution to safeguard such protected forms of speech and expression.