For the attention of European Commissioner for Internal Market, Thierry Breton,
We, the undersigned organisations, are writing to you in response to four letters you have recently addressed to Meta, X, TikTok, and most recently, YouTube, in relation to the spread of disinformation and illegal content on their respective platforms and in the context of armed conflicts, killings, and other forms of violence in the Gaza Strip and in Israel.
We understand the sense of urgency and timeliness of your action, seeking to ensure that no online content that is illegal under EU or member state laws continues to spread on Very Large Online Platforms (VLOPs) in the EU. It is especially so in times of armed conflicts that the protection of civilians has to be prioritised and protected equally in strict compliance with international humanitarian and human rights law.
We also welcome the European Commission’s diligent efforts to ensure that X fully complies with the DSA. Requesting information about the alleged spreading of illegal content and disinformation on the platforms definitely falls under the mandate of the DSA enforcement team. While we are alarmed by the potential non-compliance of Meta and, in particular, X with the DSA, we are concerned about the interpretation of the law put forward in these letters.
Firstly, the letters establish a false equivalence between the DSA’s treatment of illegal content and “disinformation.”’ “Disinformation” is a broad concept and encompasses varied content which can carry significant risk to human rights and public discourse. It does not automatically qualify as illegal and is not per se prohibited by either European or international human rights law. While the DSA contains targeted measures addressing illegal content online, it more appropriately applies a different regulatory approach with respect to other systemic risks, primarily consisting of VLOPs’ due diligence obligations and legally mandated transparency. However, the letters strongly focus on the swift removal of content rather than highlighting the importance of due diligence obligations for VLOPs that regulate their systems and processes. We call on the European Commission to strictly respect the DSA’s provisions and international human rights law, and avoid any future conflation of these two categories of expression.
Secondly, the DSA does not contain deadlines for content removals or time periods under which service providers need to respond to notifications of illegal content online. It states that providers have to respond in a timely, diligent, non-arbitrary, and objective manner. There is also no legal basis in the DSA that would justify the request to respond to you or your team within 24 hours. Furthermore, by issuing such public letters in the name of DSA enforcement, you risk undermining the authority and independence of DG Connect’s DSA Enforcement Team.
Thirdly, the DSA does not impose an obligation on service providers to “consistently and diligently enforce [their] own policies.” Instead, it requires all service providers to act in a diligent, objective, and proportionate manner when applying and enforcing the restrictions based on their terms and conditions and for VLOPs to adequately address significant negative effects on fundamental rights stemming from the enforcement of their terms and conditions. Terms and conditions often go beyond restrictions permitted under international human rights standards. State pressure to remove content swiftly based on platforms’ terms and conditions leads to more preventive over-blocking of entirely legal content.
Fourthly, while the DSA obliges service providers to promptly inform law enforcement or judicial authorities if they have knowledge or suspicion of a criminal offence involving a threat to people’s life or safety, the law does not mention a fixed time period for doing so, let alone one of 24 hours. The letters also call on Meta and X to be in contact with relevant law enforcement authorities and EUROPOL, without specifying serious crimes occurring in the EU that would provide sufficient legal and procedural ground for such a request.
Freedom of expression and the free flow of information must be vigorously defended during armed conflicts. Disproportionate restrictions of fundamental rights may distort information that is vital for the needs of civilians caught up in the hostilities and for recording documentation of ongoing human rights abuses and atrocities that could form the basis for evidence in future judicial proceedings. Experience shows that shortsighted solutions that hint at the criminal nature of “false information” or “fake news” — without further qualification — will disproportionately affect historically oppressed groups and human rights defenders fighting against aggressors perpetrating gross human rights abuses.
We would like to reiterate our support for a robust enforcement of the DSA. But that enforcement must always follow due process as prescribed by law.
We remain at your disposal for any questions you may have.
- Access Now
- ARTICLE 19
- AI Forensics
- Bits of Freedom
- Centre for Democracy & Technology, Europe Office
- Digital Action
- Digitale Gesellschaft (DE)
- Electronic Frontier Foundation
- Electronic Frontier Finland
- European Center for Not-for-Profit Law (ECNL)
- European Digital Rights (EDRi)
- Foundation The London Story
- Homo Digitalis
- INSM Foundation for Digital Rights
- IT-Pol Denmark
- Justitia/Future of Free Speech
- Masaar-Technology and Law Community
- Mozilla Foundation
- Network in Defense of Digital Rights (R3D)
- Panoptykon Foundation
- Search for Common Ground
- WHAT TO FIX