The Digital Services Act: your guide to the EU’s new content moderation rules

After three years of intense negotiations, the EU Digital Services Act has finally arrived. On 5 July 2022, the European Parliament approved the final text of the future EU content moderation rule book. The DSA has been labeled by some as a “gold standard” for content and platform governance in the EU. In this blog post, we unpack what this long-awaited law contains and what changes people can expect to see as the law becomes applicable in 2024. 

What is the Digital Services Act? 

The Digital Services Act is the new EU law that aims to limit the spread of illegal content online. It establishes a new set of obligations for private actors with the aim to create a secure and safe online environment for all. It is the first time in the history of EU platform governance regulation that people’s fundamental rights are put at the forefront.   

Who does it apply to?

The DSA applies to hosting services, marketplaces, and online platforms that offer services in the EU. The DSA will apply to all providers regardless of their place of establishment. In other words, if individuals are residing in the EU they will fully benefit from the DSA’s protective scope. Hence, the DSA is putting people first by default and not corporate interests. 

New sets of obligations are designed as “made to measure”: the DSA singles out Very Large Online Platforms (VLOPs) as well as Very Large Online Search Engines (VLOSEs), which are platforms with more than 45 million average monthly active users in the EU. This way, the law rightly recognises the platforms’ specific control over public discourse and the often manipulative influence they have on people’s behaviour online.        

What does the law change in the EU? 

The DSA brings significant outcomes that will safeguard fundamental rights online. 

  • Legally binding transparency and algorithmic accountability: Enhanced transparency rules require online platforms to disclose the number of removal orders issued by national authorities and all notices about the presence of illegal content trusted flaggers submit or obtain by automated means.The DSA requires all online platforms to publicly report on how they use automated content moderation tools, the tools’ error rates, and information about training and assistance they provide to their content moderators.
  • Harmonised response to illegal content online: For the first time in EU history, the DSA sets forth unified criteria for so-called notice-and-action procedures, the system that determines when and if platforms should be held liable for dissemination of illegal content. The law maintains and enhances an important legal principle. The conditional model of intermediary liability defines how online platforms should act when they detect illegal content. The DSA brings more clarity and certainty regarding how to do this diligently and in good faith. Second, the DSA reinforces the prohibition of general content monitoring, which keeps a distinction between knowing about specific illegal content and trying to remove that on the one hand, and scanning everything to fish for any and every piece of illegal content on the other.
  • No to “dark patterns” (deceptive design) —at least to some extent: In a win for people’s rights and online experience, the DSA has a measure addressing deceptive design. On paper, it should prevent all online platforms from designing and operating their interface design in a deceptive and manipulative way. While this is an important protection intended to ensure people can make “free and informed decisions”, the final wording does not add much to already existing standards in consumer protection and data protection rules.
  • Ban on targeting and amplification using special categories of sensitive data: Another landmark measure in the DSA is its strict regulation of online advertising. Together with our partners, we have been fighting against surveillance-based advertising that exploits people’s vulnerabilities and results in serious human rights abuse. The DSA goes beyond pure transparency requirements on targeting. It establishes a ban on advertising based on profiling and using special categories of sensitive data (e.g. sexual or political orientation). This is a real turning point that opens the pathway to effectively tackle surveillance-based advertisement in the future. It is important to acknowledge that this is only a partial victory because there are targeting techniques that are not based on sensitive data that remain highly intrusive.
  • More control over information flow to people: In the current online ecosystem, it is close to impossible for individuals to understand how and why content is being distributed to them. The DSA obliges all online platforms to disclose parameters of their content recommender systems to explain why people see some information more regularly than others. This information should be easily accessible via their terms of service. Importantly, people will have the right to modify content recommender systems and to have access to at least one option that is not based on profiling.
  • Special due diligence obligations for VLOPs: Probably the most novel and groundbreaking element of the DSA is the duty-of-care obligation for VLOPs that include (among others): mandatory risk assessment; deployment of mitigation of risk measures; and obligation to subject themselves to independent audits. The due diligence chapter of the DSA specifically recognises systemic risks for fundamental rights stemming from VLOPs’ systems and operations. However, the effectiveness of these measures will be determined by future delegated acts and guidelines that are yet to be drafted by the Commission.
What does this mean for individuals? 

If done right and enforced effectively, the DSA will safeguard peoples’ rights to freedom of expression and information, freedom of thought, and the right to form an opinion freely without manipulation. The EU stood by its promise and kept fundamental rights protection in the core of the future regulation. 

The DSA brings essential safeguards that will enable individuals to better access effective remedies. For instance, all intermediaries will be obliged to establish a single point of contact for direct communication between them and their users. If they restrict user-generated content, they will be obliged to provide a statement of reason explaining what type of action was taken and on what basis. The DSA foresees a three-tiered grievance mechanism: internal complaint-handling system provided by platforms free of charge; out-of-court dispute settlement; and judicial redress that must be always available. It also establishes the right to lodge the complaint and to collective redress. While all these measures look convincing on the paper, their mutual complementarity and practical enforcement remains to be demonstrated.           

What are the next steps and when is the law becoming applicable?

The DSA will be directly applicable across the EU 15 months after its adoption or from 1 January 2024 at the latest. Regarding the VLOPs and VLOSEs, the DSA will apply from an earlier date: four months after their designation, which is expected in the fall of 2022. This means that from that date, all platforms will have to report the number of average monthly active users of their services in order to determine whether they qualify as VLOPs.  

Conclusion: What role did Access Now play in this four years-long process? 

When the European Commission started to work on this new rulebook we published our DSA position papers back in 2020. We presented a wishlist of what a human rights-centric model of platform governance should look like. Joint actions with European and international coalition partners showed a unified voice in a very complicated and context-dependent topic. We are delighted to see that despite the challenges of how non-transparent and inaccessible the lawmaking process is for civil society, many of those recommendations found their way to the new rules. The final outcome has marks of political compromises but the EU set a precedent for a content moderation rulebook that puts fundamental rights at its center.