Commission recommends Member States to fast-track DSA governance
The Commission published a set of recommendations for Member States to coordinate their response to the spread and amplification of illegal content, such as terrorist content or unlawful hate speech, before it can lead to a serious threat to public security.
The aim is for Member States to support the Commission in ensuring full compliance by Very Large Online Platforms and Search Engines with their new obligations under the Digital Services Act (DSA), ahead of the deadline for Member States to play their role in the enforcement of the DSA.
In the context of an unprecedented period of conflict and instability affecting the European Union, first with Russia's war of aggression against Ukraine, and now with the terrorist attacks by Hamas on Israel, the Commission counts on Member States to join forces to enable prompt enforcement of the DSA. The DSA establishes a set of rules for a safe, predictable and trusted online environment in the EU, that is respectful of fundamental rights, in particular the freedom of expression and information. Since August 2023, the DSA requires designated Very Large Online Platforms and Very Large Online Search Engines to adopt mitigation measures that are tailored to the specific systemic risks posed by their systems, including systemic risks arising from dissemination of illegal content.
President of the European Commission, Ursula von der Leyen, said: “Hamas' terrorist attack has also led to an online assault of heinous, illegal content promoting hatred and terror. With our Digital Services Act, Europe now has strong rules to protect users, including vulnerable population groups, from intimidation and to ensure fundamental freedoms online. Major platforms are subject to new obligations to mitigate such risks from their services. Today's recommendation will help us to coordinate our responses with Member States and protect our society.”
Coordinating action to tackle illegal content
With the Recommendation, the Commission is encouraging Member States to designate already now an independent authority to be part of a network of prospective Digital Services Coordinators, ahead of the legal deadline of 17 February 2024.
The Commission is proposing an incident response mechanism that outlines the cooperation between the Commission and that network in response to dissemination of illegal online content, in particular where it poses a clear risk of intimidating groups of population or destabilising political and social structures in the Union. The mechanism would include regular incident response meetings to discuss good practices and methodologies, and regular reporting on and exchange of information collected at national level. The information received from the network may provide the Commission with evidence to exercise its supervisory and investigatory powers pursuant to the DSA.
Where extraordinary circumstances – such as an international armed conflict or terror attacks – justify it, the Commission encourages Very Large Online Platforms and Very Large Online Search Engines to draw up incident protocols relevant to the specific incident.
The Recommendation also recalls powers conferred to Member States by the different instruments under European Union law to tackle illegal content, such as the Regulation on addressing the dissemination of terrorist content online, in force since June 2022.The Commission will continue to rely on existing structures, particularly for counterterrorism, such as the EU Crisis Protocol which coordinates responses to online developments stemming from a terrorist or a violent extremist act; and, at international level, the Christchurch Call and the industry-led Global Internet Forum to Counter Terrorism; to secure joined-up actions.
Next Steps
This Recommendation will apply until 17 February 2024. After that date, the enforcement framework established in the DSA will apply fully, including the Board for Digital Services, which will be composed of independent Digital Service Coordinators of the Member States.
Background
At the end of August 2023, the DSA became legally enforceable for designated Very Large Online Platforms and Very Large Online Search Engines. The DSA aims at empowering and protecting users online, by requiring the designated services to assess and mitigate their systemic risks and to provide robust content moderation tools.
The designated platforms have now completed the first annual risk assessment exercise to examine risks such as how illegal content might be disseminated through their services. The DSA requires Very Large Online Platforms and Very Large Online Search Engines to adopt mitigation measures that are tailored to the specific systemic risks identified. Very Large Online Platforms have to assess the risks their systems pose, including systemic risks about illegal content and for protecting public interests.
Under the DSA, Member States have to designate the Digital Services Coordinator, an independent authority to supervise the compliance of the online services established on their territory, by 17 February 2024. The independent authority designated by the Member States under this Recommendation may assume the role of the Digital Services Coordinator, according to the DSA, in due time.