Political agreement reached on the Digital Services Act

18 May 2022

On April 23, 2022, the European Parliament and EU Member States reached a political agreement on the Digital Services Act, which the European Commission proposed in 2020. European legislation in this area is obsolete - the area of digital services was last regulated more than 20 years ago, while the proliferation of counterfeit goods, hate speech and cyber threats showed the need to modernize it.

The Digital Services Act introduces stricter rules for the liability of online platforms regarding illegal and harmful content. This will ensure better protection for users and a safer digital environment. The act is now subject to formal approval in the legislative process. Once adopted, it will be directly applicable across the EU and will apply fifteen months after entry into force or from January 1, 2024, whichever later.

The new rules include various online intermediary services, while their obligations depend on their role, size and impact on the online ecosystem. These online intermediary services include: intermediary services offering network infrastructure (Internet access providers, domain name registrars), hosting services (cloud infrastructure, web-hosting services), very large online search engines used by more than 10% of all EU consumers, online platforms (online marketplaces, app stores, social media) as well as very large online platforms (reaching more than 10% of consumers in Europe) - the latter are subject to special rules as they pose specific risks of disseminating illegal content and social damage. Meanwhile, milder rules apply for micro and small businesses.

The new Act contains:

- obligations on the traceability of business users and identification of sellers of illegal goods,
- a ban on certain types of targeted advertising on online platforms,
- the obligation for online platforms to ensure the availability of internal complaint handling systems, which must be user-friendly at the same time,
- online platforms must, for a reasonable period (and after a prior warning has been issued) suspend the provision of their services to those who frequently publish illegal content,
- very large platforms must identify and assess systemic risks once a year,
- intermediary service providers must publish clear and easy-to-understand reports at least once a year on the moderation of content they have carried out over a given period,
- providing verified researchers with access to key data of the largest platforms and search engines in order to understand the development of online risks.

An independent advisory group of Digital Service Coordinators will be established for the supervision of intermediary service providers. Each EU member will have one vote, while the European Commission will have no voting rights.

READ OTHER NEWS

TOP