Digital Services Act (DSA): Everything you need to know

What does the Digital Services Act (DSA) regulate?

The Digital Services Act (DSA) is a European regulation that came into force in October 2022. The aim of the DSA is to establish clear rules for online services in order to increase security, transparency and fairness in the digital space. It is part of the broader Digital Services Package, which also includes the Digital Markets Act (DMA).

When did the Digital Services Act (DSA) come into force?

The DSA regulations have been binding in all EU member states since February 17, 2024 and affect all platforms and online services, regardless of their size. The regulations for very large online platforms and search engines (e.g. Google, Amazon, Meta) already came into force in May 2023.

In Germany, the DSA was transposed into national law by the Digital Services Act (DDG). This came into force on May 14, 2024.

Who is affected by the Digital Services Act (DSA)?

The DSA is aimed at all digital platforms operating in the EU, including:

  • Social networks (e.g. Facebook, Instagram)
  • Online marketplaces (e.g. Amazon, eBay)
  • Search engines (e.g. Google, Bing)
  • Hosting services (e.g. by web hosters, cloud providers, web designers, agencies)
  • Intermediary services (e.g. DNS services, VPN)

Special requirements apply to platforms with more than 45 million active users in the EU, the so-called Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs), such as Google, Amazon or TikTok.

Objectives of the Digital Services Act (DSA)

The DSA aims to create a trustworthy online environment in which illegal content is removed more quickly and users' fundamental rights are protected.

The most important goals are:

  • User protection: faster removal of illegal content, regulation of harmful goods and services.
  • Transparency: Clear information on algorithms and content (e.g. personalized advertising).
  • Responsibility: Obligation to react quickly to problematic content.
  • Fair competition: combating disinformation and promoting a healthy digital ecosystem.

What obligations arise from the Digital Services Act (DSA)

The obligations depend on the type and size of the platform. The DSA defines four levels of regulation:

1st level: Mediation services (Art. 11 to 15 DSA)

The first level of regulation concerns intermediary services, which is understood to mean any company that offers information society services, for example through pure transmission or caching. This covers any provider that provides electronic services on the Internet at the individual request of a recipient in return for payment.

Examples are Registration authorities, certification authorities, providers of voice telephony, WLAN, VPN, DNS services, content delivery networks, reverse proxies.

The resulting obligations are:

  • Designation of a centralelectronic contact point for communication with national and European authorities, courts and users.
  • Intermediary services based in a third country must also appoint a legal representative in one of the Member Statesin which they offer their services.
  • General Terms and Conditions (GTC) must contain information on all guidelines, procedures, measures and applications used to moderate content. Furthermore, the respective complaints management system and the option to terminate the use of the service must also be included in the GTC.
  • Content moderation must be reported on once a year in a publicly accessible manner. Only micro and small enterprises (fewer than 50 employees, an annual turnover of less than 10 million euros and which are not considered very large online platforms within the meaning of Art. 33 DSA) are exempt from this requirement.

2nd level: Hosting service providers (Art. 16 to 18 DSA)

The second level of regulation concerns hosting providers, which includes services that store information provided by the user on their behalf.

Examples are: Web hosting services, cloud computing services, hosting resellers.

These hosting providers have additional obligations:

  • They must set up a procedure for reporting legal infringements (including copyright or trademark infringements). Users must be able to report problematic content and the platforms must check these reports quickly and act accordingly.

Stage 3: Online platforms (Art. 19 to 32 DSA)

The third level of regulation concerns obligations for online platforms. This includes hosting services that not only store information but also disseminate it publicly.

Examples include: Online marketplaces, social media platforms, app stores and websites with a comment function, provided this is not a completely subordinate secondary function, as is the case with newspapers, for example.

The resulting obligations are:

  • Online platform providers must set up a free, electronic, internal complaints management system to enable users to complain about decisions and measures taken by the provider. To this end, qualified employees must be deployed to review a complaint and justify the decision made. A purely automated review procedure is not permitted. In addition, there is an obligation to appoint an out-of-court dispute resolution body.
  • The design of misleading user interfaces (so-called "dark patterns") is prohibited, as an autonomous and informed decision by the user is then no longer possible or is prevented. This includes, for example, hidden subscriptions or unclear opt-out options.
  • Online platforms that make use of online advertising must provide their users with information about the parameters that are decisive for the selection of advertising for the respective user.
  • Online platforms that offer a marketplace for users and businesses must meet the requirements of Art. 30 DSA in order to make the initiation of contracts between users and businesses more trustworthy and secure.

4th level: Very large online platforms and online search engines (Art. 33 to 43 DSA)

The fourth level of regulation concerns the aforementioned "Very Large Online Platform" (VLOP) and "Very Large Online Search Engines" (VLOSE). Examples include Google, Amazon, Meta (Facebook/Instagram) and TikTok.

These platforms have the strictest transparency obligations and stricter requirements for their risk and crisis management in accordance with Art. 33 ff DAS, such as annual risk assessment, risk mitigation measures, regular compliance audits and the establishment of a compliance department.

Penalties for violations of the Digital Service Act (DSA)?

Violations of the DAS and the DDG can result in considerable fines. Fines of up to 300,000 euros can be imposed. Legal entities with an annual turnover of more than 5 million euros can even be fined 6% of their annual turnover.

What do companies need to do now?

If you store content for third parties or offer online commenting features, you may be required to do so under the Digital Services Act:

  • design your legal texts (general terms and conditions, imprint) to be DSA-compliant.
  • Establish processes for dealing with illegal content.
  • Provide a reporting system for users.

Our services for you:

  • Creation of a DSA-compliant imprint.
  • Creation of content moderation guidelines for your GTC.
  • Creation of user notification forms including data protection notices.
  • Creation of internal guidelines for dealing with reports.
  • Provision of external reporting points

Frequently asked questions (FAQ)

What is the Digital Services Act (DSA)?

The DSA is an EU regulation on the regulation of digital services and platforms.

Who is affected by the DSA?

All digital platforms and services operating in the EU.

What are the penalties for violations?

Up to 6% of global annual turnover or national fines.

If you have any questions or require individual advice, please contact us - we will help you to implement the DSA in a legally compliant manner!