10 questions about the Digital Services Act

Christiaan Alberdingk Thijm
04 Oct 2022

On 4 October 2022, the European Council adopted the final version of the Digital Services Act (“DSA”).Together with the so-called Digital Markets Act (“DMA“, see here), the DSA forms the basis of new European legislation for the digital economy.

The DSA contains EU-wide rules for online intermediaries, including online platforms and search engines. The DSA intends to update the more than 20-year-old E-Commerce Directive. Indeed, since 2000, digital technologies, business models and services have changed significantly.

The DSA contains important new rules for virtually all online services. However, some platforms and search engines operators are regulated more heavily. The aim of the DSA is, among others, to ensure that illegal online content is addressed quickly and that the fundamental rights of internet users are protected. The DSA aims to combat current digital challenges, such as illegal products, hate speech, disinformation and fake news.

For that purpose, the DSA contains rules, inter alia, on:

  • The liability of intermediary services;
  • Notice and action mechanisms;
  • Content moderation practices;
  • Online advertising, profiling and targeting;
  • The use of algorithms and recommender systems;
  • The traceability of traders; and
  • Systemic risks of very large online platforms and very large online search engines.

The DSA also introduces a new oversight mechanism.

Enough reason, therefore, to take a closer look at this important new regulation, which comprises over 300 pages. What will change with the DSA – and what won’t? What obligations apply to which services? A Q&A on the DSA.

 1)            What services are covered by the DSA?

The DSA contains new rules on the responsibilities and liability of “intermediary services“, or internet intermediaries. The DSA distinguishes between the following four different types of services:

  • Intermediary services, which can either be (i) mere conduit (transmission) services, (ii) caching (temporary storage) services or (iii) hosting services. According to the recitals of the DSA, these services may, inter alia, include online search engines, local wireless networks, DNS services, domain name registers, virtual private networks, cloud services, proxies and webhosting services;
  • Hosting services: services that consist of the storage of information provided by end users;
  • Online platforms: hosting services that, at the request of the user, not only store, but also disseminate information to the public. The latter means that the information, at the request of the user, is made available to a potentially unlimited number of third parties. Online platforms include, inter alia, online market places, social media services, and app stores.
  • Very large online platforms and search engines: online platforms and search engines with more than 45 million monthly active users in the EU. In other words: the Facebooks and Googles of this world.

The obligations with which these services must comply increase gradually. The very large online platforms are therefore subject to the heaviest due diligence obligations.

 2)           What happens to the liability safeguards contained in the E-Commerce Directive?

The liability framework in the E-Commerce Directive remain largely intact. This framework stipulates when an intermediary service cannot be held liable in relation to illegal content provided by the recipients of the service.

The existing liability exemptions for “mere conduit”, “caching” and “hosting” services are incorporated in full in articles 4-6 of the DSA. The prohibition on general monitoring (article 8) also remains in place.

This also means that the existing case law of the Court of Justice of the European Union (“CJEU”) concerning the liability exemptions and the measures that can be imposed on intermediaries, remains guiding. The cases L’Oréal/eBay, Scarlet/SABAM, UPC/Telekabel, McFadden, Eva Glawischnig and YouTube & Cyando thus remain relevant in practice.

At the same time, the DSA clarifies certain elements of the existing framework. One of these clarifications is the introduction of a so-called “Good Samaritan” clause. The fact that a service carries out voluntary own-initiative investigations or takes others measures to combat illegal content, does not lead to that service being ineligible for the exemptions from liability (article 7).

The DSA also makes it explicit that providers of intermediary services must comply with orders issued by judicial or administrative authorities to act against one or more specific items of illegal content (article 9) and to provide information about one or more specific individual recipients of the service (article 10). The service provider must inform the authority issuing the order of the effect given thereto, after which the authority shall transmit the order to the Digital Services Coordinator (see Question 8) from the Member State of the issuing authority. The order will then be shared with all other Digital Services Coordinators.

It is not entirely clear from the DSA whether these orders– stemming from inter alia law enforcement authorities (recital 32)– differ from the orders that can be issued to terminate or prevent an infringement pursuant to the relevant liability clauses, although it looks like they do. Indeed, the DSA stipulates that these orders “shall be without prejudice to national civil and criminal procedural law”.

 3)           What obligations will apply to all intermediary services?

The DSA contains a number of “due diligence” obligations that digital services must comply with. These requirements are proportionate to the size and risks of the service: the greater the service, the greater the responsibilities.

The DSA contains a number of obligations that all intermediary services must comply with, including the obligation to:

  • designate points of contact, both for supervisors and end users (article 11-12). Services established outside the EU must appoint legal representatives (article 13);
  • include information on content moderation, algorithmic decision-making and complaint handling systems in their terms and conditions (article 14);
  • publish public transparency reports with information on content moderation measures taken and the number of orders received from authorities (article 15). Additional reporting obligations apply to hosting providers and (very large) online platforms.

4)           What is “Notice and Action”? And how does it differ from Notice and Takedown?

The E-Commerce Directive dictates that hosting providers must have a so-called Notice and Takedown (NTD) system in place: upon receipt of a notice, there are obligated to remove (takedown) illegal information.

The DSA prescribes “notice and action mechanisms”, meaning that hosting providers should “act” when the receive a notice. Other than under the E-Commerce Directive, the DSA spells out what information a notice must contain. This includes a sufficiently substantiated explanation of reasons, the exact electronic location of the illegal information, and a statement confirming that the notice is made in good faith (Article 16). This system very much resembles the current DMCA-system in the U.S.

From article 17 of the DSA, it can inferred what the required “action” may entail, namely:

  • a restriction on the visibility of specific information, including the removal, disabling access or demotion of content;
  • a suspension, termination or restriction of payments;
  • a suspension or termination of the service; or
  • a suspension or termination of the account of the (alleged) infringer.

The hosting provider is obliged to notify both the user requesting the measures and the affected users of the decision it takes and the reasons therefore (article 17).

What is noteworthy is that the DSA does not contain a specific staydown obligation. In other words, it does not specifically require a hosting provider to prevent the same illegal content from reappearing again, although this may be inferred from the case law of the CJEU.

On the whole, Notice and Action resembles Notice and Takedown, be it that the procedure is made much more administrative under the DSA.

5)           What additional obligations apply to online platforms?

In addition to Notice and Action mechanisms, online platforms must:

  • have in place an effective internal complaint-handling system through which users can lodge complaints following a decision taken with regard to illegal content (article 20);
  • give priority to notices submitted by so-called “trusted flaggers” (article 22): entities with particular expertise and competence for the purposes of detecting, identifying and notifying illegal content. The status of trusted flaggers can be awarded by the Digital Services Coordinator (see Question 8);
  • take measures against repeat infringers (article 23), meaning users that frequently provide manifestly illegal content or frequently submit notices that are manifestly unfounded;
  • refrain from using so-called “dark patterns”: user interfaces that have been crafted to (subtly) trick or manipulate users into doing certain things (article 25);
  • provide transparency regarding online advertising (article 26, also see Question 6 below);
  • ensure that recipients of their service are informed about how recommender systems impact the way information is displayed, and how users can influence how information is presented to them. Platforms should clearly present the parameters used for such recommender systems, including the most important criteria in determining the information suggested to the recipient of the service and the reasons for their respective importance, including where information is prioritised based on profiling and users’ online behaviour (article 27). Very large platforms must offer an option for recommendations that is not based on profiling (article 38);
  • Vet the credentials of business users (article 29), in case the platform allows consumers to conclude distance contracts with traders (KYBC – “know your business customer”). Online platforms must further organize their online interfaces in a way that allows traders to comply with their information obligations towards consumers.

 6)           How does the DSA regulate online advertising?

Online advertising plays an important role in the online environment. The provision of online services is often wholly or in part remunerated though advertising revenues. Indeed, ads are Meta’s and Google’s main source of income.

Online advertising also poses significant risks, ranging from ads that are themselves illegal to the discriminatory presentation of ads with an impact on society (recital 68). For that reason, the DSA contains very important new provisions relating to online advertising, aiming to give online users more control and understanding over the ads they see online. For this purpose the DSA stipulates that:

  • Commercial communication must be clearly identifiable as such (though clear markers) and users will have to be clearly informed, for each specific ad, on whose behalf the advertisement is presented and who paid for the ad (article 26). Moreover, providers of online platforms that present advertisements must also provide “meaningful information” about the main parameters used to determine the recipient(s) to whom the ad is shown and. This includes information on the logic used and information about profiling techniques. This means that services should elaborate on the nature of their advertising activities: is it contextual, what profiling criteria are used? Services should also inform their users about any means available for them to change such criteria.
  • Targeted advertising based on profiling using special categories of personal data, such as sexual orientation or religious or political beliefs, is prohibited (article 26 paragraph 3). This provision thus significant limits services in using targeting techniques to optimize ads to match a user’s interests and potentially appeal to their vulnerabilities.
  • Providers of online platforms should not present advertisements based on profiling using personal data of the recipient of the service when they are aware with reasonable certainty that the recipient of the service is a minor (article 28).

For very large online platforms, the DSA prescribes additional measures to mitigate risks and enable oversight. These services will have to maintain and provide access to ad repositories, allowing researchers, civilians and authorities to inspect how ads were displayed and how they were targeted. Very large online platforms and search engines also need to assess whether and how their advertising systems are manipulated or otherwise contribute to societal risks, and take measures to mitigate these risks (see Question 7).

 7)           Which obligations apply to very large online platforms- and search engines?

Due to the particular risks tech giants such as Facebook, TikTok and Google pose in the dissemination of illegal content and societal harms, these parties are subject to the most stringent due diligence obligations.

  • They must conduct risk assessments to identify systemic risks stemming from the design and use of their services (article 34). Systemic risks include issues such as disinformation, illegal content, election manipulation, manipulation during pandemics and harms to vulnerable groups. In conducting the risk assessment, account must be had to all aspects of the service, including content moderation, advertisement and algorithmic systems.
  • They must prevent abuse of their systems by taking risk-based action, including oversight through independent audits (article 35, 37). These measures must be carefully balanced against restrictions of freedom of expression;
  • They must comply with a new crisis response mechanism, forcing them to act upon instruction of the Commission in cases of serious threat for public health and security crises, such as a pandemic or a war (article 36);
  • When Big Tech platforms recommend content, users must be able to modify the criteria used and be given the option to choose not to receive personalized recommendations (article 38).
  • They must comply with additional online advertising transparency obligations (see Question 6 above), including by offering a publicly available and searchable online register (article 39). This register must in any case include the following information per advertisement: (i) the content of the advertisement, (ii) the advertiser on whose behalf the ad was presented, (iii) the (legal) person who paid for the ad, (iv) the period during which the ad was presented, (v) whether the ad was specifically intended for a particular group of recipients and, if so, the parameters used to define that group and (vi) the number of recipients of the advertisements, broken down by Member State.

 8)           How will the DSA be supervised and enforced?

The DSA foresees in a unique oversight structure. Each Member State will need to appoint a Digital Services Coordinator, an independent authority which will be responsible for supervising the intermediary services established in their territory.

The European Commission will be the primary regulator for very large online platforms and search engines. In the most serious cases, it can impose fines of up to 6% of the global turnover of a service provider.

An EU-wide cooperation mechanism will be established between national regulators and the Commission.

The Digital Services Coordinators will cooperate within an independent advisory group, called the European Board for Digital Services, which shall provide advise to the Digital Services Coordinators and the Commission on matters covered by the Regulation.

 9)           When does the DSA apply?

All online intermediaries offering their services in the EU must comply with the new rules. This is regardless of whether they are established in the EU or not. A provider offers services in the EU if a “substantial connection” to the Union exists. This is the case when a service provider has an establishment in the Union or, in the absence thereof, when the number of recipients of the service in one or more Member States is significant in relation to the population thereof. A substantial connection can also exist on the basis of the targeting of activities towards one or more Member States. This may be derived, for example, from the availability of an application in the national application store, from the provision of local advertising or advertising in a language used in that Member State, or from providing customer service in a language generally used in that Member State.

The mere fact that a website is accessible from the EU, on the other hand, cannot in itself be considered as establishing a substantial connection to the Union.

 10)         When will the DSA enter into force?

Today, the Council formally adopted the DSA, which will now be published in the Official Journal of the EU. The DSA will be directly applicable across the EU after entry into force.

Very large online platforms and search engines will have to comply with the new rules within four months after their designated as such by the Commission.

All the other digital services will be obliged to comply with the DSA by 1 January 2024, or fifteen months and 20 days after the date on which the DSA is published in the Official Journal of the EU, whichever is later.

To
top