On December 15, 2020, the European Commission published its proposed Regulation on a Single Market for Digital Services, more commonly known as the Digital Services Act (“DSA Proposal”).  In publishing the Proposal, the Commission noted that its goal was to protect consumers and their fundamental rights online, establish an accountability framework for online services, and foster innovation, growth and competitiveness in the single market.  On the same day, the Commission also published its proposal for a Digital Markets Act (“DMA”), which would impose new obligations and restrictions on online services that act as “designated gatekeepers” (see our analysis of the DMA Proposal here).

I. Obligations on different categories of digital service providers

The DSA Proposal would impose new obligations on four distinct categories of online services, according to their role, size, and impact.  The categories are:

  1. Intermediary services, defined as “hosting”, “mere conduit” and “caching” services (art. 2(f));
  2. Hosting services, defined as a “service that consists of the storage of information provided by, and at the request of, a recipient of the service” (art. 2(f));
  3. Online platforms, defined as a hosting service that, “at the request of a recipient of the service, stores and disseminates [information] to the public” (art. 2(h)); and
  4. “Very large” online platforms, defined as “online platforms which provide their services to a number of average monthly active recipients . . . in the Union equal to or higher than 45 million” (art. 25(1)-(4)).

Each category of services is a subset of the one that precedes it and must comply with all the obligations that apply to it and those that apply to the preceding categories.  Thus, for instance, “online platforms” must also comply with the obligations that apply to “hosting services” and to “intermediary services.”

As discussed in Part II, providers’ compliance with their DSA obligations would be overseen and enforced by new national “Digital Services Coordinators” (“DSCs”) in each Member State.  Although the details of these obligations are too extensive to list here, they include the following:

A.                Obligations on intermediary services

The DSA largely restates the safe harbors from liability for intermediary services—namely, caching services, mere conduits, and hosting services—set out in Articles 12-14 of the EU E-Commerce Directive, 2000/31/EC. However, under the DSA, the safe harbor for hosting services will not apply to online platforms that allow consumers to engage in an e-commerce transaction where the consumer reasonably believes that the product, information, or service at issue is being offered by the platform itself, or a trader under its control (art. 5).

In addition, intermediary service providers would need to comply with Member State judicial and administrative orders to remove illegal content, and to provide information about specific users of the service (arts. 8 and 9). These providers would also need to disclose, in their terms of use, any restrictions they impose on user content (e.g., offensive or harmful but legal content), and any “policies, procedures, measures and tools” they use for content moderation (art. 12).  Finally, intermediary service providers would need to publish annual reports with data on aspects of their content moderation activities, including the number of orders they received and how they responded (art. 13).

B.                 Obligations on hosting services

Hosting services would need to do essentially two things (in addition to complying with the obligations on intermediary services).  First, they would need to establish a mechanism through which users could notify them of suspected illegal content on their services (art. 14).  Second, they would need to inform users when they remove their content, and must give such users a statement of reasons for the decision.  Providers also would need to publish their content removal decisions, along with the statement of reasons, in a publicly accessible database managed by the Commission (art. 15).

C.                 Obligations on online platforms

The DSA Proposal sets out significantly more obligations on “online platforms”—which, again, the DSA defines as a hosting service that, “at the request of a recipient of the service, stores and disseminates [information] to the public” (art. 2(h)).  These include obligations to:

  • Establish an internal complaint-handling procedure for users whose content has been removed, including where their service or account has been suspended or terminated (art. 17).
  • Allow users to challenge such decisions before an out-of-court dispute settlement body that has been certified by a DSC (art. 18).
  • Give priority review to notices of illegal content issued by “trusted flaggers”—entities designated by DSCs based on their particular expertise to detect and identify illegal content (art. 19).
  • Suspend users that frequently post “manifestly illegal content,” and individuals that frequently submit “manifestly unfounded” notices or complaints (art. 20).
  • Notify national law enforcement or judicial authorities where the platform suspects that a crime “involving a threat to the life or safety of persons” has taken or will take place (art. 21).
  • Demand identity-verifying information from traders that offer goods or services on the platform, and suspend traders that fail to provide the information (art. 22)
  • Provide basic transparency about advertisements that appear on the platform, including the identity of the person or entity on whose behalf the ad is displayed (art. 24)
  • Provide a more enhanced version of the annual transparency reports required by article 13, as well as filing reports every six months with their respective DSC on the average monthly active recipients of the service in each member state (art. 23)

D.                Obligations on very large platforms

The DSA Proposal reserves the most onerous obligations for “very large online platforms”—i.e., those with over 45 million average monthly active users in the EU.  These platforms would need to conduct risk assessments relating to illegal content on their services (art. 26); implement mitigating measures for any risks identified (art. 27); arrange for annual independent audits on their compliance with the DSA (art. 28); describe the “main parameters” of any “recommender systems” they use with respect content on their services (art. 29), and provide enhanced transparency about ads appearing on their services (art. 30).

Very large online platforms would also need to provide their DSC or the Commission with access to internal data (art. 31), assign an internal compliance officer (art. 32), and publish extensive transparency reports every six months (art. 33).  Finally, very large online platforms would be subject to “enhanced supervision” by DSCs (arts. 50-66).

II. Oversight and enforcement regime

The DSA Proposal also introduces an extensive oversight and enforcement regime for covered services.  In addition to requiring Member States to appoint a DSC, the Proposal would establish a European Board for Digital Services (“EBDS”), made up of all of the individual Member State DSCs (arts. 47-49).  The Proposal authorizes DSCs to impose fines of up to six percent of a service’s annual global turnover for violations of the Act (art. 42).

*  *  *  *  *

The DSA Proposal is the first step in the EU legislative process, which includes the European Parliament and the Council of the European Union, both of which can amend the Commission’s proposal.  The team at Covington will continue to monitor developments in this space and stand ready to answer any questions that companies may have.

Reach out to a member of our technology, privacy, antitrust and policy teams with questions.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Lisa Peets Lisa Peets

Lisa Peets leads the intellectual property and technology and media groups in the firm’s London office. Ms. Peets divides her time between London and Brussels, and her practice embraces legislative advocacy, trade and IP enforcement. In this context, she has worked closely with…

Lisa Peets leads the intellectual property and technology and media groups in the firm’s London office. Ms. Peets divides her time between London and Brussels, and her practice embraces legislative advocacy, trade and IP enforcement. In this context, she has worked closely with leading multinationals in a number of sectors, including many of the world’s best-known software and hardware companies.

On behalf of her clients, Ms. Peets has been actively engaged in a wide range of law reform efforts in Europe, on multilateral, regional and national levels. This includes advocacy on EU and national initiatives relating to e-commerce, copyright, patents, data protection, technology standards, compulsory licensing, IPR enforcement and emerging technologies. Ms. Peets also counsels clients on trade related matters, including EU export controls and sanctions rules and WTO compliance.

In the IP enforcement space, Ms. Peets coordinates a team of lawyers and Internet investigators who direct civil and criminal enforcement actions in countries throughout Europe and who conduct global notice and takedown programs to combat Internet piracy.

Ms. Peets is a member of the European Commission’s Expert Group on reform of the IP Enforcement Directive.

Photo of Dan Cooper Dan Cooper

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing…

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing clients in regulatory proceedings before privacy authorities in Europe and counseling them on their global compliance and government affairs strategies. Dan regularly lectures on the topic, and was instrumental in drafting the privacy standards applied in professional sport.

According to Chambers UK, his “level of expertise is second to none, but it’s also equally paired with a keen understanding of our business and direction.” It was noted that “he is very good at calibrating and helping to gauge risk.”

Dan is qualified to practice law in the United States, the United Kingdom, Ireland and Belgium. He has also been appointed to the advisory and expert boards of privacy NGOs and agencies, such as Privacy International and the European security agency, ENISA.

Photo of Mark Young Mark Young

Mark Young, an experienced tech regulatory lawyer, advises major global companies on their most challenging data privacy compliance matters and investigations.

Mark also leads on EMEA cybersecurity matters at the firm. He advises on evolving cyber-related regulations, and helps clients respond to…

Mark Young, an experienced tech regulatory lawyer, advises major global companies on their most challenging data privacy compliance matters and investigations.

Mark also leads on EMEA cybersecurity matters at the firm. He advises on evolving cyber-related regulations, and helps clients respond to incidents, including personal data breaches, IP and trade secret theft, ransomware, insider threats, and state-sponsored attacks.

Mark has been recognized in Chambers UK for several years as “a trusted adviser – practical, results-oriented and an expert in the field;” “fast, thorough and responsive;” “extremely pragmatic in advice on risk;” and having “great insight into the regulators.”

Drawing on over 15 years of experience advising global companies on a variety of tech regulatory matters, Mark specializes in:

  • Advising on potential exposure under GDPR and international data privacy laws in relation to innovative products and services that involve cutting-edge technology (e.g., AI, biometric data, Internet-enabled devices, etc.).
  • Providing practical guidance on novel uses of personal data, responding to individuals exercising rights, and data transfers, including advising on Binding Corporate Rules (BCRs) and compliance challenges following Brexit and Schrems II.
    Helping clients respond to investigations by data protection regulators in the UK, EU and globally, and advising on potential follow-on litigation risks.
  • GDPR and international data privacy compliance for life sciences companies in relation to:
    clinical trials and pharmacovigilance;

    • digital health products and services; and
    • marketing programs.
    • International conflict of law issues relating to white collar investigations and data privacy compliance.
  • Cybersecurity issues, including:
    • best practices to protect business-critical information and comply with national and sector-specific regulation;
      preparing for and responding to cyber-based attacks and internal threats to networks and information, including training for board members;
    • supervising technical investigations; advising on PR, engagement with law enforcement and government agencies, notification obligations and other legal risks; and representing clients before regulators around the world; and
    • advising on emerging regulations, including during the legislative process.
  • Advising clients on risks and potential liabilities in relation to corporate transactions, especially involving companies that process significant volumes of personal data (e.g., in the adtech, digital identity/anti-fraud, and social network sectors.)
  • Providing strategic advice and advocacy on a range of EU technology law reform issues including data privacy, cybersecurity, ecommerce, eID and trust services, and software-related proposals.
  • Representing clients in connection with references to the Court of Justice of the EU.
Stacy Young

Stacy Young is a trainee solicitor who attended the University of Law.