EU Commission consults on regulation of online platforms and services

The EU Commission’s consultation on the Digital Services Act package highlights that the regulation of online platforms is back on the agenda of EU legislators.

08 June 2020

Publication

The EU Digital Services Act package

On 02 June 2020, the European Commission (the Commission) launched a public consultation on the Digital Services Act package. The Digital Services Act package proposes increased and harmonised regulation of online intermediaries and platforms, and the content they facilitate, as well as measures to ensure that large gatekeeper platforms do not distort the digital single market. The consultation takes place in the context of increased scrutiny of online platforms in the US and EU following the proliferation of disinformation related to COVID-19, and President Trump’s plans to erode the scope of immunity enjoyed by platforms such as Twitter with an executive order on section 230 of the US Communications Decency Act.

The consultation is part of the Commission’s evidence-gathering exercise to identify issues that may require intervention at an EU level.

What is the Digital Services Act package?

The Commission has not made any substantive changes to the regulation of digital services since the adoption of the E-Commerce Directive (the Directive) in 2000. The Directive harmonised the defences available to online intermediaries for their users’ content, with specific provisions for different services according to their role: from internet access providers, to messaging services, to hosting service providers. In particular, the scope of the hosting defence under Article 14 of the Directive has been scrutinised in numerous cases and references to the CJEU, with the CJEU opining in L’Oréal v eBay that the defence may only apply where the hosting service provider plays a neutral and technical role in the processing of its users’ data (not an active one) such that it has no actual knowledge of illegal activities on its platforms and it acts expeditiously to disable access to infringing content (once notified).

Since the Directive was implemented, the last 20 years has seen rapid evolution and changes in the digital services sector. The Commission is currently proposing two sets of rules to address these changes:

  1. the first set of rules will set out the responsibilities of digital services to address the risks faced by their users and protect users’ rights; and
  2. the second set of rules plans to address perceived imbalances in the current market and the power of large online platforms as gatekeepers.

In its consultation, the Commission has also distinguished online intermediary services from online platforms. The Commission defines online intermediary services as services that consist of the transmission or storage of content made available by third parties and examples include: internet access providers, cloud services and messaging services. Online platforms are digital services that facilitate the interaction of two or more independent persons (individuals or firms) such as e-commerce marketplaces, social networks, search engines, app stores, and online travel and accommodation platforms.

The Consultation

The consultation focuses on six key areas (or modules):

  1. safety and responsibility of users online;
  2. liability of digital service providers as intermediaries;
  3. gatekeeper platforms;
  4. emerging issues such as online advertising and smart contracts;
  5. self-employed individuals and their use of online platforms; and
  6. governance and enforcement.

Safety and responsibility of users

This section is divided into two parts. Firstly, the Commission is seeking evidence from stakeholders regarding the online availability of illegal goods (eg counterfeit goods or illegal medicine), illegal content (eg hate speech, abusive material or content infringing IP rights) and content about other harmful activities (eg bullying or grooming); and the ease with which this material may be reported and removed. This includes the proliferation of disinformation and the risks of removing legitimate content.

Secondly, this section explores the responsibilities and obligations that could be placed on online intermediaries, in order to protect individuals from the issues raised in the first part. The Commission asks intermediaries, amongst other things, what systems they have in place to restrict illegal or harmful activities from taking place on their sites and the issues (and costs) associated with operating these systems.

The Commission is also soliciting views on the types of measures that online intermediaries (in particular platforms) should make available to moderate content. For example, the Commission is interested in the use of automated systems (such as AI and content filters) to detect, remove and/or block illegal content, goods, or user accounts. One high-profile example, particularly in the context of the DSM Directive, is YouTube’s Content ID system that detects copyright infringing content on its platform. This system notifies the copyright holder and allows them to uphold the claim, or in some cases monetise the content. These systems have been seen by some as a possible solution to the difficulties of moderating large platforms. However, the challenges associated with this technology include algorithmic bias and false positives, and a concern that the filters may wrongly pick up non-infringing content.

The Commission has also asked questions regarding the policing of content across multiple platforms and the cooperation of platforms with authorities. This could raise fundamental questions about how content is moderated across multiple online platforms. Different platforms may have different standards for the content they deem to be unacceptable. This was highlighted recently by Facebook and Twitter’s differing responses to President Trump’s posts. A proposal that requires platforms to collaborate when policing content may require the platforms to adopt common standards.

Intermediary Liability

This section requests input on the scope of online intermediary service providers’ liability for their users’ potentially illegal content. The Commission requests views on the appropriate legal framework. In particular, the questions focus on whether:

  • the definitions of mere conduits, caching services and hosting services under the e-Commerce directive are clear enough;
  • the current legal framework dis-incentivises service providers to take proactive measures against illegal activities;
  • the current balancing of risks against different rights and policy objectives, is still appropriate today; and
  • further clarity is needed as to the parameters for general monitoring obligations.

While the Commission and European Courts have previously maintained that intermediaries should not be subject to a general monitoring obligation in respect of the content on their platforms, these questions suggest that the Commission may be re-considering whether this policy continues to be appropriate today. If a greater responsibility is placed on the platforms to monitor the content they facilitate, transmit and store, this may lead to fundamental changes in who will be liable for illegal activities conducted online.

Gatekeeper platforms

The section seeks input from all stakeholders on the issues perceived with large online platforms. In particular, the Commission is interested in “the scope, the specific perceived problems, and the implications, definition and parameters for addressing possible issues deriving from the economic power of large, gatekeeper platforms”. The Commission is soliciting views on those large platforms that integrate multiple services and have the potential to become gatekeepers. Key activities that might define a gatekeeper platform include offering social media services, operating systems for smart devices, search engines, physical logistics, online advertising and cloud services.

The questions also indicate that the Commission is concerned with unfair contractual terms or the unfair practices of very large online platforms. Specifically the Commission asks (among other things) “What practices related to the use and sharing of data in the platforms’ environment are raising particular challenges?”. This part of the consultation also asks stakeholders to consider the new competition powers also under consultation, which we have considered separately here.

Advertising and smart contracts

The Commission requests stakeholders submit their views, data and information relating to the potential issues arising from online advertising and smart contracts. The Commission explicitly requests information relating to disinformation, but excludes data protection concerns. Particular issues reflected in the questions are the transparency of ad placement, as well as the placement of ads next to illegal goods or content and political advertising.

In relation to smart contracts, the Commission simply asks whether there is sufficient legal clarity in the EU for their provision and use.

Self-employed individuals and platforms

This section relates to the so-called gig-economy. The Commission is asking for views and information relating to the self-employed individuals using platforms to offer their services (eg ride-hailing, food delivery, domestic work, design work and microtasks). The Commission specifically excludes information relating to the criteria of these individuals’ legal status. The Commission is interested in “the perceived obstacles to the improvement of the situation of individuals providing services through platforms.” As well as the relationship of these individuals with the platforms, the Commission is interested in the role of the platforms in the provision of services and conclusion of contracts with consumers.

Governance and enforcement

Finally the Commission is looking to gather information on “the current state of the single market and on steps for further improvements for a competitive and vibrant Single market for digital services.” This includes the disruption caused by COVID-19. This section also focuses on governance and oversight of digital services, and cooperation between authorities and regulators across the EU (including consumer protection authorities and media regulators). Finally, the Commission also solicits views on member states establishing more clearly assigned competent national authorities or bodies to supervise the systems put in place by online platforms. If adopted, this may signal the creation of a specific regulator to monitor online platforms.

What should you do next?

The consultation is open to the general public, digital service providers including online platforms, businesses who reach their consumers online, authorities, NGOs, academics and other concerned parties. Online platforms, as well as businesses that rely on online intermediary services or online platforms, should consider responding to the consultation to make their views known to the Commission. Respondents have until 08 September 2020 to submit their responses. Respondents may respond to one, several or all of the modules. The Commission also allows for the upload of a position paper, article, report, or any other supporting evidence and data. On 24 June 2020, the Commission is holding a webinar and open discussion on the proposed rules, facilitated by Irene Roche Laguna, Deputy Head of the Electronic Communications Networks & Services Unit (participants can register here).

Please contact us if you are interested in responding to the consultation and shaping the Digital Services Act.

This document (and any information accessed through links in this document) is provided for information purposes only and does not constitute legal advice. Professional legal advice should be obtained before taking or refraining from any action as a result of the contents of this document.