Margrethe Vestager, European Commission, April 2022: "Platforms should be transparent about their content moderation decisions, prevent dangerous disinformation from going viral and avoid unsafe products being offered on marketplaces."
Governments have been taking steps to hold the tech and social media giants accountable for the content that they host and disseminate, including to limit the spread of illegal content and disinformation. However, the legislation is broad and impacts other businesses, beyond the established platforms. In this article, we consider the key legislation and its likely impact.
1. Key legislation for digital services
In the EU, this includes the Digital Services Act (DSA) and Digital Markets Act (DMA), which form a single set of rules that apply across the EU to create a safer digital space for users of digital services and safeguard a fair and competitive economy on digital markets. The DSA has been directly applicable to all businesses that fall within its scope since 17 February 2024 (with some businesses having had to comply from August 2023). The fines for non-compliance with the DSA may be up to 6% of global turnover (enforced by the European Commission and national Digital Service Coordinators). Under the DMA, certain businesses designated by the European Commission as “gatekeepers” had until 6 March 2024 to comply. In addition, the Directive on Copyright in the Digital Single Market (DSM Directive) came into force in 2019. It made certain platforms liable for copyright infringement for unauthorised content posted by their users and granted new rights to press publishers.
In the UK, the Online Safety Act (OSA) became law on 26 October 2023, but many of its obligations will not come into effect until secondary legislation is passed and/or codes of practice and guidance are published by the UK regulator, Ofcom. As set out in its roadmap for implementing the OSA, Ofcom has indicted that the codes of practice will gradually come into force between Spring 2025 and Spring 2026. There are a number of sanctions for non-compliance with the OSA, including fines of up to the greater of £18m or 10% of global revenue.
2. The Digital Services Act
The DSA applies to “intermediary services” where the services are offered to recipients (eg. users) that are located or established in the EU. For most intermediary services, the DSA became directly applicable across EU member states from 17 February 2024. Intermediary services under the DSA include “hosting services”. A subset of hosting services, defined as “online platforms” and “online search engines” were required to publish their average monthly active users in the EU by 17 February 2023. Those with at least 45 million monthly active users in the EU were designated by the European Commission on 25 April 2023 as a “Very Large Online Platform” (VLOP) or a “Very Large Online Search Engine” (VLOSE). VLOPs/VLOSEs were required to comply with their obligations under the DSA much earlier, by 24 August 2023. As at 21 June 2024, the European Commission has designated 22 VLOPs and 2 VLOSEs. Some VLOPs have been publishing details of the improvements to their policies and the steps they have been taking to comply with the DSA. Other businesses, such as Amazon and Zalando, have challenged their designations as VLOPs.
3. Intermediary services
The DSA adopts a tiered approach to the imposition of obligations on intermediary services, depending on the technical function of the service. All intermediary services will be required to establish two points of contact to communicate with the authorities and users; describe in their terms and conditions the restrictions that they impose on the use of their services (including in a way that is understandable to minors if the service is aimed at minors); and publish an annual report on content moderation.
4. Hosting services
A hosting service is a service that stores information provided by, and at the request of, a recipient of the service. For example, a business that provides online file storage to users. In addition to the obligations above that apply generally to all intermediary services under the DSA, hosting services will need to implement notice and take down mechanisms (including redress mechanisms) that allow third parties to notify the service about the presence of allegedly illegal content; provide a statement of reasons to a user whose information is removed or service is suspended/terminated; and to inform national law enforcement/judicial authorities of any information giving rise to suspicions of serious criminal offences involving a threat to the life/safety of persons.
5. Online platforms and online search engines
Online platforms are defined as hosting services which store information and disseminate it to the public at the user’s request, unless that activity is a minor and purely ancillary feature of the service and that feature cannot, for technical reasons, be used without that service. For example, a business that provides a social media network to users. Online platforms/online search engines are subject to further additional obligations under the DSA, including publishing their average monthly active recipients every 6 months, providing a high level of privacy, safety and security for minors; providing transparency on advertisements; explaining in their terms and conditions the main parameters used in their recommender systems; and providing an internal complaint-handling system and out of court mediation for content moderation decisions. Further, online platforms are required to send all of their statements of reasons to the Commission’s DSA Transparency Database, which is publicly accessible. In addition under the DSM Directive, online platforms are already liable for acts of copyright infringement by users, unless the online platform has made “best efforts” to obtain permission from rightsholders and acted diligently to remove any infringing content once notified by rightsholders.
VLOPs and VLOSEs are subject to enhanced obligations under the DSA related to the information available on their platforms, including annual audits, publishing a database of online advertisements, providing at least one recommender system that is not based on user profiling, and publishing their terms and conditions in each of the official languages of the countries they offer services. Since August 2023 (when the DSA came into effect for VLOPs and VLOSEs), the European Commission has sent out formal requests to some of the tech giants (eg. X (formerly Twitter), Meta and TikTok) requesting information on the measures they are taking to comply with their obligations under the DSA. As at June 2024, the European Commission has commenced proceedings against each of X, AliExpress, TikTok and Meta for allegedly failing to comply with its obligations under the DSA. Most recently, the European Commission’s preliminary view is that X’s “verified” blue tick accounts have the potential to deceive users; X does not comply with the required transparency on advertising; and it fails to provide access to its public data to researchers.
6. Online Safety Act
Like the DSA, the UK’s OSA is aimed at providers of online services, including online platforms and search engines, where the services are used by UK users. It differs from the DSA in that providers of such services are required to monitor actively the content made available through the service. Obligations on in-scope providers vary depending on the type of content, who is likely to access the service and whether it falls into additional categories of service which are to be announced by Ofcom and UK Secretary of State. Please see our summary article here for more information.
7. New rights for press publishers
Under the DSM Directive, press publishers have the right to prevent unauthorised reproduction and the making available to the public of their publications online, for a period of 2 years from 1 January after the content’s publication date. However, the right excludes the use of hyperlinks, individual words or very short extracts. In response to the press publishers right in the EU, Google has adopted an online contracting solution to reach license agreements with publishers to pay for their content.
8. Gatekeepers
Under the DMA, so-called “gatekeepers” (i.e., companies that have a durable and entrenched position in the market) are subject to a series of prohibitions and obligations. While a number of these affect the way in which gatekeepers allow access to other market actors, other obligations affect the way in which gatekeepers disclose content. For example, gatekeepers must not treat more favourably, in ranking and related indexing and crawling, its own services and products compared to similar services or products of a third party. In the UK, the Digital Markets, Competition and Consumers Act 2024 (“DMCCA”) received Royal Assent on 24 May 2024 and it creates a new regime to increase competition in digital markets by conferring powers and duties on the Competition and Markets Authority (CMA), with significant fines for non-compliance. The CMA is also given new powers to investigate and directly enforce consumer protection law with fining powers akin to those available under competition rules, where monetary penalties of up to 10% of global turnover (or in the case of individuals, up to £300,000) can be imposed. In addition, the DMCCA empowers Government to confirm commercial practices as automatically unfair and prohibited outright, and this will target practices such as so-called “drip-pricing,” the facilitation of fake online reviews whilst addressing transparency of mandatory fees and subscription “traps.” Please see our summary article here for more information.
9. Safety of children
The needs of specific user groups are a key focus of the increasing accountability being asked of digital platform providers by regulators. In particular, regulators are keen to ensure that children are safe in their use of online services. To that end, under the OSA, providers of relevant services must assess whether children can access their service; and, if so whether a significant number of children will use, or be attracted to, the service. Where significant numbers of children are likely to access the service, the service provider has safety duties to those children. These duties include ensuring the service is safe for children including by implementing safety measures and carrying out regular risk assessments of: the relevant services; the content of those services; the use patterns of those services; and the protections designed into the Services. Ofcom has published draft guidance on how such risk assessments should be implemented and assessed. In addition, the ICO has published a Children's Code to support children having age-appropriate protection online. Alongside this Code, the ICO has set out its opinion on the interaction of requirements of the Code and the OSA duties.
10. Increasing scrutiny from UK and European data protection authorities
It has been over five years since the General Data Protection Regulation (GDPR) came into force which put greater focus on controllers of personal data on how and why they process personal data. Regulators have increasingly begun to take action against online platforms, particularly on the use of personal data for advertising (essentially mandating that consent is the only lawful basis for processing personal data for behavioural advertising). Ad revenue is typically a key revenue source for online platforms and a way providers have kept their services free for users. This is likely to cause online platforms to consider their business models and consider alternative sources for revenue generation.
Found this article useful? Read others in our TechNotes series






.jpg?crop=300,495&format=webply&auto=webp)






