UK ICO publishes guidance for platforms when moderating online content

The ICO has published its first piece of guidance on content moderation, which sets out the data privacy obligations that organisations need to consider.

06 March 2024

Publication

On February 16, 2024, the UK Information Commissioner's Office (the "ICO") published its first piece of guidance on content moderation. The guidance forms part of the ICO's collaboration with the UK Office of Communications ("Ofcom").  It is the first in a series about online safety, which will be updated to reflect technological developments and Ofcom's finalised online safety codes of practice issued in connection with the UK Online Safety Act 2023.

The first piece of guidance sets out the data privacy obligations organisations need to be aware of when they process personal data using content moderation technologies and processes.

Purpose of the guidance and intended audience

The guidance outlines how UK data protection law applies to content moderation and the impact it can have on information rights. The purpose of the guidance is to help organisations subject to the UK Online Safety Act 2023 ("OSA") to comply with data protection law as they carry out content moderation in order to meet their online safety responsibilities and duties. The guidance covers areas such as:

  • assessing and mitigating data processing risks;
  • conducting lawful content moderation;
  • transparency regarding content moderation; and
  • ensuring data minimisation in content moderation.

The guidance is aimed at organisations who are carrying out content moderation to meet their obligations under the OSA and focuses on moderation of user-generated content on user-to-user services. However, the guidance also applies to organisations who are carrying out content moderation for other reasons.

What is content moderation?

The ICO defines content moderation as:

  • the analysis of user-generated content to assess whether it meets certain standards; and
  • any action a service provider takes as a result of this analysis.

Content moderation may be carried out to comply with obligations under the OSA or to enforce an organisation's terms of service (i.e. what content it allowed on a service and how certain types of content are managed).

According to the guidance content moderation encompasses:

  • Content removal - removing content from the service, or preventing it from being published
  • Service bans - banning users from accessing the service, either temporarily or permanently.
  • Feature blocking - restricting a user's access to certain features of a service, either temporarily or permanently.
  • Visibility reduction - reducing the visibility of content. For example, preventing content from being recommended or making content appear less prominently in users' news feeds.

How to carry out content moderation lawfully

The guidance emphasises that moderation systems involve processing personal data and sometimes special category data, and addresses good practice and legal obligations in relation to:

  • Carrying out a data protection impact assessment prior to the processing. This is required because content moderation is likely to involve processing involving new technologies, or the novel application of existing technologies (including AI); combining, comparing or matching personal information obtained from multiple sources; solely automated processing that has a legal or similarly significant effect on the user; or decisions about a person's access to a service based on automated decision-making or use of special category information.

  • Following a data protection by design and default approach when an organisation decides to use a content moderation system.

  • Identifying an appropriate lawful basis from Article 6(1) of the UK GDPR to process the data (and an Article 9 condition for any special category data), and ensuring data is processed fairly and transparently.

  • Only processing personal data in ways that people would reasonably expect and in a matter that could not have unjustified adverse effects on them as well as ensuring that the content moderation systems perform accurately and produce unbiased, consistent outputs.

  • Complying with the UK GDPR's purpose limitation and data minimisation principles.

  • Respecting the data information and rectification rights of the individual users of the online service. If children's personal data is being processed, the ICO directs organisations towards the Childrens Code.

What are the risks of content moderation not being carried out lawfully and in compliance with UK data protection laws?

The ICO warns of the potential outcomes from content moderation actions (such as content removal and restricting a user's access to the online service) being exercised incorrectly, when the decision has been based on inaccurate data relating to the user. Such errors could lead to individuals being wrongly accused of sharing illegal content or unfairly losing their access to the online service. This type of situation may expose the organisation to legal claims from affected individuals seeking compensation, as well as regulatory actions by the ICO.

Although the guidance is not legally binding, the ICO use specific terminology to indicate which parts refer to legislative requirements (by setting out what companies 'must' do), whilst separating those requirements from 'good practice advice' by using the terms 'should' to indicate what the ICO expects companies to do to comply with the law, and 'could' to refer to examples to consider to aid compliance.

Key recommendations for organisations

The guidance sets out recommendations for organisations carrying out content moderation.

  1. Assessing and Mitigating Data Processing Risks: Evaluate and minimise risks associated with data processing activities.

  2. Carrying Out Content Moderation Lawfully: Ensure that content moderation processes comply with data protection laws, such as the UK GDPR and the Data Protection Act 2018.

  3. Ensuring Fair Use of Personal Data: Use personal data fairly and transparently in content moderation processes

  4. Defining Content Moderation Purposes: Clearly define the purposes for which personal data is used in content moderation activities.

  5. Data Minimisation: Implement measures to minimise the amount of personal data collected and processed during content moderation.

  6. Accuracy of Personal Information: Ensure the accuracy of personal data used in content moderation processes.

  7. Data Retention: Define appropriate timeframes for retaining personal data collected during content moderation activities.

  8. Security Measures: Implement robust security measures to protect personal data used in content moderation from unauthorised access or disclosure.

  9. Controller Responsibilities: Clarify the roles and responsibilities of data controllers in content moderation systems.

  10. Data Protection Rights: Respect individuals' data protection rights, including the right to have inaccurate personal data corrected.

  11. Information Sharing: Establish clear guidelines for sharing information related to content moderation while ensuring compliance with data protection laws.

  12. International Data Transfers: Consider the implications of transferring individuals' personal information outside the UK and ensure compliance with relevant regulations.

  13. Automated Decision-Making: If automated decision-making is used in content moderation, ensure transparency, accountability, and compliance with data protection laws.

If you have any questions or would like to discuss the above, please do get in touch with us.

This document (and any information accessed through links in this document) is provided for information purposes only and does not constitute legal advice. Professional legal advice should be obtained before taking or refraining from any action as a result of the contents of this document.