Data Protection Update - July 2025

Our updates are your trusted source for the latest news, expert analyses, and practical tips on navigating the complex landscape of data protection.

28 July 2025

Publication

Loading...

Listen to our publication

0:00 / 0:00
  • The ICO launches consultation on a new approach to low-risk online advertising

  • EDPB clarifies rules for data transfers in response to non-EU authority requests

  • Italy’s DPA rules double opt-in as critical best practice for valid marketing consent

  • Luxembourg’s CNPD publishes guidelines for payment service providers

  • Abu Dhabi Global Market’s Registration Authority issues consultation on amendments to the Data Protection Regulations 2021

UK

plus

UK parliament approves the Data (Use and Access) Act

We have published a detailed review of the Data (Use and Access) Act 2025, the first significant reform of UK data protection laws since Brexit, which recently received Royal Assent. The article summarises the key changes impacting multiple areas of data privacy compliance and governance and includes considerations for businesses in specific sectors.

For more information, see our article here.

The ICO’s new AI and biometrics strategy

The ICO has launched a regulatory and enforcement strategy on AI and biometric technologies to help build confidence amongst organisations and public trust.

The strategy is centred on a new statutory (and therefore binding) code of practice for organisations using AI and automated-decision making (ADM). Businesses across all sectors will also want to closely monitor the ICO’s planned statements on regulatory expectations around the use of ADM in recruitment.

The strategy will also see the ICO clarifying its expectations related to the development of general purpose AI models (where there seems to be an attempt to keep pace with the EU’s new General-Purpose AI Code of Practice) and publishing guidance on use of facial recognition technology for law enforcement purposes.

For more information, see the publication here.

The ICO launches consultation on a new enforcement approach to low-risk online advertising

Under regulation 6 of the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR), consent must be obtained before storing or access information on a user’s device (including through use of cookies). Exceptions from the consent requirement, such as where cookies are strictly necessary for provision of a service, don’t usually apply to cookies used for advertising purposes. At the same time, according to the ICO, cookies used for advertising which isn’t based on detailed profiling won’t pose much risk of harm for individuals, even where individuals aren’t asked to consent.

The ICO has said that it plans to publish a statement in early 2026 identifying advertising techniques and purposes that are unlikely to attract enforcement under regulation 6 of PECR. The consultation seeks views on the relevant advertising purposes, the impacts of the ICO’s proposal, and safeguards to mitigate risks for users.

In parallel, the ICO has updated its ongoing consultation on new guidance on Storage and Access Technologies (SATs) to reflect reforms under the Data (Use and Access) Act 2025 that will allow cookies to be used without consent for statistical or security reasons. If approved, the enforcement approach outlined in the consultation will achieve a similar practical result for some advertising cookies. The approach will provide some solace to marketers concerned about how the ICO’s wider online tracking strategy and guidance on SATs will affect them.

The consultation closes on Friday 29 August.

For more information, see the publication here.

The ICO issues draft guidance to help smart product manufacturers ensure data protection compliance

Following a series of consumer workshops in 2024, the ICO has published its Draft Guidance for consumer Internet of Things (IoT) products and services, which is open for consultation. This addresses some of the concerns identified during the workshops, including in relation to products collecting too much information and consumers having a lack of control over how their personal information is used and shared. This draft guidance addresses similar topics to the ICO’s 2025 online tracking strategy, which aims to give people meaningful choice in how their data is used.

The draft guidance is aimed at manufacturers, developers and other stakeholders involved in the IoT ecosystems and is designed to provide clarity on how they can ensure compliance with UK data protection laws when processing personal data through consumer IoT products.

It emphasises the need for relevant organisations to comply with data protection laws generally, covering topics including the requirements relating to data minimisation, individuals’ rights, data protection by design and security measures. It also draws out certain risks which are specific to IoT, including:

  • Special category data: organisations must meet both a lawful basis under Article 6 and a condition under Article 9 of the UK GDPR when processing special category personal data. Aside from explicit consent, most conditions require that the processing is necessary for the reasons specified, which can be a high bar to meet. Examples of where special category personal data is collected include fitness trackers calculating BMI or fertility trackers inferring reproductive health. Additionally, biometric data that is used to uniquely identify individuals (such as smart speakers creating voice IDs for user recognition purposes) is special category personal data.
  • Children’s data: for IoT products aimed at or likely to be accessed by children (including smart toys, fitness trackers and general use products such as smart speakers that children are likely to access):
    • Privacy information must be concise, prominent and in age-appropriate language, with bite-sized explanations provided that the point of data collection or use.
    • Parental authorisation is required for children under 13.
    • Geolocation settings should be switched off by default unless there is a compelling reason to enable them.
    • Data protection impact assessments are mandatory.
  • Data protection by design and by default: the guidance emphasises the importance of data protection by design throughout the IoT product lifecycle.

Organisations involved in the IoT ecosystem should review the draft guidance and consider whether to respond to the ICO consultation.

For more information, see the publication here.

EU

plus

EU Court rules on disclosure of personal data in public contracts related to COVID-19 screening tests

On 3 April 2025, the Court of Justice of the European Union (CJEU) issued a decision in a case between L.H. and the Czech Republic’s Health Minister. The case concerned the Minister’s refusal to disclose to L.H. certain information concerning representatives of legal persons (such as companies) that are mentioned in contracts and certificates related to COVID-19 screening tests.

The CJEU confirmed that the data disclosure of individuals representing legal persons—such as first names, surnames, signatures, and contact details—counts as "processing" under the General Data Protection Regulation (GDPR). This applies even if the sole purpose of the disclosure of such data is to identify the authorised representative of the legal person.

The Court further clarified that the GDPR does not prevent national rules to require controllers, including public authorities, to inform and consult individuals before disclosing their personal data in official documents. However, this obligation must a) be possible to implement, b) not require disproportionate efforts, and c) not excessively restrict public access to such documents.

For more information, see the Judgment here.

EDPB adopts final guidelines on data transfers and launches AI training materials to support GDPR compliance

On 5 June 2025, the European Data Protection Board (EDPB) adopted the final version of its guidelines on Article 48 GDPR, following public consultation. These guidelines clarify how data controllers and processors in the EU should respond to requests for personal data from non-EU authorities, a topic of growing concern for companies dealing with international operations. Under Article 48, such requests - especially when not supported by an international agreement - do not in themselves constitute a lawful basis for data transfers under the GDPR.

The EDPB emphasises that organisations must assess whether the request is backed by a valid international treaty or agreement providing both a legal basis and sufficient safeguards. In the absence of such frameworks, data transfers may only be considered under specific and narrowly interpreted derogations, such as those provided by Article 49. The final version of the guidelines incorporates feedback from stakeholders and offers further clarification on complex cross-border scenarios, such as when a parent company located outside the EU relays a request to its European subsidiary, or when a processor receives a direct request from a third-country authority.

During the same plenary, the EDPB presented two new training projects under the Support Pool of Experts (SPE) program. The first resource, Law & Compliance in AI Security and Data Protection, is intended for legal professionals such as data protection officers and compliance experts. The second, Fundamentals of Secure AI Systems with Personal Data, targets technical audiences including developers, cybersecurity professionals and AI system deployers. Both projects are designed to address the growing skills gap in the AI and data protection field and to support the development and use of AI systems that comply with the GDPR and other applicable regulations.

For more information, see the news from EDPB here.

ITALY

plus

Italy’s DPA rules double opt-in as critical best practice for valid marketing consent

On 4 June 2025, the Italian Data Protection Authority (Garante) issued a decision highlighting the importance of the double opt-in mechanism as an essential safeguard for demonstrating valid consent to marketing communications under the GDPR. The case involved the online car dealership, following a complaint lodged by an individual in June 2023.

The Garante emphasised that mere technical logs, especially when disputed by the data subject and lacking integrity safeguards, are not enough to prove valid consent and reaffirmed a key GDPR principle: the data controller remains fully responsible for ensuring the legality and traceability of consent, even when relying on third-party data providers. In this context, the Garante identified the double opt-in method, where users must confirm their subscription via a secondary action such as clicking a link in a verification email, as the “state of the art” for documenting consent.

While not making double opt-in legally mandatory, the Garante effectively qualified it as the minimum standard of diligence, aligning with Articles 7 and 24 GDPR. The decision serves as a clear signal to all organisations engaging in digital marketing that without robust procedures for collecting and verifying consent, such as double opt-in, they risk being unable to meet their burden of proof in case of a dispute.

For more information, see the provision issued by the Garante here (in Italian).

Italy’s DPA fines employer for unlawful use of employee’s private messages in disciplinary proceedings

On 21 May 2025, the Garante fined a company €420,000 for unlawfully using an employee’s private digital communications during disciplinary proceedings that led to their dismissal.

The investigation followed a complaint by the employee, who reported that personal content from her Facebook, Messenger, and WhatsApp conversations had been used as evidence. These messages, including screenshots and quotes, were not collected directly by the company but were provided by colleagues and third parties.

The Garante found no legal basis for accessing or processing this information, which was exchanged in private contexts and unrelated to the employee’s work and stressed that even data shared in semi-private digital spaces remains protected under the GDPR. Employers cannot repurpose such content without a valid legal basis, regardless of how it is obtained. The case reinforces the need to respect boundaries between personal and professional life and confirms that fundamental data protection rights remain intact, even in disciplinary matters.

For more information, see the provision issued by the Garante here (in Italian).

LUXEMBOURG

plus

Luxembourg Tribunal confirms CNPD’s incompetence in private data use dispute

On 3 June 2025, the Luxembourg Administrative Tribunal issued a significant ruling clarifying the competence of the National Commission for Data Protection (CNPD). The case involved a dispute between a lawyer, Madam (A), and the CNPD over its decision to dismiss Madam (A)’s complaint about the use of her personal address by a former client, Sir (B).

Madam (A) discovered that Sir (B) had obtained and used her private address to send her a registered letter containing various grievances, despite her not having shared this information with him. Believing this use of her personal data violated her privacy, she filed a complaint with the CNPD on 20 June 2022.

On 29 July 2022, the CNPD informed Madam (A) that it would not pursue her complaint. It concluded that the data processing by Sir (B) fell under Article 2(2)(c) of the GDPR, which excludes the GDPR application to the processing of personal data carried out by an individual as part of a "purely personal or household activity." This position was reaffirmed by the CNPD on 6 October 2022.

Madam (A) challenged this decision before the Administrative Tribunal on 4 January 2023, seeking to overturn or annul the CNPD’s decision.

In its judgment, the Tribunal sided with the CNPD. It examined two key criteria: (i) whether the activity in question fell within the personal or household activity of the individual, and (ii) whether the data was accessible only to a limited group of recipients and not made public. The Tribunal found that Sir (B)’s use of Madam (A)’s private address to send a letter expressing dissatisfaction with her past legal services was a purely private act. He acted as an individual for personal purposes, unrelated to any professional or public activity. Regarding the second criterion, the Tribunal noted that the letter was sent only to two recipients—Madam (A) and another lawyer—and was not made public. The involvement of postal services or subcontractors in delivering the letter did not constitute public disclosure.

As a result, the Tribunal rejected Madam (A)’s action, confirming the CNPD’s decision to declare itself incompetent to handle Madam (A)’s complaint, as the disputed data processing carried out by Sir (B) falls under the scope of Article 2(2)(c) of the GDPR.

For more information, see the Judgement here (in French).

CNPD publishes data retention guidelines for payment service providers

On 26 June 2025, Luxembourg’s National Commission for Data Protection (CNPD) released new guidelines on how payment service providers should handle the retention of personal data. These guidelines, currently available only in French, address the significant volume of personal data collected and processed by such providers—during onboarding, throughout the business relationship and even after it ends. Advances in technology have further increased the ability of payment service providers to gather, store, and analyse vast amounts of user data.

The guidelines aim to help payment service providers comply with key data protection principles while considering sector-specific obligations, such as anti-money laundering requirements. Key points include:

  • Data retention limitation: under Article 5(1)(e) of the GDPR, data must not be kept longer than necessary for the purposes for which it was processed.
  • Legal basis and retention periods: providers must identify the legal basis and retention period for each category of data to justify how long it is kept.
  • Best practices: recommendations include implementing mechanisms to sort data, closing inactive accounts and other measures to comply with retention rules.
  • Transparency obligations: providers must inform individuals about data retention periods, aligning with the GDPR’s transparency requirements.

For more information, see the guidelines here (in French).

AUSTRIA

plus

NOYB files complaint against Bumble for unlawful use of personal data in AI-generated messages

On 26 July 2025, the privacy-focused non-profit organisation NOYB (None of Your Business) filed a complaint with the Austrian Data Protection Authority (DSB) against the dating app Bumble over its “AI Icebreakers” feature, introduced in December 2023 in the “Bumble for Friends” section. This AI tool generates personalised opening messages based on users’ profiles, without obtaining valid consent. NOYB claims that Bumble transmits users’ personal data, including potentially sensitive information such as sexual orientation, to OpenAI to power the feature, yet fails to inform users transparently or seek explicit consent, as required by Articles 6(1)(a) and 9 GDPR.

Although users are shown a banner stating “AI breaks the ice” with an “Okay” button, NOYB argues this design is manipulative “nudging” that pressures users into compliance without genuine choice. Notably, Bumble does not rely on consent as its legal basis, instead invoking “legitimate interest” under Article 6(1)(f) GDPR, an approach NOYB deems inapplicable.

For more information, see the complaint against Bumble here.

MIDDLE EAST

plus

Proposed amendments to Saudi Arabia’s Personal Data Protection Law (PDPL) Implementing Regulations

On 27 April 2025, the Saudi Data & AI Authority (SDAIA) issued its third public consultation on proposed amendments to the Personal Data Protection Law (PDPL) Implementing Regulations. Key proposals include clearer responsibilities for DPOs, streamlined controller registration requirements and updated privacy notice obligations. The draft also refines consent requirements for marketing, removes the time limit for filing complaints and introduces a 10-business-day deadline for organisations to respond to regulatory requests. These changes aim to provide practical guidance for organisations, ensuring compliance with the PDPL while balancing operational flexibility and user rights.

Public comments on these proposals concluded on 27 May 2025, allowing for the legislation to be refined and potentially enacted.

For more information, see the proposed amendments here.

ADGM’s Registration Authority issues proposed amendments to the Data Protection Regulations 2021

On 11 June 2025, the Abu Dhabi Global Market (ADGM) Registration Authority issued Consultation Paper No. 6 of 2025 on proposed amendments to the Data Protection Regulations 2021. The changes aim to introduce new "Substantial Public Interest Rules" for processing special categories of personal data, with specific conditions for activities related to insurance and safeguarding children or individuals at risk. These amendments will impact ADGM-licensed entities, data controllers and processors, among others.

Public comments on these proposals concluded on 2 July 2025, allowing for the legislation to be refined and potentially enacted.

For more information, see the proposed amendments here.

ASIA

plus

Updates to Singapore standard of National Certification for Data Protection

The Data Protection Trustmark (DPTM) is a voluntary certification for organisations to demonstrate their commitment to effective data protection practices. The DPTM has been updated to align its standards with global data protection benchmarks and international best practices.

The DPTM provides organisations with clear data protection requirements around key areas, such as third-party management and cross-border data transfers. Organisations seeking certification will undergo assessment by appointed certification bodies and annual surveillance audits will be conducted to ensure ongoing compliance and commitment to data protection excellence.

For more information on the DPTM, see here.

This document (and any information accessed through links in this document) is provided for information purposes only and does not constitute legal advice. Professional legal advice should be obtained before taking or refraining from any action as a result of the contents of this document.