Key trends
DeepSeek under scrutiny by the Italian Data Protection Authority
Amazon's fine of €746 million confirmed in Luxembourg
UK's adequacy decision may be extended until the end of 2025
Saudi implements transfer risk assessment
China implements new CCTV regulations
UK
European Commission proposes a limited extension to UK adequacy decisions
The European Commission has published draft implementing decisions that would extend the two adequacy decisions made in 2021 in respect of the UK's data protection legal framework for a period of six months. The adequacy decisions relate to the protection for personal data transferred from the EU to the UK within the scope of the GDPR and the Law Enforcement Directive respectively. If approved by the European Data Protection Board, the decisions will continue to apply until 27 December 2025.
The key motivation for the extension is to allow time for the UK's Data (Use and Access) Bill to be approved by Parliament (expected in the coming weeks). Once this happens, the Commission is expected to start the process of assessing whether UK data protection law continues to provide an adequate level of protection for personal data.
As the Bill omits a number of the reforms proposed by the previous UK government which would have diverged more significantly from GDPR standards, the current government will be quietly optimistic that both adequacy decisions will be approved for renewal at the end of the Commission's assessment. This said, given the range of provisions in the Bill that rely on the development of further implementing regulations, it will be interesting to see whether the Commission considers it is in a position to approve a wider renewal or whether it might consider further, short-term extensions.
For more information, see the publication here.
The ICO releases its Tech Horizons report 2025
The ICO has released its Tech Horizons report 2025, which is designed to help innovators by identifying the data protection implications of emerging technologies and providing them with guidance on how to implement related data protection safeguards. In this edition (the third edition of Tech Horizons), the ICO focuses on four significant technologies which it considers will affect society, the economy and information rights in the next 2-7 years. Its predictions on future developments (and related data protection compliance guidance) include that:
Connected transport: the next generation of vehicles will know more about their users than ever before as they will be equipped with advanced sensors, fast internet connectivity and data processing features. V2V (vehicle-to-vehicle), V2I (vehicle-to-infrastructure) and V2P (vehicle-to-pedestrians) communications through various means enable vehicles to share information. Infotainment technology is also increasingly connected, and in the future it could learn users' speech patterns to improve interactions and experience and tailor advertisements based on past reactions. Predictive maintenance features could also reveal a driver's routines and driving style. Increasing amounts of data will be collected, bringing transparency and lawful basis challenges to the fore (not least as small screens make it challenging to deliver extensive information). Where tracking technologies are used, organisations will need to ensure that they store information on terminal equipment in a way that is compliant with the restrictions under e-privacy rules such as the Privacy and Electronic Communications (EC Directive) Regulations 2003.
Quantum sensing and imaging: brain scanning, cancer screening and heart sensing all involve processing personal data, such as brain patterns and cancer diagnostics. When processing special category personal data, organisations must ensure that they comply with an Article 9 condition under the UK GDPR and, where necessary, carry out data protection impact assessments. Enhanced quantum capabilities may exacerbate wider risks, such as in relation to ensuring that people understand what information is being collected about them and why.
Digital diagnostics, therapeutics and healthcare infrastructure: such as smart pills, digital twins and AI-assisted diagnosis. Smart pills that gather and transmit health information in real-time (such as chemical states in the stomach) to support treatment and monitoring. Digital twins, which are virtual counterparts of physical entities that try to mirror the physical entity and predict outcomes and offer significant benefits but trigger potential data protection considerations, including:
Cybersecurity, which may be addressed through the use of use of Privacy Enhancing Technologies
Automated decision-making (including biased or discriminatory outcomes for patients), including related transparency considerations.
Synthetic media and its identification and detection: personal data is often used in the creation, distribution and targeting of synthetic media (the most controversial and well-known form of which is deepfakes). Distinguishing deepfake content from real media is becoming increasingly difficult and poses security risks. Organisations need to put in place appropriate technical and organisational measures to secure their information. Additionally, when carrying out automated content moderation (for example, removing content or banning a user based on checking the content against standards) could trigger UK GDPR restrictions on automated decision-making.
The ICO also comments on the steps it has taken in relation to developments identified in its previous Tech Horizons reports, such as personalised AI, next generation IOT and quantum technologies.
For more information, see the ICO's publication here.
The ICO refreshes its online tracking strategy and expands its cookie compliance assessment programme
The ICO has announced that it is maintaining its focus on online tracking into 2025, with a specific focus on the online advertising ecosystem.
Through a series of 'citizen juries' and other engagement, the ICO has identified a number of areas where individuals are seen to lose control over how their personal data is used for the purpose of personalised advertising, and as a result is taking action to ensure people have meaningful control over how they are tracked on the top 1,000 most popular websites in the UK (extending an earlier campaign that focused on the top 200 websites). New guidance due later this year will also help the public to understand their rights in this area.
The ICO also plans to publish new guidance for industry, including a statement on low-risk advertising-related activities that are unlikely to result in enforcement action and a final version of the existing draft guidance on storage and access technologies, expected once the Data (Use and Access) Bill is approved by Parliament. The ICO has this year already published new guidance for organisations on the deployment of 'consent or pay' models.
For more information, see the ICO's publication here.
ICO investigates social media and video sharing platforms' use of children's data
The ICO has announced investigations into TikTok, Reddit, and Imgur regarding their handling of children's personal information. The investigation into TikTok centres around its use of 13-17-year-olds' data to personalise and recommend content, amid concerns that young users are being exposed to inappropriate or harmful content. For Reddit and Imgur, the focus is on how age assurances measures are implemented to estimate or verify a child's age and restrict or tailor services accordingly.
The investigations will determine whether these companies have breached data protection legislation, and are part of a broader push to keep children safe online. John Edwards, the UK Information Commissioner, emphasised that "the responsibility to keep children safe online lies firmly at the door of the companies offering these services and my office is steadfast in its commitment to hold them to account". The ICO's announcement highlights changes that have resulted from its wider regulatory intervention, with companies including X, BeReal, Dailymotion and Viber taking steps to protect children's online privacy, e.g. by removing personalised advertising or geolocation services for under 18s.
The ICO states it will continue to push for further changes where organisations do not comply with the law or the Children's code which came into force in 2021, and that it will be working closely with Ofcom (which is responsible for enforcing the Online Safety Act) to ensure a coordinated approach.
For more information, see the ICO's publication here.
EUROPEAN UNION
EDPB publishes CSC biannual report and work programme 2025-2026
On 13 February 2025, the EDPB Coordinated Supervision Committee (CSC) released its biannual activity report on 13 February 2025, covering July 2022 to December 2024. The report details the CSC's efforts in supervising large-scale EU IT systems, such as the Schengen Information System (SIS) and the Visa Information System (VIS) and preparing for interoperability regulations. It includes recommendations on transparency obligations under the Internal Market Information System (IMI). A notable achievement was the July 2023 publication of a guide on exercising data subjects' rights concerning Europol's information systems, developed with input from Member States. In response to the 2022 EDPS Audit Report on Europol's processing of minors' data, the CSC coordinated national DPAs to verify the legality of data transmission to Europol. On 27 February 2025, the CSC adopted its Work Programme 2025-2026, focusing on role allocation within the Justice and Home Affairs interoperability framework and improving complaint handling related to JHA systems, Europol, Eurojust, and the European Public Prosecutor's Office. The CSC aims to expand its supervisory scope, assist national DPAs, and enhance oversight of data flows among EU institutions, aligning with the EDPB's objectives to improve data protection standards and GDPR compliance across the EU.
For more information, see the EDPB Report.
FRANCE
The CNIL specifies how to handle DSARs when they relate to the communications of a large number of emails.
Employees can request access to their personal data from their employer, including data contained in professional emails. However, requests involving a large volume of emails pose practical difficulties due to the necessary sorting and processing. To reconcile this right with the constraints faced by employers, the CNIL recommends several measures:
- Provide a summary table of emails involving the relevant data subject. If certain messages cannot be communicated, the employer must be able to justify their choice.
- Invite the data subject to specify their request to facilitate processing when it represents a significant burden.
- Communicate the requested data, after potentially adjusting the request.
For more information, see the CNIL's statement.
ITALY
The Italian Data Protection Authority blocks DeepSeek AI
On 30 January 2025, the Italian Data Protection Authority (the Garante) investigated Hangzhou DeepSeek Artificial Intelligence and Beijing DeepSeek Artificial Intelligence regarding their DeepSeek chatbot service. The investigation found that DeepSeek was indeed offering services to EU users (including Italian users) and processing their personal data without adequate transparency, violating several GDPR provisions. Specifically, the privacy policy was insufficient and only available in English, failing to meet GDPR requirements, data was stored in China without proper security measures and no EU representative was designated.
As a result, the Garante urgently imposed a limitation on processing Italian users' data and launched an investigation into GDPR compliance and the impact of data collected unlawfully, especially concerning a reported malicious attack. Taking this into account, the Garante has mandated DeepSeek to implement specific corrective actions to ensure adherence to privacy laws, including enhancing transparency in data processing, obtaining explicit consent from individuals, and improving security measures to protect personal data.
Additionally, the Garante has imposed a deadline for DeepSeek to comply with these requirements, underscoring the importance of safeguarding individuals' privacy rights in accordance with Italian and EU data protection standards.
For more information, see the Garante's provision.
ICELAND
Fine imposed by Iceland on the Primary Health Care of the Capital Area for unlawful processing of medical record systems
On 7 March 2025, the Icelandic Data Protection Authority (SA) has imposed a €33.854 administrative fine on the Primary Health Care of the Capital Area for unlawful processing of medical records in connection with the integration of medical record systems. The decision follows an investigation triggered by a ruling on a related complaint concerning the processing of medical records by the Transport Authority's medical officer. The SA found that the Primary Health Care had integrated medical record systems without fulfilling national legal requirements, leading to violations of Articles 5, 6, and 9 GDPR.
The investigation revealed that the Primary Health Care of the Capital Area had entered into multiple agreements for integrating medical record systems, but only one complied with Act No. 55/2009 on Medical Records. Under Article 20.2 of the Act, such integrations require a permit from the Minister of Health and confirmation from the Icelandic SA on data security. The Primary Health Care failed to obtain these approvals for eleven integrated parties, leading to unlawful processing of sensitive medical data.
The Icelandic SA imposed the fine for violating the principles of lawfulness, fairness, and transparency in the processing of sensitive health data.
The decision marks a significant milestone in the ongoing quest to safeguard health-related data, underscoring the critical importance of robust data protection in one of the most sensitive areas of modern privacy concerns.
For more information, see the Icelandic SA's provision.
LUXEMBOURG
Luxembourg court upholds €746 million GDPR fine against Amazon
On 18 March 2025, the Luxembourg Administrative Tribunal rendered a judgement in a case opposing Amazon Europe Core S.à r.l. to the National Commission for Data Protection (CNPD). The dispute concerned a CNPD decision dated 15 July 2021, which imposed an unprecedented administrative fine of €746 million on Amazon, along with corrective measures to be implemented within six months, under the penalty of a daily fine of €746.000.
The CNPD investigation concluded that Amazon infringed several provisions of the GDPR, particularly in relation to transparency, data minimisation, and lawfulness of the processing. The company was found to have collected and processed large volumes of personal data for interest-based advertising without obtaining valid consent from data subjects. Additionally, the company was also found to have failed to correctly apply several data subjects' rights. The CNPD imposed corrective measures to bring Amazon's personal data processing into compliance.
In its judgement of 18 March 2025, the Luxembourg Administrative Tribunal dismissed Amazon's requests and thus upheld the CNPD's original decision. The effects of this decision will remain suspended during any potential appeal procedure before the Luxembourg Administrative Court.
See the publication from the CNPD's here.
MIDDLE EAST
Saudi transfer risk assessment guidelines and criteria published
On 25 February 2025, the Saudi Data and Artificial Intelligence Authority (SDAIA) released new transfer risk assessment guidelines to safeguard personal data when transferred outside the KSA. These non-binding guidelines offer a structured approach for organisations to assess and mitigate risks associated with data transfers, ensuring compliance with the KSA Personal Data Protection Law (PDPL).
The risk assessment process is divided into four phases: 1) preparation, 2) assessing negative impacts and potential risks, 3) evaluating risks for data transfer outside the Kingdom, and 4) analysing implications for the KSA's vital interests. Notably SDAIA's guidelines differ from international standards like the GDPR by prioritising data exporters' activities and national interests, rather than just focusing on individual rights and the third country's laws affecting data transfer tools and supplementary measures used by data importers.
See the SDAIA's publication here.
DIFC launches public survey on the role of Autonomous Systems Officers (ASOs)
The DIFC's Regulation 10 mandates the appointment of an ASO for organisations using AI to process personal data in high-risk processing activities. While the role is new and its requirements are still evolving, ASOs are expected to have competencies similar to Data Protection Officers, including ensuring compliance with legal and ethical standards, monitoring AI system outcomes, and managing audits and compliance reporting.
On 25 March 2025, the DIFC initiated a public survey to gather stakeholder insights on ASO requirements, aiming to update Regulation 10 and provide guidance on the role.
See the DIFC's survey here.
DIFC publishes consultation for amendments to the Data Protection Law (DPL)
The DIFC has released Consultation Paper No.1 concerning DIFC Law Amendment Law No. 1 of 2025 on proposed amendments to several DIFC laws, including the DPL, to enhance its legal framework and align with international standards.
Key amendments to the DPL focus on clarifying its scope, ensuring redress mechanisms, and enhancing data subjects' rights. These include extending the DPL's application to non-DIFC entities processing data in the DIFC, introducing risk-based due diligence, and establishing a private right of action for data subjects to seek compensation through DIFC Courts. Public comments on these proposals concluded on 26 March 2025, allowing for the legislation to be refined and potentially enacted.
See the DIFC's publication here.
CHINA
Administrative regulation on information systems of public security video and images (CCTV Regulation)
The CCTV Regulation was issued by China's State Council on 13 January and takes effect on 1 April 2025. It provides for the deployment of CCTV systems in public spaces and the usage and protection of CCTV data and privacy. It specifies the public areas where installment of CCTV cameras is required, restricted or prohibited, and sets out the cybersecurity and data security requirements for CCTV systems. The CCTV Regulation also stipulates the rules regarding usage, access, disclosure, sharing and retention of CCTV data.
Administrative measures on personal information protection compliance audit (PIP Compliance Audit Rule)
The PIP Compliance Audit Rule was issued by the Cyberspace Administration of China on 12 February and takes effect on 1 May 2025. It provides that personal information processors (equivalent to "data controllers" in GDPR context) shall conduct PIP compliance audits on regular basis, either by themselves or by external professional agencies. For processors that process personal information of over 10 million individuals, the PIP compliance audits shall be conducted at least every two years.
The new rule also grants authority to data protection regulator to order compulsory PIP compliance audit if material risks has been identified in, may individuals' interests may be infringed by, or serious data breach has happened to the processing activities by a personal information processor. The rule includes a guidance on how to conduct PIP compliance audits. It also specifies that personal information processors that process personal information of over 1 million individuals shall appoint a personal information protection officer.
Administrative measures on the secure application of facial recognition technology (Facial Recognition Rule)
The Facial Recognition Rule was issued by the Cyberspace Administration of China on 21 March and takes effect on 1 June 2025. Strict data and privacy protection requirements are imposed on the deployers of facial recognition technology, covering issues of transparent notification, separate consent, in-device storage (unless otherwise provided by law or with separate consent), personal information protection impact assessment, provision of less privacy-intrusive alternatives, security measures, mandatory filing obligation, etc.
SINGAPORE
Guidelines for handling NRIC numbers as personal data in Singapore
The Singapore government has recently reiterated that National Registration Identity Card (NRIC) numbers are classified as personal data. Organisations must handle them with care, ensuring they notify individuals and obtain consent for their use and given sufficient protection, including safeguarding them against unauthorised disclosure.
Organisations are also encouraged to stop the practice of using NRIC numbers as a factor of authentication or as default passwords as soon as possible. While organisations may continue to use partial NRIC numbers for authentication, the government's recommendation is for them to explore alternative identification methods where feasible. Notably, the physical NRIC card can also be suitable as an authenticator. The government is currently reviewing guidelines and considering updates following public consultation.
See the guidelines here.




_11zon.jpg?crop=300,495&format=webply&auto=webp)



_11zon.jpg?crop=300,495&format=webply&auto=webp)





_11zon.jpg?crop=300,495&format=webply&auto=webp)
_11zon_(1).jpg?crop=300,495&format=webply&auto=webp)



