Key trends
The “back-to-school” period has been busy for European institutions and regulatory bodies around the world. Key highlights include updates on the AI Act, Standard Contractual Clauses (SCCs), the Digital Markets Act (DMA), GDPR penalties and data protection legislation in Saudi Arabia. We provide a summary below.
Long Story short
UK:
- The Information Commissioner's Office (ICO) and the National Crime Agency (NCA) sign a Memorandum of Understanding to improve the UK's cyber resilience by sharing information, enhancing security measures and encouraging the reporting of cyber-crimes, while respecting each organisation's distinct role.
- The UK Government consults on a proposal to increase the fees organisations pay to the ICO by 37.2% to reflect additional responsibilities under the Digital Information and Smart Data Bill, without altering statutory exemptions.
- The Labour Government's Digital Information and Smart Data Bill aims to drive economic growth and improve digital governance through innovative data use and strong privacy measures, introducing Digital Verification Services, a National Underground Asset Register, Smart Data Schemes and reforms to public services, scientific research and the ICO's structure and powers.
EU:
- The European Data Protection Board (EDPB) publishes a statement highlighting the crucial role that data protection authorities will play in implementing the EU AI Act.
- The European Commission publishes its second report on the enforcement of the GDPR, highlighting a total of €4.2 billion in fines since the GDPR came into effect.
- New Standard Contractual Clauses (SCCs) are expected by 2025 to cover data transfers by entities subject to the GDPR.
- Guidance is expected at the European level to assist stakeholders in navigating the interplay between the GDPR and the DMA.
- The ECJ and EDPB provide further insights on what should be considered legitimate interests under Article 6 GDPR
- The French Data Protection Authority (CNIL) issues a penalty decision on data anonymisation that confirms the application of the criteria from Opinion 05/2014 of the WP29 on this matter.
- The French National Cybersecurity Agency (ANSSI) publishes a guide on securing sensitive information stored in the cloud.
- German law makers publish a draft on a Federal Employee Data Protection Act to supplement the GDPR and the AI Act.
- The German Data Protection Conference (DSK) publishes informative recommendations on organising data processing in the context of asset deals.
- The DSK also publishes a position paper to clarify the interpretation of “scientific research purposes” under the GDPR.
- The Italian Data Protection Authority (Garante) publishes a favourable opinion on the draft Italian bill implementing the EU AI Act.
Asia
- The Singapore government announces plans for safety guidelines requiring Generative AI developers to disclose model functionalities and risks, aiming to ensure transparency and trust. This plan sits alongside robust efforts by the Association of Southeast Asian Nations (ASEAN) to improve data governance and the upcoming release of ASEAN’s Guide on Data Anonymisation.
- The HKMA updates its 2019 guidelines for using big data and AI by authorised institutions, adding new principles for the responsible use of Generative AI, including governance, fairness, transparency and data privacy.
- The Beijing Free Trade Zone publishes a "Negative List" for data export, detailing "Important Data" for specific sectors and requiring security assessments for data transfers out of the PRC, along with adjustments to cross-border personal information transfer protocols.
Middle East
- The grace period for the KSA Personal Data Protection Law has now ended, making compliance mandatory for organisations, with the Saudi Data & AI Authority (SDAIA) now enforcing the regulations.
- The SDAIA publishes draft guidelines for feedback on managing deepfakes in Saudi Arabia, offering advice on risk assessments, consent, watermarking and consumer protection against technology misuse.
Must Reads
- The €290 Million Wake-Up Call: GDPR Compliance in Data Transfers by Alexander Brown, Lawrence Brown, Emily Jones and Olivia Ward
- Labour's commitment to a technological revolution in healthcare by Hayley Davis
- Fit or misfit: Can GDPR and the AI Act interplay? by Christopher Götz
- ECJ rules on commercial interests as legitimate interest by Jaap Tempelman
Regional Updates
UK
The ICO signs a memorandum of understanding with the NCA for further cybersecurity collaboration
As set out in a statement from the ICO, the ICO and the National Crime Agency (NCA) have established a Memorandum of Understanding (MoU) aimed at enhancing the UK's cyber resilience. This agreement:
- outlines the collaborative efforts between the two organisations to better safeguard nationwide against data theft and ransomware attacks;
- emphasises the commitment to share relevant, timely information on cybersecurity, support enhanced security measures and offer guidance on implementing changes; and
- builds upon the existing relationship between the ICO and NCA, aiming to elevate cybersecurity standards while respecting the distinct roles of each organisation.
A key focus of this partnership is to ensure organisations are directed to appropriate resources (e.g. the National Cyber Security Centre (NCSC)) and are encouraged to report cyber-crimes as soon as possible. ICO Deputy Commissioner (Regulatory Supervision), Stephen Bonner, highlighted the financial toll cyber-crime has taken on UK businesses and stressed the importance of cooperation between relevant entities to bolster the country's cyber defences.
NCA Deputy Director Paul Foster, Head of the National Cyber Crime Unit, underscored the NCA's comprehensive approach to combating cyber-crime, which includes disrupting cybercriminal activities and facilitating legal actions against them. Foster also emphasised the importance of providing support and guidance to organisations at risk of or affected by cyber-attacks, in collaboration with partner entities.
The MoU delineates several commitments, including encouraging organisations to engage with the NCA on cybersecurity issues and ensuring that any confidential information shared by an organisation with the NCA will not be passed to the ICO without the organisation's consent. Additionally, the ICO will aid the NCA's efforts to monitor cyber-attacks by sharing information about cyber incidents in an anonymised, systemic and aggregated manner or on a specific organisational basis when suitable. Both parties will strive to coordinate their responses to cyber incidents to minimise disruption and will work together to promote learning, offer consistent guidance and enhance cybersecurity standards.
The UK Government runs a consultation on increases to UK data protection fees
The UK Government is running a public consultation on proposed increases to the annual fee that organisations pay to help fund the ICO. The proposal is for an increase of 37.2% across the current three tier fee structure. This means that the fee payable by “large” organisations, for example, would increase from £2,900 to £3,979. The existing fee structure has not changed since 2018 and the proposed increases are justified in part by the additional responsibilities that will fall on the ICO with the introduction of the Digital Information and Smart Data Bill. There are no proposals to change the existing statutory exemptions from paying a fee.
The fee uplifts considered in the consultation will have only a limited impact on organisations. The state of the ICO’s funding and resources have been concerns for some time, and the government’s focus on this aspect of the data protection legal framework may signal the start of more general scrutiny around failure by organisations to register and pay the required fee. As a further measure, provisions in the (now defunct) Data Protection and Digital Information Bill which would have put the regulator on a statutory footing as a new “Information Commission” may also be revisited in upcoming legislation.
Plans for new Digital Information and Smart Data Bill start to come to light
The Labour Government has stated its aim of leveraging the power of data for economic growth, the enhancement of digital governance, and the betterment of people's lives. Central to this vision is the introduction of the Digital Information and Smart Data Bill which is designed to create a legislative framework to encourage innovative data uses while ensuring robust data protection and privacy standards. As ever with new legislation announced in the King’s Speech, there is little detail to go on and the fuller implications of the Bill will become clearer as the legislation is developed. However, for the moment, the key aspects of the Bill are:
- Digital Verification Services: Aimed at simplifying everyday transactions, these services will enable the creation and adoption of secure digital identity products. Certified providers will offer solutions for various needs, including moving house, pre-employment checks and purchasing age-restricted items, making digital interactions more convenient and secure.
- National Underground Asset Register (NUAR): This innovative digital map will revolutionise infrastructure management by providing instant, secure access to data on underground pipes and cables. NUAR will enhance the safety and efficiency of installation, maintenance and repair activities.
- Smart Data Schemes: Focussed on empowering consumers, these schemes will enable the secure sharing of customer data with authorised third-party providers upon request. This initiative aims to encourage innovation and improve market engagement by enabling customers to make informed decisions based on a comprehensive understanding of their data.
- Enhancing Public Services and Scientific Research: The Bill seeks to improve digital public services by reforming data sharing and standards. It proposes updates to the Digital Economy Act to facilitate government data sharing about businesses and moves towards electronic registration of births and deaths. Additionally, the Bill aims to support scientific research by allowing broader consent for research areas and enabling commercial researchers to utilise the data regime effectively.
- Strengthening Data Protection: A significant aspect of the Bill is the modernisation of the ICO, which will adopt a new structure with a CEO, board and chair. The ICO will also receive enhanced powers to ensure data protection standards are met. The Bill will introduce targeted reforms to clarify data laws, promoting the safe development of new technologies while maintaining high protection standards.
- Territorial Impact and Economic Benefits: The Bill will have a UK-wide impact, introducing measures that promise substantial economic benefits. Digital Verification Services alone are estimated to contribute around £600 million per year to the economy. Furthermore, the expansion of Smart Data schemes, inspired by the success of Open Banking, is expected to stimulate economic growth across various sectors.
Prior to the change in UK Government, the Data Protection and Digital Information Bill was going through the legislative process and would have brought reforms to the data privacy regulatory regime in the UK. However, the Data Protection and Digital Information did not get included in the legislative “wash up” at the end of the previous Parliament and so it is consigned to history. That said, it will be interesting to see the extent to which any of the Data Protection and Digital Information reforms find their way into the Digital Information and Smart Data Bill.
EU
Data protection authorities continue to play a key role in the development of AI regulation
The European Union has made significant progress in regulating artificial intelligence. The European Data Protection Board (EDPB) wants to become the supervisory authority for high-risk AI systems, following the recent AI Act. Member States must appoint competent authorities within a set timeframe and, given their expertise in fundamental rights and personal data protection, data protection authorities are seen as ideal to oversee high-risk AI applications. The EDPB suggests that data protection authorities supervise AI systems used in sensitive areas like biometric identification, law enforcement, migration management, judicial administration and democratic processes, ensuring strong protection of individual rights and freedoms. It also proposes that these authorities act as a single contact point for the public and calls for clear procedures to help cooperation among different authorities, including the European Commission's AI Office. This move is a big step towards harmonised and effective regulation of high-risk AI, highlighting the key role of personal data protection in Europe's digital landscape.
For more information: [EU: The EDPB publishes a statement on the role of data protection authorities in the AI Act]
The European Commission publishes its second report on the application of the GDPR
The European Commission has published its second report on how the GDPR is being applied, six years after it came into effect. The report concludes that no major changes to the regulation are needed, but some areas need more attention. Cross-border cases have increased, and data protection authorities have used the mutual assistance mechanism under Article 61. However, joint operations under Article 62 are not used much, and data protection authorities decisions are often challenged in courts on procedural grounds. Over 20,000 investigations have started, with 100,000 complaints received each year. About 20,000 complaints have been resolved amicably, and 6,680 fines have been issued, totalling €4.2 billion. Ireland has imposed the highest fines (€2.8 billion), followed by Luxembourg (€746 million) and Italy (€197 million). Data Protection Officers face challenges related to training and integration. The Commission stresses the need to support SMEs in complying with the GDPR by providing tailored tools and financial help. It also calls for consistent application of the GDPR across the EU, better use of cooperation mechanisms and enough resources for DPAs, proposing a regulation to harmonise laws and improve consistency in GDPR enforcement.
For more information: [Second report on the application of the General Data Protection Regulation]
The European Commission announces new SCCs for importers under the GDPR
The European Commission has announced plans to draft new Standard Contractual Clauses (SCCs) for data importers directly subject to the GDPR. In 2021, the Commission adopted SCCs for transferring personal data to countries without an adequacy decision, including four modules for different transfer scenarios. However, these SCCs could not be used when the importer was already under the GDPR scope, leaving a legal gap. Now, the Commission is launching a public consultation on these new SCCs, with final adoption expected in the second half of 2025. This raises questions about the status of the 2021 SCCs currently used for such transfers and the potential risk of sanctions for controllers without appropriate mechanisms. The recent €290 million fine against Uber by the Dutch authority for inadequate cross-border transfers highlights the importance of these new SCCs, especially if the Data Privacy Framework were to be invalidated in the future.
For more information: [European Commission's public consultation]
The EDPB works together with European Commission to develop guidance on interplay between the GDPR and DMA
The Commission services in charge of the enforcement of the Digital Markets Act (DMA) and the European Data Protection Board (EDPB) have agreed to work together to clarify and give guidance on the interplay between DMA and GDPR.
This enhanced dialogue between Commission’s services and the EDPB will focus on the applicable obligations to digital gatekeepers under the DMA which present a strong interplay with the GDPR, as there is a need to ensure the coherent application to digital gatekeepers of the applicable regulatory frameworks.
Developing a coherent interpretation of the DMA and GDPR while respecting each regulators’ competences in areas where the GDPR applies and is referenced in the DMA is crucial to effectively implement the two regulatory frameworks and achieve their respective and complementary objectives.
The DMA established a High Level Group to provide the Commission with advice and expertise to ensure that the DMA and other sectoral regulations applicable to gatekeepers are implemented in a coherent and complementary manner. The Commission and representatives from the EDPB and EDPS already engaged on data-related and interoperability obligations in the High Level Group. This project builds on this engagement and deepens the cooperation in relation to the two specific regulatory frameworks.
For more information [EDPB’s statement on the guidance on interplay GDPR and DMA]
ECJ Ruling and draft EDPB Guidelines on processing personal data based on legitimate interests are published
The European Court of Justice (ECJ) issued a preliminary ruling on October 4, 2024, in which it confirmed that the interests of a controller to process personal data need not be set out in EU or national law in order to qualify as a legitimate interest under Article 6(1)(f) of the GDPR, and that purely commercial interests of a data controller, such as the sale of personal data for advertising and marketing purposes, can in principle also qualify as a legitimate interest and therefore constitute a valid legal basis for the processing activities concerned.
The full judgment can be found here: ECJ Judgment of October 4, 2024 (C-621/22).
In what seems to be a coordinated action, further guidance on the processing of personal data on the basis of legitimate interests has been issued for public consultation by the European Data Protection Board on October 8, 2024: Guidelines on processing of personal data based on Article 6(1)(f) GDPR.
FRANCE
French Data Protection Authority CNIL imposes first public fine for non-compliance in relation to health data
On 5 September 2024, the CNIL fined CEGEDIM SANTE €800,000 for failing to comply with pre-processing formalities in handling health data. The company provided its clients with access to a database of patient data for studies and statistical analysis, sourced from doctors who used their software and consented to contribute their data. However, the Authority determined that the data was pseudonymised rather than anonymised, as unique identifiers allowed for the tracking of individual patients over time. As a result, the patient data remained classified as personal data subject to data protection regulations.
This decision highlights the importance of adhering to pre-processing requirements when managing sensitive health information and serves as a reminder for organisations to ensure full compliance with data protection laws. The CNIL highlighted the ongoing ambiguity surrounding anonymisation concepts and is awaiting the highly anticipated update of the Article 29 Working Party’s opinion n°05/2014 on anonymisation techniques, which may clarify the legal framework and simplify future compliance.
For more information: [CNIL’s announcement]
The ANSSI publishes guide to secure sensitive information systems in the cloud
The French National Cybersecurity Agency (ANSSI) has released a guide to help choose secure cloud services for sensitive information systems. With numerous cloud options available, this guide is aimed at operators of critical importance and essential information systems, assisting them in making informed decisions about cloud hosting. It highlights the importance of assessing risks, including threats, data sensitivity and international legal implications.
The guide focuses on three main areas: understanding different cloud services, recognising current threats and identifying the type of information systems. It differentiates between commercial and non-commercial cloud offerings. For some systems, it recommends cloud solutions with the French SecNumCloud certification, emphasising the need for secure infrastructure. This document is a useful tool for CTOs, encouraging thorough risk analysis to secure their systems in an increasingly complex cloud environment.
For more information: [Recommendations for hosting sensitive information systems in the cloud – ANSSI – only available in French]
GERMANY
German law makers publish a draft on a Federal Employee Data Protection Act
The new draft law intends to supplement the GDPR and the AI Act and focuses on enhancing protections for employee data in the digital work environment, in particular in the context of AI. Key issues covered include:
- Purpose: The legislation seeks to address the use of data in the workplace, especially in light of AI. It aims to ensure robust employee privacy protections.
- Data Processing: The draft specifies clear guidelines for the permissible processing of employee data, including the conditions under which employers can collect, use, and store such data. It provides rules for the use of employee data in recruitment, performance evaluations, and workplace monitoring.
- Profiling and AI: The draft intends to place specific restrictions on the use of AI and automated systems, especially in relation to profiling employees. Employers are required to inform employees about the use of AI in decision-making processes, and must ensure transparency / fairness in such systems. Profiling related to personal aspects like emotional analysis is strictly prohibited.
- Employee Rights: Employees are granted the right to know how their data is being used, including detailed information on any profiling or AI-based decision-making systems. Transparency and the right to human oversight in automated decisions affecting employees is emphasised.
- Employer Obligations: Employers are required to implement protective measures to safeguard employee data, especially sensitive data, and to ensure compliance with the GDPR. This includes data minimisation, pseudonymisation, and ensuring that personal data is only accessed by authorised individuals.
- Monitoring and Surveillance: The draft sets strict limits on employee monitoring, prohibiting excessive or covert surveillance. Video surveillance and location tracking must be justified by legitimate business needs, and they cannot be used for performance control.
The German Data Protection Conference (DSK) publishes guidance for asset deals
The Conference of Independent Data Protection Authorities of the German Federal and State Governments has issued guidance on the transfer of personal data during asset deals, effective September 11, 2024. Key points include:
Key points:
1. Due Dilligence
- Generally, transferring personal data before contract conclusion is not permitted
- Exceptions: voluntary consent or legitimate interest for key personnel data
2. Employee Data
- Transfer allowed at time of business transfer under § 613a BGB
- Before transfer: data sharing very limited, employee consent may be required
- Employees have right to object to transfer within one month of notification
3. Customer Data
- For ongoing contracts: data transfer allowed if customer approves contract transfer
- For completed contracts: requires data processing agreement, strict separation of data
- Special categories of data (e.g. health data): requires explicit consent
- Bank details: transfer allowed for ongoing contracts, otherwise needs consent
For more information: [Guidance from the DSK]- only available in German
The DSK publishes position paper on "Scientific Research Purposes" under the GDPR
The DSK has issued a position paper on September 11, 2024, clarifying the concept of "scientific research purposes" under the General Data Protection Regulation (GDPR). This paper aims to provide guidance on when the GDPR's privileged rules for scientific research apply.
Key points:
1. Broad Interpretation
The GDPR advocates for a broad interpretation of scientific research, including technological development, fundamental research, applied research and privately funded research.
2. Criteria for Scientific Research
The DSK outlines five main criteria:
- Methodical and systematic approach: Research must follow a methodical and systematic procedure.
- Knowledge gain: The aim should be to gain new insights.
- Verifiability: Research methods and results should be documented and potentially verifiable.
- Independence and autonomy: Researchers must maintain independence, even in commissioned research.
- Public interest: Research should serve the common good, not exclusively commercial interests.
3. Commercial aspects: The pursuit of economic motives does not exclude an activity from being considered scientific research, as long as it aims to achieve societal benefits.
For more information: [Position paper from DSK]- only available in German
ITALY
The Italian Data Protection’s Authority publishes Opinion on an Italian draft bill containing provisions and delegations on Artificial Intelligence (AI)
The Italian Data Protection Authority (Garante), in Opinion No. 477 of August 2, 2024, issued a favourable opinion on the Italian bill on AI, which also contains legislative delegation for adaptation to the EU Regulation on AI (No. 2024/1689 – AI law).
The Garante underlined the need for systematic and consistent coordination between new national legislation and European standards to avoid overlaps and ensure that personal data protection is always a priority. The Garante recommended adding to the Italian bill clear references to GDPR compliance. They suggest the introduction of a new article that specifically requires AI-related personal data processing to adhere to GDPR rules and other relevant national and European standards.
The Garante also commented on its suitability as the competent authority for specific aspects of AI, in particular for “real-time” remote biometric identification systems used for policing in public spaces. The Garante believes that, given its experience in the application of rules governing algorithmic decision-making processes based on personal data, this designation would be consistent and reduce administrative burdens, ensuring effective oversight of such systems.
In this opinion the Garante also expressed its concerns about the protection of personal data in the employment and healthcare sectors.
For more information : [Garante’s statement] - only available in Italian
MIDDLE EAST
Grace period for Saudi’s data protection law comes to an end
The grace-period for compliance with the KSA Personal Data Protection Law (PDPL) came to an end on 14 September 2024, meaning that this framework is in full force. Organisations must now comply with the requirements and the Saudi Data & AI Authority (SDAIA) can start enforcing and regulating these rules.
For more information: [KSA Personal Data Protection Law]
The Saudi Data & AI Authority (SDAIA) releases draft AI-related guidance, including on deepfakes
The SDAIA has released draft guidance for public feedback on deepfakes in Saudi Arabia, outlining use cases (including malicious and on-malicious) and guidance for key groups like GenAI developers and content creators on risk assessments, consent and watermarking artificial content. It has also advised consumers on assessing messages, analysing audio-visual elements, utilising content authentication tools and reporting technology misuse.
For more information: [Draft guidance for public feedback on deepfakes in Saudi Arabia]
ASIA
Singapore announces plan to introduce safety guidelines for Generative AI
The Singapore government announced in July 2024 plans to introduce safety guidelines for Generative AI model developers and app deployers, aiming to enhance transparency and trust. These guidelines will require developers to disclose how their AI models and apps function, including data usage, testing outcomes and potential risks – similar to the information provided with over-the-counter medications. Additionally, the guidelines will specify attributes for safety and trustworthiness that must be verified before deployment, addressing concerns like hallucinations, toxic statements and bias. Spearheaded by the Ministry of Digital Development and Information, this initiative seeks to establish a solid framework for the ethical use of generative AI, with public consultations planned to ensure relevance and effectiveness.
As the 2024, Chair for the ASEAN Digital Ministers Meeting, Singapore is leading efforts to harmonise data governance and facilitate seamless cross-border operations for ASEAN businesses. A new ASEAN Guide on Data Anonymisation, set to be released early next year, aims to provide businesses with a valuable tool for more responsible data usage across the region.
Hong Kong Monetary Authority issues updated principles on use of Generative AI
In November 2019, the Hong Kong Monetary Authority (HKMA) introduced guiding principles for authorised institutions to follow when using big data analytics and artificial intelligence (2019 Principles). In August 2024, the HKMA updated these guidelines, emphasising that the authorised institutions should apply the 2019 Principles to the use of Generative AI (GenAI). The update includes new principles to ensure appropriate customer protection is in place, including:
- Governance and accountability: Authorised institutions must ensure accountability for GenAI-driven decisions, with the board and senior management overseeing the responsible use of GenAI.
- Fairness: Authorised institutions should guarantee that GenAI models are objective, consistent, ethical and fair, avoiding bias against any customer or group.
- Transparency and disclosure: Authorised institutions are required to disclose the use of GenAI to customers, explaining the purpose, operation and limitations of GenAI models to enhance understanding and trust in the technology.
- Data privacy and protection: Effective measures must be implemented to protect customer data, adhering to the Personal Data (Privacy) Ordinance and guidelines issued by the Privacy Commissioner for Personal Data.
For more information: [Consumer Protection in respect of Use of Generative Artificial Intelligence]
Beijing Free Trade Zone publishes the 2024 Negative List for data export
Beijing Free Trade Zone (FTZ) published the 2024 edition “Negative List” for data export on 26 August. On top of national data transfer rules, the Beijing FTZ Negative List sets out the scope of “Important Data” for the automobile, pharmaceutical, civil aviation and AI companies incorporated in the FTZ. Where such companies transfer the listed data out of the PRC, they are legally required to clear the prior security assessment. Further, it finetunes the benchmarks for adopting security assessment or standard contract for the cross-border transfer of personal information in specific business scenarios of the automobile, pharmaceutical, civil aviation, retailing and AI sectors.




_11zon.jpg?crop=300,495&format=webply&auto=webp)



_11zon.jpg?crop=300,495&format=webply&auto=webp)





_11zon.jpg?crop=300,495&format=webply&auto=webp)
_11zon_(1).jpg?crop=300,495&format=webply&auto=webp)



