In today's interconnected world, where data flows effortlessly across borders, ensuring its security and compliance is paramount. Our updates are your trusted source for the latest news, expert analyses, and practical tips on navigating the complex landscape of data protection.
Long story short
The ICO has launched its consultation series on generative AI and warns against 2024 becoming the year people lose trust in AI.
EU:
- In France: (a) the CNIL has fined Amazon for employee monitoring; (b) there's been an expansion of protection of minors beyond EU requirements;
- in Italy, the Garante has probed into AI and webscraping;
- in Germany, the DSK has issued a position paper setting out the data protection requirements and challenges of cloud-based health applications;
- the EDPB has issued guidance on DPOs;
- the ECJ has confirmed that organisations can limit damages when proving that a fault was not attributable to them whilst making it clear that a fine can only be imposed if some kind of fault is established;
- the CJEU Rules have provided important clarification of the scope and damages of data subject claims following a cyber attack; and
- the CJEU has clarified the conditions under which national supervisory authorities may impose an administrative fine.
UK:
- the ICO has launched its consultation series on generative AI and warns against 2024 becoming the year people lose trust in AI;
- published new guidance related to employment; and
- shared its findings from its investigation on "text pests".
Asia:
- China released new guidelines facilitating cross-border personal data flow within Greater Bay Area;
- HKMA issued a circular on managing cyber risks on third party service providers; and
- The Monetary Authority of Singapore released responses to proposal on mandatory reference checks.
Middle East:
- Saudi has launched National Data Governance Platform for registration and compliance services under its new Data Protection Law; and
- issued its first official guidance for its new Data Protection Law.
Must reads / must listen
- Quick Guide of the EU Data Act article by Christopher Götz
- The revolution in the data driven healthcare and life sciences market article by Edoardo Tedeschi and Matteo
- Find the latest news regarding contentious risk relating to data and privacy on our blog Updata
Regional updates
UK
ICO launches its consultation series on generative AI and warns against 2024 becoming the year people lose trust in AI
On 6 December 2023, the Information Commissioner, John Edwards, delivered a speech at the TechUK Digital Ethics Summit, with a focus on the importance of organisations remaining "authentic" if they are to "seize the moment" in using AI and other emerging technologies.
As detailed in our article, key takeaways from Edwards' speech included:
The need to maintain trust in AI: Edwards was clear that the ICO wants to ensure that AI is deployed in "sensible, privacy-respectful ways", so as to ensure that people do not move away from using apps or technology due to fears around the risks to their data;
Looking into the supply chain: Edwards explained that the ICO are interested in "getting into the weeds of the supply chain", asking key questions such as "What information are they trained on?";
Collaboration with other regulators: Edwards was clear that this is not just an issue for the ICO, citing various ways in which the ICO is working with other regulators (such as the Competition and Markets Authority, Ofcom and the Financial Conduct Authority) to help ensure the safe deployment of AI across the UK;
Sandbox: Edwards urged organisations to make applications for the Sandbox (which closed at the end of December 2023) which the ICO put in place so as to provide a "safe space" for creating innovative products and services, with staff on hand to help with "tricky" data protection issues that organisations may come up against, whilst offering a way to "stress test" products and services before releasing these to the wider market; and
Innovation advice services: Edwards reminded listeners of the ICO's Innovation Advice Service (launched in April 2023), through which the ICO promises to answer data protection questions within 10-15 working days.
More recently, on 15January 2024, the ICO issued its consultation on generative AI and data protection, which closes on 1 March 2024.
The ICO explained that it has launched its consultation as part of its investigation into how aspects of data protection law should apply to the development and use of generative AI models, which it notes "could be transformative for people and businesses if organisations develop and deploy it responsibly".
Some of the key questions which the consultation will focus on are:
- What is the appropriate lawful basis for training generative AI models?
- How does the purpose limitation principle play out in the context of generative AI development and deployment?
- What are the expectations around complying with the accuracy principle?
- What are the expectations in terms of complying with data subject rights?
ICO publishes new guidance related to employment
On 12 December 2023, the ICO published two new pieces of employment guidance covering employment record-keeping and recruitment and selection issues respectively.
The draft guidance on record-keeping in part aims to clarify how established principles of UK data protection law, such as legal basis considerations and the rights of access and erasure, apply in respect of records commonly held about workers. Many large groups will already be across these issues.
Other sections of the guidance dealing with use of employment records in specific commercial contexts are likely to be of some interest. There is guidance on the use of employment records to detect fraud, which is timely given new UK legislation introducing a corporate offence of failure to prevent fraud.
In relation to handling of employment records during mergers and acquisitions, the guidance calls on employers to anonymise information where possible, and where not possible only to disclose personal data with assurances that the recipient will only use the data to evaluate assets and liabilities and will destroy or return the data after use. Some companies will see this as unrealistic, given the importance with some deals of having access to personal data on an on-going basis to understand the cost impact and integration challenges of transactions. Separately, the guidance helpfully considers issues associated with data sharing in the context of the Transfer of Employment Regulations (TUPE) and pension and insurance provision.
The guidance on recruitment and selection has a broader audience, which takes in employers as well as recruitment agencies and related service providers. It provides considerable detail on how data protection law principles are intended to apply in relation to recruitment activity. One section focuses on data protection impact assessments for "high risk" processing; the guidance suggests that businesses should consider carrying out assessments when they collect personal information from sources other than the candidate without providing them with privacy information, which may represent a gap in some companies' current compliance frameworks.
Many will also be drawn to the sections in the recruitment guidance on sole or partial use of automated decision-making and profiling (including where this involves the use of third-party AI solutions) for recruitment and on pre-employment vetting of candidates. The latter section deals with obtaining information about criminal convictions and using social media and performing credit checks for vetting purposes and will be useful for those clients we have seen carrying out multi-jurisdictional reviews around the rights of employers in these areas.
Consultation on both sets of guidance is open until 5 March 2024.
ICO findings from its investigation on "text pests"
The ICO has produced findings of its investigation into "text pests", including good practice that it encourages businesses to follow. "Text pests" are companies' personnel who use customer contact details obtained through orders (such as of a takeaway or taxi) made to the business to contact them to proposition them.
Being contacted by text pests can be a distressing experience for affected individuals. It can also represent a data protection compliance risk for the companies whose personnel are the "text pests", although the ICO did not find any "ongoing negligent behaviour" from specific companies in its investigation.
Good practice measures identified by the ICO included:
Data minimisation: allowing couriers to view only limited delivery and customer data and for the time necessary to carry out deliveries;
Access controls: access to customer contact data can be restricted to those involved in responding to customer queries; and
Pseudonymisation/masking techniques: such as hiding telephone numbers (and only enabling contact to be made via a central number) can likewise be effective.
As ever, staff training, as well as robust disciplinary measures and reporting to the ICO, were found to be key by the ICO.
EUROPE
##EU
EDPB publishes high-level summary of report on DPOs
The European Data Protection Board ("EDPB") launched a coordinated enforcement action on the role of the data protection officer ("DPO") in 2023, involving 25 supervisory authorities ("SA") across the European Economic Area ("EEA"). The purpose was to assess the compliance of organisations with the GDPR provisions on DPOs and identify the challenges and needs of DPOs and organisations.
Participating supervisory authorities drafted a questionnaire, which was sent to the stakeholders of each SA's choice at national level. The questionnaire was designed to allow different strategies and approaches by the SAs. The results were aggregated and analysed at the EEA level and presented in this report along with the national reports.
In their survey the SAs found among other things:
- Insufficient funds are being allocated to DPOs;
- DPOs are lacking the relevant exert knowledge and training;
- conflicts of interest found with DPOs; and
- deficits in regular reporting of DPOs to the management boards.
The EDPB plans to update and evolve its current guidance regarding DPOs to address these issues, specifically by introducing new best practice recommendations to aid data controllers / processors as well as DPOs in the fulfilling of their respective duties.
CNIL imposes a 32 million fine against Amazon France Logistics for excessively intrusive employee activity and performance monitoring
On December 27th, 2023, the French data protection authority (the "CNIL") imposed a 32 million fine against Amazon France Logistics (Amazon's large-scale warehouses in France) in relation to (i) the processing of data from the scanner used by employees to carry out their tasks in order to document in real time the quality, productivity and inactivity of each employee and (ii) the use of video surveillance. This fine corresponds to 3% of Amazon's sales. The CNIL identified three major breaches to the GDPR:
- Breach to the data minimisation principle, by collecting every detail of the employee's quality and productivity, and to keep these information for a period of 31 days;
- Unlawful processing of data consisting in measuring periods of interruption of the scanner, and by signalling an excessive speed of scanning; and
- Insufficient information of employees regarding the use of video surveillance, and insufficiently secure access to the video surveillance system.
AI and web scraping: Italy's DPA opens training data probe
On November 21, 2023, Italy's data protection authority (the "Garante") announced an investigation into how data is collected for the purpose of training algorithms. The probe covers public and private entities and aims to "verify the adoption of suitable security measures to prevent the massive collection (webscraping) of personal data."
The Garante also opened a 60-day public consultation, starting on last January 18, on potential security measures to combat data scraping. The goal is to acquire observations, comments and possible operational proposals on the measures adopted and adoptable by website and platform operators, both public and private, with respect to the massive collection of personal data, carried out through web-scraping techniques, by companies developing generative AI systems, for the purpose of training the relevant algorithms. Contributions may be submitted by all stakeholders, in particular trade associations, consumer associations and experts and representatives of the academic world.
France aims to protect minors online way beyond European regulation requirements
Faced with the ever-increasing number of minors surfing the web, and their growing exposure to harmful content (pornographic content, gambling websites, etc.), France has sought to reinforce the protection of minors online based on a new draft bill on digital services.
The issue French lawmakers are facing is the tension between the challenge of protecting minors online and the protection of privacy, guaranteed notably by the GDPR. The intention is not to identify but to track minors online, leaving free access to adults for pornographic and gambling websites.
However, the method for monitoring minors online is a matter of debate. Some methods can be highly intrusive. For example, age verification using an ID or facial analysis through a trusted third party raises the issue of sensitive data processing. Others, such as control through possession of a credit card, can be easily circumvented. Algorithmic solutions for estimating age, which are less intrusive but also less reliable, could be adopted. A verification mechanism was to be tested with a trusted third party implementing double anonymity. It ended up not being tested at all, as France ran up against the technical feasibility of such a control and European regulations on personal data protection. After the commissioner Thierry Breton summoned France to apply DSA and DMA back in August 2023, French lawmakers have informal discussions with the EU so as to find a way to move forward on this particular matter.
ECJ confirms organisations can limit damages when proving that a fault was not attributable to them
The case concerns a judgment of the European Court of Justice ("ECJ") on the application of Article 82(1) GDPR and the nature of the compensation to the damage suffered due to the infraction of the GDPR.
To this end, the ECJ confirmed that the monetary compensation shall enable to fully compensate for the damage concretely suffered and has no deterrent or penalising function.
Moreover, the ECJ clarified that the mere infringement of the GDPR is not sufficient to recognise a right to compensation for damage, after pointing out, in particular, that the existence of damage that has been suffered, or of an infringement, constitutes one of the conditions of the right to compensation laid down in Article 82(1), as does the existence of an infringement of GDPR and of a causal link between that damage and that infringement [see to that effect, judgment of 4 May 2023, Österreichische Post, C-300/21].
Therefore, even though the fault of the organisation is presumed, data controllers/processors can prove that the breach is not attributable to them. The obligation to pay damages without fault would contradict the principle of legal certainty.
ECJ makes it clear that a fine can only be imposed if some kind of fault is established
The case concerns a recent judgment of the ECJ regarding the conditions for imposing fines on legal entities for violations of the General Data Protection Regulation ("GDPR").The ECJ had to decide whether the GDPR allows a fine to be imposed on a legal entity as a data controller without attributing the violation to a specific individual and whether the fine requires proof of fault on the part of the legal entity.
The ECJ ruled that the GDPR does not require such attribution to an individual and that a fine can be imposed directly on a legal entity as a data controller if it decides on the purposes and means of the data processing. The ECJ makes it clear that a fine can only be imposed if some kind of fault is established. However, it also clarifies that no special or particularly high requirements are placed on this culpability. Intent is not necessary, negligence is sufficient.
The ECJ's judgment confirms the broad scope of liability and sanctions under the GDPR, and the need for legal entities to ensure compliance with the data protection rules.
ECJ provides judgment on rights to obtain one's own personal data
The case concerns a judgment of the ECJ on the rights of patients to obtain a first copy of their personal data processed by their doctors free of charge, in accordance with the GDPR.
The ECJ held that the obligation of the data controller (the doctor) to provide the data subject (the patient) with a first copy of their personal data free of charge applies regardless of the purpose of the request. The ECJ also ruled that a national rule that imposes the costs of providing the first copy on the patient to protect the economic interests of the doctor is not compatible with GDPR. Furthermore, the ECJ clarified that the right to receive a copy of the personal data includes the right to receive a complete copy of the documents contained in the patient record that contain such data.
This ruling shows once again that the ECJ interprets the rights of data subjects broadly and that, provided there are no indications of misuse, the request for the provision of stored personal data should generally be complied with.
CJEU clarifies the conditions under which national supervisory authorities may impose an administrative fine
The Court of Justice clarifies the conditions under which national supervisory authorities may impose an administrative fine on one or more controllers for an infringement of the GDPR. In particular, it holds that the imposition of such a fine requires that there be wrongful conduct; in other words, that the infringement has been committed intentionally or negligently. Moreover, where the addressee of the fine forms part of a group of companies, the calculation of that fine must be based on the turnover of the entire group.
CJEU Rules provides important clarification of the scope and damages of data subject claims following a cyber attack
Even before the inception of the GDPR there has been lively argument over what damages are available for breach of data protection rights, not least in the leading Vidal-Hall and Lloyd cases, both against Google. In December 2023, The Court of Justice of the European Union issued a judgment that provides some important points of clarity on the issue, at least for those in the European Union. For those of us in other jurisdictions, the judgment provides useful guidance which should be taken into account by Courts considering the issue.
The ruling in VB v. Natsionalna agentsia za prihodite (C‑340/21), delivered on December 14, 2023, concerned a cyber attack against the Bulgarian National Revenue Agency that affected 6 million individuals, one of whom sued alleging non-financial damage suffered as a result of the result (under Article 82). The individual argued that this damage took the form of fear of future misuse of their data leading to possible blackmail, assault and kidnap. The CJEU's ruling contained (amongst other things) the following three key elements:
1. The judgment followed an earlier CJEU decision in May, UI -v- Osterreichische Post AG, which had determined that Article 82 GDPR does not give rise to an automatic right to damages for mere infringement of an individual's data protection rights (echoing the UK Supreme Court's view in Lloyd -v- Google). The VB judgment built on this earlier one by indicating that the alleged fear experienced by the individual may, in itself, constitute a form of non-financial damage for which the individual could receive compensation (a question to be decided back at national court level and for the data subject to prove).
2. The fact of a data breach does not lead to a presumption that the data processor's security measures are inadequate (and therefore in breach of the GDPR). The Court must conduct a detailed assessment of the security measures in place, and the controller is only liable if a failure to implement appropriate security measures caused or contributed to the breach. This is a helpful aspect of the ruling for data controllers facing litigation as a result of a breach. Many claims contain bare assertions that a single data breach means that the controller was negligent or had taken insufficient security measures, but of course this is often not the case. However, the CJEU placed the burden of proving the adequacy of the security measures on the data processor.
3. Less helpfully for data controllers, the CJEU confirmed that controllers are not necessarily absolved from liability where a data breach occurs as the result of a third party (here, cyber criminals). Again, key to avoiding liability is proving the adequacy of the security measures in place (or if they were inadequate, they did not contribute to or cause the breach).
This ruling has far-reaching implications for both data controllers and data subjects. For the former, it provides a roadmap to ensure minimisation of risk in circumstances where a data breach arises as a result of a cyber attack by ensuring the design and implementation of appropriate security measures. For the latter, it provides clear boundaries as to what a credible claim for damages following a data breach looks like, and should dissuade the more speculative claims often brought. Hopefully a ruling of similar clarity and breadth will come along in England & Wales in order to provide some direct precedent.
The German DSK has issued a position paper setting out the data protection requirements and challenges of cloud-based health applications
The position paper of the Conference of Independent Data Protection Supervisory Authorities of the Federal and State Governments (Datenschutzkonferenz - "DSK") sets out the DSK's position on the data protection requirements and challenges of cloud-based health applications. It is the opinion of the DSK that the use of health applications (e.g., an app for reading and storing glucose values) must be possible without using cloud functions and without linking to a user account, unless the cloud function is necessary for achieving a therapeutic benefit and the function is expressly desired by the data subject.
The DSK further stresses that the utilization of personal data for research purposes and quality assurance should always be considered under the aspects of reasonable use. In determining whether means are reasonable, account should be taken of all objective factors, such as the cost of identification and the time required, taking into account the technology available at the time of processing and technological developments. If a digital health application is to use anonymized data, a data protection impact assessment ("DPIA") should be conducted in order to show how the anonymization is carried out and demonstrate that the removal of the personal reference is actually guaranteed.
Finally the DSK emphasizes the need to establish processes for the effective and prompt fulfilment of the rights of data subjects, such as access, rectification, erasure, restriction of processing, and data portability, and to ensure a secure authentication of the applicants.
MIDDLE EAST
SAUDI ARABIA
Saudi launches National Data Governance Platform for registration and compliance services under its new Data Protection Law
On 27 November 2023, the Saudi Authority for Data and Artificial Intelligence ("SDIAIA"), at the Saudi Data Forum, announced the launch of various initiatives including the National Data Governance Platform.
SDAIA highlighted that the platform serves to register entities falling within the scope of the newly rolled-out Personal Data Protection Law ("PDPL"). In this regard, SDAIA noted that the platform would form a unified national registry and assist entities in fulfilling their obligations under the PDPL.
The platform is reported to provide government agencies with a number of services, including a: (i) data breach notification service; (ii) privacy impact assessment service; (iii) a legal support service; and (iv) self-assessment tool. SDAIA also plans to offer a compliance assessment service and corrective procedures follow-up service to the platform in due course.
The National Data Governance Platform has not currently been rolled-out for public use - and in any case, the reports from the Saudi Data Forum suggest that the platform services (other than registration) may only be utilised by Saudi government entities in the first instance. We will keep a close watch on developments in this space.
Saudi issues first official guidance for its new Data Protection Law
On 17 January 2024, SDAIA released its first (and eagerly anticipated) official guidance in relation to the PPDL. The guidance is intended to familiarise relevant organisations with the key data protection concepts and requirements under the PDPL. As stated in the document, "the guidance will be your on-the-job reference in your data protection compliance journey".
The guidance also provides some rather useful sector-specific case studies in relation to each compliance topic, which adds some practical flavour to the guidance and will help organisations contextualise any compliance efforts.
On the theme of case studies, the guidance also provides useful instructions to readers in relation to how to use the National Data Governance Platform (as discussed above) in certain contexts, such as in relation to conducting impact assessments and reporting data breaches. Importantly, this appears to be directed at all organisations (rather than just government agencies as the services under the National Data Governance Platform were initially reported to be limited to - as set out above) - which may indicate some encouraging news for commercial organisations subject to the PDPL seeking to utilise these compliance services.
ASIA
CHINA / HONG KONG
China released new guidelines facilitating cross-border personal data flow within Greater Bay Area
On 13 December, the Cyberspace Administration of China and the Innovation, Technology & Industry Bureau of Hong Kong jointly released the Standard Contract for Personal Information Cross-Border Flow within Greater Bay Area (mainland and Hong Kong) ("GBA Standard Contract") together with its implementing guidelines ("Guidelines"), effective immediately. On the same day, the Hong Kong Office of the Privacy Commissioner for Personal Data ("PCPD") issued guidance on the GBA Standard Contract ("GBA Guidance") (see link) to assist companies in applying the GBA Standard Contract.
The Standard Contract is one of the three mechanisms to transfer personal information outside of China, which works similarly to the EU Standard Contractual Clauses but with an additional filing procedure. The GBA Standard Contract aims to provide a streamlined version of the "standard" Standard Contract, with lighter contractual obligations and more friendly filing requirements, to ease the compliance burden of cross-border data flow within the GBA.
Here are some more details on the GBA Standard Contract and the Guidelines:
Territorial scope: to benefit from the GBA Standard Contract, both the data transferor and recipient shall be registered in either Hong Kong or the nine cities of Guangdong Province including Guangzhou, Shenzhen, Zhuhai, Foshan, Huizhou, Dongguan, Zhongshan, Jiangmen or Zhaoqing;
No volume threshold: GBA Standard Contract applies regardless of the volume of personal information to be transferred outside of China. Data rich businesses which used to trigger security assessment (the stricter approval mechanism) may now be downgraded to the less restrictive standard contract filing mechanism;
Exclusion: the new scheme won't apply to (i) any transfer or secondary transfer to recipients outside of the above territorial scope. The data recipient should ensure that, personal data received via the GBA Standard Contract shall not be transferred outside of the Greater Bay Area; or (ii) any transfer of personal information which has been identified as "important data". They will need to follow the current nationwide data transfer rules;
More flexible clauses: GBA Standard Contract cuts back some obligations for the data recipient, as well as provides for more flexibility in terms of terminology and dispute resolution clauses;
Simpler filing documents: the personal information protection impact assessment report is no longer required to be filed, and the scope of the assessment is also reduced;
Data flow from HK to mainland: GBA Standard Contract can also apply to data flows from HK to the mainland side of the GBA. According to latest guidance from the HK PDPC, parties are encouraged to adopt the GBA Standard Contract if they satisfy the territorial scope; and
Coordination between regulators from both sides: filings can be made with regulators of both mainland and HK, and same for raising complaints and reporting data incidents.
The Hong Kong government has initiated the first pilot scheme inviting participants in banking, credit referencing and healthcare sectors, all of which with high demand for data.
More importantly, the GBA Standard Contract is only the first of many facilitation measures to come under the framework of the MoU on Facilitating Cross-boundary Data Flow Within the Greater Bay Area. The GBA Standard Contract and Guidelines imply the recognition of Hong Kong as offering equivalent data protection as available under Chinese law. With that spirit, we can expect that more opening measures will materialise in the near future to further free up cross border data flow within the GBA and supercharge the economic growth in the region.
Hong Kong - HKMA issued a circular on managing cyber risks on third party service providers
On 21 December 2023, the Hong Kong Monetary Authority (the "HKMA") issued a circular (see link and link) to assist Authorized Institutions ("AIs") in managing cyber risk associated with third party service providers. In view of the rise of supply chain attacks on global institutions over the past year, the HKMA has undertaken thematic examinations on AI's management of cyber risk associated with the use of third party services.
The HKMA has shared sound practices derived from the thematic examinations. The key points are as follows:
Integrating third party cyber risks into the risk governance framework: AIs shall formulate a risk governance framework which allows them to identify, assess and manage cyber risks with third parties under different scenarios (e.g. data breaches, operational disruptions or security compromises);
Identifying, assessing and mitigating cyber risk throughout the third-party management lifecycle: AIs should conduct regular reviews, cyber resilience assessments of third parties and ensure security measures are contractually agreed upon and periodically evaluated;
Assessing supply chain risks of third parties: AIs should assess the supply risks of third parties supporting critical operations of the AIs. This could be in the form of conducting additional due diligence on the dependencies of fourth-parties and end-to-end data processing. If needed, AIs shall conduct additional security reviews for higher-risk acquisitions;
Expanding cyber threat intelligence monitoring: Als should share intelligence with peer institutions through platforms like the Cyber Intelligence Sharing Platform (CISP) to prepare against supply chain attacks;
Preparing for supply chain attacks: AIs shall formulate incident response strategies based on common risk scenarios and previous supply chain incidents. If a third party is supporting critical operations of the AI, the AI shall establish effective response protocols with the third party and conduct regular drills; and
Adopting the latest standards, practices and technologies: AIs shall regularly review and enhance their cyber defence capabilities with reference to the latest international standards and sound practices.
The HKMA expects AIs to review their current controls against this circular and adopt sound practices accordingly. The HKMA will provide further guidance as international and industry developments unfold.
Singapore - The Monetary Authority of Singapore released responses to proposal on mandatory reference checks
The Monetary Authority of Singapore on 12 December 2023 released its Response to Feedback Received on Proposals to Mandate Reference Checks which confirms that the regulator will proceed to impose mandatory reference checks requirements on financial institutions. This will be imposed via notices which MAS is drafting to ensure financial institutions conduct and respond to reference checks on a minimum set of standardised information. The record keeping obligations will have an interplay with the issue of employee data privacy and it has been clarified that where the proposed requirements to be set out in the notices are stricter than the requirements imposed under the data privacy legal requirements, the stricter requirements in the notices will apply.
Please contact one of us below or your usual Simmons contact if you have any questions.
The full team can be found on our Data Protection and Privacy webpage.














