Challenges facing employers when using AI applications

Employers can automate the recruitment process with AI, enhance efficiency, and optimise decisions, but must comply with legal requirements.

17 September 2025

Publication

Loading...

Listen to our publication

0:00 / 0:00

AI in recruiting: Challenges facing employers when using AI applications

From creating job advertisements through analysing and evaluating application documents to pre-selecting candidates, conducting interviews and subsequent onboarding, employers can now automate almost every aspect of the application process through the use of AI applications. Using AI tools thus opens up numerous opportunities for increasing efficiency and optimising decision-making processes. However, in doing so, the legal requirements outlined below must be observed.

The two main cornerstones of the legal framework are the EU AI Act and the General Data Protection Regulation (GDPR).

The EU AI Act creates a binding legal framework at European level that is also relevant for employers using AI applications. The following highlights the aspects that are particularly relevant for employers conducting an application process:

  • High-risk AI systems: Systems for selecting or evaluating applicants are considered high-risk AI systems, and are thus subject to special regulations. Among other things, employers must ensure that such systems are used properly and are monitored by qualified personnel.
  • Prohibited practices: For certain practices, the use of AI tools is expressly prohibited. For example, AI applications that recognise and evaluate emotions are prohibited. This includes systems that analyse non-verbal signals such as facial expressions or gestures in video interviews.

Moreover, the GDPR is of relevance for the use of AI tools in the application process. In addition to the requirement that the processing of personal data must be based on a legal basis and must be purpose-specific, Article 22 of the GDPR must be observed in particular. According to this, individuals have the right not to be subject to a decision based solely on automated processing which produces legal effects concerning them or significantly affects them.

From this arises the fundamental requirement of supervision by a human being, also to ensure that an applicant’s individual circumstances are appropriately taken into account during the application process. Employers should therefore use AI systems that may pre-sort and evaluate applications and make recommendations, but do not make independent decisions on rejections or hires. The final decision must always be made by a human being.

Violations of the EU AI Act and the GDPR are subject to high fines. Violations of the EU AI Act are punishable with fines of up to EUR 15 million or 3% of global annual turnover. Violations of the GDPR trigger even higher fines.

Potential discrimination through AI applications

Employers should also be aware that even supposedly objective applications can lead to discriminatory decisions by reproducing or reinforcing unconscious biases, especially if the training data is not sufficiently representative. For example, if women in management positions are underrepresented in the training data, the AI tool could consider this to be the norm and systematically disadvantage female applicants when selecting candidates for senior positions.

Discrimination by AI is often difficult to detect. This is because so-called proxy variables, such as place of residence or educational background, can be used to indirectly infer sensitive characteristics such as gender or ethnic origin without directly using the data itself. If a hiring or promotion decision is based on a recommendation that relies on such discriminatory assumptions, this will constitute a violation of the General Equal Treatment Act (AGG). In such cases, the applicant concerned will be entitled to compensation in accordance with Section 15 (2) AGG.

Conclusion

The use of AI tools in the application process allows companies to streamline and optimise their processes. However, employers should ensure that the systems used are transparent, non-discriminatory and compliant with data protection regulations. Running regular system checks, obtaining legal expertise and establishing human control mechanisms are essential in order to avoid legal pitfalls and to fully exploit the advantages of the technology.

This document (and any information accessed through links in this document) is provided for information purposes only and does not constitute legal advice. Professional legal advice should be obtained before taking or refraining from any action as a result of the contents of this document.