AEPD Bans AI-Based Facial Recognition for Online University Exams in Spain

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

The Spanish Data Protection Agency (AEPD) banned the use of AI-based facial recognition for online exam proctoring, sanctioning universities such as Universitat Internacional Valenciana and VIU. The decision was driven by concerns over biometric data protection and a lack of enabling legislation for such AI systems.[AI generated]

Why's our monitor labelling this an incident or hazard?

The event clearly involves an AI system (facial recognition with AI) used in a way that has directly caused harm by violating data protection laws and individuals' rights to privacy and data security. The sanction by the AEPD confirms that the AI system's deployment led to an incident of unlawful processing of sensitive biometric data without valid consent or legal basis, which is a breach of fundamental rights under applicable law. Therefore, this qualifies as an AI Incident due to realized harm from the AI system's use. The discussion of possible future regulatory frameworks does not negate the current incident classification.[AI generated]
AI principles
AccountabilityPrivacy & data governanceRespect of human rights

Industries
Education and training

Affected stakeholders
Consumers

Harm types
Human or fundamental rights

Severity
AI incident

Business function:
Monitoring and quality control

AI system task:
Recognition/object detection


Articles about this incident or hazard

Thumbnail Image

La AEPD sanciona el tratamiento de datos biométricos con IA en la VIU pero ve posibilidad de desarrollo normativo

2025-06-03
Valencia Plaza
Why's our monitor labelling this an incident or hazard?
The event clearly involves an AI system (facial recognition with AI) used in a way that has directly caused harm by violating data protection laws and individuals' rights to privacy and data security. The sanction by the AEPD confirms that the AI system's deployment led to an incident of unlawful processing of sensitive biometric data without valid consent or legal basis, which is a breach of fundamental rights under applicable law. Therefore, this qualifies as an AI Incident due to realized harm from the AI system's use. The discussion of possible future regulatory frameworks does not negate the current incident classification.
Thumbnail Image

Reconocimiento Facial en Exámenes Online: Prohibición de la AEPD - Notiulti

2025-06-03
Notiulti
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the use of an AI-based facial recognition system for exam monitoring, confirming AI system involvement. The AEPD's prohibition is a governance response to the risks posed by processing sensitive biometric data without sufficient justification, highlighting potential violations of data protection rights. However, the article does not report any realized harm such as injury, rights violations, or other damages caused by the AI system's use. Instead, it documents a regulatory decision to prevent such harms. This fits the definition of Complementary Information, as it updates on societal and governance responses to AI-related risks, rather than reporting a new AI Incident or AI Hazard.
Thumbnail Image

La AEPD sanciona el tratamiento de datos biométricos con IA en la...

2025-06-03
europa press
Why's our monitor labelling this an incident or hazard?
The article explicitly describes the use of AI-based facial recognition technology for biometric data processing in online exams, which is an AI system. The AEPD's sanction is due to the lack of valid consent and absence of a legal basis, meaning the AI system's use led to a breach of data protection regulations and fundamental rights. This constitutes a violation of human rights and legal obligations, fulfilling the criteria for an AI Incident. The harm is realized (sanction and rights violation), not merely potential, so it is not a hazard or complementary information.
Thumbnail Image

Protección de Datos prohíbe el uso de reconocimiento facial para vigilar exámenes online

2025-06-03
EL PAÍS
Why's our monitor labelling this an incident or hazard?
An AI system (facial recognition with AI analysis) was used in a way that directly impacts individuals' biometric data and privacy rights, constituting a violation of data protection laws and fundamental rights. The system's use led to a legal complaint and regulatory prohibition, indicating realized harm in terms of rights violations. Therefore, this qualifies as an AI Incident because the AI system's use has directly led to a breach of obligations intended to protect fundamental rights (privacy and data protection).
Thumbnail Image

Las universidades españolas controlaban los exámenes online con reconocimiento facial. La AEPD ha decidido que ya basta

2025-06-04
Xataka
Why's our monitor labelling this an incident or hazard?
The article explicitly involves AI systems (facial recognition and monitoring tools with AI components) used in online exam supervision. The AEPD's decision and sanctions indicate that the AI system's use has directly led to legal violations and harm to individuals' rights under GDPR, specifically regarding biometric data processing without valid consent or legal basis. This is a clear case of an AI Incident involving violations of human rights and legal obligations. The harm is realized and ongoing, not merely potential, as sanctions have been imposed and the practice is declared illegal without proper legal framework.
Thumbnail Image

# AEPD: Sanción por biometría con IA en VIU y posible regulación - Notiulti

2025-06-04
Notiulti
Why's our monitor labelling this an incident or hazard?
The use of biometric data processed by AI systems in exams directly implicates an AI system's use leading to a violation of data protection rights, which falls under violations of human rights or legal obligations. The sanction indicates that harm or breach has occurred due to the AI system's use, qualifying this as an AI Incident.