
The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.
A class action lawsuit alleges that Workday's AI screening system discriminates against candidates over 40 based on age, race, and disability. Plaintiff Derek Mobley and four others claim the algorithm led to automated, swift rejections, sparking concerns over AI bias in recruitment and potential violations of labor laws.[AI generated]
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions that Workday's AI algorithms and tools used in job screening have caused harm by systematically rejecting applicants based on age, race, and disabilities, which constitutes a violation of labor and human rights. This harm has already occurred as plaintiffs claim they were rejected from hundreds of jobs. Therefore, this qualifies as an AI Incident due to the direct involvement of an AI system in causing discriminatory harm to people.[AI generated]