
The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.
Automated tenant-screening algorithms used by US landlords are denying rental applications, often without explanation, disproportionately affecting marginalized groups. Applicants report repeated, vague rejections, raising concerns about algorithmic bias and discrimination in housing access. Civil rights groups and policymakers are calling for greater transparency and regulation of these AI systems.[AI generated]
Why's our monitor labelling this an incident or hazard?
The event involves AI systems (automated screening algorithms) whose use in rental housing decisions has directly led to discriminatory harm against applicants, particularly marginalized groups, by denying housing opportunities without transparency or proper explanation. This constitutes a violation of rights and harm to communities, fitting the definition of an AI Incident. The article also references ongoing regulatory efforts, but the primary focus is on the realized harm caused by these AI systems in housing access.[AI generated]