Automated Tenant-Screening Algorithms Cause Discriminatory Housing Denials in the US

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Automated tenant-screening algorithms used by US landlords are denying rental applications, often without explanation, disproportionately affecting marginalized groups. Applicants report repeated, vague rejections, raising concerns about algorithmic bias and discrimination in housing access. Civil rights groups and policymakers are calling for greater transparency and regulation of these AI systems.[AI generated]

Why's our monitor labelling this an incident or hazard?

The event involves AI systems (automated screening algorithms) whose use in rental housing decisions has directly led to discriminatory harm against applicants, particularly marginalized groups, by denying housing opportunities without transparency or proper explanation. This constitutes a violation of rights and harm to communities, fitting the definition of an AI Incident. The article also references ongoing regulatory efforts, but the primary focus is on the realized harm caused by these AI systems in housing access.[AI generated]
AI principles
FairnessTransparency & explainabilityAccountabilityRespect of human rights

Industries
Real estateFinancial and insurance services

Affected stakeholders
Consumers

Harm types
Human or fundamental rightsEconomic/PropertyPsychological

Severity
AI incident

Business function:
Other

AI system task:
Organisation/recommendersForecasting/prediction


Articles about this incident or hazard

Thumbnail Image

As US cities face a housing crisis, algorithms are deciding who can and cannot rent a home

2022-11-19
Scroll.in
Why's our monitor labelling this an incident or hazard?
The event involves AI systems (automated screening algorithms) whose use in rental housing decisions has directly led to discriminatory harm against applicants, particularly marginalized groups, by denying housing opportunities without transparency or proper explanation. This constitutes a violation of rights and harm to communities, fitting the definition of an AI Incident. The article also references ongoing regulatory efforts, but the primary focus is on the realized harm caused by these AI systems in housing access.
Thumbnail Image

US renters fall foul of algorithms in search for a home

2022-11-20
The Star
Why's our monitor labelling this an incident or hazard?
The event involves AI systems (automated tenant screening algorithms) whose use has directly led to violations of rights, specifically discrimination in housing access, which is a breach of fundamental rights and protections. The harm is realized as applicants are denied housing unfairly due to opaque algorithmic decisions, constituting an AI Incident. The article also discusses regulatory responses, but the primary focus is on the harm caused by these AI systems in their current use.
Thumbnail Image

U.S. renters fall foul of algorithms in search for a home

2022-11-16
National Post
Why's our monitor labelling this an incident or hazard?
The article explicitly references automated valuation models, which are AI systems used in home appraisals. The harm described is a violation of rights and harm to communities, as biased appraisals can lead to unfair treatment of minority homeowners. Since these harms are ongoing and directly linked to the use of AI systems, this qualifies as an AI Incident. The mention of mitigation efforts does not negate the presence of harm but rather addresses it.
Thumbnail Image

FEATURE-U.S. renters fall foul of algorithms in search for a home | Technology

2022-11-16
Devdiscourse
Why's our monitor labelling this an incident or hazard?
The event involves AI systems (automated tenant-screening algorithms) whose use has directly led to harm in the form of discriminatory housing denials and lack of transparency, impacting fundamental rights and causing harm to communities. The harm is realized and ongoing, as applicants report repeated rejections with vague reasons, and civil rights groups highlight algorithmic discrimination. Therefore, this qualifies as an AI Incident under the framework.
Thumbnail Image

U.S. renters fall foul of algorithms in search for a home

2022-11-19
UnionLeader.com
Why's our monitor labelling this an incident or hazard?
The event involves AI systems (automated tenant-screening algorithms) whose use has directly led to harm in the form of discriminatory denial of housing applications, which constitutes violations of rights and harm to communities. The harm is realized and ongoing, as applicants are being denied housing unfairly due to these AI systems. Therefore, this qualifies as an AI Incident. The article also covers responses and potential regulation, but the primary focus is on the harm caused by the AI systems in use.