
The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.
Multiple reports highlight how poorly designed AI algorithms have led to discriminatory outcomes in areas like hiring, healthcare, and vaccine distribution. Examples include Amazon's recruiting tool penalizing women and Stanford's vaccine algorithm disadvantaging frontline workers, demonstrating how algorithmic bias can perpetuate systemic discrimination and harm vulnerable groups.[AI generated]





























