
The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.
The predictive policing AI Geolitica, used by Plainfield police in New Jersey, was found to be accurate in less than 1% of cases, leading to ineffective policing and reinforcing biases against minority communities. Studies and police feedback highlight its failure and the resulting harm to targeted populations.[AI generated]





























