
The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.
Replika, an AI chatbot app designed for companionship, abruptly disabled its sexual conversation features, leaving many users distressed, lonely, and emotionally harmed. The change, partly prompted by regulatory concerns, led to user backlash, petitions, and reports of significant psychological impact, highlighting the risks of modifying AI systems relied upon for emotional support.[AI generated]
Why's our monitor labelling this an incident or hazard?
The Replika AI chatbot is an AI system designed to provide companionship and emotional support. The event reports that the AI system's change in behavior—no longer responding to sexual advances—has caused users to feel lonely and lost, which constitutes harm to their emotional health. This harm is directly linked to the AI system's use and its interaction design. Furthermore, the Italian Data Protection Authority's order to stop processing data due to risks to children underscores the seriousness of the harm and legal concerns. Since the AI system's use has directly led to harm to persons and potential violations of protections for minors, this event meets the criteria for an AI Incident.[AI generated]