The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.
Australia's eSafety Commissioner has ordered four AI chatbot companies, including Character.ai, to detail measures protecting children from harmful content such as sexual material and encouragement of self-harm. The action follows concerns and reported incidents, including a suicide linked to chatbot interactions, prompting regulatory scrutiny and potential fines.[AI generated]

































