
The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.
Scammers used AI voice-cloning technology to impersonate loved ones, convincing elderly victims like Ruth Card to urgently withdraw and send money for fake emergencies. The realistic synthetic voices led to significant financial losses and emotional distress, highlighting the growing threat of AI-enabled impersonation fraud.[AI generated]













































:quality(85)//cloudfront-us-east-1.images.arcpublishing.com/infobae/76C55IE2LJDP7HV63SXIDCNX4I.jpg)
:quality(85)//cloudfront-us-east-1.images.arcpublishing.com/infobae/PBJUIPTFAFGLLKGXTOO5C4QOSI.jpg)


/https://assets.iproup.com/assets/jpg/2023/03/33933_landscape.jpg)












