
The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.
Israeli startup Toka, founded by ex-premier Ehud Barak and former cyber division head Yaron Rosen, has developed AI-powered software that allows governments to access and alter live or recorded surveillance footage without leaving traces. This technology raises serious human rights concerns, enabling potential falsification of evidence and misuse by authorities.[AI generated]
Why's our monitor labelling this an incident or hazard?
The software from Toka is an AI system because it involves sophisticated manipulation of video surveillance footage, likely using AI techniques for real-time and retrospective editing without detection. The event involves the use of this AI system by governments to alter video evidence, which can directly lead to violations of human rights and legal rights, such as wrongful incrimination and political misuse. These harms have already occurred or are ongoing given the software's deployment and use. Therefore, this qualifies as an AI Incident due to direct harm caused by the AI system's use.[AI generated]