Deepfake Tool ProKYC Exploited for Crypto Exchange Fraud

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Cybercriminals are using an AI-powered deepfake tool, ProKYC, to bypass security protocols on cryptocurrency exchanges. This tool generates fake IDs and videos to pass facial recognition systems, facilitating money laundering and fraud. Cato Networks reports that these activities are increasing fraud in the crypto industry, posing significant security threats.[AI generated]

Why's our monitor labelling this an incident or hazard?

The event describes active misuse of an AI system (deepfake generation and video spoofing) that has directly led to financial harm through new-account fraud and money laundering operations. This constitutes a realized harm caused by an AI system, fitting the definition of an AI Incident.[AI generated]
AI principles
AccountabilityRobustness & digital securitySafetyPrivacy & data governanceTransparency & explainability

Industries
Financial and insurance servicesDigital securityGovernment, security, and defence

Affected stakeholders
Business

Harm types
Economic/PropertyReputationalPublic interestHuman or fundamental rights

Severity
AI incident

Business function:
Compliance and justiceICT management and information security

AI system task:
Content generation

In other databases

Articles about this incident or hazard

Thumbnail Image

Bad Actors Selling Deepfake Tool To Bypass Crypto Exchange Security Protocols, According to Cybersecurity Firm - The Daily Hodl

2024-10-13
The Daily Hodl
Why's our monitor labelling this an incident or hazard?
The event describes active misuse of an AI system (deepfake generation and video spoofing) that has directly led to financial harm through new-account fraud and money laundering operations. This constitutes a realized harm caused by an AI system, fitting the definition of an AI Incident.
Thumbnail Image

AI deepfake tool on 'new level' at bypassing crypto exchange KYC: Report

2024-10-11
Cointelegraph
Why's our monitor labelling this an incident or hazard?
An AI system (deepfake generator) is explicitly used by nefarious actors to commit identity and financial fraud by circumventing biometric KYC measures. This misuse has directly led to harm (fraud and potential financial losses), fitting the definition of an AI Incident.
Thumbnail Image

Deepfakes Can Fool Facial Recognition on Crypto Exchanges

2024-10-11
TechRepublic
Why's our monitor labelling this an incident or hazard?
The article describes a threat actor using a generative AI deepfake system to spoof liveness checks and fake government IDs, directly facilitating account fraud and money laundering on crypto platforms. This is a realized harm caused by the malicious use of an AI system.
Thumbnail Image

ProKYC Deepfake Tool Exploited to Bypass Crypto Exchange Security

2024-10-11
Techopedia.com
Why's our monitor labelling this an incident or hazard?
ProKYC is an AI system whose malicious use has directly led to real-world harm: creation of fraudulent accounts, New Account Fraud on exchanges like Bybit, Stripe, and Revolut, and financial losses for investors. This is a clear case of an AI Incident.
Thumbnail Image

New AI Deepfake Tool Exposes Vulnerabilities in Crypto Exchange KYC Systems

2024-10-11
Crypto News Land
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions an AI system (ProKYC) that generates deepfake videos and synthetic identities to deceive biometric verification systems used in KYC processes. The use of this AI system has directly led to fraudulent activities that undermine KYC rules, enabling identity theft and financial fraud. These constitute violations of legal obligations and harm to communities by facilitating criminal financial activities. Therefore, this qualifies as an AI Incident due to realized harm caused by the AI system's use in fraud.
Thumbnail Image

Crypto: AI Breaks The KYC Barriers Of Exchanges

2024-10-12
Cointribune
Why's our monitor labelling this an incident or hazard?
ProKYC is an AI system that generates deepfake videos and synthetic biometric data to bypass KYC checks, directly enabling fraud on crypto exchanges and payment platforms. The article details how this tool is actively used to commit fraud, causing financial losses to businesses and users. The harm is realized, not just potential, and the AI system's role is pivotal in enabling this fraud. Hence, it meets the criteria for an AI Incident due to direct harm caused by the AI system's use.
Thumbnail Image

Cato Networks Alerts Users About AI-Based Fraud Threats in Cryptocurrency Exchanges

2024-10-14
COINTURK NEWS
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions AI-based deepfake technology being used to create fake identity documents and videos to bypass facial recognition systems in cryptocurrency exchanges. This AI system's use has directly led to financial harm exceeding $5.3 billion in 2023 due to fraudulent accounts. The involvement of AI in the development and use of these deepfakes is central to the harm described, fulfilling the criteria for an AI Incident under the framework, as it involves violations of property and harm to communities through fraud and money laundering.