Deepfake Video of FTX Founder Used in Crypto Scam Targeting Investors

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Scammers used AI-generated deepfake videos of FTX founder Sam Bankman-Fried to impersonate him on Twitter, directing victims to fraudulent websites with promises of crypto refunds or giveaways. The deepfakes exploited verified accounts to appear legitimate, aiming to steal cryptocurrency from investors affected by FTX's collapse.[AI generated]

Why's our monitor labelling this an incident or hazard?

The event explicitly mentions the use of a deepfake video, which is an AI system generating synthetic media. The video was used maliciously to scam users, directly causing harm through phishing and potential financial loss. Therefore, this qualifies as an AI Incident because the AI system's use directly led to harm to people (scam victims).[AI generated]
AI principles
Transparency & explainabilityRobustness & digital securitySafetyAccountabilityRespect of human rights

Industries
Financial and insurance servicesMedia, social platforms, and marketingDigital security

Affected stakeholders
Consumers

Harm types
Economic/PropertyReputationalPsychologicalPublic interest

Severity
AI incident

AI system task:
Content generation

In other databases

Articles about this incident or hazard

Thumbnail Image

SBF phishing: Fake Sam Bankman-Fried video attempts to scam FTX investors on Twitter

2022-11-22
capital.com
Why's our monitor labelling this an incident or hazard?
The event explicitly mentions the use of a deepfake video, which is an AI system generating synthetic media. The video was used maliciously to scam users, directly causing harm through phishing and potential financial loss. Therefore, this qualifies as an AI Incident because the AI system's use directly led to harm to people (scam victims).
Thumbnail Image

Modified Video of FTX Founder Sam Bankman-Fried Directs Users to Fraudulent Website - This is What You Need to Know

2022-11-22
cryptonews.com
Why's our monitor labelling this an incident or hazard?
The event explicitly involves an AI system (deepfake video generation) used maliciously to impersonate a public figure and trick users into sending cryptocurrency to scammers. This misuse of AI has directly caused harm to individuals by facilitating fraud and theft, fitting the definition of an AI Incident due to harm to people (financial harm) and communities (trust erosion).
Thumbnail Image

SBF deepfake attempts to scam FTX investors with video to steal funds

2022-11-22
Crypto News Flash
Why's our monitor labelling this an incident or hazard?
The event involves the use of an AI system (deepfake technology) to produce synthetic video and audio impersonating a real person, which is then used to scam investors and steal cryptocurrency funds. This directly leads to harm to property (financial loss) and harm to communities (investor fraud). The harm is realized as the scam is actively being circulated and has a clear intent to defraud. Therefore, this qualifies as an AI Incident due to the direct involvement of AI-generated deepfake content causing harm through fraudulent financial schemes.
Thumbnail Image

Scammers target FTX victims with deepfake video of disgraced founder

2022-11-22
Daily Mail Online
Why's our monitor labelling this an incident or hazard?
The event explicitly involves an AI system—deepfake video technology—used in the scam. The use of a deepfake video directly led to harm by deceiving victims into sending cryptocurrency to scammers, fulfilling the criteria for an AI Incident. The harm is financial loss to individuals (harm to persons/groups), and the AI system's involvement is central to the incident. Therefore, this qualifies as an AI Incident rather than a hazard or complementary information.
Thumbnail Image

Sam Bankman-Fried Deepfake Video Attempts to Scam the Scammed

2022-11-21
Gizmodo AU
Why's our monitor labelling this an incident or hazard?
The deepfake video is generated using AI techniques to manipulate video and audio, creating a realistic but fake representation of Sam Bankman-Fried. This AI-generated content is used maliciously to scam people, causing financial harm to victims who might be tricked into sending cryptocurrency to fraudulent addresses. The harm is direct and realized, as the scam targets vulnerable individuals who have already suffered losses. Therefore, this qualifies as an AI Incident due to the AI system's role in causing direct harm through deception and fraud.
Thumbnail Image

SBF Deepfake Scam Offers Users 'Compensation' for FTX Collapse

2022-11-21
Yahoo News
Why's our monitor labelling this an incident or hazard?
The event describes a deepfake video created using AI technology to impersonate Sam Bankman-Fried and promote a fraudulent cryptocurrency giveaway. The AI system's role in generating the deepfake video is pivotal to the scam's effectiveness, which has already caused financial harm to some users. This fits the definition of an AI Incident, as the AI system's use has directly or indirectly led to harm to property (users' cryptocurrency).
Thumbnail Image

Sam Bankman-Fried deepfake attempts to scam investors impacted by FTX

2022-11-22
Cointelegraph
Why's our monitor labelling this an incident or hazard?
The event explicitly describes the use of AI systems (deepfake video and voice emulation programs) to create a fake video that was used to scam investors, leading to direct financial harm. The AI system's use is central to the incident, as it enabled the creation of a convincing fraudulent video that misled victims. This meets the criteria for an AI Incident because the AI system's use directly led to harm (financial loss) to people affected by the scam.
Thumbnail Image

SBF Deepfake Scam Offers Users 'Compensation' for FTX Collapse - Decrypt

2022-11-21
Decrypt
Why's our monitor labelling this an incident or hazard?
The event explicitly involves an AI system (deepfake technology) used maliciously to create a fraudulent video that deceives users and causes financial harm. The harm is realized (users have lost cryptocurrency), fulfilling the criteria for an AI Incident. The AI system's use is central to the scam's effectiveness, making it a direct cause of harm. Hence, this qualifies as an AI Incident rather than a hazard or complementary information.
Thumbnail Image

FTX News: SBF's Viral Deepfake Lures Customers For Refund

2022-11-21
Coingape
Why's our monitor labelling this an incident or hazard?
The event describes a deepfake video, which is an AI-generated synthetic media, used to impersonate a public figure to deceive victims into sending cryptocurrency to fraudulent addresses. This has directly led to financial harm (loss of cryptocurrency) to victims, fulfilling the criteria for an AI Incident under harm to persons or communities. The AI system's use is explicit and pivotal in enabling the scam, and the harm is realized, not just potential. Therefore, this qualifies as an AI Incident.
Thumbnail Image

Sam Bankman-Fried deepfake attempts to scam investors impacted by FTX

2022-11-22
IQ Stock Market
Why's our monitor labelling this an incident or hazard?
The event explicitly involves an AI system in the form of deepfake technology that generates synthetic video and voice content to impersonate a real person. The use of this AI-generated content directly leads to harm by facilitating a phishing scam that attempts to steal cryptocurrency from investors, constituting harm to property and communities. Therefore, this qualifies as an AI Incident due to the realized harm caused by the malicious use of an AI system.