The same AI tools generating fake music videos and phony IDs are now supercharging crypto scams. This week, Chainalysis reported that fraudsters using deepfakes, face-swap apps, and large language models alongside classic romance and investment cons are netting an average of $3.2 million per scheme — roughly 4.5 times the haul of a conventional crypto scam. The data lands as a string of incidents this month shows how cheap and accessible the technology has become.
Tools of the trade
The arsenal is growing fast. OpenAI's ChatGPT Images 2.0 can now generate fake IDs, prescriptions, receipts, bank alerts, and news screenshots, according to a report by The Atlantic's Lila Shroff. That's a subscription product. On the cheaper end, Haotian AI — a Chinese real-time deepfake software exposed by 404 Media — costs a few hundred dollars and works on Microsoft Teams. Reporter Joseph Cox swapped faces live on a Teams call using it. Resemble AI, which documented a wave of AI-generated content across politics, entertainment, and crime in early May, said the tools are consumer-grade, widely available, and improving faster than institutional response.
Real-world losses
The scams are landing hard. A Chicago man lost $69,000 to a scammer who flashed an AI-generated US Marshals badge on a video call. In August 2025, attackers stole $2 million by impersonating the founder of Plasma on a video call. And BeInCrypto reported that North Korean operatives run deepfake video calls on Zoom — adding a geopolitical dimension to the trend. Even political deepfakes are getting traction: an AI video of mayoral candidate Spencer Pratt drew 4.1 million views on X, and FBI Director Kash Patel posted a video that appeared to use AI to generate shots nearly identical to the Beastie Boys' 'Sabotage' video.
Why crypto is a prime target
Chainalysis didn't mince words. Pair deepfakes with a romance con and a promise of crypto returns, and the damage multiplies. The $3.2 million average is not just about better tech — it's about trust. Scammers can now impersonate founders, regulators, or even romantic partners with real-time face swaps. The same Haotian AI that works on Teams can be used to fake a Zoom call with a supposed exchange executive. And ChatGPT Images 2.0 makes fake bank alerts and transaction receipts indistinguishable from the real thing.
The response gap
Resemble AI's warning is blunt: the tools are improving faster than institutions can react. That gap is where the money is going. No single regulator has moved to block deepfake software for video calls, and the platforms that host these tools — from app stores to cloud providers — have not imposed uniform bans. The question now is whether any enforcement action will come, or whether this year's scam average will keep climbing.



