Scam compounds hiring “AI models” to seal the deal in deepfake video calls

| March 24, 2026
Deepfake AI model

Scam compounds in Southeast Asia have already become modern slave farms, trapping victims and forcing many of them to become scammers for them. Now they’ve added another type of worker to the mix: so-called AI models.

These professional scammers conduct video calls with their targets, charming them into handing over their cash. As reported in WIRED this week, recruitment ads describe roles handling around a hundred live video calls per day, promoting romance scams and crypto hustles in industrial-scale scam operations across Cambodia, Myanmar, and Laos. 

These scam farms already rely on chat operators to ensnare scam victims via messaging apps. Many of these operators are themselves victims of trafficking, forced to work long shifts under threats of violence. They develop relationships with victims over time, exploiting loneliness or financial worries. While they work to make a victim feel special, they’re actually juggling similar text sessions with dozens of people at once. Eventually, a victim may want a video call, either to meet their imagined sweetheart or to confirm an investment opportunity is legitimate (or both). 

Chat operators might not have the ability to charm victims on video, especially when they’re victims themselves, being made to work long shifts and are physically beaten.  So when a victim asks for a video call, the scam bosses call in a specialist “AI model” with strong interpersonal skills to charm the victim. Despite the name, they’re real people hired to appear on video calls. The AI deepfake software adjusts their looks to match the fictionalized person that the victim is hoping to see. 

Scam operations run recruitment ads for these models, and many seem willing to apply for these jobs. Humanity Research Consultancy, an investigative research group that tracks trafficking supply chains, identified a pitch from a 24-year-old Uzbekistani calling herself Angel. She claimed to speak four languages and to have a year’s experience as an AI model. She demanded $7,000 monthly for her services. 

The growth of scam compounds 

How do these scam compounds even exist? According to the Australian Strategic Policy Institute, Myanmar’s 2021 military coup helped fuel a fraud boom. Scam centers along the Thai border have more than doubled as crime syndicates move into that region, along with Myanmar, Cambodia and Laos. 

These scam centers are often tolerated because they line the coffers of local militia. But there have been some countermeasures. Raids and cross-border crackdowns have led to arrests and the movement of large numbers of suspects between countries, including operations targeting compounds such as KK Park in Myawaddy. Cambodia and Myanmar have also signalled increased efforts to tackle scam operations, although the networks remain highly resilient.

This kind of activity becomes easier as technology improves. Real-time face-swapping and deepfake tools are now good enough to support live video, not just pre-recorded clips. We’ve already seen real-time deepfakes used for everything from job interviews through to impersonating banking executives to scam millions. What’s new here is the scale: people handling dozens or even hundreds of calls a day for romance scams and crypto investment fraud shows that this is now a mass exploit. 

How to stay safe 

Here’s the problem with deepfake video: the common “tells” that let you spot it are evaporating. At one time a sure sign of an AI deepfake was someone with the wrong number of fingers or oddities in hairlines. You can up the ante in live calls by asking someone to turn sideways. Have them touch their nose, and wave their fingers in front of their face. It’s more difficult for deepfake software to handle that extra noise. 

But beware: the algorithms that produce deepfakes are getting better all the time, and more easily able to handle such tests. We’re at the point where this deepfake researcher says many more of us will be fooled by them this year. 

If you can’t fully trust what you see, fall back on what you know. Be wary of unsolicited contact, especially when someone quickly builds emotional rapport or introduces an investment opportunity. Even if a profile looks well-established or a website appears legitimate, take time to dig a little deeper.

Avoid sharing personal or financial information with someone you’ve only met online, and be wary of anyone who pushes you toward quick decisions or asks to move conversations off established platforms. The FBI has some sound advice on their website

The most dangerous part of this deepfake AI model trend is that it helps scam operations cross the final frontier. A live human can close a scam that a simple chat interaction can’t. That’s why people like Angel from Uzbekistan have a job, and why you need to be more on your guard than ever. 


We don’t just report on scams—we help detect them

Cybersecurity risks should never spread beyond a headline. If something looks dodgy to you, check if it’s a scam using Malwarebytes Scam Guard. Submit a screenshot, paste suspicious content, or share a link, text or phone number, and we’ll tell you if it’s a scam or legit. Available with Malwarebytes Premium Security for all your devices, and in the Malwarebytes app for iOS and Android.

About the author

Danny Bradbury has been a journalist specialising in technology since 1989 and a freelance writer since 1994. He covers a broad variety of technology issues for audiences ranging from consumers through to software developers and CIOs. He also ghostwrites articles for many C-suite business executives in the technology sector. He hails from the UK but now lives in Western Canada.