AI Face Models Apply for Jobs Running Deepfake Video Scams, Making 100 Calls Per Day

Reviewed byNidhi Govil

2 Sources

Share

Dozens of recruitment channels on Telegram are seeking AI face models to conduct pig-butchering scams via deepfake video calls. Workers from Turkey, Russia, Ukraine, and Asia apply for roles requiring up to 100 video calls daily, using face-swapping technology to manipulate victims into cryptocurrency and romance scams across Southeast Asia.

AI Face Models Emerge as New Frontier in Online Financial Scams

A disturbing trend has emerged in the world of cybercrime: people actively applying to become AI face models for sophisticated scam operations. Angel, a 24-year-old woman from Uzbekistan, arrived in the Cambodian city of Sihanoukville ready to leverage her fluency in English, Chinese, Russian, and Turkish. But her linguistic talents weren't destined for conventional corporate work. Instead, she was applying to sit in front of a computer all day, making deepfake video calls to manipulate Americans in elaborate pig-butchering schemes

1

.

Source: Inc.

Source: Inc.

Angel's application video reveals she already has one year of experience as an "AI model," highlighting how this exploitative industry has matured over recent months. A WIRED investigation uncovered dozens of recruitment videos and job advertisements posted to Telegram, showing applicants from Turkey, Russia, Ukraine, Belarus, and multiple Asian countries seeking these roles in Cambodia and Southeast Asia

1

. The region has become notorious for vast, industrialized scam compounds that hold thousands of human trafficking victims captive while forcing them to run cryptocurrency investment scams and romance scams.

How Recruitment for AI Models Operates on Telegram

Hieu Minh Ngo, a cybercrime investigator at the Vietnamese scam-fighting nonprofit ChongLuaDao, identified around two dozen channels on Telegram dedicated to these job postings. "In the past year until today, they are also hiring people doing AI modeling," Ngo explains. "They will give you the software so they can swap their face by using AI and they can do romance scams"

1

. Humanity Research Consultancy, an anti-human-trafficking organization, has independently tracked people applying on Telegram for jobs in known scam hub cities as models and AI face models.

The job advertisements paint a grim picture of working conditions. Recruiters demand excessive hours with little free time and relentless schedules. One posting for an alleged six-month contract specifies that workers must send photos daily, make video and voice calls, and create audio and video messages—approximately 100 video calls per day

1

. These listings often contain veiled or explicit references to crypto schemes, gold trading, and romance scams, with ideal candidates speaking multiple languages

2

.

Face-Swapping Technology Powers Deepfake Video Calls

The mechanics of these AI scams reveal how criminal organizations in Southeast Asia have industrialized deception. Fraudsters typically use stolen images of celebrities or attractive individuals to create fake personas on social media and messaging platforms. Once contact is established, they bombard potential victims with attention to build relationships before requesting money. When suspicious victims request video calls to verify authenticity, that's when AI deepfake technology and face-swapping technology come into play

1

.

Source: Wired

Source: Wired

Some Southeast Asian scam centers have established dedicated "AI rooms" specifically for conducting these deepfake video calls. Multiple people may control a single scammer account, messaging victims under one fake persona. The AI face models provide the human element needed to sustain the illusion during video interactions, with their real faces swapped in real-time to match the fake profile images victims have been shown

2

.

The Double Exploitation of AI Scams

What makes this phenomenon particularly troubling is the multi-layered exploitation involved. Based on applications reviewed, some individuals applying for these roles understand they'll be scamming others and may even have past experience in it

2

. Yet the scams are often multi-fold: targeting vulnerable workers into exploitative roles that help criminal enterprises extract billions from unsuspecting victims.

While some people actively seek out this work, the broader context involves human trafficking and forced labor in scam compounds across the region. These high-tech, multibillion-dollar criminal enterprises both trick people into working in scam facilities and attract willing applicants seeking employment

1

. The rise of recruitment channels within the past year alone signals how quickly this industry is expanding, with applications requiring details like height and weight alongside language skills and prior AI modeling experience.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo