A curious employment-wanted ad appears on a Chinese-language Telegram channel devoted to connecting workers with scam bosses in Cambodia. A young woman posts five pictures of herself in various alluring poses, with the caption “Willing To Work As An AI Model.” She claims to have two years of experience as a “Killer,” a term that Pig Butchering compounds use to describe their workers. And now, she wants to graduate to the “AI Model” so her pretty face can be used for deepfakes.

Advertisements like these illustrate just how entrenched AI has become by professional scammers. It’s also an ominous sign of where scams are heading.

While 2024 provided us with rare glimpses of deepfakes, voice clones, and AI-generated phishing scams, it was merely “training wheels” for fraudsters as they tested the waters. It’s 2025 which is now poised to become the year AI scams become a dominant force in draining fintech and bank accounts.

On Pace For $40 Billion in AI Enabled Fraud

According to Deloitte Center for Financial Services, Generative AI will enable $40 billion in losses by 2027 up from $12.3 billion in 2023. That’s a 32 percent compound annual growth rate.

And that rise has gotten the attention of the FBI. Earlier this month, they warned that criminals are exploiting AI to commit fraud on a larger scale to increase the believability of their schemes.

AI Fraud As A Service Is The Next Big Threat

An advertisement for Haotian AI, a company that provides face-changing software on Telegram, shows they mean business. “We have an R&D team of hundreds of programmers and dozens of servers to serve you.”

The advertisement promotes deepfake face-changing software that is “difficult to distinguish with the naked eye” and designed for “overseas calls,” a perfect match for scammers who want to make their romance scams more believable.

On Telegram, the chatter on criminally focused channels related to AI and deepfakes for use in fraud is rising. In 2023, there were 47,000 messages, in 2024, the number of those messages has now surpassed over 350,000 – a 644 percent increase.

With so much more activity on criminal channels, fraud experts now believe AI-enabled scams will flourish in 2025.

Here is what you need to know.

The Vast Majority Of BEC Attacks Will Involve AI and Deepfakes

Business Email Compromise attacks leveraging deepfakes are now poised to become a major fraud threat in 2025. In two separate schemes in Hong Kong this year, fraudsters used AI-generated video and audio to impersonate company executives on Zoom calls, convincing employees to transfer nearly $30 million in funds.

And these are not isolated incidents. AI use in BEC attacks is everywhere. A report by Medius, a US-based company, reports that close to 53 percent of accounting professionals were targeted with deepfake AI attacks in the last year.

Another firm, VIPRE Security Group, reports that 40 percent of BEC emails are now completely AI generated. “As AI technology advances, the potential for BEC attacks grows exponentially”, said Usman Choudhary, their Chief Product and Technology Officer.

AI Romance Scam ChatBots Will Proliferate

A prominent Nigerian Yahoo Boy recently posted a video on his YouTube channel. The shocking video shows a fully automated AI chatbot communicating directly with a victim. The victim believes she is talking to her love interest – a military doctor based overseas. In reality, she is talking to a chatbot controlled by the scammer.

These chatbots give the scammer the ability to converse fluently with the victim without an accent to build the believability of the scam. Experts warn that the use of fully autonomous AI chatbots will proliferate in 2025.

Pig Butchering Operations Will Shift To AI Schemes

Imagine a wall of mobile phones, scamming thousands of victims across the world each minute. Videos on social media and posted on Telegram show that this is now a reality and it is a new technique being used by scam compounds to scale up their operations.

The AI software called “Instagram Automatic Fans” blasts messages to thousands of people a minute to ensnare them into pig butchering scams. One message reads “My friend recommended you. How are you?”. The same message is sent over and over again.

Criminal syndicates involved in pig butchering will leverage AI-powered deepfake technology for video calls, voice clones, and chatbots to vastly scale up their operations in 2025.

AI Deepfake Extortion Scams Will Target High Profile Executives

In Singapore, scammers recently orchestrated a deepfake email extortion plot targeting 100 public servants from 30 government agencies including Cabinet ministers.

The email demanded they pay $50,000 in cryptocurrency or face the consequences of deepfake videos that showed them in compromising situations. If they didn’t pay, the scammers promised to release the videos and destroy their reputations. The scammers created those convincing fakes using public photos from LinkedIn and YouTube.

With AI-driven deepfake software becoming more accessible, these deepfake extortion scams will likely spread to corporations across the world targeting their most high-profile executives.

Deepfake Digital Arrest Impersonation Scam Wave Will Hit US

With over 92,000 deepfake digital arrest cases reported in India since January 2024, cybersecurity experts warn this could be the next major fraud trend to hit Western nations.

In these scams, people posing as federal law enforcement officials psychologically manipulate victims through manufactured legal emergencies, isolate them, and demand ransoms or manipulate the victim into transferring their life savings to them. The scammers will at times use deepfake video and audio clips of government officials to make the scam more believable.

The scammers often operate from scam compounds in Myanmar and Cambodia and have refined their tactics by leveraging deepfakes. Indian authorities have been able to trace 40% of the new wave of digital arrests to Southeast Asia.

Just as pig butchering schemes migrated to the US in 2021, some believe these digital arrest deepfake scams will migrate in 2025.

A Watershed Moment In Financial Crime

The rise of AI scams represents a watershed moment in financial crime. While banks and fintechs are rushing to prepare defenses against it, criminals are simply moving faster.

With accessible AI available for as little as $20 a month, and new fraud as a service operations expanding rapidly 2025 promises to be a major turning point in scams.

The future of AI scams has arrived, and it just may speak in a voice that sounds exactly like yours.

Read the full article here

Share.