It is hardly surprising that the realm of AI deepfakes has an avid following among the criminal community. What is surprising is how suddenly that focus has shifted in recent months to a particular target—fintech’s identity controls.

Getting a glimpse into the underworld of fraud is as easy as downloading Telegram and searching open forums where fraud is discussed freely. It is there, one can find instant messages between fraudsters exchanging methods, selling stolen identities, and even peddling access to breached banking and payment card data.

An Explosion of Interest On Telegram Fraud Forums

A recurring area of interest that becomes immediately apparent on the forums is AI. An analysis of over ten million instant messages from the top twenty-five fraud forums from 2020 onward revealed a massive spike in mentions of “AI” starting in March 2024, where mentions of the technology surged to over 37,000 messages in a single month—a 900 percent increase over the previous month.[1]

While the frequency of conversations is interesting, what they are saying offers even more insight into their intent. Their messages are telling, indicating interest in cloning voices, creating realistic deepfake videos and exploiting banks, lenders and fintechs to open new accounts for money muling, or taking over existing customers accounts to drain their funds.

“Who knows an AI voice changer that is realistic that I can use for voice calls?” one user asks. Within minutes, he gets several responses with viable options and even an offer to sell a special software package to do it for him.

Using AI Faces To Bypass Verification Mechanisms

Fraud forums on Telegram are peppered with AI-generated deepfake videos. These videos show people rotating their heads up and down and side to side. There are advertisements for AI that can turn a single photo of an AI-generated face into a convincing video that can fool liveness checks used by fintechs and banks for selfies and driver’s license verification during the Know Your Customer process.

One such piece of software will set a buyer back $2,000 or more but they claim they can bypass at least five of the largest liveness detection software packages fintech’s use.

And for those buyers that doubt it is real, “vouches” and “proof” showing the AI working on a major fintech online application screen are there too. One video shows a man using a fake identity using a photo he took of a bank executive off the internet. He turns on the software, which pushes a deepfake video from his desktop to his phone with a virtual camera. He moves his head in a circle to complete the liveness check and within seconds receives an “all clear” on the identity checks.

Face Swaps Are The Deepfake of Choice

In some cases, simple off-the-shelf face swapping tools are being used to create a whole web of deceit and deception too.

According to iProov’s Threat Intelligence Report 2024, “Face swaps are now firmly established as the deepfake of choice among persistent threat actors”. Their statistics show face swap injection attacks increased by a staggering 704 percent in the second half of 2023 compared to the first half.

The proliferation of easily accessible face swap apps like Xpression Camera, Faceswap Live, and AI Faceswap used in conjunction with traditional cyber tools such as emulators is bringing a new level of sophistication to threat actors intent on exploiting KYC controls.

From Realistic Masks To Deepfakes Which Are Far Harder To Detect

One person tracking the AI exploits intently on the dark web is David Maimon, a professor at Georgia State University and the head of fraud insights at identity verification firm SentiLink.

He has seen a shift from the use of ultra-realistic mask, which were popular during the pandemic, to AI generated deepfakes to bypass KYC liveness checks.”

“Some identity vendors are concerned with liveness detection bypass,” he says, “In the past, they were worried about the 3d silicon masks, but that does not seem to be an issue their tool cannot overcome at the moment. They are more concerned about the deepfakes.”

And Maimon is not the only one concerned. Last year, Binance CSO, Jimmy Su, expressed concerns about AI generated deepfakes ability to bypass KYC controls after the company saw a marked rise in attempts to use on their platform.

Speaking to a reporter from CoinTelegraph, he explained that the sophistication of the deepfakes has become so advanced that they can respond in real-time during the liveness checks.

“Some of the verification requires the user, for example, to blink their left eye or look to the left or to the right, look up or look down,” he said, “The deepfakes are advanced enough today that they can actually execute those commands.”.

AI’s Impact: Brace Yourselves

If there is any doubt that we’re on the cusp of an AI fraud avalanche, think again. It is about to get very interesting in financial services fraud, and here are a few reasons why:

A New Level of Realism

The quality of deepfake clones is improving dramatically. What once looked uncanny and artificial (remember the Will Smith viral spaghetti eating video?) now appears startlingly real and human. For instance, just weeks after the release of Deep Cam Live, an AI software that can generate a real time deepfake from a single photo, a Tik Tok user convinced 8,000 people that he was the real Elon Musk in a livestream.

Infinite Scalability and Automation

We may live to see the day when a 16-year-old will carry out a $1 billion dollar fraud from their bedroom. That is because AI-powered tools allow fraudsters of any age or capability to generate thousands of convincing fake identities quickly for next to nothing. In some respects, that capability is already here — the website OnlyFake.org claims to be able to generate over 20,000 fake identities per day using neural networks.

Targeting The Frictionless Onboarding Experience

When it comes to balancing convenience and security, most banks and fintechs companies have focused on the former — building streamlined onboarding processes that have minimal human intervention and are designed to alleviate friction as much as possible. While this is convenient and sought after for legitimate customers, it can also make them an attractive target.

The Deepfake Challenge: A Call To Action

“You get 15,000 real drivers licenses, all with front, back and selfies to use as a bonus”, says one of the biggest deepfake sellers on Telegram. “This could be very useful if you are planning to abuse referral schemes, promotions, or just trying to verify an account.”

There is a real appetite to exploit any loophole online. It’s not a question of whether or not AI faces will defeat liveness detection at scale; they will. The only question is how soon and how much damage will be done.

Fintechs will have to look at how they approach identity verification very carefully.

Are they up to the challenge? Time will tell.

Read the full article here

Share.