WHY THIS MATTERS IN BRIEF

It doesn’t take much to get a new fake bank account, and criminals are making as much of this new deepfake tool as they can.

 

Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trendsconnect, watch a keynote, or browse my blog.

Deepfakes have been getting better for a while and will soon be so realistic that very few people, if anyone, will be able to tell them apart from real people. Now, in a step towards that fake reality a new Artificial Intelligence (AI) powered deepfake tool called ProKYC that allows nefarious actors to bypass high-level Know Your Customer (KYC) measures on crypto exchanges demonstrates a “new level of sophistication” in crypto fraud, cybersecurity firm Cato Networks said.

 

 

In an Oct. 9 report, Cato Networks’ chief security strategist Etay Maor said the new AI tool represents a significant step up from the old-fashioned methods cybercriminals used to beat two-factor authentication and KYC. Instead of purchasing forged ID documents on the dark web, AI-powered tools allow fraudsters to spin brand-new identities literally out of thin air.

Cato said the new AI tool had been customised specifically to target crypto exchanges and financial firms whose KYC protocols include matching webcam pictures of a new user’s face to their government-issued ID document such as a passport and or a driver’s license.

 

Bank fraud, as easy as 123.

 

A video provided by ProKYC demonstrated how the tool can generate fake ID documents and accompanying deepfake videos to pass the facial recognition challenges used by one of the world’s largest crypto exchanges. In the video, the user creates an AI-generated face and integrates the deepfake image into a template of an Australian passport.

 

 

Next, the ProKYC tool creates deepfake an accompanying video and image of the AI-generated person, used to successfully bypass the KYC protocols on the Dubai-based crypto exchange Bybit.

Cato said that with AI-powered tools like ProKYC, threat actors are now far more capable of creating new accounts on crypto exchanges, a practice known as New Account Fraud (NAF).

The ProKYC website offers a package with a camera, virtual emulator, facial animation, fingerprints, and verification photo generation for $629 as part of an annual subscription. Outside of crypto exchanges, it also claims to be capable of bypassing KYC measures for payment platforms Stripe and Revolut, among others.

 

 

Maor said properly detecting and safeguarding against this new breed of AI fraud is challenging, as overly strict systems could cause false positives, whereas any lapses would be allowing fraudulent actors through the net.

“Creating biometric authentication systems that are super restrictive can result in many false-positive alerts. On the other hand, lax controls can result in fraud,” said the founders.

Still, there are potential detection methods for these AI tools such as those offered by Reality Defender, some of which rely on humans to manually identify unusually high-quality images and videos, as well as inconsistencies in facial movements and image quality.

 

 

The penalties for identity fraud in the United States can be severe and vary depending on the nature and extent of the crime. The maximum penalty is up to 15 years imprisonment and heavy fines.

In September, software firm Gen Digital, the parent company of antivirus firms Norton, Avast and Avira, reported that crypto scammers using deepfake AI videos to lure in victims to fraudulent token schemes have grown increasingly active in the last 10 months – which is no surprise.

The post This new Deepfake tool lets criminals easily bypass bank KYC appeared first on Matthew Griffin | Keynote Speaker & Master Futurist.

By