Community
Are you ready to face the dark side of AI? Deepfakes, voice clones and counterfeit ID documents are no longer distant threats—they are infiltrating the financial world and helping bad actors to open bank accounts as we speak. It has become increasingly challenging to distinguish between authentic and fraudulent identities during remote onboarding, both for humans as well as for AI itself. The bottom line: you can’t trust what you see or hear anymore, and more AI may not be the answer.
Bad AI vs. Good AI – who will win?
Some might say combating deepfakes and other AI-generated fraud requires leveraging advanced AI-driven solutions. But is that really true? Or is it a never-ending arms race? While AI algorithms can detect anomalies in images and voice samples, these systems can still be deceived by increasingly complex attacks. Although fraudsters often manage to get their way with less sophisticated methods as well. A study by the Chaos Computer Club showed how video identity verification systems could be tricked by deepfake videos created by relatively simple technology, without the need for cutting-edge tools. Similarly, fraudsters are finding other ways to exploit technology— 404 Media investigation uncovered OnlyFake, a website enabling users to easily create highly convincing fake ID documents that would bypass KYC verification processes.
Human Agents – even less effective than AI powered tools
Many of you might think – well if AI can be tricked, why not return to the use of human agents? They should be able to spot irregularities, shouldn’t they? I have bad news for those of you who think that this is safer. A widespread fraud strategy is to intentionally fail biometric verification to continue the process with a human operator, where they see high success rates for false acceptance. You will surely remember the viral story about the fraudsters who managed to trick a finance worker into paying out $25 million after a video call with a deepfake ‘chief financial officer’. Cutting the long story short, everyone on that call was fake except for the finance worker; the only "human in the loop" failed to detect the scam, he was convinced everyone looked and sounded just like colleagues he recognized.
Human-led systems face increasing cost pressures due to the need for extensive training and highly skilled examiners, making person-to-person remote video call identification not only inefficient but also uneconomical.
A possible solution – NFC-based identity verification
So, does this mean that we should get back to in-person identity verification? Not necessarily. NFC-based identity document verification is the key to combatting AI-generated identity theft. It completely removes reliance on images of the ID document being used; instead, it relies on cryptography as its foundation. All you need is a biometric identity document with NFC chip in it and a smartphone that supports NFC tech. Nowadays more than 170 countries issue International Civil Aviation Organisation (ICAO) Doc 9303 compliant chipped identity documents. The information on the chip is digitally signed by the document issuing country and most of countries have also implemented protection against cloning. This makes NFC-based remote identity verification the most secure way to remotely verify the authenticity of identity documents. The document check step is simply not subject to deepfake-type fraud risks.
To ensure that the person carrying out identification process is also the real owner of the identity document, NFC-based identity document verification should be combined with a liveness check. Of course, liveness checks are always subject to AI-generated fraud because they are based on imagery, but risk reduced to a minimum since a high resolution colour face image (without any watermarks, security elements or glare) is retrieved from the chip and used as the basis for the liveness check. The chances of a deepfake attack when using NFC-based IDV are extremely low. Fraudsters typically look for the easiest way to bypass the system. The likelihood of someone stealing IDs and then trying to bypass the liveness check with a deepfake is very slim—it’s simply not worth their effort.
Besides, the high-resolution face image, other personal information on the chip can also be retrieved and shared with the financial institution. It is 100% accurate data extraction thus allowing full automation of the KYC process.
Governments encouraging the use of NFC-based identity verification
An increasing number of regulators and other institutions also have come to realise that that more advanced methods for remote IDV are needed; that they can no longer rely on image-based document verification. Some examples are listed below:
As we navigate the rapidly changing landscape of fraud and the role generative AI is playing, prioritizing robust and secure methods like NFC-based ID document verification becomes imperative to safeguard against increasingly sophisticated threats and ensure the integrity of remote identification processes.
This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.
Rolands Selakovs Founder at avoided.io
14 February
Sergei Grechkin Chief Risk Officer at AIFM Cayros Capital
Katherine Chan CEO at Juice
Yuval Shuminer CEO at Piere
13 February
Welcome to Finextra. We use cookies to help us to deliver our services. You may change your preferences at our Cookie Centre.
Please read our Privacy Policy.