Deepfakes: The Growing Threat to Identity Verification and Authentication

By March 7, 2023Biometric Summit 2023

This article has been written by deepfake detection startup, Duck Duck Goose to accompany their involvement as a partner in a live session; “The threat of deepfakes to biometrics and how to counteract them” at the Goode Intelligence Biometric Summit 2023 on 23rd March. Register here to attend.

Deepfakes: The Growing Threat to Identity Verification and Authentication

Deepfakes are hyper-realistic fake images, videos, and audio becoming increasingly prevalent. The technology behind deepfakes is based on artificial neural networks that can analyze and imitate faces, voices, and other media aspects, putting the digital world to the test. This allows someone to, for instance, generate images and videos of people that do not exist. It can even use someone else’s face to make them say or move a certain way. Even more worrying is that having just one image of the target is enough to create these sorts of deepfakes, causing concern over the potential for identity fraud. 

Recently, deepfake software and websites have become increasingly accessible to all users, allowing anyone to create a deepfake without any technical expertise easily. This growing use of deepfake technology poses a significant threat to digital identity verification and facial biometric authentication, which is increasingly being used to verify and authenticate individuals in secure locations or online accounts. Facial biometrics is a technology increasingly used to identify and authenticate people, for example, when accessing secure buildings or online accounts. However, deepfakes make this no longer safe by tricking the technology used for identification and authentication purposes. Therefore, it is crucial to develop improved digital identity verification and authentication strategies to address this growing threat. In this article, we will discuss three examples of how deepfakes can be misused for identity fraud.

Deepfakes for live meetings

Deepfake software can be used by criminals to replace themselves as another person or even a non-existent person during a live meeting. They can use deepfake videos to pretend to be someone else and gain access to secure systems or information. In addition, they can replace the face on their ID document with the same deepfake running in real-time, allowing them to commit fraud with financial transactions, for example.

Fooling and bypassing liveness checks

Liveness checks are used to recognize whether the face being scanned is from a live person or not. With the emergence of deepfakes, there are ways to get around this check. For example, criminals can inject the deepfake into a virtual phone environment. Or they can play with a smartphone’s camera stream, making it appear as though the phone’s camera is actively recording a live face while a pre-recorded or live video is injected into the phone.

Deepfake morphs and physical documents

Deepfakes also make it easier to create better morphs in a scalable way that can be used to log into secure systems. Facial authentication systems and face match technology could fail in this case and treat the morph as a live face. Criminals can even use these morphs to create physical documents, enabling them to commit identity document fraud, a serious issue that is difficult to detect. In the world of digital identity, the current state of technology offers techniques to detect selfie fraud attempts, such as reply attacks or mask attacks, and remove them. However, these techniques are still not accurate enough and can make mistakes when it comes to detecting specific deepfakes.

To protect the digital world from misleading deepfakes, it is important to further develop the technology behind deepfake detection and adopt a multi-level security approach. This can help detect and prevent deepfake attacks before they cause damage. While deepfake technology can be used for positive purposes, it is important to be aware of its potential dangers and take steps to mitigate them.