FB pixel

Strengthening biometrics to fight against sophisticated payments fraud

Strengthening biometrics to fight against sophisticated payments fraud
 

By Tim Brown, Global Identity Officer at Prove

Biometrics have become a part of consumers’ everyday lives. Long celebrated for its many advantages, this technology adds convenience, speed, and ease to a wide variety of uses. In light of the ongoing evolution of newer, savvier cyber threats, it’s becoming increasingly clear that relying solely on biometrics for payments may not offer sufficient protection and calls for additional sophisticated solutions.

Cyber criminals are using technology designed to manipulate or generate fake images, videos, and audio with increasing sophistication, bypassing systems’ security with ease. In fact, the problem is so pervasive that Gartner research predicts 30% of enterprise companies will no longer rely on face biometrics and similar technology as reliable, standalone identity verification and authentication methods by 2026. Deepfakes and digital injection attacks are particularly problematic.

Navigating new dangers: deepfakes and digital injection attacks

As bad actors deploy increasingly sophisticated threat tactics, including deepfakes and digital injection attacks, the financial industry’s go-to cybersecurity solutions will also need to evolve. It’s time to alter the industry’s current reliance on biometrics when it comes to online payments.

Deepfakes are images, live videos, or sound files that have been altered, manipulated, or produced by artificial intelligence (AI). Fraudsters are deploying this generated or manipulated content to attack remote identity verification systems, and they’re escalating in quality, scale, and complexity. There are a number of ways fraudsters submit this data — from video replay to digital injection attacks. While the former generally involves high-quality video/audio replay from a secondary source, the latter is much more nuanced.

Video/audio injection uses access to applications and website APIs, exposing vulnerabilities in their code to inject AI-generated content into the application or website. This fools the system, masquerading as content originating from a known source. Systems that rely on media capture for remote onboarding are uniquely vulnerable to deepfakes because they require the capture of such data from devices outside of the onboarding entity’s control. What’s more, remote onboarding systems that rely on image capture, such as document authentication and selfie capture, are inherently full of user friction.

When biometrics are used for remote payments, cyber criminals can trick these systems with deepfakes and wreak havoc with large-scale, remote attacks. According to the aforementioned Gartner research, in 2023, digital injection attacks increased by 200%.

Reducing fraud with safer, user-friendly authentication

Fortunately, there are newer low-friction, low-fraud options available that can bolster biometric security measures.

Device trust is key in preventing image and audio-based attacks. Customers’ identities may be verified with their phones without the need for document scans or such biometrics as selfies. Instead, device intelligence combines smart personal information verification and real-time risk scoring for more trustworthy results. For example, phone signals can be analyzed during transactions to detect such concerns as phone takeovers, SIM swaps, and synthetic identity fraud. By simply ensuring that the onboarding user is actually in possession of the device, that the device is trusted, and that the onboarding user owns that device, financial institutions can ensure a secure capture for more dependable identity verification and authentication.

There is also great potential in AI, cybersecurity’s double-edged sword. For example, the previously mentioned vulnerabilities could one day be addressed by using techniques being pioneered by D-CAPTCHA to augment liveness detection, using machine learning techniques to change up the liveness prompts in real time. This challenges the deepfake model to generate content on the fly, which exceeds its capabilities. Such technology is still nascent and not yet available for commercial use, however, they shed light on new avenues the industry could one day explore.

While biometric technology alone might be appropriate for certain uses, when it comes to online payments, today’s high-tech attacks call for the additional support of alternative, sophisticated solutions. Unfortunately, incidents involving AI-generated media are only expected to increase. This is the time to act — before a cyber criminal or cyber-criminal organization has the opportunity to victimize more customers or financial institutions. The financial industry has the power to offer customers fast, convenient, frictionless ways to pay, all while protecting against fraudsters’ latest tactics.

About the author

Tim Brown is the Global Identity Officer at Prove and a well-respected expert in biometrics, authentication, and identity. He has led product architecture and development and is a frequent SME participating in numerous standards organizations, round tables, conferences, webinars, and podcasts on subjects ranging from biometric and forensic data interchange to identity verification and the impact of mobile driver’s licenses on the vertical markets. Brown holds multiple patents in authentication, biometric liveness, and identity acquisition.

Related Posts

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

 

Biometrics providers and systems evolve or get left behind

Biometrics are allowing people to prove who they are, speeding journeys through airports, and enabling anonymous online proof of age,…

 

Findynet funding development of six digital wallet solutions

Finnish public-private cooperative Findynet has announced it will award 60,000 euros (US$69,200) to six digital wallet vendors to help translate…

 

Patchwork of age check, online safety legislation grows across US

As the U.S. waits for the Supreme Court’s opinion on the Texas case of Paxton v. Free Speech Coalition, which…

 

AVPA laud findings from age assurance tech trial

The Age Verification Providers Association (AVPA), and several of its members, have welcomed the publication of preliminary findings from the…

 

Sri Lanka to launch govt API policies and guidelines

Sri Lanka’s government, in the wake of its digital economy drive, is gearing up to release application programming interface (API)…

 

Netherlands’ asylum seeker ID cards from Idemia use vertical ICAO format

The Netherlands will introduce new identity documents for asylum seekers Idemia Smart Identity, compliant with the ICAO specification for vertical…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events