FB pixel

Deepfake voice attacks are here to put detection to the real-world test

Deepfake voice attacks are here to put detection to the real-world test
 

It’s put up or shut up time for biometric software companies and public researchers claiming they can detect deepfake voices.

Someone sent robocalls as part of disinformation tactic in the United States purporting to be President Joe Biden. It sounded like Biden telling people not to vote in a primary election, but it could have been AI. No one, not even vendors selling deepfake detection software, can agree.

Software maker ID R&D, a unit of Mitek, is stepping into the market, and responded to the previous big voice cloning scandal in the U.S., involving pop star Taylor Swift, with a video showing that its voice biometrics liveness code can differentiate real recordings from digital impersonation.

The electoral fraud attempt poses a different kind of challenge.

A Bloomberg article this week looked at what might have been the first deepfake audio dirty trick played on Biden. But no one knows if it was an actor or AI.

Citing two other detector makers, ElevenLabs and Clarity, Bloomberg could find no certainty.

ElevenLabs’ software found it unlikely that the misinformation attack was the result of biometric fraud. Not so, Clarity, which apparently found it 80 percent likely to be a deepfake.

(ElevenLabs, which focuses on creating voices, became a unicorn. The company raised an $80 million series B this month, executives said the company is valued at more than $1 billion, according to Crunchbase.)

As is often the case, some hope springs from research, and in this case, it’s qualified.

A team of students and alums from University of California – Berkeley say that they have developed a method of detection that function with as little as no errors.

Of course, that’s in a lab setting and the research team feels the method will require “proper context,” to be understood.

The team gave a deep-learning model raw audio to process and extract multi-dimensional representations. The model uses these so-called embeddings to parse real from fake.

Article Topics

 |   |   |   |   |   | 

Latest Biometrics News

 

Biometrics providers and systems evolve or get left behind

Biometrics are allowing people to prove who they are, speeding journeys through airports, and enabling anonymous online proof of age,…

 

Findynet funding development of six digital wallet solutions

Finnish public-private cooperative Findynet has announced it will award 60,000 euros (US$69,200) to six digital wallet vendors to help translate…

 

Patchwork of age check, online safety legislation grows across US

As the U.S. waits for the Supreme Court’s opinion on the Texas case of Paxton v. Free Speech Coalition, which…

 

AVPA laud findings from age assurance tech trial

The Age Verification Providers Association (AVPA), and several of its members, have welcomed the publication of preliminary findings from the…

 

Sri Lanka to launch govt API policies and guidelines

Sri Lanka’s government, in the wake of its digital economy drive, is gearing up to release application programming interface (API)…

 

Netherlands’ asylum seeker ID cards from Idemia use vertical ICAO format

The Netherlands will introduce new identity documents for asylum seekers Idemia Smart Identity, compliant with the ICAO specification for vertical…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events