US police tracking people using AI that identifies non-biometric attributes

U.S. state and local police departments are using an AI model that helps them evade facial recognition prohibitions.
MIT Technology Review reports on how police forces and universities across the U.S. are using a tool called Track. Developed by Veritone, the AI model can track people using attributes like gender, hair color and style, body size, clothing and accessories.
Use of Track is expanding, with federal attorneys at the Department of Justice having used the tool for criminal investigations last summer. The Department of Homeland Security, which includes immigration agencies, along with the Department of Defense make use of Veritone’s other AI tools, which include facial recognition.
On Veritone’s website, the company advertises Track with the capability to enable investigators to track individuals and vehicles across videos without any personally identifiable information (PII). Compliance with privacy laws is a particular selling point. Veritone CEO Ryan Steelberg was quoted by Technology Review saying that Track can also track when faces are obscured or not visible.
The tool raises privacy concerns similar to facial recognition, according to the American Civil Liberties Union, which discovered Track via MIT Technology Review. The union said it was the first time it had seen a nonbiometric tracking system used at scale in the U.S. In an interview, Jon Gacek, Veritone’s general manager of its public sector business, said Track is not a general surveillance tool but is intended to speed up the identification of important parts of videos.
Steelberg claimed Track is less than a year away from being operational on live video feeds. It currently runs only on recorded video. The CEO said the number of attributes that Track uses to identify individuals will keep expanding. The majority of Veritone’s clients are media and entertainment companies but its fastest-growing segment is the public sector, which currently makes up just six percent of the Irvine, California-based company’s business.
In the U.S. facial recognition use by police is almost outright forbidden in certain places, such as in San Francisco and Oakland, California, or curtailed, with laws in Maine and Montana limiting its use. These laws usually refer to “biometric data” but attributes such as body size fall into a grey area since an individual’s body shape is mutable and can vary, while biometrics are generally immutable, such as iris and face and fingerprints.
Police departments across the U.S. are in various considerations over biometrics use. The Milwaukee Police Department is considering engaging biometrics firm Biometrica as a provider in exchange for access to its database of 2.5 million mugshot photos. In Ohio, a legal case could have potentially far-reaching consequences for law enforcement and civil liberties, surfacing legal and ethical tensions over police use of facial recognition. In Michigan, Detroit police have been sued after a woman claimed she is the victim of wrongful arrest prompted by a facial recognition match. Detroit Police say facial recognition was not used in the case.
Article Topics
biometrics | data privacy | police | regulation | United States | Veritone | video surveillance
Comments