Assemblyman Phil Ting, D-San Francisco, one of the California lawmakers that the ACLU says was mislabeled, is sponsoring a bill that would ban law enforcement agencies in California from using facial recognition technology in body-worn cameras. 

Assemblyman Phil Ting, D-San Francisco, one of the California lawmakers that the ACLU says was mislabeled, is sponsoring a bill that would ban law enforcement agencies in California from using facial recognition technology in body-worn cameras.  AP Photo/Rich Pedroncelli

Face-Recognition Tool Misidentified State Lawmakers as Criminals: ACLU

The group tested Amazon's Rekognition on photos of California's lawmakers. The company says the test wasn't fair.

Twenty-six lawmakers in California were incorrectly matched with mugshots in a recent test of Amazon’s facial recognition software conducted by the American Civil Liberties Union of Northern California.

The ACLU ran photos of all 120 members of the California State Legislature through Rekognition, Amazon’s facial recognition software, which matched roughly 20 percent of them to mugshots in a separate database. Assembly member Phil Ting, one of the lawmakers falsely identified as a criminal, said at a press conference Tuesday that the experiment illustrates the limitations of the technology, which particularly struggles to correctly identify people of color—especially women.

“We wanted to run this as a demonstration about how this software is absolutely not ready for prime time,” he said.

Ting, a Democrat from San Francisco, is the primary sponsor of a bill that would ban law enforcement agencies in California from using facial recognition technology in body-worn cameras. 

Related: US Plans Face Recognition on ‘All Passenger Applications’

Related: Moscow to Weave AI Face Recognition into Its Urban Surveillance Net

Related: How AI Will Find You In the Crowd, Without Facial Recognition

Deploying the software there, Ting argued, would turn the cameras from a tool designed to increase trust and transparency into one of constant surveillance, which could have devastating results for people who are arrested as a result of being misidentified.

“It’s no laughing matter if you are an individual who is trying to get a job, if you are an individual trying to get a home,” he said. “There are real people who could have real impacts.”

The federal government so far has not regulated facial-recognition technology, leaving states and cities to construct their own restrictions. A few cities, including San Francisco, have prohibited all departments from using the software. The state of Oregon has banned police from deploying the technology in body-worn cameras. Ting’s bill goes further, banning the use of all "biometric surveillance technology."

The California test comes a year after the ACLU conducted a similar experiment using photos of members of Congress. In that test, 28 lawmakers matched up with mugshots of other people.

Amazon criticized the ACLU's method of testing their technology. In both cases, Amazon said, the fault of the false matching lies not with the software, but with the settings that were used.

Matt Cagle, an attorney for the ACLU, said the organization used the software’s default parameters, which match at an 80 percent confidence rate. For use by law enforcement, Amazon recommends at least a 99 percent confidence rate, which uses a higher threshold to generate a match and will by default return fewer photos.

“The ACLU is once again knowingly misusing and misrepresenting Amazon Rekognition to make headlines,” a spokeswoman said. “As we’ve said many times in the past, when used with the recommended 99% confidence threshold and as one part of a human driven decision, facial recognition technology can be used for a long list of beneficial purposes, from assisting in the identification of criminals to helping find missing children to inhibiting human trafficking.”

Ting’s bill passed the Assembly in May and is currently awaiting a vote in the Senate.

NEXT STORY: The Aging Spacecraft of Deep Space