Facial Recognition Misidentifies California Lawmakers as Criminals
The American Civil Liberties Union (ACLU) of Northern California recently ran a test of Rekognition, Amazon's facial recognition technology, and found that it misidentified 26 California lawmakers as criminals. The misidentified lawmakers were a portion of the 120 lawmakers that the ACLU scanned against images of 25,000 criminals. San Francisco Assemblyman Phil Ting was one of the ones identified as a criminal, despite never having been arrested.
In addition to not being a criminal, Ting is working on getting AB1215, a bill that would ban the use of facial recognition technology on police body cameras, through the California legislature.
"Clearly, this software is faulty," Ting told the San Francisco Chronicle. "It really should not be used by any law enforcement agency at this point. Body cameras are there to build trust, not to tear it down."
Other critics of facial recognition technology point out that it disproportionately misidentifies people of color. In the ACLU test, more than half of the 26 misidentified lawmakers were people of color.
Facial recognition technology on body cameras could be a major privacy violation for the entire populace. Matt Cagle, a technology and civil liberties attorney at the ACLU, told the Chronicle that facial recognition in body cameras represent "a public safety hazard and a threat to our fundamental rights."
"Body cameras were promised as a police accountability tool, not as a surveillance system," Cagle said. "People should be able to walk down the street without having their face logged into a government database."
Law enforcement groups, however, oppose the bill, arguing that facial recognition technology could be a strong tool for fighting crime.
"We're concerned that (Ting's bill) is an attempt to wipe out something that could identify repeat offenders, could solve cold cases and old crimes and deter future crime," Shaun Rundle, deputy director of the California Peace Officers' Association, told the Chronicle.
Amazon also took issue with the test, arguing that the ACLU set the parameters to 80 percent confidence of a match, whereas they recommend law enforcement agencies use the higher bar of 99 percent certainty. They claim that the ACLU and the California lawmakers are purposely misusing and misrepresenting the accuracy of Amazon's software and that it should still be considered for law enforcement.
"We continue to advocate for federal legislation of facial recognition technology to ensure responsible use," the Amazon spokesman told the Chronicle in an email.
While no state agency is currently using facial recognition technology in California, some cities - namely, San Francisco and Oakland - have already passed preemptive laws banning the use of facial recognition software by all city agencies. Those bans, of course, include police departments. If AB1215 passes, those laws will be expanded to cover the entire state.Share this post
Install Tenta Browser Free!
Start protecting your online privacy today with Tenta Browser.