“With the ban expiring, we actually have no safeguards, as we speak,” Assemblymember Phil Ting, D-San Francisco, who sponsored the 2019 moratorium, told Government Technology earlier this month. “Body cameras and facial recognition software could be used for any purpose. There are no regulations. There are no best practices.”
Many who advocate for restoring limits or bans cite concerns that the tool can make mistakes — prompting officers to arrest the wrong person — and that it can intrude on residents’ privacy and create an atmosphere of public surveillance that might chill free speech and protest.
But new measures are in the works. Ting’s new bill, AB 642, would allow police use of the tech under certain restrictions. Meanwhile, Assemblymember Lori Wilson’s bill, AB 1034, would echo the original state ban by preventing law enforcement from installing, activating or using biometric surveillance “in connection with an officer camera or data collected by an officer camera.”
Wilson brought her ban before the state’s Public Safety Committee on Mar. 28. During that hearing, several assemblymembers said they sympathized with concerns but that fully prohibiting the tool’s use was unnecessary.
“The fact that one person could be misidentified and brought into the criminal justice system is just one person too many. We should not be willy-nilly with this idea that, ‘Yeah, we captured a whole lot of criminals, but we put a few people in jail by accident,’” said committee chair Reginald Jones-Sawyer Sr. “[But] I think this technology could be very effective once it's perfected … Part of the discussion we really should be having is, how do we perfect this? Or how do we put safeguards to ensure that if someone is misidentified, it doesn’t get to a point where they end up in prison or in jail?’"
PRIVACY RISKS, EXPECTATIONS?
Using facial recognition on police body cameras means residents may be tracked and identified whenever they pass by police, said Electronic Frontier Foundation (EFF) legislative associate Chao Jun Liu, during the hearing. This can be especially “unsettling” when it means being identified at sensitive locations like houses of worship or facilities that provide abortions.
“We all deserve a level of anonymity in our daily lives and in civic expression,” he said. “Face recognition technology on body cams makes it so anytime someone walks by a police officer, they could be recorded and identified without reason and without even knowing it.”
The bill describes similar concerns: “Widespread use of facial recognition on police body cameras would be the equivalent of requiring every Californian to show their photo ID card to every police officer they pass.”
Public Safety Committee vice chair Juan Alanis, however, said that people should not expect privacy when in public, given today’s technology climate.
“We have cars that are doing surveillance … We have cameras up on our houses. We have cameras at the banks, at the ATMs. Cameras are everywhere now,” Alanis said. “Pretty much everybody should assume that they're being videoed.”
Alanis, who said he was previously a police sergeant, said he would support restrictions but that preventing use of facial recognition technology would be “a terrible idea.”
WRONGFUL ARRESTS
Facial recognition technology also makes mistakes and has resulted in at least five Black men being misidentified and arrested, said Carmen-Nicole Cox, director of government affairs at the ACLU of California and co-sponsor of the bill. One of the men was arrested in front of his children and held in prison for over 30 hours, she said.
Police also may be more likely to draw weapons on residents who the technology falsely flags as violent criminals, creating a dangerous situation, Cox said.
“One wrongful misidentification, apprehension, detention, set of charges, prosecution — no matter the outcome — is traumatic, is problematic,” Cox said. “It is an invasion of privacy. It is an abusive misuse of government resources. And of course, for Black men especially, it is fraught with tension and the risk of police violence from which people don't come back from.”
Facial recognition tech is often seen to create more false positives when trying to identify women and people of color, Cox said.
Plus, the tools perform best when comparing high-quality images, something unlikely to be generated by body-worn cameras. As the bill notes, “body cameras produce low-quality footage that is blurry, skewed and in near-constant motion.”
Alanis asked how often police had used facial recognition to successfully apprehend violent criminals or find missing people.
“That testimony is conspicuously absent here today,” Cox replied.
One witness testified in favor of police use of the tool: the individual who gave no name but identified himself as speaking on behalf of the Los Angeles County sheriff. He said that law enforcement will “need every tool available at our disposal,” for safely hosting the 2028 Olympics and associated crowds.
Wilson, however, said police have other public safety resources available that carry fewer risks to residents’ rights.
TO BAN OR LIMIT?
Jones-Sawyer, who is Black, recalled the ACLU running a demo of facial recognition that misidentified him as a criminal — and then having a law enforcement officer deny his account of that experience.
“For the first time, as a Black man, I realized how devastating it is for law enforcement to accuse you of something that you did not do. Basically, he called me a liar, in this committee, to my face, and to everybody in the room,” Jones-Sawyer said.
He said any misidentifications should be taken seriously, but that he sees facial recognition technology as something that can be fixed and improved, without resorting to bans.
Assemblymember Tom Lackey similarly said the tool is too useful to reject entirely, “perfection can be the enemy of good and there’s a lot more good comes [from this,] especially from the public safety aspect.”
The committee moved to send the bill to the Privacy and Consumer Protection Committee in a 5-1 vote, with Lackey dissenting.
AB 642’S RESTRICTION-BASED APPROACH
Discussing his own bill with GovTech, Ting said he doesn’t believe a full ban or moratorium could clear today’s Legislature. Instead, his proposed AB 642 puts limits around police use of facial recognition.
“I decided to figure out, what were some of the best practices that we could employ across the state to make sure that facial recognition software was used for public safety, but not to infringe on our civil liberties?” Ting said.
The bill includes measures like requiring police to establish and publicize a written policy guiding their use of facial recognition systems, and to only allow authorized and trained personnel to use the technology.
The policy also would prevent facial recognition from serving as the sole justification for an arrest, search or affidavit for a warrant and explicitly bans using the technology to identify people solely based on protected characteristics — like sexual orientation or race.