老夫子传媒

漏 2024 | 老夫子传媒
Southern Oregon University
1250 Siskiyou Blvd.
Ashland, OR 97520
541.552.6301 | 800.782.6191
Listen | Discover | Engage a service of Southern Oregon University
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

These wrongly arrested Black men say a California bill would let police misuse face recognition

California's Legislature is considering making it illegal for police to use facial recognition technology as the sole justification for a search or arrest. Critics say it won't stop wrongful arrests.
Rahul Lal
/
Sipa USA via Reuters
California's Legislature is considering making it illegal for police to use facial recognition technology as the sole justification for a search or arrest. Critics say it won't stop wrongful arrests.

Three men falsely arrested based on face recognition technology have joined the fight against a California bill that aims to place guardrails around police use of the technology. They say it will still allow abuses and misguided arrests.

In 2019 and 2020, three Black men were accused of, and jailed for, crimes they didn鈥檛 commit after police used face recognition to falsely identify them. Their wrongful arrest lawsuits are still pending, but their cases bring to light how AI-enabled tools can lead to civil rights violations and .

Now all three men are speaking out against pending California legislation that would make it illegal for police to use face recognition technology as the sole reason for a search or arrest. Instead it would require corroborating indicators.

The problem, critics say, is that a possible face recognition 鈥渕atch鈥 is not evidence 鈥 and that it can lead investigations astray even if police seek corroborating evidence.

After a contentious hearing today, the Senate Public Safety Committee uanimously voted to advance . It had cleared the Assembly last month without opposition.

Such a bill 鈥渨ould not have stopped the police from falsely arresting me in front of my wife and daughters,鈥 Robert Williams told CalMatters. In 2020, Detroit police accused Williams of stealing watches worth thousands of dollars 鈥 the first known instance of in the United States 鈥 after face recognition matched a surveillance video to a photo of Williams in a state database. Investigators put his photo in a 鈥渟ix-pack lineup鈥 with five others, and he was chosen by a security guard who had seen a surveillance image but .

鈥淚n my case, as in others, the police did exactly what AB 1814 would require them to do, but it didn鈥檛 help,鈥 said Williams, who is Black. 鈥淥nce the facial recognition software told them I was the suspect, it poisoned the investigation. This technology is racially biased and unreliable and should be prohibited.

鈥淚 implore California lawmakers to not settle for half measures that won鈥檛 actually protect people like me.鈥

But the bill鈥檚 author, Democratic Assemblymember Phil Ting of San Francisco, maintained that because it bans face recognition technology from being the sole criteria for a warrant, search or arrest, it would prevent wrongful apprehensions such as those in Detroit.

And he stressed that it would improve the status quo for Californians.

鈥淟aw enforcement agencies in the state do not need any permission from anyone to do anything on facial recognition right now,鈥 Ting said. 鈥淣othing in any state law provides guidance in that particular area.

鈥淭his actually takes a good first step to really provide some security, to provide some civil rights protections, and to ensure that we take the first step to regulate facial recognition technology.鈥

The first face recognition searches in the United States took place more than two decades ago. It鈥檚 a process that begins with a photo of a suspect typically taken from security camera footage. Face recognition on your iPhone is trained to match your photo, but the kind used by law enforcement agencies searches databases of mug shots or drivers license photos can contain millions of photos, and can fail in numerous ways. Tests by researchers have shown that the technology is less accurate when attempting to identify , people who are , Native American, people who identify as , if a probe image of a suspect is low quality, or if the image in a database is outdated.

After a computer assembles a list of possible matches from a database of images, police pick a suspect from an array of candidates, then show that photo to an eyewitness. Although people tend to think they鈥檙e good at it, eyewitness testimony is a .

Because prosecutors use face recognition to identify possible suspects but ultimately rely on eyewitness testimony, the technology can play a role in a criminal investigation but remain hidden from the accused and defense attorneys.

Directives not to treat a possible match by a face recognition system as the sole basis for an arrest sometimes don鈥檛 make a difference 鈥 they failed to do so, for instance, in the case of Alonzo Sawyer, a man who was falsely arrested near Baltimore and .

鈥淥nce the facial recognition software told them I was the suspect, it poisoned the investigation."
ROBERT WILLIAMS, WRONGLY ARRESTED IN DETROIT

Njeer Parks, who spent nearly a year fighting allegations that he stole items from a hotel gift shop in New Jersey and then nearly hit a police officer with a stolen vehicle, came out in opposition to the California bill in a video last week. The police 鈥渁re not going to do their job if the AI is saying 鈥業t鈥檚 him鈥 already. That鈥檚 what happened to me.鈥

鈥淚 got lucky,鈥 he told CalMatters in a phone interview about a receipt that exonerated him and kept him out of prison. 鈥淚 don鈥檛 want to see anybody sitting in jail for something they didn鈥檛 do.鈥

Testifying at today鈥檚 hearing was the attorney for Michael Oliver, a third Black man who was wrongly accused of assaulting a high school teacher in Detroit in 2020. 鈥淭he warrant request in Michael鈥檚 case was based entirely on a supposed (face recognition technology) match and a photo lineup,鈥 said attorney David Robinson. 鈥淥ther than a photo lineup, the detective did no other investigation. So it鈥檚 easy to say that it鈥檚 the officer鈥檚 fault, that he did a poor job or no investigation. But he relied on (face recognition), believing it must be right. That鈥檚 the automation bias this has been referenced in these sessions.

鈥淪o despite a warning to the officer 鈥 鈥榠nvestigative lead only鈥 鈥 that prescription was trumped by the mesmerizing effect of this machine that the officer saw as faster and smarter than he, and it must be right.鈥

Supporters of Ting鈥檚 bill include the California Faculty Association and the League of California Cities. The California Police Chief Association argues that face recognition can reduce criminal activity and provide police with actionable leads, and that such technology will be important as California looks to host international events such as the 2026 World Cup and the 2028 Summer Olympics in Los Angeles.

鈥淎cross the country, real-world examples of law enforcement using (facial recognition technology) to solve major crimes showcases just how important this new technology can be towards protecting our communities,鈥 the California Police Chiefs Association has argued. It cited cases in which it says face recognition played a role in identifying the guilty, including a newspaper headquarters and a rape in New York.

Face recognition alone should never lead to false arrests, Jake Parker with the Security Industry Association told members of the California Assembly a few weeks ago. That鈥檚 why AB 1814 is meant to corroborate investigative leads with evidence, not just a possible face recognition match.

鈥淭here鈥檚 a clear need to bolster public trust that this technology is being leveraged accurately, lawfully, and in an effective way that鈥檚 also limited and non-discriminatory in a way that benefits our communities,鈥 he said. 鈥淪o we believe AB 1814 will help bolster this trust and for that reason we urge you to support this bill in its current form.鈥

鈥淚 believe having a precautionary step can help protect people鈥檚 privacy and due process rights, while still allowing local governments to go further and pursue their own facial recognition bans.鈥
ASSEMBLYMEMBER PHIL TING, DEMOCRAT FROM SAN FRANCISCO

But more than 50 advocacy organizations 鈥 including the ACLU, Access Reproductive Justice and the Electronic Frontier Foundation . They called face recognition unreliable, a proven threat to Black men, and a potential threat to protestors, people seeking abortions, and immigrant and LGBTQ communities.

鈥淏y allowing police to scan and identify people without limitation, AB 1814 will also increase unnecessary police interactions that too often have the potential to escalate into fatal encounters. This will remain true regardless of how accurate face recognition technology becomes,鈥 the organizations said in a letter. 鈥淭here is no way for people to find out if facial recognition is used against them and no mechanism to make sure the police comply with the law.鈥

Ting also authored a 2019 bill that of body camera footage with face recognition. That was amended to a temporary ban, which ended in January 2023.

He told CalMatters he鈥檚 uncomfortable with the fact that California currently has no limits on how law enforcement agencies use face recognition.

He said in a statement that his bill 鈥渟imply requires officers to have additional evidence before they can proceed with a search, arrest, or affidavit for a warrant. I believe having a precautionary step can help protect people鈥檚 privacy and due process rights, while still allowing local governments to go further and pursue their own facial recognition bans.鈥

Ting鈥檚 city of San Francisco became the first major city in the nation to ban face recognition in 2019, but an by City Attorney David Chiu found that the city鈥檚 allows police to perform face recognition searches on imagery captured by cameras and drones. reported that San Francisco police go around restrictions by requesting that law enforcement in neighboring cities conduct the search for them.

Recalled San Francisco district attorney Chesa Boudin says there have almost certainly been false arrests associated with use of face recognition in California but they would remain unknown to the public unless prosecutors filed charges and the accused later went to trial with a civil lawsuit seeking damages. Often .

鈥淲e absolutely need a legislative, regulatory framework for these technologies, but I don鈥檛 think AB 1814 is adequate in terms of protecting civil liberties or providing meaningful guardrails or safeguards for the use of these new and powerful technologies,鈥 said Boudin, who now directs UC Berkeley鈥檚 Criminal Law & Justice Center.

Lawmakers have until the end of the legislative session in August to decide whether to pass AB 1814.

 is a nonprofit, nonpartisan media venture explaining California policies and politics.