© 2024 WYPR
WYPR 88.1 FM Baltimore WYPF 88.1 FM Frederick WYPO 106.9 FM Ocean City
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
2024 Election Coverage

Burnett Seeks Regulation Of Facial Recognition Technology In Baltimore

Steven Senne/AP

City Councilman Kristerfer Burnett wants strict controls on the use by Baltimore City of facial recognition surveillance, a technology that privacy advocates say holds harmful implications for people of color. He has authored a bill that would prohibit its use by non-police city agencies and set strict standards for use by the police department.

“My intention with this bill is not to impede the Baltimore City Police Department from investigating crime in Baltimore City,” he said at a hearing Wednesday afternoon. “I do believe, however, that it is important that we raise questions about how the technology is currently deployed to ensure that there's oversight, transparency and the rights and civil liberties of our citizens are protected.” 

Burnett’s bill, co-sponsored by council members  Zeke Cohen, Ryan Dorsey and John Bullock, cannot ban the city police from using the technology, because the department is under the control of the state, not the city. Burnett can force regulation of the technology, as well as annual audits detailing how city departments employ it. 

In addition to prohibiting non-police agencies from using the technology, Burnett’s bill would require the Director of the Baltimore City Information and Technology department to submit a public annual report to the mayor that details every purchase of surveillance technology and an explanation of how those technologies are being used. If it passes, it would go into effect 30 days after it’s signed into law. 

 

Facial surveillance technology came under renewed scrutiny last summer amid racial justice protests following the killing of George Floyd by Minneapolis police. In the wake of Floyd’s death, tech giants IBM, Amazon, and Microsoft said they would pause or end sales of the technology to U.S. police. And within the last year, several cities, including San Francisco, Oakland, Boston and Minneapolis, enacted some form of a ban on the technology. 

 

Facial recognition surveillance technology refers to software that uses artificial intelligence to scan and analyze human faces, generally by cross-referencing images from databases such as FBI mugshots or driver’s license photos. 

 

Critics have pointed to serious flaws in the software’s algorithms identified by nonpartisan researchers, including a report from the National Institute of Standards and Technology last fall that found widespread racial and gender disparities in its results.

 

The algorithms generally are best at correctly identifying white middle-aged men, the NIST found. There are higher rates of false positives for Asian, Black and Native people compared to white people, with differentials ranging from a factor of 10 to 100 times.

 

The NIST report echoed the findings of MIT scholars Joy Buolamwini, Deb Raji, and Timnit Gebru, whose 2018 research found that some algorithms misidentified Black women nearly 35% of the time, while almost always correctly identifying white men.

 

“In one test, I ran on Amazon's AI [facial recognition algorithm,] they found the face of Oprah Winfrey male,” Buolamwini, a Black woman, said at a city hearing last year. “I personally had to wear a white mask to have my face detected.”  

 

These mistakes can have devastating impacts: a false match can lead to a wrongful arrest and detention. Facial recognition technology is already responsible for multiple unjust incarcerations of Black men, Buolamwini said.

 

She pointed to the story of a Detroit man who was incorrectly identified by facial recognition software in a grainy video of a robbery. 

 

“Robert Williams was wrongfully arrested in front of his two young daughters due to a false face recognition match and detained for 30 hours,” Buolamwini said. “A black man accosted by the police — we know that the outcome could have been fatal.”

 

And when the technology does work properly, it raises Fourth Amendment concerns, said Caylin Young, the Director of Public Policy at the ACLU of Maryland.

 

“Individuals who are not doing anything or committing any crime, who may be just participating in exercising their protected First Amendment speech at a protest, might be subject to unnecessary surveillance or unreasonable surveillance and therefore added to one of these databases,” Young said.

 

BPD and the city’s Fraternal Order of Police chapter are opposed to the bill. The latter claims that an outright ban on the technology would remove a “valuable tool needed to fight violent crime” that is but a single aspect of a criminal investigation.    

 

“A positive identification utilizing facial recognition alone is not enough to generate probable cause for arrest, let alone prove a case beyond a reasonable doubt for conviction,” Sgt. Bobby Cherry and Detective Eric Perez, chairmen of the FOP legislative committee, wrote in a letter to the council. 

 

They noted that the technology was used to identify and arrest some of the insurgents involved in the Jan. 6 breach of the U.S. Capitol.

 

Michelle Wirzberger, BPD director of government affairs, said the agency is committed to safeguards that ensure the technology is used responsibly, including transparency. 

 

“We understand, as does this council, that the only way we can improve our relationship with the community we serve is through acknowledging their concerns and work cooperatively with them to create new ways of interacting that everyone can embrace,” Wirzberger said.

 

BPD, along with the Department of Transportation, uses the Maryland Image Repository System (MIRS) as part of their investigation processes. MIRS, which is maintained by the Maryland Department of Public Safety and Correctional Services, is a facial recognition software that compares images of unidentified suspects to state and FBI mugshots, as well as motor vehicle records. 

 

During the hearing, Burnett quoted from a letter sent to President Biden from more than 40 civil rights organizations across the country last week.

 

“All-seeing, all-knowing surveillance evokes science fiction dystopias. But in the year 2021, persistent tracking of all people in America in public spaces with facial recognition technology is no longer relegated to the realm of fiction,” he said.

 

Emily Sullivan is a city hall reporter at WYPR, where she covers all things Baltimore politics. She joined WYPR after reporting for NPR’s national airwaves. There, she was a reporter for NPR’s news desk, business desk and presidential conflicts of interest team. Sullivan won a national Edward R. Murrow Award for an investigation into a Trump golf course's finances alongside members of the Embedded team. She has also won awards from the Chesapeake Associated Press Broadcasters Association for her use of sound and feature stories. She has provided news analysis on 1A, The Takeaway, Here & Now and All Things Considered.
Related Content