© 2024 Ideastream Public Media

1375 Euclid Avenue, Cleveland, Ohio 44115
(216) 916-6100 | (877) 399-3307

WKSU is a public media service licensed to Kent State University and operated by Ideastream Public Media.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

City official says Cleveland police have facial recognition tech. How it's being used is unclear

A Cleveland surveillance camera at the Cudell Recreation Center on the West Side. The city's Safe Smart CLE program started with the installation of cameras at the city's rec centers.
Matthew Richmond
/
Ideastream Public Media
A Cleveland surveillance camera at the Cudell Recreation Center on the West Side. The city's Safe Smart CLE program started with the installation of cameras at the city's rec centers.

Based on publicly available records and a recent comment from a Cleveland official, it appears the Cleveland Division of Police has had the capability to use facial recognition technology as part of its processing of video evidence since as early as 2019.

It appears the program started when former Mayor Frank Jackson launched his citywide streetlight replacement and surveillance camera installation initiative in 2018.

That initiative, dubbed Safe Smart CLE, in addition to swapping out all streetlights with LED bulbs, replaced the fixtures on city-owned light poles to accommodate city-owned cameras. Cleveland now has a network of around 1,400 cameras on streetlights and public buildings.

According to Board of Control records, Cleveland entered into contracts in 2019 with Paladin Protective Services, a local security system company, and Motorola Solutions for installation and monitoring of the new cameras. Motorola Solutions owns several facial recognition companies that are part of its video service.

According to NBC News, the company spent $1.7 billion between 2017 and 2019 acquiring technology companies that sell body worn cameras, license plate readers and facial recognition technology.

Around the same time, the city hired a company that, among its services, analyzes police video using software called Briefcam. That software includes facial recognition technology, according to Briefcam’s website. The city provided Ideastream Public Media with the Briefcam contract, which does not mention facial recognition. The city has not provided its contracts with Motorola and Paladin.

The law department responded to the records request for any facial recognition contracts by saying the police department “does not have any facial recognition software and or applications that is presently running on any of its video surveillance cameras.”

They described Briefcan as, “a video analytics tool which allows you to quickly search for vehicles, objects and patterns of traffic.”

“I think this is like vendor creep,” said Cleveland State University law professor Jonathan Witmer-Rich.

The city hired companies to set up the entire system — cameras, storage, video processing — and may not have been planning to start using facial recognition technology back then.

“And then it’s like, 'Oh wait, one additional, little bonus functionality we’re getting here is we can use their facial recognition algorithm to do certain things,'” Witmer-Rich said.

Chief Innovation and Technology Officer Froilan C. Roy Fernando acknowledged during city council’s safety committee meeting on April 27 that the city’s police department is using facial recognition technology as one of its options for video analysis.

In an exchange with Councilman Joe Jones, who asked whether the city uses facial recognition, Fernando described a camera system that can be used in several ways.

“They can be programmed to recognize specific objects such as man, woman, dog without any identity,” Fernando said. “It’s up to our deployment as to what we’re looking for in a particular field of view. In some cases, for illegal dumping, we may just want to monitor for contaminants left by an object coming in there. In other cases, where there is a potential crime incident, maybe that camera can be programmed to identify facial recognition, yes.”

When Jones followed to learn more about the city’s use of the technology, Director of Public Safety Karrie Howard cut off the conversation.

“What I would like to do is, if possible, if we could speak about the capability of our camera system privately. You don’t want to say we have facial recognition,” Howard said before changing the subject.

The use of facial recognition by law enforcement has been criticized as an invasion of privacy and an unreliable technology, especially when identifying Black people.

A landmark study in 2018 by a researcher at the Massachusetts Institute of Technology found commercially available facial recognition software incorrectly dark-skinned women by up to 34 percent.

According to Witmer-Rich, the technology has improved rapidly since that study was done. But, he said, some software is more accurate than others.

“’My first question is, ‘Do they have a written policy governing their use of facial recognition? And if so, can the citizens see it?’” Witmer-Rich said.

The police department’s policies on how it can be used and by whom are not included in the publicly available policies, known as General Police Orders.

Cleveland police did not respond to a request for any policies covering the use of facial recognition.

The Center for Privacy and Technology at Georgetown Law notes in its work questioning the use of the technology that several years ago NYPD at times replaced grainy images of people committing crimes with photos of people officers thought the suspect resembled.

In one case, investigators swapped in a photograph of Woody Harrelson and in another a photograph of a professional basketball player to search for suspects. Officers also used artist’s sketches of suspects to find matches in their database.

A paper published in 2020 by the Institute of Electrical and Electronics Engineers found algorithms used in facial recognition have to be adjusted to take into consideration race bias.

That can be done in part by setting a high bar for matches. It’s unclear whether Cleveland requires high degree of certainty in its matches or whether it tracks internally the false matches produced by the software it uses.

Witmer-Rich said residents of Cleveland should know more about what products the city uses and how they’re used, and they also have the right to be involved in the decision about whether facial recognition should be used at all.

“Fundamentally, even if your system works extremely well, we may just decide we don’t want you to use it because it’s too powerful of a tool, it invades into our privacy too much,” Witmer-Rich said.

Matthew Richmond is a reporter/producer focused on criminal justice issues at Ideastream Public Media.