An Orlando Police Department patrol car

An Orlando Police Department patrol car elisfkc via Flickr CC2.0

Facial Recognition Is Hard to Make Useful, Police Find

The Orlando Police Department ended a pilot program, saying they ran out of time and money to make it work.

Cities in California and Massachusetts have banned police use of facial recognition technology over privacy and accuracy concerns. But rolling out the video surveillance technology also could be hampered by another factor: lack of money or staff to make it work.

The Orlando Police Department ended its pilot program testing out facial recognition technology this month after it was unable to advance the program as hoped. Amazon had provided the department free use of its Rekognition, which works by scanning the faces of people captured in security camera video and attempting to identify them by searching various databases. But as the second phase of Orlando’s pilot project wound down, department officials said the agency had not been able to devote the necessary resources to the program.

“At this time, the city was not able to dedicate the resources to the pilot to enable us to make any noticeable progress toward completing the needed configuration and testing,” Orlando’s city administrative officer and police chief wrote in a letter to city elected leaders this month explaining the decision to abandon the pilot program.

Police departments wooed by state-of-the-art technology need to consider not just the upfront costs, but also the long-term resources needed to deploy programs like facial recognition, said Jim Burch, president of the National Police Foundation.

Related: Lawmakers Question FBI’s Facial Recognition Program

Related: Moscow to Weave AI Face Recognition into Its Urban Surveillance Net

Related: How AI Will Find You In the Crowd, Without Facial Recognition

This issue can be seen with other, more common technology. Some smaller departments that deployed body-worn cameras in recent years are now reconsidering their use, due in part to the cost of storing videos and responding to records requests, the Washington Post reported in January. In one instance, a police department in Jeffersonville, Indiana ended their body-worn camera program after state lawmakers voted to require storing the videos for at least 190 days.

“When you look at the cost of body-worn cameras, we looked at the cost of the hardware. But we didn’t understand the costs with retention and records requests that would come with it,” Burch said.

He believes departments could face similar challenges with facial recognition technology.

“People are looking at the marketing brochures but I don’t think anyone in law enforcement really knows enough about this… to really understand what is the total cost of ownership,” he said.

To address the complexities of purchasing and emerging technologies, the National Police Foundation plans to hold a seminar in September on the benefits and potential challenges departments may face when acquiring emerging information technologies.

Sgt. Eduardo Bernal, a spokesman for the Orlando Police Department, told Route Fifty in May that the department was still having difficulties stabilizing the outflow of video data from the city’s cameras to Amazon. To be viable, the technology would have to be compatible with the approximately 200 cameras operated by the department, Bernal said.

Orlando police declined to discuss the resource issues at play in the discontinuation. A spokeswoman for Orlando Mayor, Buddy Dyer, said it was a matter of priorities within the department.

“The issue is that staff wasn't able to commit any time to the pilot to make any progress on the pilot,” spokeswoman Cassandra Anne Lafser said in an emailed statement. “Due to other ongoing priority projects, this is the situation we are in. Therefore, we have no plans to do another pilot again at this time.”

Cities that have deployed facial recognition as part of video surveillance systems have reportedly spent millions of dollars.

Detroit spent $1 million on software that would provide continuous monitoring of video streams, while Chicago received a $13 million federal grant to create a “regional transit terrorism prevention and response system” that included a face recognition system tied to the Chicago Transit Authority CCTV system, according to a recent Georgetown Law report.

Privacy and civil liberties advocates have raised concerns about the accuracy of facial recognition software, particularly when used to match photos of minorities, and questioned the legality of using the technology in conjunction with real-time surveillance networks.