Technologies like facial recognition and biometrics tend to conjure up images of a dystopian world. Think the "Terminator" movies or "1984," both of which depict societies using technological advances in all the wrong ways.
These stark pictures, coupled with some real-life reports, can instill fear of governmental tech use — especially in the hands of police departments.
The end result is that police departments find themselves in a conundrum: Many of these technologies enhance their ability to keep civilians safe, but civilians worry about privacy and improper arrests, not to mention fears that some technology could be used as a tool to enhance authoritarianism. This leaves police departments figuring out how to best use technology for the benefits it can provide, while assuaging citizen fears about its misuse.
"People expect police departments to keep up with the latest tech tools available in order to keep them safe," said James Slessor, managing director with Accenture. "Hand in hand with that, however, is the fact that citizens expect the police to use technology in a way that provides them privacy and in the proper manner."
Overall, citizens are supportive of improved technology within police departments. A recent study from the Pew Research Center found 56% of Americans trust law enforcement agencies to use facial recognition responsibly, while 59% of the public says it's acceptable for police to use facial recognition in assessing security threats in public places. Another Pew study from 2017 found 93% of the public is in favor of the use of body cameras by police to record interactions.
Police professionals themselves are also adapting to the evolution of technology. A 2018 Accenture survey of policing personnel found the specific technologies respondents are expecting to use more over the next few years include: body-worn cameras (48%), biometrics (37%), video analytics (42%) and predictive policing technologies (26%).
At issue, however, is building trust in the public that such technology will be used in the right way. To address this, Slessor recommends a path of communication and transparency.
"Police need to take the time to explain how they’re using these tools," said Slessor. "They are a necessity and if used well, can be a source for good."
Using tech to gauge perception
The City of Grand Rapids, MI Police Department (GRPD) is one of about 15 across the country that have implemented Elucd, a technology platform designed to gauge residents' sense of trust and safety relating to police. The platform, dubbed a "sentiment meter," regularly surveys communities to assess public perception of police.
Elucd CEO Michael Simon likened the software to an annual physical.
"If citizens tell the department that they are unhappy with neighborhood enforcement, the department can respond,” he said. "For instance, officers could set up a town hall meeting in a neighborhood to discuss issues."
Citizens learn about the surveys through social media and other electronic channels. They ask questions regarding how safe residents feel in their neighborhood, how respectful officers are and what they consider to be the top issue in their part of the city. In Grand Rapids, for instance, the city’s overall trust score hovers around 68 to 71, which equates to "very good."
GRPD Sgt. John Wittkowski said that initially, the public was skeptical of the department’s implementation of the software.
"There was a perception that the department was going to use the software to gather information for the wrong purpose," he explained. "They worried we would gain access to their devices and accounts." Once GRPD realized these concerns, it addressed and educated the public, creating clarity around the technology.
On the whole, the public will judge a police department’s use of technology based on several factors, according to Mike Rossler, assistant professor of criminal justice sciences at Illinois State University.
"If a department uses AI for crime mapping, it becomes a predictive tool to determine what parts of a city in which crime is likely to occur," he explained. "Then the department sends more officers to that area. This doesn’t really build strong community relations."
On the other hand, said Rossler, body cameras have served as a positive tool in the public eye. "By enhancing accountability, departments can build trust," he said. "If the police make footage available to the public via freedom of information acts, and a policy to turn off body cams at the request of a victim for privacy, that goes a step further in improving relations."
The Chicago Police Department (CPD) has learned this lesson first hand. When it recently upgraded its facial recognition software, citizens protested the move. Learning that the technology, Clearview AI, operated using a database of photos from social media and websites, critics launched a lawsuit.
In response, the CPD shared a statement, along with statistics, on how exactly they use the AI system: "The Chicago Police Department does not have or utilize any facial recognition technology (FRT) or software, which uses dynamic algorithms in video technology to identify individuals in a live or real-time environment ... Live FRT is very different from the facial matching tool used by the CPD."
The question remains of whether the public will accept that explanation or keep skeptical of the technology. In early February, a coalition of 75 groups sent a letter to Mayor Lori Lightfoot urging her to end the practice, pointing to inaccuracies with the technology and instances of misidentification.
With all police technologies, there’s also inherent concern from the community that racial biases will come into play. Rossler said that if a department uses something like increased aerial surveillance in a community of color, it becomes an issue.
"For some of the predictive policing technologies, there are concerns because it is difficult to hold a computer program accountable,” he said. “If we’re talking AI, it depends on what data is provided to the system."
Police officer biases, Rossler explained, may be reinforced by machine learning, sending officers into places at times where they then make more arrests. "So it then becomes a self-fulfilling prophecy," he said.
Taking the right steps
When aiming to develop an understanding of police departments' tech use, Slessor said public education should contain several key factors. First is sharing how a system is designed.
"Is it open? Traceable?" he asked. "Design that into the system from the start and share that information with citizens."
The second is governance. Slessor recommended the development of a code of ethics and accountability that is again shared with the public. "We need to understand it and legislate around it," said Slessor. "A strict framework is important.”
He also said there should be a monitoring system in place, as "technology moves so quickly ... It’s not ok to simply put technology in place and leave it there."
These days, social media can also prove a powerful tool for fostering public trust. "You need to meet people where they are," said Rossler. "Twitter is probably the number one place for a department to be, followed by Facebook and Instagram, among others."
Using social media properly in this case amounts to encouraging information exchanges.
"It’s important to remember that this is a public institution, so using humor as you would in other instances online won’t really work here," said Rossler. "But if a department can share information with the public, answer questions, and gather information, it can be very helpful for both sides."
Finally, Simon pointed to the value of having empathy for how the public may perceive new technologies.
"Not all technology or surveillance is suspect,” he said. “But departments must ensure that all voices are heard when there’s a concern."