ShotSpotter, a system U.S. police departments use to detect gunfire, resulted in New York Police Department officers spending hundreds of hours in a single month investigating incidents the officers could not confirm were in fact shots fired, according to an audit the New York City comptroller released last week.
But ShotSpotter contends the audit is focusing on the wrong metrics.
The NYPD will have spent more than $54 million on the gunshot-detection service from the time the department started using it in 2014 until December, the end of its current contract with SoundThinking, the California company that owns the system.
The audit, led by New York City Comptroller Brad Lander, analyzed ShotSpotter’s accuracy over several months in 2022 and 2023. When the system was performing at its best, 20% of ShotSpotter’s alerts related to “confirmed shootings,” the audit found. But its performance was often below that level: Of the 940 alerts officers responded to in June 2023, for instance, only 13% corresponded to confirmed shootings.
That’s not the measure of accuracy the contract uses, however. The contractual standard the NYPD set was that ShotSpotter would detect and identify the location of 90% of outdoor gunfire incidents in the coverage areas, with some exceptions based on weapon caliber and suppression. (Suppressors, also known as silencers, muffle the sound of gunfire.) ShotSpotter reached its 90% target “in almost all boroughs except Manhattan, but when measured against the number of confirmed shootings, performance is far lower,” the audit states.
“It was surprising to me to learn how ShotSpotter wastes thousands of hours of NYPD officers’ time every year, and it doesn’t necessarily make New York safer for cops to, say, chase after a car backfiring,” Lander said in an interview.
In its four-page response, the NYPD said it “is limited in what it can consider a ‘confirmed shooting’ in conjunction with a ShotSpotter alert by the nature of police work and alerts which don’t result in the recovery of evidence (i.e. ballistics, property damage, shell casings/live ammunition, firearms, video, ear or eyewitnesses and/or victims).”
The audit criticizes NYPD for not collecting data the comptroller and the public could use to evaluate the tool’s effectiveness and economy. Without that evaluation, the audit “therefore does not currently support renewal of the contract,” it states.
If NYPD renews the contract, Lander said, he would like to see a clause with consequences for the system providing too many false positives, including a structure to reduce the fee the city pays to SoundThinking if, say, more than half of all alerts sent to officers do not relate to a confirmed shooting.
How ShotSpotter works
In more than 160 cities, ShotSpotter has placed sensors in areas that are known to have high crime rates. The devices listen for sharp, loud sounds and transmit data to company employees who determine whether they sound like gunshots. If so, they notify the police. Due to how the sensors triangulate in a given area, the system can provide an exact location to police in about a minute.
What is often misunderstood is the human element at play here, says Tom Chittum, senior vice president of forensic services at ShotSpotter. “Our staff don’t just use their ears, but they have other elements to help them determine if a sound is a gunshot,” he notes. For example, they can listen for audio cues such as whistling associated with fireworks or look at the waveform of the audio data to determine whether that sound has a sharp rise and fall, or its distance from the nearest sensor.
In the NYPD audit report, the comptroller “is being disingenuous about what he found,” Chittum said. When NYPD records an unconfirmed shooting, “our system could have detected a real shooting, but the perpetrators may have fled, they may have collected casings, and witnesses don’t often stick around after calling 911.”
Chittum added that because finding evidence in low light can be challenging, when clients deploy “daytime follow-up” after a nighttime shooting, the recovery rate goes up.
Chittum and other SoundThinking executives have long said that ShotSpotter isn’t meant to replace law enforcement efforts or services such as 911; it’s only complementary. ShotSpotter can’t detect indoor shootings, for one, and the company doesn’t plan on updating the system to do so.
Other cities evaluating the technology
ShotSpotter has both advocates and critics. In February, Chicago Mayor Brandon Johnson said he is ending the city’s six-year contract with ShotSpotter, which had been one of his campaign pledges, saying that “Chicago spends $9 million a year on ShotSpotter despite clear evidence it is unreliable and overly susceptible to human error.”
In May, Houston announced it wants to end its four-year contract with ShotSpotter before it runs out in January 2027; Houston media reports found that only 5% of alerts from December 2020 through September 2022 led to an arrest.
After a one-year pilot program in 2023 in Durham, North Carolina, the City Council voted not to continue using the system. Philip Cook, professor emeritus of public policy and economics at Duke University, led an audit of ShotSpotter’s success rate in Durham. He shared three conclusions:
- For the 1,447 total gunshot notifications in the pilot area, 57% had only a ShotSpotter notification, 15% had both a 911 call and a ShotSpotter notification, and 28% had only a 911 call.
- For the 282 confirmed gunshots, 26% had only a ShotSpotter notification, 34% had both a 911 call and a ShotSpotter notification, and 40% had only a 911 call.
- The notifications only detected by ShotSpotter led to an additional 2.3 police deployments per day in the pilot area.
In an interview, Cook said that “The ShotSpotter system made it easier for officers to locate where the gunfire occurred, since when they get 911 calls, the caller could be blocks away and may not know exactly [where] the shooting occurred, and human error can also happen with those calls.”
For the ShotSpotter notifications for which police did not confirm a gunshot, Cook suggested one possible reason: “The area where the sensors were installed [has]a high rate of drive-by shootings, so obviously it will be challenging for officers arriving five minutes later to find any perpetrators on the scene or to locate evidence.”
The Durham audit said researchers were unable to conclude if ShotSpotter’s technology led to a reduction in gun violence.
Beyond efficacy in crime prevention, other concerns about the technology’s use in Chicago have come from the Northwestern University School of Law’s MacArthur Justice Center and the ACLU. The two groups cite the Chicago inspector general’s report on ShotSpotter, which says the perceived aggregate frequency of ShotSpotter alerts in some regions encourages officers to engage in more stops and pat downs in those regions.
Also, as the ACLU writes, “ShotSpotter’s methodology is used to provide evidence against defendants in criminal cases, but isn’t transparent and hasn’t been peer-reviewed or otherwise independently evaluated. That simply isn’t acceptable for data that is used in court.”