Update (October 22): Earlier this month, SpotSpotter filed a lawsuit alleging that the Vice report linked below contains false and defamatory statements.
Court documents recently reviewed by VICE have revealed that ShotSpotter, a company that makes and sells audio gunshot detection to cities and police departments, may not be as accurate or reliable as the company claims. In fact, the documents reveal that employees at ShotSpotter may be altering alerts generated by the technology in order to justify arrests and buttress prosecutors’ cases. For many reasons, including the concerns raised by these recent reports, police must stop using technologies like ShotSpotter.
Acoustic gunshot detection relies on a series of sensors, often placed on lamp posts or buildings. If a gunshot is fired, the sensors detect the specific acoustic signature of a gunshot and send the time and location to the police. Location is gauged by measuring the amount of time it takes for the sound to reach sensors in different locations.
According to ShotSpotter, the largest vendor of acoustic gunshot detection technology, this information is then verified by human acoustic experts to confirm the sound is gunfire, and not a car backfire, firecracker, or other sounds that could be mistaken for gunshots. The sensors themselves can only determine whether there is a loud noise that somewhat resembles a gunshot. It’s still up to people listening on headphones to say whether or not shots were fired.
In a recent statement, ShotSpotter denied the VICE report and claimed that the technology is “100% reliable.” Absolute claims like these are always dubious. And according to the testimony of a ShotSpotter employee and expert witness in court documents reviewed by VICE, claims about the accuracy of the classification come from the marketing department of the company—not from engineers.
Moreover, ShotSpotter presents a real and disturbing threat to people who live in cities covered in these AI-augmented listening devices—which all too often are over-deployed in majority Black and Latine neighborhoods. It's important to note that many of ShotSpotter's claims of accuracy are generated by marketers, not engineers. A recent study of Chicago showed how, over the span of 21 months, ShotSpotter sent police to dead-end reports of shots fired over 40,000 times--although some experts and studies have disputed this claim. This shows—again—that the technology is not as accurate as the company’s marketing department claims. It also means that police officers routinely are deployed to neighborhoods expecting to encounter an armed shooter, and instead encounter innocent pedestrians and neighborhood residents. This creates a real risk that police officers will interpret anyone they encounter near the projected site of the loud noises as a threat—a scenario that could easily result in civilian casualties, especially in over-policed communities.
In addition to its history of false positives, the danger it poses to pedestrians and residents, and the company's dubious record of altering data at the behest of police departments, there is also a civil liberties concern posed by the fact that these microphones intended to detect gunshots can also record human voices.
Yet people in public places—for example, having a quiet conversation on a deserted street—are often entitled to a reasonable expectation of privacy, without overhead microphones unexpectedly recording their conversations. Federal and state eavesdropping statutes (sometimes called wiretapping or interception laws) typically prohibit the recording of private conversations absent consent from at least one person in that conversation.
In at least two criminal trials, prosecutors sought to introduce as evidence audio of voices recorded on acoustic gunshot detection systems. In the California case People v. Johnson, the court admitted it into evidence. In the Massachusetts case Commonwealth v. Denison, the court did not, ruling that a recording of “oral communication” is prohibited “interception” under the Massachusetts Wiretap Act.
It’s only a matter of time before police and prosecutors’ reliance on ShotSpotter leads to tragic consequences. It’s time for cities to stop using ShotSpotter.