Johnson, a first-term mayor, campaigned on a promise to end the use of ShotSpotter, putting him at odds with police leaders who have praised the system.
They argue that crime rates – not residents’ race – determine where the technology is deployed.
And due to decades of racist policing focusing on communities based on residents' race those crime rates are based on exactly that.
This is why police forces love AI; they can feed the systems data based on their own history racist policing and absolve themselves of responsibility when the garbage coming out matches the garbage going in.
Doesn't make any sense anyway. If I'm Whitey McWhiterson (I am, but if) and I want law enforcement protecting me and not minorities, I'd want this deployed in my neighborhood, right? Or is the false positive rate too high?
Not only is the false positive rate way too high, they’ve caught the company working with law enforcement to retroactively add fake data points to support raids and arrests.
You don't want that. Because at any given moment, kids are committing all sorts of minor violations. They trespass, stay out too late, get into scuffles, etc.
But now there's a cop watching you 24x7, and catches you smoking at 14 in a closed construction site with your friends... you get arrested cause the cop is a jackass. Now you've got a record, you missed a couple weeks of school, and college won't accept you, and your future is fucked.
You did the exact same thing that millions of other teens did, but because a cop happened to feel kinda like fucking you over, your life is ruined.
You DO NOT want cops patrolling your neighborhood, or anywhere you hang out.
I'll preface this by saying I have no idea how the system works, but I wouldn't be surprised. I have an old motorcycle that will occasionally get in a mood where it doesn't want to start. If I'm not in a rush, I'll let it sit a few minutes between tries, but if I have somewhere to be, I'll keep fighting with it and keep cranking the starter, which often leads to a massive backfire. I've made neighbors think someone's shooting before.
Not only is the false positive rate way too high, they’ve caught the company working with law enforcement to retroactively add fake data points to support raids and arrests.
I'm not going to comment on the racial bias part as I have no data on any of that, but I'm not sure they use "AI" in any modern sense of the term. It's basically the same tech triggering the activation words of our voice activated home assistants but massively scaled.
I lived in a city with shotspotter long before modern AI was popular. It is a simply sound detection and triangulation as far as I'm aware.
The night it was activated, blanks were fired by police to calibrate the network. The next morning, there was an article that the system detected and triangulated 4 additional shots.
I work in a field where "AI" has been all the rage for the last few years (cybersecurity). In my experience, if a vendor touts that their product uses "AI", run. Run far, far away. The one thing AI is really good at is turning noisy data into a fuck ton of false positives. And I can't imagine any noisier data than the noise in a city (pun not intended). Cities are a 24x7 cacophony of loud noise and you expect anything to pick out and triangulate gun shots? Sure, they are loud as can be, but that sound also reflects and there are lots of other loud sounds to deal with. And that doesn't even touch on the problem of unscrupulous police forces using either bad data or just making shit up about the data to go harass people. Good riddance to bad rubbish.
*scraps eventually. It's better than nothing, but they are signing a new contract through September of this year, supposedly so CPD can "transition appropriately" for the next six months, whatever that entails.
First you deploy this technology to the areas that you know have been experiencing gunfire. Several microphones are used and when gunfire is "heard" by the server, it can be triangulated to the source location. If you have a video network, you can also move to the source. Guess the server program is not identifying the gunshot frequency correctly.