A federal lawsuit filed Thursday alleges Chicago police misused “unreliable” gunshot detection technology and failed to pursue other leads in investigating a grandfather from the city’s South Side who was charged with killing a neighbor.
. . . . .
ShotSpotter’s website says the company is “a leader in precision policing technology solutions” that help stop gun violence by using sensors, algorithms and artificial intelligence to classify 14 million sounds in its proprietary database as gunshots or something else.
Lawsuit: Chicago police misused ShotSpotter in murder case
Some commentators (e.g., link) have jumped on this story as an example of someone (allegedly) being wrongly imprisoned due to AI.
But maybe ShotSpotter is just bad software that is used improperly? Does it matter?
The definition of AI is so difficult that we may soon find ourselves regulating all software.