Chaim Gartenberg, writing for The Verge:
Axon (formally known as Taser) has been shifting its business toward body cameras for police officers for the past few years, but today, the company is making a big change. At the recommendation of its AI ethics board, “Axon will not be commercializing face matching products on our body camera,” the company announced in a blog post today.
[. . . . .]
According to the board’s report, “Face recognition technology is not currently reliable enough to ethically justify its use on body-worn cameras.” It cites that, at the very least, more accurate technology that “performs equally well across races, ethnicities, genders, and other identity groups” would be required, assuming facial recognition technology for police body cameras can ever be considered ethical at all, a conversation that the board has begun to examine.
Axon (formerly Taser) says facial recognition on police body cams is unethical
One issue we keep sidestepping is that facial recognition technology is never going to be either perfectly accurate or perfectly equal across all classes of people. In other words, no matter how accurate the technology becomes there will always be some small differences in performance between, for example, recognizing light-skinned and dark-skinned people. So the question becomes, is any difference in accuracy tolerable? What amount?