A group of 27 AI researchers affiliated with various academic institutions as well as Microsoft, Google, and Facebook have written an open letter calling on Amazon to stop selling its face recognition technology (Rekognition) to law enforcement. The letter gets into the weeds very quickly but the main complaint is that Rekognition is biased against darker skinned individuals:
A recent study conducted by Inioluwa Deborah Raji and Joy Buolamwini, published at the AAAI/ACM conference on Artificial Intelligence, Ethics, and Society, found that the version of Amazon’s Rekognition tool which was available on August 2018, has much higher error rates while classifying the gender of darker skinned women than lighter skinned men (31% vs. 0%).On Recent Research Auditing Commercial Facial Analysis Technology
Amazon’s response has essentially been “no that’s not quite right and we’re also concerned and continually improving but none of this is any reason to stop selling the product.”
What all of this highlights is:
- No consensus on amount of tolerable bias. Perfectly zero bias may be unreachable. Do we insist upon it, or near it? Or is there a level of tolerable bias? Less bias than an average human would be an improvement in most cases.
- No framework for assessing bias. We don’t have any standards on how to judge whether an AI system is “tolerably biased” or not. Much of the debate here is over how the biased was measured.
- No framework for assessing impact of bias. Objections to Amazon’s Rekognition technology are premised on its commercial sale, especially to law enforcement. If Amazon had simply released the technology as a research project, it would have joined many other examples of bias in AI research that cause concern but not outrage. Should we insist on zero bias for law enforcement applications? Can retail applications be more tolerably biased?
- No laws or regulations at all. And of course there are no laws or regulations governing the sale or use of these systems anywhere in the United States. But… perhaps coming soon.