Floyd Abrams, one of the most prominent First Amendment lawyers in the country, has a new client: the facial recognition company Clearview AI.
Litigation against the start-up “has the potential of leading to a major decision about the interrelationship between privacy claims and First Amendment defenses in the 21st century,” Mr. Abrams said in a phone interview. He said the underlying legal questions could one day reach the Supreme Court.
Software that tweaks photos to hide them from facial recognition:
A start-up called Clearview AI, for example, scraped billions of online photos to build a tool for the police that could lead them from a face to a Facebook account, revealing a person’s identity.
Now researchers are trying to foil those systems. A team of computer engineers at the University of Chicago has developed a tool that disguises photos with pixel-level changes that confuse facial recognition systems.
The cameras matched facial images of customers entering a store to those of people Rite Aid previously observed engaging in potential criminal activity, causing an alert to be sent to security agents’ smartphones. Agents then reviewed the match for accuracy and could tell the customer to leave.
The DeepCam systems were primarily deployed in “lower-income, non-white neighborhoods,” and, according to current and former Rite Aid employees, a previous system called FaceFirst regularly made mistakes:
“It doesn’t pick up Black people well,” one loss prevention staffer said last year while using FaceFirst at a Rite Aid in an African-American neighborhood of Detroit. “If your eyes are the same way, or if you’re wearing your headband like another person is wearing a headband, you’re going to get a hit.”
So starts a great article on DNN’s and learning shortcuts:
Recently, researchers trained a deep neural network to classify breast cancer, achieving a performance of 85%. When used in combination with three other neural network models, the resulting ensemble method reached an outstanding 99% classification accuracy, rivaling expert radiologists with years of training.
The result described above is true, with one little twist: instead of using state-of-the-art artificial deep neural networks, researchers trained “natural” neural networks – more precisely, a flock of four pigeons – to diagnose breast cancer.
Researchers at Duke University have released a paper on PULSE, an AI algorithm that constructs a high resolution face from a low resolution image. And the results look pretty good:
6/23/2020 Update: The PULSE algorithm exhibits a notable bias towards Caucasian features:
It’s a startling image that illustrates the deep-rooted biases of AI research. Input a low-resolution picture of Barack Obama, the first black president of the United States, into an algorithm designed to generate depixelated faces, and the output is a white man.
The reports can be created in a few seconds, using searches based on partial names or incomplete dates of birth. Tenants generally have no choice but to submit to the screenings and typically pay an application fee for the privilege. Automated reports are usually delivered to landlords without a human ever glancing at the results to see if they contain obvious mistakes, according to court records and interviews.
A similar problem exists in information security and one solution gaining traction are “bug bounty programs”. Bug bounty programs seek to allow security researchers and laymen to submit their exploits directly to the affected parties in exchange for compensation.
The market rate for security bounties for the average company on HackerOne range from \$100-\$1000. Bigger companies can pay more. In 2017, Facebook has disclosed paying \$880,000 in bug bounties, with a minimum of $500 a bounty. Google pays from \$100 to \$31,337 for exploits and Google paid \$3,000,000 in security bounties in 2016.
It seems reasonable to suggest at least big companies with large market caps who already have bounty reporting infrastructure, attempt to reward and collaborate with those who find bias in their software, rather than have them take it to the press in frustration and with no compensation for their efforts.