Clearview AI scores a PR win in the NYT

Kashmir Hill:

If Clearview AI, which is based in New York, hadn’t granted his lawyer special access to a facial recognition database of 20 billion faces, Mr. Conlyn might have spent up to 15 years in prison because the police believed he had been the one driving the car.

Clearview AI, Used by Police to Find Criminals, Is Now in Public Defenders’ Hands

Clearview allowed use of their facial recognition service to identify a good samaritan who had pulled Mr. Conlyn from the passenger side of the vehicle, thereby providing evidence he was not the driver.

Facebook does not know what data it has

Bruce Schneier, linking to an article in The Intercept about a court hearing in the Cambridge Analytica suit:

Facebook’s inability to comprehend its own functioning took the hearing up to the edge of the metaphysical. At one point, the court-appointed special master noted that the “Download Your Information” file provided to the suit’s plaintiffs must not have included everything the company had stored on those individuals because it appears to have no idea what it truly stores on anyone. Can it be that Facebook’s designated tool for comprehensively downloading your information might not actually download all your information? This, again, is outside the boundaries of knowledge.

“The solution to this is unfortunately exactly the work that was done to create the DYI file itself,” noted Zarashaw. “And the thing I struggle with here is in order to find gaps in what may not be in DYI file, you would by definition need to do even more work than was done to generate the DYI files in the first place.”

FACEBOOK ENGINEERS: WE HAVE NO IDEA WHERE WE KEEP ALL YOUR PERSONAL DATA

Schneier has repeatedly made this fundamental but counter-intuitive point: “Today, it’s easier to build complex systems than it is to build simple ones.”

None of this is surprising to people familiar with modern data center services at scale. Twitter allegedly doesn’t know how to restart its services if they really go down:

The company also lacks sufficient redundancies and procedures to restart or recover from data center crashes, Zatko’s disclosure says, meaning that even minor outages of several data centers at the same time could knock the entire Twitter service offline, perhaps for good.

Ex-Twitter exec blows the whistle, alleging reckless and negligent cybersecurity policies

Most of this is overblown rhetoric, but the underlying point is that no single person understands how any of these complex systems work. And they are not easy to fix or change.

Maybe the police should be able to use facial recognition…

Scott Ikeda for CPO Magazine:

Some cities and states that were early to ban law enforcement from using facial recognition software appear to be having second thoughts, which privacy advocates with the Electronic Frontier Foundation (EFF) and other organizations largely attribute to an uptick in certain types of urban crime.

Facial Recognition Bans Begin To Fall Around the US as Re-Funding of Law Enforcement Becomes Politically Popular

New Orleans and Virginia have both backtracked a bit with facial recognition technology now being allowed with supervision and for more serious types of crime.

Virginia in particular has imposed a requirement that facial recognition technology have an accuracy rating of at least 98% across all demographics.

Discovery sanctions for GDPR redactions

An order by Judge Payne out of the Eastern District of Texas does not agree that redactions allegedly required by GDPR were proper:

To further demonstrate the alleged bad faith application of the GDPR, Arigna showed where Continental blacked out the faces of its Executive Board in a picture even though that picture was available on Continental’s public website without the redactions. Based on these redactions and failure to timely produce the ESI, Argina seeks an adverse inference instruction; an order precluding Continental from using any document that it did not timely produce, and Arigna’s costs and fees.

In response, Continental argued (but did not show) that it received an opinion letter from a law firm based in Europe stating the redactions were required by the GDPR, and that it had worked diligently to produce the ESI while also complying with the GDPR.

July 29, 2022 Memorandum Order, Case No. 22-cv-00126 (EDTX)

Convenience vs Privacy

Very cool technology:

Delta Air Lines recently introduced a “Parallel Reality” system that lets travelers access individual flight information on a shared overhead screen based on a scan of their boarding pass — or their face. The twist is that 100 people can do this at a time, all using the same digital screen but only seeing their own personal details.

Unlike a regular TV or video wall, in which each pixel would emit the same color of light in every direction, the board sends different colors of light in different directions.

Coming to a giant airport screen: Your personal flight information

But it does require computers know exactly who and where you are.

“The internet is less free, more fragmented, and less secure”

The Council on Foreign Relations, described by Wikipedia as a “right leaning American think tank specializing in U.S. foreign policy and international relations,” has issued a report titled Confronting Reality in Cyberspace:

The major findings of the Task Force are as follows:

The era of the global internet is over.

U.S. policies promoting an open, global internet have failed, and Washington will be unable to stop or reverse the trend toward fragmentation.

Data is a source of geopolitical power and competition and is seen as central to economic and national security.

The report is a warning that the U.S. needs to get serious about a fragmenting internet or risk losing digital leadership entirely.

Keyword search warrants are (too?) powerful

Three teenagers set fire to a home in Denver because they believed someone who stole a phone lived there. Five members of a family died.

The police had video from a neighbor’s house showing three people in hooded sweatshirts and masks near the home at the time of the fire. But for weeks they had no further evidence.

Then the police subpoenaed cell tower data to see who was in the area. They got 7,000 devices, which they narrowed down to exclude neighbors and any that did not match the movement of a vehicle that was observed. Only 33 devices remained.

Then they went to Google:

[A] warrant to Google asked for any searches for the destroyed house’s address anytime in the two weeks before the fire. Google provided five accounts that made that search — including three accounts with email addresses that included [the suspect’s names].

Teen charged in deadly Denver arson told investigators he set fire over stolen phone, detective says

One of the defendants has filed a motion to suppress the Google search evidence, and the EFF has filed an amicus brief in support:

Should the police be able to ask Google for the name of everyone who searched for the address of an abortion provider in a state where abortions are now illegal? Or who searched for the drug mifepristone? What about people who searched for gender-affirming healthcare providers in a state that has equated such care with child abuse? Or everyone who searched for a dispensary in a state that has legalized cannabis but where the federal government still considers it illegal?

EFF to File Amicus Brief in First U.S. Case Challenging Dragnet Keyword Warrant

Fascinating case. Some version of this feels destined for the U.S. Supreme Court.

States aren’t any better at privacy

A press release by the California Department of Justice acknowledges that it leaked personal data on individuals applying for a concealed and carry weapons permit between 2011 and 2021.

The leaked data included “names, date of birth, gender, race, driver’s license number, addresses, and criminal history.”

The California Attorney General page on the California Consumer Privacy Act:

https://oag.ca.gov/privacy/ccpa

At least GDPR applies to public entities in Europe.

Some companies agree to not use location data from “sensitive points of interest”

A subset of Network Advertising Initiative companies have voluntarily agreed that they will not use location data associated with “sensitive points of interest,” which include:

Places of religious worship

Correctional facilities

Places that may be used to infer an LGBTQ+ identification

Places that may be used to infer engagement with explicit sexual content, material, or acts

Places primarily intended to be occupied by children under 16

Domestic abuse shelters, including rape crisis centers

Welfare or homeless shelters and halfway houses

Dependency or addiction treatment centers

Medical facilities that cater predominantly to sensitive conditions, such as cancer centers, HIV/ AIDS, fertility or abortion clinics, mental health treatment facilities, or emergency room trauma centers

Places that may be used to infer refugee or immigrant status, such as refugee or immigration centers and immigration services`

Credit repair, debt services, bankruptcy services, or payday lending institutions

Temporary places of assembly such as political rallies, marches, or protests, during the times that such rallies, marches, or protests take place

Military bases

NAI PRECISE LOCATION INFORMATION SOLUTION PROVIDER VOLUNTARY ENHANCED STANDARDS

The announcement is close behind increasing public concern that location data brokers might intentionally or reluctantly provide data on individuals visiting abortion clinics.