Discovery sanctions for GDPR redactions

An order by Judge Payne out of the Eastern District of Texas does not agree that redactions allegedly required by GDPR were proper:

To further demonstrate the alleged bad faith application of the GDPR, Arigna showed where Continental blacked out the faces of its Executive Board in a picture even though that picture was available on Continental’s public website without the redactions. Based on these redactions and failure to timely produce the ESI, Argina seeks an adverse inference instruction; an order precluding Continental from using any document that it did not timely produce, and Arigna’s costs and fees.

In response, Continental argued (but did not show) that it received an opinion letter from a law firm based in Europe stating the redactions were required by the GDPR, and that it had worked diligently to produce the ESI while also complying with the GDPR.

July 29, 2022 Memorandum Order, Case No. 22-cv-00126 (EDTX)

Convenience vs Privacy

Very cool technology:

Delta Air Lines recently introduced a “Parallel Reality” system that lets travelers access individual flight information on a shared overhead screen based on a scan of their boarding pass — or their face. The twist is that 100 people can do this at a time, all using the same digital screen but only seeing their own personal details.

Unlike a regular TV or video wall, in which each pixel would emit the same color of light in every direction, the board sends different colors of light in different directions.

Coming to a giant airport screen: Your personal flight information

But it does require computers know exactly who and where you are.

“The internet is less free, more fragmented, and less secure”

The Council on Foreign Relations, described by Wikipedia as a “right leaning American think tank specializing in U.S. foreign policy and international relations,” has issued a report titled Confronting Reality in Cyberspace:

The major findings of the Task Force are as follows:

The era of the global internet is over.

U.S. policies promoting an open, global internet have failed, and Washington will be unable to stop or reverse the trend toward fragmentation.

Data is a source of geopolitical power and competition and is seen as central to economic and national security.

The report is a warning that the U.S. needs to get serious about a fragmenting internet or risk losing digital leadership entirely.

Keyword search warrants are (too?) powerful

Three teenagers set fire to a home in Denver because they believed someone who stole a phone lived there. Five members of a family died.

The police had video from a neighbor’s house showing three people in hooded sweatshirts and masks near the home at the time of the fire. But for weeks they had no further evidence.

Then the police subpoenaed cell tower data to see who was in the area. They got 7,000 devices, which they narrowed down to exclude neighbors and any that did not match the movement of a vehicle that was observed. Only 33 devices remained.

Then they went to Google:

[A] warrant to Google asked for any searches for the destroyed house’s address anytime in the two weeks before the fire. Google provided five accounts that made that search — including three accounts with email addresses that included [the suspect’s names].

Teen charged in deadly Denver arson told investigators he set fire over stolen phone, detective says

One of the defendants has filed a motion to suppress the Google search evidence, and the EFF has filed an amicus brief in support:

Should the police be able to ask Google for the name of everyone who searched for the address of an abortion provider in a state where abortions are now illegal? Or who searched for the drug mifepristone? What about people who searched for gender-affirming healthcare providers in a state that has equated such care with child abuse? Or everyone who searched for a dispensary in a state that has legalized cannabis but where the federal government still considers it illegal?

EFF to File Amicus Brief in First U.S. Case Challenging Dragnet Keyword Warrant

Fascinating case. Some version of this feels destined for the U.S. Supreme Court.

States aren’t any better at privacy

A press release by the California Department of Justice acknowledges that it leaked personal data on individuals applying for a concealed and carry weapons permit between 2011 and 2021.

The leaked data included “names, date of birth, gender, race, driver’s license number, addresses, and criminal history.”

The California Attorney General page on the California Consumer Privacy Act:

https://oag.ca.gov/privacy/ccpa

At least GDPR applies to public entities in Europe.

Some companies agree to not use location data from “sensitive points of interest”

A subset of Network Advertising Initiative companies have voluntarily agreed that they will not use location data associated with “sensitive points of interest,” which include:

Places of religious worship

Correctional facilities

Places that may be used to infer an LGBTQ+ identification

Places that may be used to infer engagement with explicit sexual content, material, or acts

Places primarily intended to be occupied by children under 16

Domestic abuse shelters, including rape crisis centers

Welfare or homeless shelters and halfway houses

Dependency or addiction treatment centers

Medical facilities that cater predominantly to sensitive conditions, such as cancer centers, HIV/ AIDS, fertility or abortion clinics, mental health treatment facilities, or emergency room trauma centers

Places that may be used to infer refugee or immigrant status, such as refugee or immigration centers and immigration services`

Credit repair, debt services, bankruptcy services, or payday lending institutions

Temporary places of assembly such as political rallies, marches, or protests, during the times that such rallies, marches, or protests take place

Military bases

NAI PRECISE LOCATION INFORMATION SOLUTION PROVIDER VOLUNTARY ENHANCED STANDARDS

The announcement is close behind increasing public concern that location data brokers might intentionally or reluctantly provide data on individuals visiting abortion clinics.

Microsoft discontinues face, gender, and age analysis tools

Kashmir Hill for the NYT:

“We’re taking concrete steps to live up to our A.I. principles,” said Ms. Crampton, who has worked as a lawyer at Microsoft for 11 years and joined the ethical A.I. group in 2018. “It’s going to be a huge journey.”

Microsoft Plans to Eliminate Face Analysis Tools in Push for ‘Responsible A.I.’

This coincides with Microsoft’s release of their Microsoft Responsible AI Standard, v2 (see also blog post).

Note, however, that these tools may have been useful for accessibility:

The age and gender analysis tools being eliminated — along with other tools to detect facial attributes such as hair and smile — could be useful to interpret visual images for blind or low-vision people, for example, but the company decided it was problematic to make the profiling tools generally available to the public, Ms. Crampton said.

Trade-offs everywhere.

AI model predicts who will become homeless

 EMILY ALPERT REYES for the LA Times:

It pulls data from eight county agencies to pinpoint whom to assist, looking at a broad range of data in county systems: Who has landed in the emergency room. Who has been booked in jail. Who has suffered a psychiatric crisis that led to hospitalization. Who has gotten cash aid or food benefits — and who has listed a county office as their “home address” for such programs, an indicator that often means they were homeless at the time.

A computer model predicts who will become homeless in L.A. Then these workers step in

That’s a lot of sensitive personal data. The word “privacy” does not appear in the article.

Data is of course exceptionally helpful in making sure money and resources are applied efficiently. (See also personalized advertising.)

This seems great, so… ok?

Frustration with GDPR bottleneck in Ireland

VINCENT MANANCOURT writing for Politico:

So far, officials at the EU level have put up a dogged defense of what has become one of their best-known rulebooks, including by publicly pushing back against calls to punish Ireland for what activists say is a failure to bring Big Tech’s data-hungry practices to heel.

Now, one of the European Union’s key voices on data protection regulation is breaking the Brussels taboo of questioning the bloc’s flagship law’s performance so far.

“I think there are parts of the GDPR that definitely have to be adjusted to the future reality,” European Data Protection Supervisor Wojciech Wiewiórowski told POLITICO in an interview earlier this month.

What’s wrong with the GDPR?

The main complaint appears to be that the Irish Data Protection Commission (which handles most big-tech privacy complaints) is overworked and slow.

Otherwise there appears to be a sense that things haven’t quite worked out as hoped, whatever that means.

The Privacy “Duty of Loyalty”

The draft American Data Privacy and Protection Act has a section called “duty of loyalty.” What the heck is that?

In the draft it’s a collection of specific requirements to minimize data collection and prohibit the use and transfer of social security numbers, precise geolocation, etc. See Sections 101, 102, 103 in the Discussion Draft.

But the “duty of loyalty” as a data privacy concept is broader. It means that data collectors must use data in a way that benefits users and places their interests above the interests of making a profit, much like a duty of loyalty (or a fiduciary duty) that a lawyer must have to their client.

Neil M. Richards and Woodrow Hartzog explain the concept in a 2021 paper:

Put simply, under our approach, loyalty would manifest itself primarily as a prohibition on designing digital tools and processing data in a way that conflicts with a trusting party’s best interests. Data collectors bound by such a duty of loyalty would be obligated to act in the best interests of the people exposing their data and engaging in online experiences, but only to the extent of their exposure. 

A Duty of Loyalty for Privacy Law at 966.

Richards and Hartzog suggest that a broad duty of loyalty combined with specific prohibitions against especially troubling practices would work like other areas of regulation (e.g., “unfair and deceptive trade practices”).

But although the American Data Privacy and Protection Act refers to this concept, the broad duty of loyalty is not (yet) part of the draft.