A proposal to tax targeted digital ads

Paul Romer proposes tax policy, instead of antitrust, to nudge privacy in the right direction:

Of course, companies are incredibly clever about avoiding taxes. But in this case, that’s a good thing for all of us. This tax would spur their creativity. Ad-driven platform companies could avoid the tax entirely by switching to the business model that many digital companies already offer: an ad-free subscription. Under this model, consumers know what they give up, and the success of the business would not hinge on tracking customers with ever more sophisticated surveillance techniques. A company could succeed the old-fashioned way: by delivering a service that is worth more than it costs.

A Tax That Could Fix Big Tech

Not a bad idea.

Sixth Circuit says chalking tires is an unreasonable search

In Taylor v. City of Saginaw, the Sixth Circuit U.S. Court of Appeals (covering Kentucky, Michigan, Ohio, and Tennessee) has concluded that the common – indeed, ubiquitous! – practice of tracking how long a car has been parked by chalking its tires is unconstitutional:

Alison Taylor, a frequent recipient of parking tickets, sued the City and its parking enforcement officer Tabitha Hoskins, alleging that chalking violated her Fourth Amendment right to be free from unreasonable search. The City moved to dismiss the action. The district court granted the City’s motion, finding that, while chalking may have constituted a search under the Fourth Amendment, the search was reasonable. Because we chalk this practice up to a regulatory exercise, rather than a community-caretaking function, we REVERSE.

This is a great example of a court following individual precedent down a winding path to a conclusion that is actually very strange. Here’s how they got there:

  1. Start with the Constitution. The Fourth Amendment to the Constitution protects the “right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures.”
  2. Is it a search? Yes. But really only because the Supreme Court recently decided that attaching GPS devices to cars is a search. Attaching GPS devices is a search because it is a trespass. And chalking is also a trespass because the common law says that “acting upon a chattel as intentionally to cause it to come in contact with some other object” is a trespass. So chalking is a trespass to obtain information. And that makes it a search.
  3. Is it unreasonable? We assume so. The government bears the burden of proving that the search was not unreasonable, and this is where they fell down. First, the government said people have a reduced expectation of privacy in cars. Nope, the Court says, that analysis only applies when you have a warrant or probable cause, and the government didn’t have either. Second, the government said the parking officers weren’t operating as law enforcement; they were operating as “community caretakers” and another standard applies. Nope, the Court says, the government is actually enforcing laws so that doesn’t apply either. Hearing no other arguments, the Court concludes the search was unreasonable.
  4. And now tire chalking is an unconstitutional, unreasonable search.

I’m not sure the drafters of the Fourth Amendment would agree with this analysis. Chalking a tire doesn’t seem to be either unreasonable or a search. And of course there are a number of other ways to argue this case, including with the “administrative search exception,” which the government failed to raise. It’s possible this case gets reviewed.

On the other hand, plenty of other options are available to parking enforcement officers including video, photos, parking meters, and taking notes!

Ode to Obscurity

The recording of almost everywhere we go and everything we do has become increasingly cheap and easy. “Obscurity” is becoming rarer. Dr. Hartzog (law, computer science) and Dr. Selinger (philosophy) make the point that lack of obscurity may limit our growth as individuals:

Obscurity makes meaningful and intimate relationships possible, ones that offer solidarity, loyalty and love. It allows us to choose with whom we want to share different kinds of information. It protects us from having everyone know the different roles we play in the different parts of our lives. We need to be able to play one role with our co-workers while revealing other parts of ourselves with friends and family. . . .

Obscurity protects us from being pressured to be conventional. This buffer from a ubiquitous permanent record is especially important for the exploratory phase of youth. To develop as humans, people must be free to try things they might later regret. This is how we become better people. Without obscurity, we wouldn’t have the freedom to take risks, fail and dust ourselves off. We’d be stymied by the fear of being deemed failures by a system that never forgets.

Why You Are No Longer Safe in the Crowd

Is obscurity different than privacy? Or perhaps it is another name for a privacy concept that has a million shades of gray. Privacy is weird.

Summary of the DETOUR Act

On April 9, 2019, Representatives Warner (D-VA) and Fischer (R-NE) introduced the Deceptive Experiences To Online Users Reductions (DETOUR) Act. The bill would criminalize user interface “dark patterns” around user consent, which trick or nudge (depending on your perspective) users into consenting to things that may not be in their best interests.

There is a whole website devoted to dark patterns, and it is pretty informative. Here’s an example of a dark pattern that makes it hard to figure out how not to sign up for a service:

The DETOUR Act gives the FTC power to regulate user interfaces that have “the purpose or substantial effect of obscuring, subverting, or impairing user autonomy, decision-making, or choice to obtain consent or user data.” It also prohibits the segmentation of users into groups for behavioral experimentation without informed consent.

The Act would only apply to “large online operators,” defined as having more than 100M authenticated users in any 30 day period. (Small online operators can still trick people?) Large online operators would also have to disclose their experiments every 90 days and establish Independent Review Boards to oversee any such user research.

Summary of the proposed Algorithmic Accountability Act of 2019

Senators Wyden (D-OR) and Booker (D-NJ) have proposed a Senate bill that would require big businesses and data brokers to conduct “impact assessments” for (1) “high-risk automated decision systems”; and (2) “high-risk information systems”.

The bill essentially gives the FTC power to promulgate regulations requiring companies with a lot of personal data to conduct studies of how their use of that data impacts people. Think of it as the equivalent of an environmental impact study for big data, or the US equivalent of GDPR’s Data Protection Impact Assessment process. In fact, it is very similar to the GDPR requirement.

Here’s a summary of the key terms:

Covered entities. The bill would apply to anyone that (a) receives more than $50M in revenue over the preceding three-year period; (b) possesses personal information on more than 1M consumers or consumer devices; or (c) is a “data broker,” defined as possessing personal information on individuals that are not customers or employees as a substantial part of business.

Definition of personal information. Broadly defined as any information “reasonably linkable to a specific consumer or consumer device.”

Impact assessments. At a minimum, requires a description of the system, design, training process, data, purpose, relative benefits and costs, data minimization practices, retention policies, access to data by consumers, ability of consumers to correct or object to the data, sharing of data, risks of inaccurate, biased, unfair, or discriminatory decisions, and safeguards to minimize risks.

Systems which must be evaluated. Must evaluate any system that “poses a significant risk” to the privacy and security of personal information or results in inaccurate, unfair, biased, or discriminatory decisions, especially if the system alters legal rights or profiles “sensitive aspects” of consumer lives such as protected class, criminal convictions, work performance, economic situation, health, personal preferences, interests, behavior, location, etc.

Enforcement. Enforced by the FTC or the Attorney General of any State upon notice to the FTC.

What is privacy?

Kate Klonick in an opinion piece for the NYT:

The instructions were straightforward: At some point in the next two weeks, try to determine a stranger’s identity, in a public place, using only Google search on your phone, based on things the stranger said loudly enough for lots of others to hear and things that are displayed on the person’s clothing or bags, like logos or a monogram.

Is Your Seatmate Googling You?

Most privacy is really obscurity. It’s just that we mostly don’t care, until we do.

What is private information? Is it that which literally no one else knows about you? Almost certainly not. Information that just your spouse or immediate family know? Probably broader than that. Information that isn’t Google-able? Maybe.

Privacy is more likely just the sense that there is information about yourself that you would like to limit distribution of in some way. And this is a core tension: privacy is different for every single person and yet it is a burden to constantly tweak those privacy controls.

Privacy is complicated and no one wants to deal

From the RSA security conference this year comes a new survey on consumer attitudes towards privacy. And it’s not really surprising:

According to survey findings from research released at the RSA Conference 2019 on Tuesday, data privacy is a top concern for most people; out of 4,000 participants queried from January 14 to February 15, a vast majority (96 percent) of said that they care about their privacy (including most Millennials at more than 93 percent); and 93 percent said they use security software.

RSAC 2019: Most Consumers Say ‘No’ to Cumbersome Data Privacy Practices

So they asked people if they cared about privacy and got 96% agreement. Yep, almost all of us really like privacy.

But of course there’s more:

However, users did not follow through with some of the more difficult and cumbersome best practices for data privacy. For instance, only 32 percent said that they read privacy policies and End User License Agreements (EULAs) (and 66 percent say they simply skim through or do not read EULA or other consent forms at all).

No surprise either. And of course a lot of people reuse passwords and don’t bother to check permissions. Because that stuff is a hassle.

I say again: the fundamental premise of GDPR and most other privacy legislation (including California’s upcoming Consumer Privacy Act) is wrong. Legislators think that if you only tell users how their data is being collected and used, then they will strike the right privacy balance for themselves. But apart from a few privacy activists, they won’t.

Better to start carving off the low hanging privacy fruit. While we’re all concerned that companies might not be telling us exactly how they are using our data, some companies are out in the open (or at least in the EULA) selling our location data to the grey market. How’s that for a simple rule? Don’t sell user location data. Start there.

Done with Facebook

I’m done. John Gruber links to yet another story of Facebook’s fundamental inability to govern itself:

On the surface, Facebook prompting people to enable 2FA was a good thing — if you have 2FA enabled it’s much harder for someone who isn’t you to log in to your account. But this being Facebook, they’re not just going to do something that is only good for the user, are they?

Last year it came to light that Facebook was using the phone numbers people submitted to the company solely so they could protect their accounts with 2FA for targeted advertising. And now, as security researcher and New York Times columnist Zeynep Tufekci pointed out, Facebook is allowing anyone to look up a user by their phone number, the same phone number that was supposed to be for security purposes only.

FACEBOOK IS ALLOWING ANYONE TO LOOK YOU UP USING YOUR TWO-FACTOR AUTHENTICATION PHONE NUMBER

I’m offended as a data privacy lawyer and a cybersecurity professional and a user. I’m just done with Facebook.

Taking a Facebook Break

What happens if you pick some people and boot them off Facebook for a month?

Those booted off enjoyed an additional hour of free time on average. They tended not to redistribute their liberated minutes to other websites and social networks, but chose instead to watch more television and spend time with friends and family. They consumed much less news, and were thus less aware of events but also less polarised in their views about them than those still on the network. Leaving Facebook boosted self-reported happiness and reduced feelings of depression and anxiety.

It also helped some to break the Facebook habit. Several weeks after the deactivation period, those who had been off Facebook spent 23% less time on it than those who had never left, and 5% of the forced leavers had yet to turn their accounts back on. And the amount of money subjects were willing to accept to shut their accounts for another four weeks was 13% lower after the month off than it had been before. Users, in other words, overestimate how much they value the service: a misperception corrected by a month of abstention. . . .

What would happen if Facebook was turned off? (emphasis added)

No surprise, but illustrates the power of Facebook’s algorithmic magic: just one more video, just one more update. Makes you forget there are other things you enjoy more.