Kate Klonick in an opinion piece for the NYT:
The instructions were straightforward: At some point in the next two weeks, try to determine a stranger’s identity, in a public place, using only Google search on your phone, based on things the stranger said loudly enough for lots of others to hear and things that are displayed on the person’s clothing or bags, like logos or a monogram.Is Your Seatmate Googling You?
Most privacy is really obscurity. It’s just that we mostly don’t care, until we do.
What is private information? Is it that which literally no one else knows about you? Almost certainly not. Information that just your spouse or immediate family know? Probably broader than that. Information that isn’t Google-able? Maybe.
Privacy is more likely just the sense that there is information about yourself that you would like to limit distribution of in some way. And this is a core tension: privacy is different for every single person and yet it is a burden to constantly tweak those privacy controls.
From the RSA security conference this year comes a new survey on consumer attitudes towards privacy. And it’s not really surprising:
According to survey findings from research released at the RSA Conference 2019 on Tuesday, data privacy is a top concern for most people; out of 4,000 participants queried from January 14 to February 15, a vast majority (96 percent) of said that they care about their privacy (including most Millennials at more than 93 percent); and 93 percent said they use security software.RSAC 2019: Most Consumers Say ‘No’ to Cumbersome Data Privacy Practices
So they asked people if they cared about privacy and got 96% agreement. Yep, almost all of us really like privacy.
But of course there’s more:
However, users did not follow through with some of the more difficult and cumbersome best practices for data privacy. For instance, only 32 percent said that they read privacy policies and End User License Agreements (EULAs) (and 66 percent say they simply skim through or do not read EULA or other consent forms at all).
No surprise either. And of course a lot of people reuse passwords and don’t bother to check permissions. Because that stuff is a hassle.
I say again: the fundamental premise of GDPR and most other privacy legislation (including California’s upcoming Consumer Privacy Act) is wrong. Legislators think that if you only tell users how their data is being collected and used, then they will strike the right privacy balance for themselves. But apart from a few privacy activists, they won’t.
Better to start carving off the low hanging privacy fruit. While we’re all concerned that companies might not be telling us exactly how they are using our data, some companies are out in the open (or at least in the EULA) selling our location data to the grey market. How’s that for a simple rule? Don’t sell user location data. Start there.
I’m done. John Gruber links to yet another story of Facebook’s fundamental inability to govern itself:
On the surface, Facebook prompting people to enable 2FA was a good thing — if you have 2FA enabled it’s much harder for someone who isn’t you to log in to your account. But this being Facebook, they’re not just going to do something that is only good for the user, are they?
Last year it came to light that Facebook was using the phone numbers people submitted to the company solely so they could protect their accounts with 2FA for targeted advertising. And now, as security researcher and New York Times columnist Zeynep Tufekci pointed out, Facebook is allowing anyone to look up a user by their phone number, the same phone number that was supposed to be for security purposes only.FACEBOOK IS ALLOWING ANYONE TO LOOK YOU UP USING YOUR TWO-FACTOR AUTHENTICATION PHONE NUMBER
I’m offended as a data privacy lawyer and a cybersecurity professional and a user. I’m just done with Facebook.
What happens if you pick some people and boot them off Facebook for a month?
Those booted off enjoyed an additional hour of free time on average. They tended not to redistribute their liberated minutes to other websites and social networks, but chose instead to watch more television and spend time with friends and family. They consumed much less news, and were thus less aware of events but also less polarised in their views about them than those still on the network. Leaving Facebook boosted self-reported happiness and reduced feelings of depression and anxiety.
It also helped some to break the Facebook habit. Several weeks after the deactivation period, those who had been off Facebook spent 23% less time on it than those who had never left, and 5% of the forced leavers had yet to turn their accounts back on. And the amount of money subjects were willing to accept to shut their accounts for another four weeks was 13% lower after the month off than it had been before. Users, in other words, overestimate how much they value the service: a misperception corrected by a month of abstention. . . .What would happen if Facebook was turned off? (emphasis added)
No surprise, but illustrates the power of Facebook’s algorithmic magic: just one more video, just one more update. Makes you forget there are other things you enjoy more.
Now this is how you respond to an extortion attempt. Worth reading in full.
My estimation of, and respect for, Mr. Bezos has risen substantially.
It’s unclear why the New York Times’ Editorial Board thinks the EU’s GDPR is “leading the way” if they are concerned about the “legal fiction of consent”:
The average person would have to spend 76 working days reading all of the digital privacy policies they agree to in the span of a year. Reading Amazon’s terms and conditions alone out loud takes approximately nine hours.
GDPR does not fix this problem. The whole premise of GDPR is that consumers should be able to balance their privacy concerns if they only understood and controlled their own data: “Natural persons should have control of their own personal data.“
But look, data is really complicated. That’s sort of the entire point of big data. Meanwhile the public seems outraged on the one hand that there is not enough disclosure (“users are not able to fully understand the extent of the processing operations carried out by Google”), and on the other hand that there is too much disclosure (“we discovered that privacy policies have greatly increased in length over the past 10 years”).
Here’s the bottom line: no one wants to read privacy policies. And unless it becomes comically simple, users will not balance their own privacy interests. Solve for the equilibrium.
Online companies that are labeled as disrupters may not give you the best deal and – get this – may use your personal data to get additional value from you! Shocking. Thanks, NYTimes! I guess it was a slow news day.
Following up on my post yesterday (ugh privacy):
In a recent Wall Street Journal commentary, Mark Zuckerberg claimed that Facebook users want to see ads tailored to their interests. But the data show the opposite is true. With the help of major polling firms, we conducted two large national telephone surveys of Americans in 2012 and 2009. When we asked people whether they wanted websites they visit to show them commercial ads, news or political ads “tailored to your interests,” a substantial majority said no. Around half did say they wanted discounts tailored to their interests. But that too changed after we told them how companies gathered the information that enables tailoring, such as following you on a website.Mark Zuckerberg’s Delusion of Consumer Consent
Survey responses depend greatly on how questions are phrased and contextualized, and I’m not even sure how I’d answer this question myself.
I feel like I want relevant ads. But maybe not too relevant? There’s a line between, “get the most durable baby swing on the market!” (no thanks, go away!) and “buy that genetic test you’ve been researching!” (how did you know that??)
All we really know is that if there is a “right balance” between personalized ads and privacy, Facebook is obviously not incentivized to find it. But… who is? What is the right balance? I see precious little commentary on that.
If you go to Facebook’s Marking Data Privacy Day 2019 and click on the by-line link, you get the Facebook profile for their Chief Privacy Officer, and a bunch of her family photos.
At first I thought, “this is weird.”
And then I thought, “no of course this is expected.”
And then I thought, “ugh privacy.”