Done with Facebook

I’m done. John Gruber links to yet another story of Facebook’s fundamental inability to govern itself:

On the surface, Facebook prompting people to enable 2FA was a good thing — if you have 2FA enabled it’s much harder for someone who isn’t you to log in to your account. But this being Facebook, they’re not just going to do something that is only good for the user, are they?

Last year it came to light that Facebook was using the phone numbers people submitted to the company solely so they could protect their accounts with 2FA for targeted advertising. And now, as security researcher and New York Times columnist Zeynep Tufekci pointed out, Facebook is allowing anyone to look up a user by their phone number, the same phone number that was supposed to be for security purposes only.

FACEBOOK IS ALLOWING ANYONE TO LOOK YOU UP USING YOUR TWO-FACTOR AUTHENTICATION PHONE NUMBER

I’m offended as a data privacy lawyer and a cybersecurity professional and a user. I’m just done with Facebook.

Taking a Facebook Break

What happens if you pick some people and boot them off Facebook for a month?

Those booted off enjoyed an additional hour of free time on average. They tended not to redistribute their liberated minutes to other websites and social networks, but chose instead to watch more television and spend time with friends and family. They consumed much less news, and were thus less aware of events but also less polarised in their views about them than those still on the network. Leaving Facebook boosted self-reported happiness and reduced feelings of depression and anxiety.

It also helped some to break the Facebook habit. Several weeks after the deactivation period, those who had been off Facebook spent 23% less time on it than those who had never left, and 5% of the forced leavers had yet to turn their accounts back on. And the amount of money subjects were willing to accept to shut their accounts for another four weeks was 13% lower after the month off than it had been before. Users, in other words, overestimate how much they value the service: a misperception corrected by a month of abstention. . . .

What would happen if Facebook was turned off? (emphasis added)

No surprise, but illustrates the power of Facebook’s algorithmic magic: just one more video, just one more update. Makes you forget there are other things you enjoy more.

Who Balances Privacy?

It’s unclear why the New York Times’ Editorial Board thinks the EU’s GDPR is “leading the way” if they are concerned about the “legal fiction of consent”:

The average person would have to spend 76 working days reading all of the digital privacy policies they agree to in the span of a year. Reading Amazon’s terms and conditions alone out loud takes approximately nine hours.

Why would anyone read the terms of service when they don’t feel as though they have a choice in the first place? It’s not as though a user can call up Mark Zuckerberg and negotiate his or her own privacy policy. The “I agree” button should have long ago been renamed “Meh, whatever.”

How Silicon Valley Puts the ‘Con’ in Consent

GDPR does not fix this problem. The whole premise of GDPR is that consumers should be able to balance their privacy concerns if they only understood and controlled their own data: “Natural persons should have control of their own personal data.

But look, data is really complicated. That’s sort of the entire point of big data. Meanwhile the public seems outraged on the one hand that there is not enough disclosure (“users are not able to fully understand the extent of the processing operations carried out by Google”), and on the other hand that there is too much disclosure (“we discovered that privacy policies have greatly increased in length over the past 10 years”).

Here’s the bottom line: no one wants to read privacy policies. And unless it becomes comically simple, users will not balance their own privacy interests. Solve for the equilibrium.

Facebook Privacy Noise

Following up on my post yesterday (ugh privacy):

In a recent Wall Street Journal commentary, Mark Zuckerberg claimed that Facebook users want to see ads tailored to their interests. But the data show the opposite is true. With the help of major polling firms, we conducted two large national telephone surveys of Americans in 2012 and 2009. When we asked people whether they wanted websites they visit to show them commercial ads, news or political ads “tailored to your interests,” a substantial majority said no. Around half did say they wanted discounts tailored to their interests. But that too changed after we told them how companies gathered the information that enables tailoring, such as following you on a website.

Mark Zuckerberg’s Delusion of Consumer Consent

Survey responses depend greatly on how questions are phrased and contextualized, and I’m not even sure how I’d answer this question myself.

I feel like I want relevant ads. But maybe not too relevant? There’s a line between, “get the most durable baby swing on the market!” (no thanks, go away!) and “buy that genetic test you’ve been researching!” (how did you know that??)

All we really know is that if there is a “right balance” between personalized ads and privacy, Facebook is obviously not incentivized to find it. But… who is? What is the right balance? I see precious little commentary on that.