An engineer has built a counter-surveillance tool on top of the hardware and software stack for Tesla vehicles:
It uses the existing video feeds created by Tesla’s Sentry Mode features and uses license plate and facial detection to determine if you are being followed.
Scout does all that in real-time and sends you notifications if it sees anything suspicious.Turn your Tesla into a CIA-like counter-surveillance tool with this hack
A video demonstration is embedded in the article.
This is a reminder that intelligent surveillance tools are going to be available at massive scale to even private citizens, not just the government. As governments track citizens, will citizens track government actors and individual police officers? What will we do with all of this data?
Rachel Thomas with an excellent essay on 8 Things You Need to Know About Surveillance:
I frequently talk with people who are not that concerned about surveillance, or who feel that the positives outweigh the risks. Here, I want to share some important truths about surveillance:
1. Surveillance can facilitate human rights abuses and even genocide
2. Data is often used for different purposes than why it was collected
3. Data often contains errors
4. Surveillance typically operates with no accountability
5. Surveillance changes our behavior
6. Surveillance disproportionately impacts the marginalized
7. Data privacy is a public good
8. We don’t have to accept invasive surveillance
The issues are of course more complex than anyone can summarize in a brief essay, but Thomas’ points on data often containing errors (3) and the frequent lack of processes to remedy those errors (4) deserve special emphasis. We tend to assume that automated systems are accurate and work well as long as they do so in our experience. But for many people they do not, and this has a dramatic impact on their lives.
Angry lamentation about the effects of new tech on privacy has flabbergasted me the most. For practical purposes, we have more privacy than ever before in human history. You can now buy embarrassing products in secret. You can read or view virtually anything you like in secret. You can interact with over a billion people in secret.
Then what privacy have we lost? The privacy to not be part of a Big Data Set. The privacy to not have firms try to sell us stuff based on our previous purchases. In short, we have lost the kinds of privacy that no prudent person loses sleep over.
The prudent will however be annoyed that – thanks to populist pressure – we now have to click “I agree” fifty times a day to access our favorite websites. Implicit consent was working admirably, but now we all have to suffer to please people who are impossible to please. . . . .Historically Hollow: The Cries of Populism
Meanwhile of course the government takes (mostly innocent!) state driver’s license biometrics without notice or consent.
The endowment effect is a psychological phenomenon in which people overvalue the things they own as compared to equivalent things they do not own. For example, a person given a mug and then offered the chance to trade it for equally valued pens will typically demand twice the value in pens than they would have been willing to pay to acquire the mug in the first place. In other words, a thing that is “yours” has more value than the thing in the abstract.
Turns out the endowment effect is on steroids in the data privacy realm:
Do consumers value data privacy? How much? In a survey of 2,416 Americans, we find that the median consumer is willing to pay just $5 per month to maintain data privacy (along specified dimensions), but would demand $80 to allow access to personal data. This is a “superendowment effect,” much higher than the 1:2 ratio often found between willingness to pay and willingness to accept.How Much Is Data Privacy Worth? A Preliminary Investigation
The researchers conclude that “policymakers should give little or no attention” to these economic measures of value because willingness to pay and willingness to accept are “highly unreliable guides to the welfare effects of retaining or giving up data privacy.”
Maybe it’s because people have been told data privacy is a major issue, but they actually haven’t seen any real-world impact on their lives.
The EU High-Level Expert Group on Artificial Intelligence released its Policy and Investment Recommendations for Trustworthy AI today. The 50-page document is a bit more prescriptive that their previous Ethics Guidelines, and suggests that governments “refrain from disproportionate and mass surveillance of individuals” and “introduce mandatory self-identification of AI systems.” (But see deceptive NYPD chatbots.)
A big chunk of the report also urges the EU to invest in education and subject matter expertise.
So far the discussion around AI mass surveillance has been relatively binary: do it or not. At some point I expect we will see proposals to do mass surveillance while maintaining individual privacy. The security benefits of mass surveillance are too attractive to forego.
In his book “Deep Medicine,” which is about how A.I. is changing medicine across all fields, Eric Topol describes a study in which a learning algorithm was given medical records to predict who was likely to attempt suicide. It accurately predicted attempts nearly 80 percent of the time. By incorporating data of real-world interactions such as laughter and anger, an algorithm in a similar study was able to reach 93 percent accuracy.
[. . . . .]
Medicine is hard because, as A.I. is teaching us, we’re much more different from one another than we thought. There is no single diet approach that is best for all people because we all process food in our own distinct way. Diet, like other treatments, has to be customized.
You can be freaked out by the privacy-invading power of A.I. to know you, but only A.I. can gather the data necessary to do this.How Artificial Intelligence Can Save Your Life
Rephrasing a sentence from an earlier post, health is halfway around the block before privacy can get its shoes on.
Niraj Chokshi, writing for the New York Time:
To prevent the worst outcomes, the A.C.L.U. offered a range of recommendations governing the use of video analytics in the public and private sectors.
No governmental entity should be allowed to deploy video analytics without legislative approval, public notification and a review of a system’s effects on civil rights, it said. Individuals should know what kind of information is recorded and analyzed, have access to data collected about them, and have a way to challenge or correct inaccuracies, too.
To prevent abuses, video analytics should not be used to collect identifiable information en masse or merely for seeking out “suspicious” behavior, the A.C.L.U. said. Data collected should also be handled with care and systems should make decisions transparently and in ways that don’t carry legal implications for those tracked, the group said.
Businesses should be governed by similar guidelines and should be transparent in how they use video analytics, the group said. Regulations governing them should balance constitutional protections, including the rights to privacy and free expression.How Surveillance Cameras Could Be Weaponized With A.I.
These recommendations appear to boil down to transparency and not tracking everyone all the time without a specific reason. Seems reasonable as a starting point.
Bruce Schneier advocates for a technological “pause” to allow policy to catch up:
[U]biquitous surveillance will drastically change our relationship to society. We’ve never lived in this sort of world, even those of us who have lived through previous totalitarian regimes. The effects will be felt in many different areas. False positives — when the surveillance system gets it wrong — will lead to harassment and worse. Discrimination will become automated. Those who fall outside norms will be marginalized. And most importantly, the inability to live anonymously will have an enormous chilling effect on speech and behavior, which in turn will hobble society’s ability to experiment and change.Computers and Video Surveillance
On the other hand, isn’t it a good thing we can spot police officers with militant, racist views?
US Customs revealed the name of a hacked subcontractor (presumably accidentally) in the title of a Word document:
A contractor for US Customs and Border Protection has been breached, leaking photos and other sensitive data, the agency announced on Monday. Initially described as “traveler photos,” many of the images seem to be pictures of traveler license plates, likely taken from cars at an automotive port of entry.
Customs has not named the contractor involved in the breach, but a Washington Post article noted that the announcement included a Word document with the name Perceptrics, a provider of automated license plate readers used at a number of southern ports of entry.License plate photos compromised after Customs contractor breach
So they can’t secure the data. And they can’t secure the data announcing the failure to secure the data.
If you think the government can actually secure your data (or even their own hacking tools), you are fooling yourself.
Update: Customs and Border Patrol has suspended all contracts with this supplier as a result of the breach.
It’s complex research, but worth asking how much all this fuss about privacy is really worth?
How much value do online publishers derive from behaviorally targeted advertising that uses privacy-hostile tracking technologies to determine which advert to show a website user?
A new piece of research suggests publishers make just 4% more vs if they were to serve a non-targeted ad.Targeted ads offer little extra value for online publishers, study suggests