Users on social media are often in their own universes. Liberals often don’t even see the content that conservatives see, and vice versa.
Imagine if that kind of segmentation extended to augmented reality as well:
Imagine a world that’s filled with invisible graffiti. Open an app, point your phone at a wall, and blank brick or cement becomes a canvas. Create art with digital spraypaint and stencils, and an augmented reality system will permanently store its location and placement, creating the illusion of real street art. If friends or social media followers have the app, they can find your painting on a map and come see it. You might scrawl an in-joke across the door of a friend’s apartment, or paint a gorgeous mural on the side of a local store.
Now imagine a darker world. Members of hate groups gleefully swap pictures of racist tags on civil rights monuments. Students bully each other by spreading vicious rumors on the walls of a target’s house. Small businesses get mobbed beyond capacity when a big influencer posts a sticker on their window. The developers of Mark AR, an app that’s described as “the world’s first augmented reality social platform,” are trying to create the good version of this system. They’re still figuring out how to avoid the bad one.Is the world ready for virtual graffiti?
I first read China Miéville’s The City & the City many years ago, and I keep thinking about how strange it was then, and how much the ideas have resonated since.
Joseph Cox with Motherboard has authored a story on a massive private license plate surveillance network called DRN:
This tool, called Digital Recognition Network (DRN), is not run by a government, although law enforcement can also access it. Instead, DRN is a private surveillance system crowdsourced by hundreds of repo men who have installed cameras that passively scan, capture, and upload the license plates of every car they drive by to DRN’s database. DRN stretches coast to coast and is available to private individuals and companies focused on tracking and locating people or vehicles. The tool is made by a company that is also called Digital Recognition Network.This Company Built a Private Surveillance Network. We Tracked Someone With It
I wrote recently about private surveillance projects that may meet or exceed government efforts. It won’t be long before the license plate readers are facial recognition scanners. It’s probably happening now.
An engineer has built a counter-surveillance tool on top of the hardware and software stack for Tesla vehicles:
It uses the existing video feeds created by Tesla’s Sentry Mode features and uses license plate and facial detection to determine if you are being followed.
Scout does all that in real-time and sends you notifications if it sees anything suspicious.Turn your Tesla into a CIA-like counter-surveillance tool with this hack
A video demonstration is embedded in the article.
This is a reminder that intelligent surveillance tools are going to be available at massive scale to even private citizens, not just the government. As governments track citizens, will citizens track government actors and individual police officers? What will we do with all of this data?
Rachel Thomas with an excellent essay on 8 Things You Need to Know About Surveillance:
I frequently talk with people who are not that concerned about surveillance, or who feel that the positives outweigh the risks. Here, I want to share some important truths about surveillance:
1. Surveillance can facilitate human rights abuses and even genocide
2. Data is often used for different purposes than why it was collected
3. Data often contains errors
4. Surveillance typically operates with no accountability
5. Surveillance changes our behavior
6. Surveillance disproportionately impacts the marginalized
7. Data privacy is a public good
8. We don’t have to accept invasive surveillance
The issues are of course more complex than anyone can summarize in a brief essay, but Thomas’ points on data often containing errors (3) and the frequent lack of processes to remedy those errors (4) deserve special emphasis. We tend to assume that automated systems are accurate and work well as long as they do so in our experience. But for many people they do not, and this has a dramatic impact on their lives.
Angry lamentation about the effects of new tech on privacy has flabbergasted me the most. For practical purposes, we have more privacy than ever before in human history. You can now buy embarrassing products in secret. You can read or view virtually anything you like in secret. You can interact with over a billion people in secret.
Then what privacy have we lost? The privacy to not be part of a Big Data Set. The privacy to not have firms try to sell us stuff based on our previous purchases. In short, we have lost the kinds of privacy that no prudent person loses sleep over.
The prudent will however be annoyed that – thanks to populist pressure – we now have to click “I agree” fifty times a day to access our favorite websites. Implicit consent was working admirably, but now we all have to suffer to please people who are impossible to please. . . . .Historically Hollow: The Cries of Populism
Meanwhile of course the government takes (mostly innocent!) state driver’s license biometrics without notice or consent.
The endowment effect is a psychological phenomenon in which people overvalue the things they own as compared to equivalent things they do not own. For example, a person given a mug and then offered the chance to trade it for equally valued pens will typically demand twice the value in pens than they would have been willing to pay to acquire the mug in the first place. In other words, a thing that is “yours” has more value than the thing in the abstract.
Turns out the endowment effect is on steroids in the data privacy realm:
Do consumers value data privacy? How much? In a survey of 2,416 Americans, we find that the median consumer is willing to pay just $5 per month to maintain data privacy (along specified dimensions), but would demand $80 to allow access to personal data. This is a “superendowment effect,” much higher than the 1:2 ratio often found between willingness to pay and willingness to accept.How Much Is Data Privacy Worth? A Preliminary Investigation
The researchers conclude that “policymakers should give little or no attention” to these economic measures of value because willingness to pay and willingness to accept are “highly unreliable guides to the welfare effects of retaining or giving up data privacy.”
Maybe it’s because people have been told data privacy is a major issue, but they actually haven’t seen any real-world impact on their lives.
The EU High-Level Expert Group on Artificial Intelligence released its Policy and Investment Recommendations for Trustworthy AI today. The 50-page document is a bit more prescriptive that their previous Ethics Guidelines, and suggests that governments “refrain from disproportionate and mass surveillance of individuals” and “introduce mandatory self-identification of AI systems.” (But see deceptive NYPD chatbots.)
A big chunk of the report also urges the EU to invest in education and subject matter expertise.
So far the discussion around AI mass surveillance has been relatively binary: do it or not. At some point I expect we will see proposals to do mass surveillance while maintaining individual privacy. The security benefits of mass surveillance are too attractive to forego.
In his book “Deep Medicine,” which is about how A.I. is changing medicine across all fields, Eric Topol describes a study in which a learning algorithm was given medical records to predict who was likely to attempt suicide. It accurately predicted attempts nearly 80 percent of the time. By incorporating data of real-world interactions such as laughter and anger, an algorithm in a similar study was able to reach 93 percent accuracy.
[. . . . .]
Medicine is hard because, as A.I. is teaching us, we’re much more different from one another than we thought. There is no single diet approach that is best for all people because we all process food in our own distinct way. Diet, like other treatments, has to be customized.
You can be freaked out by the privacy-invading power of A.I. to know you, but only A.I. can gather the data necessary to do this.How Artificial Intelligence Can Save Your Life
Rephrasing a sentence from an earlier post, health is halfway around the block before privacy can get its shoes on.
Niraj Chokshi, writing for the New York Time:
To prevent the worst outcomes, the A.C.L.U. offered a range of recommendations governing the use of video analytics in the public and private sectors.
No governmental entity should be allowed to deploy video analytics without legislative approval, public notification and a review of a system’s effects on civil rights, it said. Individuals should know what kind of information is recorded and analyzed, have access to data collected about them, and have a way to challenge or correct inaccuracies, too.
To prevent abuses, video analytics should not be used to collect identifiable information en masse or merely for seeking out “suspicious” behavior, the A.C.L.U. said. Data collected should also be handled with care and systems should make decisions transparently and in ways that don’t carry legal implications for those tracked, the group said.
Businesses should be governed by similar guidelines and should be transparent in how they use video analytics, the group said. Regulations governing them should balance constitutional protections, including the rights to privacy and free expression.How Surveillance Cameras Could Be Weaponized With A.I.
These recommendations appear to boil down to transparency and not tracking everyone all the time without a specific reason. Seems reasonable as a starting point.
Bruce Schneier advocates for a technological “pause” to allow policy to catch up:
[U]biquitous surveillance will drastically change our relationship to society. We’ve never lived in this sort of world, even those of us who have lived through previous totalitarian regimes. The effects will be felt in many different areas. False positives — when the surveillance system gets it wrong — will lead to harassment and worse. Discrimination will become automated. Those who fall outside norms will be marginalized. And most importantly, the inability to live anonymously will have an enormous chilling effect on speech and behavior, which in turn will hobble society’s ability to experiment and change.Computers and Video Surveillance
On the other hand, isn’t it a good thing we can spot police officers with militant, racist views?