Users on social media are often in their own universes. Liberals often don’t even see the content that conservatives see, and vice versa.
Imagine if that kind of segmentation extended to augmented reality as well:
Imagine a world that’s filled with invisible graffiti. Open an app, point your phone at a wall, and blank brick or cement becomes a canvas. Create art with digital spraypaint and stencils, and an augmented reality system will permanently store its location and placement, creating the illusion of real street art. If friends or social media followers have the app, they can find your painting on a map and come see it. You might scrawl an in-joke across the door of a friend’s apartment, or paint a gorgeous mural on the side of a local store.
Now imagine a darker world. Members of hate groups gleefully swap pictures of racist tags on civil rights monuments. Students bully each other by spreading vicious rumors on the walls of a target’s house. Small businesses get mobbed beyond capacity when a big influencer posts a sticker on their window. The developers of Mark AR, an app that’s described as “the world’s first augmented reality social platform,” are trying to create the good version of this system. They’re still figuring out how to avoid the bad one.Is the world ready for virtual graffiti?
I first read China Miéville’s The City & the City many years ago, and I keep thinking about how strange it was then, and how much the ideas have resonated since.
According to the Pew Research Center, a full 56 percent said that they trust police and officials to use these technologies responsibly. That goes for situations in which no consent is given: About 59 percent said it is acceptable for law enforcement to use facial recognition tools to assess security threats in public spaces.Police Use of Facial Recognition is Just Fine, Say Most Americans
Black and Hispanic adults approve at lower rates. See the study for details.
In contrast to recent U.S. municipal decisions restricting government use of facial recognition technology, a UK court has ruled that police use of the technology does not violate any fundamental rights.
In one of the first lawsuits to address the use of live facial recognition technology by governments, a British court ruled on Wednesday that police use of the systems is acceptable and does not violate privacy and human rights.Police Use of Facial Recognition Is Accepted by British Court
The UK is of course one of the most surveilled countries in the world.
President Trump tweeted an apparently classified image of an Iranian launch pad on August 30. He has the right to do so. But he probably did not expect everything that the tweet would reveal.
Now astronomers have easily identified the exact satellite that took the image. By measuring the semi-major and semi-minor axes of the ellipse (as viewed in the image) of the circular launch platform, they were able to determine the angle of view. This matched precisely with a satellite known as USA 224, previously of unknown capability. Google Earth shows the launch pad as about 60 meters in diameter, which therefore suggests a satellite resolution capability of 10 centimeters per pixel. That resolution is very impressive and also previously unknown.
The detail in the image is surprising, even to satellite imagery experts. In an interview with NPR, Melissa Hanham of the Open Nuclear Network in Vienna said, “… I did not believe <the image> could come from a satellite.” Hanham also said that “I imagine adversaries are going to take a look at this image and reverse-engineer it to figure out how the sensor itself works and what kind of post-production techniques they’re using.”Thanks to Trump, We’ve Got a Better Idea of the Capabilities of US Surveillance Satellites
An engineer has built a counter-surveillance tool on top of the hardware and software stack for Tesla vehicles:
It uses the existing video feeds created by Tesla’s Sentry Mode features and uses license plate and facial detection to determine if you are being followed.
Scout does all that in real-time and sends you notifications if it sees anything suspicious.Turn your Tesla into a CIA-like counter-surveillance tool with this hack
A video demonstration is embedded in the article.
This is a reminder that intelligent surveillance tools are going to be available at massive scale to even private citizens, not just the government. As governments track citizens, will citizens track government actors and individual police officers? What will we do with all of this data?
Rachel Thomas with an excellent essay on 8 Things You Need to Know About Surveillance:
I frequently talk with people who are not that concerned about surveillance, or who feel that the positives outweigh the risks. Here, I want to share some important truths about surveillance:
1. Surveillance can facilitate human rights abuses and even genocide
2. Data is often used for different purposes than why it was collected
3. Data often contains errors
4. Surveillance typically operates with no accountability
5. Surveillance changes our behavior
6. Surveillance disproportionately impacts the marginalized
7. Data privacy is a public good
8. We don’t have to accept invasive surveillance
The issues are of course more complex than anyone can summarize in a brief essay, but Thomas’ points on data often containing errors (3) and the frequent lack of processes to remedy those errors (4) deserve special emphasis. We tend to assume that automated systems are accurate and work well as long as they do so in our experience. But for many people they do not, and this has a dramatic impact on their lives.
Peter Thiel in a NYT op-ed:
A.I.’s military power is the simple reason that the recent behavior of America’s leading software company, Google — starting an A.I. lab in China while ending an A.I. contract with the Pentagon — is shocking. As President Barack Obama’s defense secretary Ash Carter pointed out last month, “If you’re working in China, you don’t know whether you’re working on a project for the military or not.”Good for Google, Bad for America
And he’s not wrong. But it’s also not possible to prevent China from ultimately obtaining this technology.
Thiel is correct in the short-term, but also dangerously short-sighted. What’s the plan here? Further isolation and an arms race? Liberal democracies need to be focused on global frameworks (rule of law, free speech, free trade, free movement of people and information) that prevent war and human misery. This is an opportunistic easy rhetorical point, not a strategy.
Bruce Schneier on a speech by Attorney General Barr on encryption policy:
I think this is a major change in government position. Previously, the FBI, the Justice Department and so on had claimed that backdoors for law enforcement could be added without any loss of security. They maintained that technologists just need to figure out how: an approach we have derisively named “nerd harder.”
With this change, we can finally have a sensible policy conversation. Yes, adding a backdoor increases our collective security because it allows law enforcement to eavesdrop on the bad guys. But adding that backdoor also decreases our collective security because the bad guys can eavesdrop on everyone. This is exactly the policy debate we should be having [–] not the fake one about whether or not we can have both security and surveillance.Attorney General William Barr on Encryption Policy
Schneier still believes it is more important that everyone is secure than to provide backdoors to law enforcement, but at least everyone is starting to acknowledge the reality that law enforcement backdoors weaken security.
Niraj Chokshi, writing for the New York Time:
To prevent the worst outcomes, the A.C.L.U. offered a range of recommendations governing the use of video analytics in the public and private sectors.
No governmental entity should be allowed to deploy video analytics without legislative approval, public notification and a review of a system’s effects on civil rights, it said. Individuals should know what kind of information is recorded and analyzed, have access to data collected about them, and have a way to challenge or correct inaccuracies, too.
To prevent abuses, video analytics should not be used to collect identifiable information en masse or merely for seeking out “suspicious” behavior, the A.C.L.U. said. Data collected should also be handled with care and systems should make decisions transparently and in ways that don’t carry legal implications for those tracked, the group said.
Businesses should be governed by similar guidelines and should be transparent in how they use video analytics, the group said. Regulations governing them should balance constitutional protections, including the rights to privacy and free expression.How Surveillance Cameras Could Be Weaponized With A.I.
These recommendations appear to boil down to transparency and not tracking everyone all the time without a specific reason. Seems reasonable as a starting point.