Survey suggests most Americans support police use of facial recognition technology

According to the Pew Research Center, a full 56 percent said that they trust police and officials to use these technologies responsibly. That goes for situations in which no consent is given: About 59 percent said it is acceptable for law enforcement to use facial recognition tools to assess security threats in public spaces.

Police Use of Facial Recognition is Just Fine, Say Most Americans

Black and Hispanic adults approve at lower rates. See the study for details.

UK court approves police use of facial recognition

In contrast to recent U.S. municipal decisions restricting government use of facial recognition technology, a UK court has ruled that police use of the technology does not violate any fundamental rights.

In one of the first lawsuits to address the use of live facial recognition technology by governments, a British court ruled on Wednesday that police use of the systems is acceptable and does not violate privacy and human rights.

Police Use of Facial Recognition Is Accepted by British Court

The UK is of course one of the most surveilled countries in the world.

The astronomy community has identified the spy satellite revealed by Trump

President Trump tweeted an apparently classified image of an Iranian launch pad on August 30. He has the right to do so. But he probably did not expect everything that the tweet would reveal.

Now astronomers have easily identified the exact satellite that took the image. By measuring the semi-major and semi-minor axes of the ellipse (as viewed in the image) of the circular launch platform, they were able to determine the angle of view. This matched precisely with a satellite known as USA 224, previously of unknown capability. Google Earth shows the launch pad as about 60 meters in diameter, which therefore suggests a satellite resolution capability of 10 centimeters per pixel. That resolution is very impressive and also previously unknown.

Unreal.

The detail in the image is surprising, even to satellite imagery experts. In an interview with NPR, Melissa Hanham of the Open Nuclear Network in Vienna said, “… I did not believe <the image> could come from a satellite.” Hanham also said that “I imagine adversaries are going to take a look at this image and reverse-engineer it to figure out how the sensor itself works and what kind of post-production techniques they’re using.”

Thanks to Trump, We’ve Got a Better Idea of the Capabilities of US Surveillance Satellites

14 iPhone 0-Days

Bruce Schneier:

This upends pretty much everything we know about iPhone hacking. We believed that it was hard. We believed that effective zero-day exploits cost $2M or $3M, and were used sparingly by governments only against high-value targets. We believed that if an exploit was used too frequently, it would be quickly discovered and patched.

None of that is true here. This operation used fourteen zero-days exploits. It used them indiscriminately. And it remained undetected for two years. (I waited before posting this because I wanted to see if someone would rebut this story, or explain it somehow.)

Massive iPhone Hack Targets Uyghurs

Real-time counter surveillance tool for Tesla vehicles

An engineer has built a counter-surveillance tool on top of the hardware and software stack for Tesla vehicles:

It uses the existing video feeds created by Tesla’s Sentry Mode features and uses license plate and facial detection to determine if you are being followed.

Scout does all that in real-time and sends you notifications if it sees anything suspicious.

Turn your Tesla into a CIA-like counter-surveillance tool with this hack

A video demonstration is embedded in the article.

This is a reminder that intelligent surveillance tools are going to be available at massive scale to even private citizens, not just the government. As governments track citizens, will citizens track government actors and individual police officers? What will we do with all of this data?

Rachel Thomas on Surveillance

Rachel Thomas with an excellent essay on 8 Things You Need to Know About Surveillance:

I frequently talk with people who are not that concerned about surveillance, or who feel that the positives outweigh the risks. Here, I want to share some important truths about surveillance:

1. Surveillance can facilitate human rights abuses and even genocide
2. Data is often used for different purposes than why it was collected
3. Data often contains errors
4. Surveillance typically operates with no accountability
5. Surveillance changes our behavior
6. Surveillance disproportionately impacts the marginalized
7. Data privacy is a public good
8. We don’t have to accept invasive surveillance

The issues are of course more complex than anyone can summarize in a brief essay, but Thomas’ points on data often containing errors (3) and the frequent lack of processes to remedy those errors (4) deserve special emphasis. We tend to assume that automated systems are accurate and work well as long as they do so in our experience. But for many people they do not, and this has a dramatic impact on their lives.

Peter Thiel goes after Google on AI

Peter Thiel in a NYT op-ed:

A.I.’s military power is the simple reason that the recent behavior of America’s leading software company, Google — starting an A.I. lab in China while ending an A.I. contract with the Pentagon — is shocking. As President Barack Obama’s defense secretary Ash Carter pointed out last month, “If you’re working in China, you don’t know whether you’re working on a project for the military or not.”

Good for Google, Bad for America

And he’s not wrong. But it’s also not possible to prevent China from ultimately obtaining this technology.

Thiel is correct in the short-term, but also dangerously short-sighted. What’s the plan here? Further isolation and an arms race? Liberal democracies need to be focused on global frameworks (rule of law, free speech, free trade, free movement of people and information) that prevent war and human misery. This is an opportunistic easy rhetorical point, not a strategy.

A More Nuanced Encryption Policy Debate

Bruce Schneier on a speech by Attorney General Barr on encryption policy:

I think this is a major change in government position. Previously, the FBI, the Justice Department and so on had claimed that backdoors for law enforcement could be added without any loss of security. They maintained that technologists just need to figure out how: ­an approach we have derisively named “nerd harder.”

With this change, we can finally have a sensible policy conversation. Yes, adding a backdoor increases our collective security because it allows law enforcement to eavesdrop on the bad guys. But adding that backdoor also decreases our collective security because the bad guys can eavesdrop on everyone. This is exactly the policy debate we should be having­ [–] not the fake one about whether or not we can have both security and surveillance.

Attorney General William Barr on Encryption Policy

Schneier still believes it is more important that everyone is secure than to provide backdoors to law enforcement, but at least everyone is starting to acknowledge the reality that law enforcement backdoors weaken security.

The ACLU has some AI surveillance recommendations

Niraj Chokshi, writing for the New York Time:

To prevent the worst outcomes, the A.C.L.U. offered a range of recommendations governing the use of video analytics in the public and private sectors.

No governmental entity should be allowed to deploy video analytics without legislative approval, public notification and a review of a system’s effects on civil rights, it said. Individuals should know what kind of information is recorded and analyzed, have access to data collected about them, and have a way to challenge or correct inaccuracies, too.

To prevent abuses, video analytics should not be used to collect identifiable information en masse or merely for seeking out “suspicious” behavior, the A.C.L.U. said. Data collected should also be handled with care and systems should make decisions transparently and in ways that don’t carry legal implications for those tracked, the group said.

Businesses should be governed by similar guidelines and should be transparent in how they use video analytics, the group said. Regulations governing them should balance constitutional protections, including the rights to privacy and free expression.

How Surveillance Cameras Could Be Weaponized With A.I.

These recommendations appear to boil down to transparency and not tracking everyone all the time without a specific reason. Seems reasonable as a starting point.

The Complexity of Ubiquitous Surveillance

Bruce Schneier advocates for a technological “pause” to allow policy to catch up:

[U]biquitous surveillance will drastically change our relationship to society. We’ve never lived in this sort of world, even those of us who have lived through previous totalitarian regimes. The effects will be felt in many different areas. False positives­ — when the surveillance system gets it wrong­ — will lead to harassment and worse. Discrimination will become automated. Those who fall outside norms will be marginalized. And most importantly, the inability to live anonymously will have an enormous chilling effect on speech and behavior, which in turn will hobble society’s ability to experiment and change.

Computers and Video Surveillance

On the other hand, isn’t it a good thing we can spot police officers with militant, racist views?