Entries Tagged "terrorism"

Page 6 of 80

Paris Terrorists Used Double ROT-13 Encryption

That is, no encryption at all. The Intercept has the story:

Yet news emerging from Paris—as well as evidence from a Belgian ISIS raid in January—suggests that the ISIS terror networks involved were communicating in the clear, and that the data on their smartphones was not encrypted.

European media outlets are reporting that the location of a raid conducted on a suspected safe house Wednesday morning was extracted from a cellphone, apparently belonging to one of the attackers, found in the trash outside the Bataclan concert hall massacre. Le Monde reported that investigators were able to access the data on the phone, including a detailed map of the concert hall and an SMS messaging saying “we’re off; we’re starting.” Police were also able to trace the phone’s movements.

The obvious conclusion:

The reports note that Abdelhamid Abaaoud, the “mastermind” of both the Paris attacks and a thwarted Belgium attack ten months ago, failed to use encryption whatsoever (read: existing capabilities stopped the Belgium attacks and could have stopped the Paris attacks, but didn’t). That’s of course not to say batshit religious cults like ISIS don’t use encryption, and won’t do so going forward. Everybody uses encryption. But the point remains that to use a tragedy to vilify encryption, push for surveillance expansion, and pass backdoor laws that will make everybody less safe—is nearly as gruesome as the attacks themselves.

And what is it about this “mastermind” label? Why do we have to make them smarter than they are?

EDITED TO ADD: More information.

EDITED TO ADD: My previous blog post on this.

Posted on November 18, 2015 at 3:35 PMView Comments

Refuse to Be Terrorized

Paul Krugman has written a really good update of my 2006 essay.

Krugman:

So what can we say about how to respond to terrorism? Before the atrocities in Paris, the West’s general response involved a mix of policing, precaution, and military action. All involved difficult tradeoffs: surveillance versus privacy, protection versus freedom of movement, denying terrorists safe havens versus the costs and dangers of waging war abroad. And it was always obvious that sometimes a terrorist attack would slip through.

Paris may have changed that calculus a bit, especially when it comes to Europe’s handling of refugees, an agonizing issue that has now gotten even more fraught. And there will have to be a post-mortem on why such an elaborate plot wasn’t spotted. But do you remember all the pronouncements that 9/11 would change everything? Well, it didn’t—and neither will this atrocity.

Again, the goal of terrorists is to inspire terror, because that’s all they’re capable of. And the most important thing our societies can do in response is to refuse to give in to fear.

Me:

But our job is to remain steadfast in the face of terror, to refuse to be terrorized. Our job is to not panic every time two Muslims stand together checking their watches. There are approximately 1 billion Muslims in the world, a large percentage of them not Arab, and about 320 million Arabs in the Middle East, the overwhelming majority of them not terrorists. Our job is to think critically and rationally, and to ignore the cacophony of other interests trying to use terrorism to advance political careers or increase a television show’s viewership.

The surest defense against terrorism is to refuse to be terrorized. Our job is to recognize that terrorism is just one of the risks we face, and not a particularly common one at that. And our job is to fight those politicians who use fear as an excuse to take away our liberties and promote security theater that wastes money and doesn’t make us any safer.

This crass and irreverent essay was written after January’s Paris terrorist attack, but is very relevant right now.

Posted on November 17, 2015 at 6:36 AMView Comments

Paris Attacks Blamed on Strong Cryptography and Edward Snowden

Well, that didn’t take long:

As Paris reels from terrorist attacks that have claimed at least 128 lives, fierce blame for the carnage is being directed toward American whistleblower Edward Snowden and the spread of strong encryption catalyzed by his actions.

Now the Paris attacks are being used an excuse to demand back doors.

CIA Director John Brennan chimed in, too.

Of course, this was planned all along. From September:

Privately, law enforcement officials have acknowledged that prospects for congressional action this year are remote. Although “the legislative environment is very hostile today,” the intelligence community’s top lawyer, Robert S. Litt, said to colleagues in an August e-mail, which was obtained by The Post, “it could turn in the event of a terrorist attack or criminal event where strong encryption can be shown to have hindered law enforcement.”

There is value, he said, in “keeping our options open for such a situation.”

I was going to write a definitive refutation to the meme that it’s all Snowden’s fault, but Glenn Greenwald beat me to it.

EDITED TO ADD: It wasn’t fair for me to characterize Ben Wittes’s Lawfare post as agitating for back doors. I apologize.

Better links are these two New York Times stories.

EDITED TO ADD (11/17): These two essays are also good.

EDITED TO ADD (11/18): The New York Times published a powerful editorial against mass surveillance.

EDITED TO ADD (11/19): The New York Times deleted a story claiming the attackers used encryption. Because it turns out they didn’t use encryption.

Posted on November 16, 2015 at 2:39 PMView Comments

Automatic Face Recognition and Surveillance

ID checks were a common response to the terrorist attacks of 9/11, but they’ll soon be obsolete. You won’t have to show your ID, because you’ll be identified automatically. A security camera will capture your face, and it’ll be matched with your name and a whole lot of other information besides. Welcome to the world of automatic facial recognition. Those who have access to databases of identified photos will have the power to identify us. Yes, it’ll enable some amazing personalized services; but it’ll also enable whole new levels of surveillance. The underlying technologies are being developed today, and there are currently no rules limiting their use.

Walk into a store, and the salesclerks will know your name. The store’s cameras and computers will have figured out your identity, and looked you up in both their store database and a commercial marketing database they’ve subscribed to. They’ll know your name, salary, interests, what sort of sales pitches you’re most vulnerable to, and how profitable a customer you are. Maybe they’ll have read a profile based on your tweets and know what sort of mood you’re in. Maybe they’ll know your political affiliation or sexual identity, both predictable by your social media activity. And they’re going to engage with you accordingly, perhaps by making sure you’re well taken care of or possibly by trying to make you so uncomfortable that you’ll leave.

Walk by a policeman, and she will know your name, address, criminal record, and with whom you routinely are seen. The potential for discrimination is enormous, especially in low-income communities where people are routinely harassed for things like unpaid parking tickets and other minor violations. And in a country where people are arrested for their political views, the use of this technology quickly turns into a nightmare scenario.

The critical technology here is computer face recognition. Traditionally it has been pretty poor, but it’s slowly improving. A computer is now as good as a person. Already Google’s algorithms can accurately match child and adult photos of the same person, and Facebook has an algorithm that works by recognizing hair style, body shape, and body language ­- and works even when it can’t see faces. And while we humans are pretty much as good at this as we’re ever going to get, computers will continue to improve. Over the next years, they’ll continue to get more accurate, making better matches using even worse photos.

Matching photos with names also requires a database of identified photos, and we have plenty of those too. Driver’s license databases are a gold mine: all shot face forward, in good focus and even light, with accurate identity information attached to each photo. The enormous photo collections of social media and photo archiving sites are another. They contain photos of us from all sorts of angles and in all sorts of lighting conditions, and we helpfully do the identifying step for the companies by tagging ourselves and our friends. Maybe this data will appear on handheld screens. Maybe it’ll be automatically displayed on computer-enhanced glasses. Imagine salesclerks ­—or politicians ­—being able to scan a room and instantly see wealthy customers highlighted in green, or policemen seeing people with criminal records highlighted in red.

Science fiction writers have been exploring this future in both books and movies for decades. Ads followed people from billboard to billboard in the movie Minority Report. In John Scalzi’s recent novel Lock In, characters scan each other like the salesclerks I described above.

This is no longer fiction. High-tech billboards can target ads based on the gender of who’s standing in front of them. In 2011, researchers at Carnegie Mellon pointed a camera at a public area on campus and were able to match live video footage with a public database of tagged photos in real time. Already government and commercial authorities have set up facial recognition systems to identify and monitor people at sporting events, music festivals, and even churches. The Dubai police are working on integrating facial recognition into Google Glass, and more US local police forces are using the technology.

Facebook, Google, Twitter, and other companies with large databases of tagged photos know how valuable their archives are. They see all kinds of services powered by their technologies ­ services they can sell to businesses like the stores you walk into and the governments you might interact with.

Other companies will spring up whose business models depend on capturing our images in public and selling them to whoever has use for them. If you think this is farfetched, consider a related technology that’s already far down that path: license-plate capture.

Today in the US there’s a massive but invisible industry that records the movements of cars around the country. Cameras mounted on cars and tow trucks capture license places along with date/time/location information, and companies use that data to find cars that are scheduled for repossession. One company, Vigilant Solutions, claims to collect 70 million scans in the US every month. The companies that engage in this business routinely share that data with the police, giving the police a steady stream of surveillance information on innocent people that they could not legally collect on their own. And the companies are already looking for other profit streams, selling that surveillance data to anyone else who thinks they have a need for it.

This could easily happen with face recognition. Finding bail jumpers could even be the initial driving force, just as finding cars to repossess was for license plate capture.

Already the FBI has a database of 52 million faces, and describes its integration of facial recognition software with that database as “fully operational.” In 2014, FBI Director James Comey told Congress that the database would not include photos of ordinary citizens, although the FBI’s own documents indicate otherwise. And just last month, we learned that the FBI is looking to buy a system that will collect facial images of anyone an officer stops on the street.

In 2013, Facebook had a quarter of a trillion user photos in its database. There’s currently a class-action lawsuit in Illinois alleging that the company has over a billion “face templates” of people, collected without their knowledge or consent.

Last year, the US Department of Commerce tried to prevail upon industry representatives and privacy organizations to write a voluntary code of conduct for companies using facial recognition technologies. After 16 months of negotiations, all of the consumer-focused privacy organizations pulled out of the process because industry representatives were unable to agree on any limitations on something as basic as nonconsensual facial recognition.

When we talk about surveillance, we tend to concentrate on the problems of data collection: CCTV cameras, tagged photos, purchasing habits, our writings on sites like Facebook and Twitter. We think much less about data analysis. But effective and pervasive surveillance is just as much about analysis. It’s sustained by a combination of cheap and ubiquitous cameras, tagged photo databases, commercial databases of our actions that reveal our habits and personalities, and ­—most of all ­—fast and accurate face recognition software.

Don’t expect to have access to this technology for yourself anytime soon. This is not facial recognition for all. It’s just for those who can either demand or pay for access to the required technologies ­—most importantly, the tagged photo databases. And while we can easily imagine how this might be misused in a totalitarian country, there are dangers in free societies as well. Without meaningful regulation, we’re moving into a world where governments and corporations will be able to identify people both in real time and backwards in time, remotely and in secret, without consent or recourse.

Despite protests from industry, we need to regulate this budding industry. We need limitations on how our images can be collected without our knowledge or consent, and on how they can be used. The technologies aren’t going away, and we can’t uninvent these capabilities. But we can ensure that they’re used ethically and responsibly, and not just as a mechanism to increase police and corporate power over us.

This essay previously appeared on Forbes.com.

EDITED TO ADD: Two articles that say much the same thing.

Posted on October 5, 2015 at 6:11 AMView Comments

Defending All the Targets Is Impossible

In the wake of the recent averted mass shooting on the French railroads, officials are realizing that there are just too many potential targets to defend.

The sheer number of militant suspects combined with a widening field of potential targets have presented European officials with what they concede is a nearly insurmountable surveillance task. The scale of the challenge, security experts fear, may leave the Continent entering a new climate of uncertainty, with added risk attached to seemingly mundane endeavors, like taking a train.

The article talks about the impossibility of instituting airport-like security at train stations, but of course even if were feasible to do that, it would only serve to move the threat to some other crowded space.

Posted on August 27, 2015 at 6:57 AMView Comments

Movie Plot Threat: Terrorists Attacking US Prisons

Kansas Senator Pat Roberts wins an award for his movie-plot threat: terrorists attacking the maximum-security federal prison at Ft. Leavenworth:

In an Aug. 14 letter to Defense Secretary Ashton B. Carter, Roberts stressed that Kansas in general—and Leavenworth, in particular—are not ideal for a domestic detention facility.

“Fort Leavenworth is neither the ideal nor right location for moving Guantánamo detainees,” Roberts wrote to Defense Secretary Ashton B. Carter. “The installation lies right on the Missouri River, providing terrorists with the possibility of covert travel underwater and attempting access to the detention facility.”

Not just terrorists, but terrorists with a submarine! This is why Ft. Leavenworth, a prison from which no one has ever escaped, is unsuitable for housing Guantanamo detainees.

I’ve never understood the argument that terrorists are too dangerous to house in US prisons. They’re just terrorists, it’s not like they’re Magneto.

Posted on August 25, 2015 at 2:19 PMView Comments

Human and Technology Failures in Nuclear Facilities

This is interesting:

We can learn a lot about the potential for safety failures at US nuclear plants from the July 29, 2012, incident in which three religious activists broke into the supposedly impregnable Y-12 facility at Oak Ridge, Tennessee, the Fort Knox of uranium. Once there, they spilled blood and spray painted “work for peace not war” on the walls of a building housing enough uranium to build thousands of nuclear weapons. They began hammering on the building with a sledgehammer, and waited half an hour to be arrested. If an 82-year-old nun with a heart condition and two confederates old enough to be AARP members could do this, imagine what a team of determined terrorists could do.

[…]

Where some other countries often rely more on guards with guns, the United States likes to protect its nuclear facilities with a high-tech web of cameras and sensors. Under the Nunn-Lugar program, Washington has insisted that Russia adopt a similar approach to security at its own nuclear sites­—claiming that an American cultural preference is objectively superior. The Y-12 incident shows the problem with the American approach of automating security. At the Y-12 facility, in addition to the three fences the protestors had to cut through with wire-cutters, there were cameras and motion detectors. But we too easily forget that technology has to be maintained and watched to be effective. According to Munger, 20 percent of the Y-12 cameras were not working on the night the activists broke in. Cameras and motion detectors that had been broken for months had gone unrepaired. A security guard was chatting rather than watching the feed from a camera that did work. And guards ignored the motion detectors, which were so often set off by local wildlife that they assumed all alarms were false positives….

Instead of having government forces guard the site, the Department of Energy had hired two contractors: Wackenhut and Babcock and Wilcox. Wackenhut is now owned by the British company G4S, which also botched security for the 2012 London Olympics, forcing the British government to send 3,500 troops to provide security that the company had promised but proved unable to deliver. Private companies are, of course, driven primarily by the need to make a profit, but there are surely some operations for which profit should not be the primary consideration.

Babcock and Wilcox was supposed to maintain the security equipment at the Y-12 site, while Wackenhut provided the guards. Poor communication between the two companies was one reason sensors and cameras were not repaired. Furthermore, Babcock and Wilcox had changed the design of the plant’s Highly Enriched Uranium Materials Facility, making it a more vulnerable aboveground building, in order to cut costs. And Wackenhut was planning to lay off 70 guards at Y-12, also to cut costs.

There’s an important lesson here. Security is a combination of people, process, and technology. All three have to be working in order for security to work.

Slashdot thread.

Posted on July 14, 2015 at 5:53 AMView Comments

Tracking the Psychological Effects of the 9/11 Attacks

Interesting research from 2012: “The Dynamics of Evolving Beliefs, Concerns, Emotions, and Behavioral Avoidance Following 9/11: A Longitudinal Analysis of Representative Archival Samples“:

Abstract: September 11 created a natural experiment that enables us to track the psychological effects of a large-scale terror event over time. The archival data came from 8,070 participants of 10 ABC and CBS News polls collected from September 2001 until September 2006. Six questions investigated emotional, behavioral, and cognitive responses to the events of September 11 over a five-year period. We found that heightened responses after September 11 dissipated and reached a plateau at various points in time over a five-year period. We also found that emotional, cognitive, and behavioral reactions were moderated by age, sex, political affiliation, and proximity to the attack. Both emotional and behavioral responses returned to a normal state after one year, whereas cognitively-based perceptions of risk were still diminishing as late as September 2006. These results provide insight into how individuals will perceive and respond to future similar attacks.

Posted on June 30, 2015 at 6:27 AMView Comments

1 4 5 6 7 8 80

Sidebar photo of Bruce Schneier by Joe MacInnis.