Essays Tagged "IEEE Security & Privacy"

Page 2 of 4

The Security Value of Muddling Through

  • Bruce Schneier
  • IEEE Security & Privacy
  • March/April 2015

View or Download in PDF Format

Of all the stories to come out of last year’s massive Sony hack, the most interesting was the ineffectiveness of the company’s incident response. Its initial reactions were indicative of a company in panic, and Sony’s senior executives even talked about how long it took them to fully understand the attack’s magnitude.

Sadly, this is more the norm than the exception. It seems to be the way Target and Home Depot handled their large hacks in 2013 and 2014, respectively. The lack of immediate response made the incidents worse…

The Future of Incident Response

  • Bruce Schneier
  • IEEE Security & Privacy
  • September/October 2014

View or Download in PDF Format

Security is a combination of protection, detection, and response. It’s taken the industry a long time to get to this point, though. The 1990s was the era of protection. Our industry was full of products that would protect your computers and network. By 2000, we realized that detection needed to be formalized as well, and the industry was full of detection products and services.

This decade is one of response. Over the past few years, we’ve started seeing incident response (IR) products and services. Security teams are incorporating them into their arsenal because of three trends in computing. One, we’ve lost control of our computing environment. More of our data is held in the cloud by other companies, and more of our actual networks are outsourced. This makes response more complicated, because we might not have visibility into parts of our critical network infrastructures…

Metadata = Surveillance

  • Bruce Schneier
  • IEEE Security & Privacy
  • March/April 2014

View or Download in PDF Format

Ever since reporters began publishing stories about NSA activities, based on documents provided by Edward Snowden, we’ve been repeatedly assured by government officials that it’s “only metadata.” This might fool the average person, but it shouldn’t fool those of us in the security field. Metadata equals surveillance data, and collecting metadata on people means putting them under surveillance.

An easy thought experiment demonstrates this. Imagine that you hired a private detective to eavesdrop on a subject. That detective would plant a bug in that subject’s home, office, and car. He would eavesdrop on his computer. He would listen in on that subject’s conversations, both face to face and remotely, and you would get a report on what was said in those conversations. (This is what President Obama repeatedly reassures us isn’t happening with our phone calls. But am I the only one who finds it suspicious that he always uses very specific words? “The NSA is not listening in on your phone calls.” This leaves open the possibility that the NSA is recording, transcribing, and analyzing your phone calls—and very occasionally reading them. This is far more likely to be true, and something a pedantically minded president could claim he wasn’t lying about.)…

Trust in Man/Machine Security Systems

  • Bruce Schneier
  • IEEE Security & Privacy
  • September/October 2013

View or Download in PDF Format

I jacked a visitor’s badge from the Eisenhower Executive Office Building in Washington, DC, last month. The badges are electronic; they’re enabled when you check in at building security. You’re supposed to wear it on a chain around your neck at all times and drop it through a slot when you leave.

I kept the badge. I used my body as a shield, and the chain made a satisfying noise when it hit bottom. The guard let me through the gate.

The person after me had problems, though. Some part of the system knew something was wrong, and wouldn’t let her out. Eventually, the guard had to manually override something…

IT for Oppression

  • Bruce Schneier
  • IEEE Security & Privacy
  • March/April 2013

View or Download in PDF Format

Whether it’s Syria using Facebook to help identify and arrest dissidents or China using its “Great Firewall” to limit access to international news throughout the country, repressive regimes all over the world are using the Internet to more efficiently implement surveillance, censorship, propaganda, and control. They’re getting really good at it, and the IT industry is helping. We’re helping by creating business applications—categories of applications, really—that are being repurposed by oppressive governments for their own use:…

The Importance of Security Engineering

  • Bruce Schneier
  • IEEE Security & Privacy
  • September/October 2012

View or Download in PDF Format

In May, neuroscientist and popular author Sam Harris and I debated the issue of profiling Muslims at airport security. We each wrote essays, then went back and forth on the issue. I don’t recommend reading the entire discussion; we spent 14,000 words talking past each other. But what’s interesting is how our debate illustrates the differences between a security engineer and an intelligent layman. Harris was uninterested in the detailed analysis required to understand a security system and unwilling to accept that security engineering is a specialized discipline with a body of knowledge and relevant expertise. He trusted his intuition…

How Changing Technology Affects Security

  • Bruce Schneier
  • IEEE Security & Privacy
  • March/April 2012

View or Download in PDF Format

This essay was republished in Wired on February 24, 2014.

Security is a tradeoff, a balancing act between attacker and defender. Unfortunately, that balance is never static. Changes in technology affect both sides. Society uses new technologies to decrease what I call the scope of defection—what attackers can get away with—and attackers use new technologies to increase it. What’s interesting is the difference between how the two groups incorporate new technologies.

Changes in security systems can be slow. Society has to implement any new security technology as a group, which implies agreement and coordination and—in some instances—a lengthy bureaucratic procurement process. Meanwhile, an attacker can just use the new technology. For example, at the end of the horse-and-buggy era, it was easier for a bank robber to use his new motorcar as a getaway vehicle than it was for a town’s police department to decide it needed a police car, get the budget to buy one, choose which one to buy, buy it, and then develop training and policies for it. And if only one police department did this, the bank robber could just move to another town. Defectors are more agile and adaptable, making them much better at being early adopters of new technology…

Empathy and Security

  • Bruce Schneier
  • IEEE Security & Privacy
  • September/October 2011

View or Download in PDF Format

Several independent streams of research seem to have converged on the role of empathy in security. Understanding how empathy works and fails—and how it can be harnessed—could be important as we develop security systems that protect people over computer networks.

Mirror neurons are part of a recently discovered brain system that activates both when an individual does something and when that individual observes someone else doing the same thing. They’re what allow us to “mirror” the behaviors of others, and they seem to play a major role in language acquisition, theory of mind, and empathy…

Detecting Cheaters

  • Bruce Schneier
  • IEEE Security & Privacy
  • March/April 2011

View or Download the PDF

Our brains are specially designed to deal with cheating in social exchanges. The evolutionary psychology explanation is that we evolved brain heuristics for the social problems that our prehistoric ancestors had to deal with. Once humans became good at cheating, they then had to become good at detecting cheating—otherwise, the social group would fall apart.

Perhaps the most vivid demonstration of this can be seen with variations on what’s known as the Wason selection task, named after the psychologist who first studied it. Back in the 1960s, it was a test of logical reasoning; today, it’s used more as a demonstration of evolutionary psychology. But before we get to the experiment, let’s get into the mathematical background…

A Taxonomy of Social Networking Data

  • Bruce Schneier
  • IEEE Security & Privacy
  • July/August 2010

Portuguese translation

Lately I’ve been reading about user security and privacy—control, really—on social networking sites. The issues are hard and the solutions harder, but I’m seeing a lot of confusion in even forming the questions. Social networking sites deal with several different types of user data, and it’s essential to separate them.

Below is my taxonomy of social networking data, which I first presented at the Internet Governance Forum meeting last November, and again—revised—at an OECD workshop on the role of Internet intermediaries in June…

Sidebar photo of Bruce Schneier by Joe MacInnis.