Essays Tagged "IEEE Security & Privacy"

Page 2 of 4

Metadata = Surveillance

  • Bruce Schneier
  • IEEE Security & Privacy
  • March/April 2014

Ever since reporters began publishing stories about NSA activities, based on documents provided by Edward Snowden, we’ve been repeatedly assured by government officials that it’s “only metadata.” This might fool the average person, but it shouldn’t fool those of us in the security field. Metadata equals surveillance data, and collecting metadata on people means putting them under surveillance.

An easy thought experiment demonstrates this. Imagine that you hired a private detective to eavesdrop on a subject. That detective would plant a bug in that subject’s home, office, and car. He would eavesdrop on his computer. He would listen in on that subject’s conversations, both face to face and remotely, and you would get a report on what was said in those conversations. (This is what President Obama repeatedly reassures us isn’t happening with our phone calls. But am I the only one who finds it suspicious that he always uses very specific words? “The NSA is not listening in on your phone calls.” This leaves open the possibility that the NSA is recording, transcribing, and analyzing your phone calls—and very occasionally reading them. This is far more likely to be true, and something a pedantically minded president could claim he wasn’t lying about.)…

Trust in Man/Machine Security Systems

  • Bruce Schneier
  • IEEE Security & Privacy
  • September/October 2013

I jacked a visitor’s badge from the Eisenhower Executive Office Building in Washington, DC, last month. The badges are electronic; they’re enabled when you check in at building security. You’re supposed to wear it on a chain around your neck at all times and drop it through a slot when you leave.

I kept the badge. I used my body as a shield, and the chain made a satisfying noise when it hit bottom. The guard let me through the gate.

The person after me had problems, though. Some part of the system knew something was wrong, and wouldn’t let her out. Eventually, the guard had to manually override something…

IT for Oppression

  • Bruce Schneier
  • IEEE Security & Privacy
  • March/April 2013

Whether it’s Syria using Facebook to help identify and arrest dissidents or China using its “Great Firewall” to limit access to international news throughout the country, repressive regimes all over the world are using the Internet to more efficiently implement surveillance, censorship, propaganda, and control. They’re getting really good at it, and the IT industry is helping. We’re helping by creating business applications — categories of applications, really — that are being repurposed by oppressive governments for their own use:

  • What is called censorship when practiced by a government is content filtering when practiced by an organization. Many companies want to keep their employees from viewing porn or updating their Facebook pages while at work. In the other direction, data loss prevention software keeps employees from sending proprietary corporate information outside the network and also serves as a censorship tool. Governments can use these products for their own ends…

The Importance of Security Engineering

  • Bruce Schneier
  • IEEE Security & Privacy
  • September/October 2012

In May, neuroscientist and popular author Sam Harris and I debated the issue of profiling Muslims at airport security. We each wrote essays, then went back and forth on the issue. I don’t recommend reading the entire discussion; we spent 14,000 words talking past each other. But what’s interesting is how our debate illustrates the differences between a security engineer and an intelligent layman. Harris was uninterested in the detailed analysis required to understand a security system and unwilling to accept that security engineering is a specialized discipline with a body of knowledge and relevant expertise. He trusted his intuition…

How Changing Technology Affects Security

  • Bruce Schneier
  • IEEE Security & Privacy
  • March/April 2012

This essay was republished in Wired on February 24, 2014.

Security is a tradeoff, a balancing act between attacker and defender. Unfortunately, that balance is never static. Changes in technology affect both sides. Society uses new technologies to decrease what I call the scope of defection — what attackers can get away with — and attackers use new technologies to increase it. What’s interesting is the difference between how the two groups incorporate new technologies.

Changes in security systems can be slow. Society has to implement any new security technology as a group, which implies agreement and coordination and — in some instances — a lengthy bureaucratic procurement process. Meanwhile, an attacker can just use the new technology. For example, at the end of the horse-and-buggy era, it was easier for a bank robber to use his new motorcar as a getaway vehicle than it was for a town’s police department to decide it needed a police car, get the budget to buy one, choose which one to buy, buy it, and then develop training and policies for it. And if only one police department did this, the bank robber could just move to another town. Defectors are more agile and adaptable, making them much better at being early adopters of new technology…

Empathy and Security

  • Bruce Schneier
  • IEEE Security & Privacy
  • September/October 2011

Several independent streams of research seem to have converged on the role of empathy in security. Understanding how empathy works and fails—and how it can be harnessed—could be important as we develop security systems that protect people over computer networks.

Mirror neurons are part of a recently discovered brain system that activates both when an individual does something and when that individual observes someone else doing the same thing. They’re what allow us to “mirror” the behaviors of others, and they seem to play a major role in language acquisition, theory of mind, and empathy…

Detecting Cheaters

  • Bruce Schneier
  • IEEE Security & Privacy
  • March/April 2011

Our brains are specially designed to deal with cheating in social exchanges. The evolutionary psychology explanation is that we evolved brain heuristics for the social problems that our prehistoric ancestors had to deal with. Once humans became good at cheating, they then had to become good at detecting cheating — otherwise, the social group would fall apart.

Perhaps the most vivid demonstration of this can be seen with variations on what’s known as the Wason selection task, named after the psychologist who first studied it. Back in the 1960s, it was a test of logical reasoning; today, it’s used more as a demonstration of evolutionary psychology. But before we get to the experiment, let’s get into the mathematical background…

A Taxonomy of Social Networking Data

  • Bruce Schneier
  • IEEE Security & Privacy
  • July/August 2010

Portuguese translation

Lately I’ve been reading about user security and privacy — control, really — on social networking sites. The issues are hard and the solutions harder, but I’m seeing a lot of confusion in even forming the questions. Social networking sites deal with several different types of user data, and it’s essential to separate them.

Below is my taxonomy of social networking data, which I first presented at the Internet Governance Forum meeting last November, and again — revised — at an OECD workshop on the role of Internet intermediaries in June…

Security and Function Creep

  • Bruce Schneier
  • IEEE Security & Privacy
  • January/February 2010

Security is rarely static. Technology changes the capabilities of both security systems and attackers. But there’s something else that changes security’s cost/benefit trade-off: how the underlying systems being secured are used. Far too often we build security for one purpose, only to find it being used for another purpose — one it wasn’t suited for in the first place. And then the security system has to play catch-up.

Take driver’s licenses, for example. Originally designed to demonstrate a credential — the ability to drive a car — they looked like other credentials: medical licenses or elevator certificates of inspection. They were wallet-sized, of course, but they didn’t have much security associated with them. Then, slowly, driver’s licenses took on a second application: they became age-verification tokens in bars and liquor stores. Of course the security wasn’t up to the task — teenagers can be extraordinarily resourceful if they set their minds to it — and over the decades driver’s licenses got photographs, tamper-resistant features (once, it was easy to modify the birth year), and technologies that made counterfeiting harder. There was little value in counterfeiting a driver’s license, but a lot of value in counterfeiting an age-verification token…

Security, Group Size, and the Human Brain

  • Bruce Schneier
  • IEEE Security & Privacy
  • July/August 2009

If the size of your company grows past 150 people, it’s time to get name badges. It’s not that larger groups are somehow less secure, it’s just that 150 is the cognitive limit to the number of people a human brain can maintain a coherent social relationship with.

Primatologist Robin Dunbar derived this number by comparing neocortex — the “thinking” part of the mammalian brain — volume with the size of primate social groups. By analyzing data from 38 primate genera and extrapolating to the human neocortex size, he predicted a human “mean group size” of roughly 150…

Sidebar photo of Bruce Schneier by Joe MacInnis.