Essays Tagged "IEEE Security & Privacy"

Page 3 of 5

How Changing Technology Affects Security

  • Bruce Schneier
  • IEEE Security & Privacy
  • March/April 2012

View or Download in PDF Format

This essay was republished in Wired on February 24, 2014.

Security is a tradeoff, a balancing act between attacker and defender. Unfortunately, that balance is never static. Changes in technology affect both sides. Society uses new technologies to decrease what I call the scope of defection—what attackers can get away with—and attackers use new technologies to increase it. What’s interesting is the difference between how the two groups incorporate new technologies.

Changes in security systems can be slow. Society has to implement any new security technology as a group, which implies agreement and coordination and—in some instances—a lengthy bureaucratic procurement process. Meanwhile, an attacker can just use the new technology. For example, at the end of the horse-and-buggy era, it was easier for a bank robber to use his new motorcar as a getaway vehicle than it was for a town’s police department to decide it needed a police car, get the budget to buy one, choose which one to buy, buy it, and then develop training and policies for it. And if only one police department did this, the bank robber could just move to another town. Defectors are more agile and adaptable, making them much better at being early adopters of new technology…

Empathy and Security

  • Bruce Schneier
  • IEEE Security & Privacy
  • September/October 2011

View or Download in PDF Format

Several independent streams of research seem to have converged on the role of empathy in security. Understanding how empathy works and fails—and how it can be harnessed—could be important as we develop security systems that protect people over computer networks.

Mirror neurons are part of a recently discovered brain system that activates both when an individual does something and when that individual observes someone else doing the same thing. They’re what allow us to “mirror” the behaviors of others, and they seem to play a major role in language acquisition, theory of mind, and empathy…

Detecting Cheaters

  • Bruce Schneier
  • IEEE Security & Privacy
  • March/April 2011

View or Download the PDF

Our brains are specially designed to deal with cheating in social exchanges. The evolutionary psychology explanation is that we evolved brain heuristics for the social problems that our prehistoric ancestors had to deal with. Once humans became good at cheating, they then had to become good at detecting cheating—otherwise, the social group would fall apart.

Perhaps the most vivid demonstration of this can be seen with variations on what’s known as the Wason selection task, named after the psychologist who first studied it. Back in the 1960s, it was a test of logical reasoning; today, it’s used more as a demonstration of evolutionary psychology. But before we get to the experiment, let’s get into the mathematical background…

A Taxonomy of Social Networking Data

  • Bruce Schneier
  • IEEE Security & Privacy
  • July/August 2010

Portuguese translation

Lately I’ve been reading about user security and privacy—control, really—on social networking sites. The issues are hard and the solutions harder, but I’m seeing a lot of confusion in even forming the questions. Social networking sites deal with several different types of user data, and it’s essential to separate them.

Below is my taxonomy of social networking data, which I first presented at the Internet Governance Forum meeting last November, and again—revised—at an OECD workshop on the role of Internet intermediaries in June…

Security and Function Creep

  • Bruce Schneier
  • IEEE Security & Privacy
  • January/February 2010

View or Download in PDF Format

Security is rarely static. Technology changes the capabilities of both security systems and attackers. But there’s something else that changes security’s cost/benefit trade-off: how the underlying systems being secured are used. Far too often we build security for one purpose, only to find it being used for another purpose—one it wasn’t suited for in the first place. And then the security system has to play catch-up.

Take driver’s licenses, for example. Originally designed to demonstrate a credential—the ability to drive a car—they looked like other credentials: medical licenses or elevator certificates of inspection. They were wallet-sized, of course, but they didn’t have much security associated with them. Then, slowly, driver’s licenses took on a second application: they became age-verification tokens in bars and liquor stores. Of course the security wasn’t up to the task—teenagers can be extraordinarily resourceful if they set their minds to it—and over the decades driver’s licenses got photographs, tamper-resistant features (once, it was easy to modify the birth year), and technologies that made counterfeiting harder. There was little value in counterfeiting a driver’s license, but a lot of value in counterfeiting an age-verification token…

Security, Group Size, and the Human Brain

  • Bruce Schneier
  • IEEE Security & Privacy
  • July/August 2009

View or Download in PDF Format

If the size of your company grows past 150 people, it’s time to get name badges. It’s not that larger groups are somehow less secure, it’s just that 150 is the cognitive limit to the number of people a human brain can maintain a coherent social relationship with.

Primatologist Robin Dunbar derived this number by comparing neocortex—the “thinking” part of the mammalian brain—volume with the size of primate social groups. By analyzing data from 38 primate genera and extrapolating to the human neocortex size, he predicted a human “mean group size” of roughly 150…

Architecture of Privacy

  • Bruce Schneier
  • IEEE Security & Privacy
  • January/February 2009

View or Download in PDF Format

The Internet isn’t really for us. We’re here at the beginning, stumbling around, just figuring out what it’s good for and how to use it. The Internet is for those born into it, those who have woven it into their lives from the beginning. The Internet is the greatest generation gap since rock and roll, and only our children can hope to understand it.

Larry Lessig famously said that, on the Internet, code is law. Facebook’s architecture limits what we can do there, just as gravity limits what we can do on Earth. The 140-character limit on SMSs is as effective as a legal ban on grammar, spelling, and long-winded sentences: KTHXBYE…

How the Human Brain Buys Security

  • Bruce Schneier
  • IEEE Security & Privacy
  • July/August 2008

View or Download in PDF Format

People tend to be risk-averse when it comes to gains, and risk-seeking when it comes to losses. If you give people a choice between a $500 sure gain and a coin-flip chance of a $1,000 gain, about 75 percent will pick the sure gain. But give people a choice between a $500 sure loss and a coin-flip chance of a $1,000 loss, about 75 percent will pick the coin flip.

People don’t have a standard mathematical model of risk in their heads. Their trade-offs are more subtle, and result from our brains have developed. A computer might not see the difference between the two choices—it’s simply a measure of how risk-averse you are—but humans do…

The Death of the Security Industry

  • Bruce Schneier
  • IEEE Security & Privacy
  • November/December 2007

View or Download the PDF

The hardest thing about working in IT security is convincing users to buy our technologies. An enormous amount of energy has been focused on this problem—risk analyses, ROI models, audits—yet critical technologies still remain uninstalled and important networks remain insecure. I’m constantly asked how to solve this by frustrated security vendors and—sadly—I have no good answer. But I know the problem is temporary: in the long run, the information security industry as we know it will disappear.

The entire IT security industry is an accident: an artifact of how the computer industry developed. Computers are hard to use, and you need an IT department staffed with experts to make it work. Contrast this with other mature high-tech products such as those for power and lighting, heating and air conditioning, automobiles and airplanes. No company has an automotive-technology department, filled with car geeks to install the latest engine mods and help users recover from the inevitable crashes…

Nonsecurity Considerations in Security Decisions

  • Bruce Schneier
  • IEEE Security & Privacy
  • May/June 2007

View or Download in PDF Format

Security decisions are generally made for nonsecurity reasons. For security professionals and technologists, this can be a hard lesson. We like to think that security is vitally important. But anyone who has tried to convince the sales VP to give up her department’s Blackberries or the CFO to stop sharing his password with his secretary knows security is often viewed as a minor consideration in a larger decision. This issue’s articles on managing organizational security make this point clear.

Below is a diagram of a security decision. At its core are assets, which a security system protects. Security can fail in two ways: either attackers can successfully bypass it, or it can mistakenly block legitimate users. There are, of course, more users than attackers, so the second kind of failure is often more important. There’s also a feedback mechanism with respect to security countermeasures: both users and attackers learn about the security and its failings. Sometimes they learn how to bypass security, and sometimes they learn not to bother with the asset at all…

Sidebar photo of Bruce Schneier by Joe MacInnis.