Essays Tagged "Guardian"

Page 3 of 4

Facebook Should Compete on Privacy, Not Hide It Away

  • Bruce Schneier
  • The Guardian
  • July 15, 2009

Reassuring people about privacy makes them more, not less, concerned. It’s called “privacy salience”, and Leslie John, Alessandro Acquisti, and George Loewenstein — all at Carnegie Mellon University — demonstrated this in a series of clever experiments. In one, subjects completed an online survey consisting of a series of questions about their academic behaviour — “Have you ever cheated on an exam?” for example. Half of the subjects were first required to sign a consent warning — designed to make privacy concerns more salient — while the other half did not. Also, subjects were randomly assigned to receive either a privacy confidentiality assurance, or no such assurance. When the privacy concern was made salient (through the consent warning), people reacted negatively to the subsequent confidentiality assurance and were less likely to reveal personal information…

Raising the Cost of Paperwork Errors Will Improve Accuracy

  • Bruce Schneier
  • The Guardian
  • June 24, 2009

It’s a sad, horrific story. Homeowner returns to find his house demolished. The demolition company was hired legitimately but there was a mistake and it demolished the wrong house. The demolition company relied on GPS co-ordinates, but requiring street addresses isn’t a solution. A typo in the address is just as likely, and it would have demolished the house just as quickly. The problem is less how the demolishers knew which house to knock down, and more how they confirmed that knowledge. They trusted the paperwork, and the paperwork was wrong. Informality works when every­body knows everybody else. When merchants and customers know each other, government officials and citizens know each other, and people know their neighbours, people know what’s going on. In that sort of milieu, if something goes wrong, people notice…

Be Careful When You Come to Put Your Trust in the Clouds

Cloud computing may represent the future of computing but users still need to be careful about who is looking after their data

  • Bruce Schneier
  • The Guardian
  • June 4, 2009

This year’s overhyped IT concept is cloud computing. Also called software as a service (Saas), cloud computing is when you run software over the internet and access it via a browser. The salesforce.com customer management software is an example of this. So is Google Docs. If you believe the hype, cloud computing is the future.

But, hype aside, cloud computing is nothing new. It’s the modern version of the timesharing model from the 1960s, which was eventually killed by the rise of the personal computer. It’s what Hotmail and Gmail have been doing all these years, and it’s social networking sites, remote backup companies, and remote email filtering companies such as MessageLabs. Any IT outsourcing – network infrastructure, security monitoring, remote hosting – is a form of cloud computing…

We Shouldn't Poison Our Minds with Fear of Bioterrorism

  • Bruce Schneier
  • The Guardian
  • May 14, 2009

Terrorists attacking our food supply is a nightmare scenario that has been given new life during the recent swine flu outbreak. Although it seems easy to do, understanding why it hasn’t happened is important. GR Dalziel, at the Nanyang Technological University in Singapore, has written a report chronicling every confirmed case of malicious food contamination in the world since 1950: 365 cases in all, plus 126 additional unconfirmed cases. What he found demonstrates the reality of terrorist food attacks.

It turns out 72% of the food poisonings occurred at the end of the food supply chain – at home – typically by a friend, relative, neighbour, or co-worker trying to kill or injure a specific person. A characteristic example is Heather Mook of York, who in 2007 tried to kill her husband by putting rat poison in his spaghetti…

How the Great Conficker Panic Hacked into Human Credulity

  • Bruce Schneier
  • The Guardian
  • April 23, 2009

This essay also appeared in the Gulf Times.

Conficker’s April Fool’s joke — the huge, menacing build-up and then nothing — is a good case study on how we think about risks, one whose lessons are applicable far outside computer security. Generally, our brains aren’t very good at probability and risk analysis. We tend to use cognitive shortcuts instead of thoughtful analysis. This worked fine for the simple risks we encountered for most of our species’s existence, but it’s less effective against the complex risks society forces us to face today…

An Enterprising Criminal Has Spotted a Gap in the Market

  • Bruce Schneier
  • The Guardian
  • April 2, 2009

Before his arrest, Tom Berge stole lead roof tiles from several buildings in south-east England, including the Honeywood Museum in Carshalton, the Croydon parish church, and the Sutton high school for girls. He then sold those tiles to scrap metal dealers.

As a security expert, I find this story interesting for two reasons. First, among attempts to ban, or at least censor, Google Earth, lest it help the terrorists, here is an actual crime that relied on the service: Berge needed Google Earth for reconnaissance.

But more interesting is the discrepancy between the value of the lead tiles to the original owner and to the thief. The Sutton school had to spend £10,000 to buy new lead tiles; the Croydon Church had to repair extensive water damage after the theft. But Berge only received £700 a tonne from London scrap metal dealers…

Blaming The User Is Easy — But It's Better to Bypass Them Altogether

  • Bruce Schneier
  • The Guardian
  • March 12, 2009

Blaming the victim is common in IT: users are to blame because they don’t patch their systems, choose lousy passwords, fall for phishing attacks, and so on. But, while users are, and will continue to be, a major source of security problems, focusing on them is an unhelpful way to think.

People regularly don’t do things they are supposed to: changing the oil in their cars, going to the dentist, replacing the batteries in their smoke detectors. Why? Because people learn from experience. If something is immediately harmful, ie, touching a hot stove or petting a live tiger, they quickly learn not to do it. But if someone skips an oil change, ignores a computer patch, or …

The Secret Question Is: Why Do IT Systems Use Insecure Passwords?

  • Bruce Schneier
  • The Guardian
  • February 19, 2009

Since January, the Conficker.B worm has been spreading like wildfire across the internet, infecting the French navy, hospitals in Sheffield, the court system in Houston, Texas, and millions of computers worldwide. One of the ways it spreads is by cracking administrator passwords on networks. Which leads to the important question: why are IT administrators still using easy-to-guess passwords?

Computer authentication systems have two basic requirements. They need to keep the bad guys from accessing your account, and they need to allow you to access your account. Both are important, and every system is a balancing act between the two. Too little security, and the bad guys will get in too easily. But if the authentication system is too complicated, restrictive, or hard to use, you won’t be able, or won’t bother, to use it…

Terrorists May Use Google Earth, But Fear Is No Reason to Ban It

  • Bruce Schneier
  • The Guardian
  • January 29, 2009

This essay also appeared in The Hindu, Brisbane Times, and The Sydney Morning Herald.

German translation

It regularly comes as a surprise to people that our own infrastructure can be used against us. And in the wake of terrorist attacks or plots, there are fear-induced calls to ban, disrupt or control that infrastructure. According to officials investigating the Mumbai attacks, the terrorists used images from Google Earth to help learn their way around. This isn’t the first time Google Earth has been charged with helping terrorists: in 2007, Google Earth images of British military bases were found in the homes of …

Tigers Use Scent, Birds Use Calls — Biometrics Are Just Animal Instinct

  • Bruce Schneier
  • The Guardian
  • January 8, 2009

Biometrics may seem new, but they’re the oldest form of identification. Tigers recognise each other’s scent; penguins recognise calls. Humans recognise each other by sight from across the room, voices on the phone, signatures on contracts and photographs on drivers’ licences. Fingerprints have been used to identify people at crime scenes for more than 100 years.

What is new about biometrics is that computers are now doing the recognising: thumbprints, retinal scans, voiceprints, and typing patterns. There’s a lot of technology involved here, in trying to both limit the number of false positives (someone else being mistakenly recognised as you) and false negatives (you being mistakenly not recognised). Generally, a system can choose to have less of one or the other; less of both is very hard…

Sidebar photo of Bruce Schneier by Joe MacInnis.