Blog: November 2009 Archives

The Psychology of Being Scammed

This is a very interesting paper: “Understanding scam victims: seven principles for systems security,” by Frank Stajano and Paul Wilson. Paul Wilson produces and stars in the British television show The Real Hustle, which does hidden camera demonstrations of con games. (There’s no DVD of the show available, but there are bits of it on YouTube.) Frank Stajano is at the Computer Laboratory of the University of Cambridge.

The paper describes a dozen different con scenarios — entertaining in itself — and then lists and explains six general psychological principles that con artists use:

1. The distraction principle. While you are distracted by what retains your interest, hustlers can do anything to you and you won’t notice.

2. The social compliance principle. Society trains people not to question authority. Hustlers exploit this “suspension of suspiciousness” to make you do what they want.

3. The herd principle. Even suspicious marks will let their guard down when everyone next to them appears to share the same risks. Safety in numbers? Not if they’re all conspiring against you.

4. The dishonesty principle. Anything illegal you do will be used against you by the fraudster, making it harder for you to seek help once you realize you’ve been had.

5. The deception principle. Things and people are not what they seem. Hustlers know how to manipulate you to make you believe that they are.

6. The need and greed principle. Your needs and desires make you vulnerable. Once hustlers know what you really want, they can easily manipulate you.

It all makes for very good reading.

Two previous posts on the psychology of conning and being conned.

EDITED TO ADD (12/12): Some of the episodes of The Real Hustle are available on the BBC site, but only to people with UK IP addresses — or people with a VPN tunnel to the UK.

Posted on November 30, 2009 at 6:17 AM30 Comments

Fear and Public Perception

This 1996 interview with psychiatrist Robert DuPont was part of a Frontline program called “Nuclear Reaction.”

He’s talking about the role fear plays in the perception of nuclear power. It’s a lot of the sorts of things I say, but particularly interesting is this bit on familiarity and how it reduces fear:

You see, we sited these plants away from metropolitan areas to “protect the public” from the dangers of nuclear power. What we did when we did that was move the plants away from the people, so they became unfamiliar. The major health effect, adverse health effect of nuclear power is not radiation. It’s fear. And by siting them away from the people, we insured that that would be maximized. If we’re serious about health in relationship to nuclear power, we would put them in downtown, big cities, so people would see them all the time. That is really important, in terms of reducing the fear. Familiarity is the way fear is reduced. No question. It’s not done intellectually. It’s not done by reading a book. It’s done by being there and seeing it and talking to the people who work there.

So, among other reasons, terrorism is scary because it’s so rare. When it’s more common — England during the Troubles, Israel today — people have a more rational reaction to it.

My recent essay on fear and overreaction.

Posted on November 27, 2009 at 8:25 AM77 Comments

Leaked 9/11 Text Messages

Wikileaks has published pager intercepts from New York on 9/11:

WikiLeaks released half a million US national text pager intercepts. The intercepts cover a 24 hour period surrounding the September 11, 2001 attacks in New York and Washington.

[…]

Text pagers are usualy carried by persons operating in an official capacity. Messages in the archive range from Pentagon, FBI, FEMA and New York Police Department exchanges, to computers reporting faults at investment banks inside the World Trade Center.

Near as I can tell, these messages are from the commercial pager networks of Arch Wireless, Metrocall, Skytel, and Weblink Wireless, and include all customers of that service: government, corporate, and personal.

There are lots of nuggets in the data about the government response to 9/11:

One string of messages hints at how federal agencies scrambled to evacuate to Mount Weather, the government’s sort-of secret bunker buried under the Virginia mountains west of Washington, D.C. One message says, “Jim: DEPLOY TO MT. WEATHER NOW!,” and another says “CALL OFICE (sic) AS SOON AS POSSIBLE. 4145 URGENT.” That’s the phone number for the Federal Emergency Management Agency’s National Continuity Programs Directorate — which is charged with “the preservation of our constitutional form of government at all times,” even during a nuclear war. (A 2006 article in the U.K. Guardian newspaper mentioned a “a traffic jam of limos carrying Washington and government license plates” heading to Mount Weather that day.)

FEMA’s response seemed less than organized. One message at 12:37 p.m., four hours after the attacks, says: “We have no mission statements yet.” Bill Prusch, FEMA’s project officer for the National Emergency Management Information System at the time, apparently announced at 2 p.m. that the Continuity of Operations plan was activated and that certain employees should report to Mt. Weather; a few minutes later he sent out another note saying the activation was cancelled.

Historians will certainly spend a lot of time poring over the messages, but I’m more interested in where they came from in the first place:

It’s not clear how they were obtained in the first place. One possibility is that they were illegally compiled from the records of archived messages maintained by pager companies, and then eventually forwarded to WikiLeaks.

The second possibility is more likely: Over-the-air interception. Each digital pager is assigned a unique Channel Access Protocol code, or capcode, that tells it to pay attention to what immediately follows. In what amounts to a gentlemen’s agreement, no encryption is used, and properly-designed pagers politely ignore what’s not addressed to them.

But an electronic snoop lacking that same sense of etiquette might hook up a sufficiently sophisticated scanner to a Windows computer with lots of disk space — and record, without much effort, gobs and gobs of over-the-air conversations.

Existing products do precisely this. Australia’s WiPath Communications offers Interceptor 3.0 (there’s even a free download). Maryland-based SWS Security Products sells something called a “Beeper Buster” that it says let police “watch up to 2500 targets at the same time.” And if you’re frugal, there’s a video showing you how to take a $10 pager and modify it to capture everything on that network.

It’s disturbing to realize that someone, possibly not even a government, was routinely intercepting most (all?) of the pager data in lower Manhattan as far back as 2001. Who was doing it? For that purpose? That, we don’t know.

Posted on November 26, 2009 at 7:11 AM88 Comments

Virtual Mafia in Online Worlds

If you allow players in an online world to penalize each other, you open the door to extortion:

One of the features that supported user socialization in the game was the ability to declare that another user was a trusted friend. The feature involved a graphical display that showed the faces of users who had declared you trustworthy outlined in green, attached in a hub-and-spoke pattern to your face in the center.

[…]

That feature was fine as far as it went, but unlike other social networks, The Sims Online allowed users to declare other users untrustworthy too. The face of an untrustworthy user appeared circled in bright red among all the trustworthy faces in a user’s hub.

It didn’t take long for a group calling itself the Sims Mafia to figure out how to use this mechanic to shake down new users when they arrived in the game. The dialog would go something like this:

“Hi! I see from your hub that you’re new to the area. Give me all your Simoleans or my friends and I will make it impossible to rent a house.”

“What are you talking about?”

“I’m a member of the Sims Mafia, and we will all mark you as untrustworthy, turning your hub solid red (with no more room for green), and no one will play with you. You have five minutes to comply. If you think I’m kidding, look at your hub-three of us have already marked you red. Don’t worry, we’ll turn it green when you pay…”

If you think this is a fun game, think again-a typical response to this shakedown was for the user to decide that the game wasn’t worth $10 a month. Playing dollhouse doesn’t usually involve gangsters.

EDITED TO ADD (12/12): SIM Mafia existed in 2004.

Posted on November 25, 2009 at 6:36 AM44 Comments

Users Rationally Rejecting Security Advice

This paper, by Cormac Herley at Microsoft Research, sounds like me:

Abstract: It is often suggested that users are hopelessly lazy and
unmotivated on security questions. They chose weak passwords, ignore security warnings, and are oblivious to certicates errors. We argue that users’ rejection of the security advice they receive is entirely rational from an economic perspective. The advice offers to shield them from the direct costs of attacks, but burdens them with far greater indirect costs in the form of effort. Looking at various examples of security advice we find that the advice is complex and growing, but the benefit is largely speculative or moot. For example, much of the advice concerning passwords is outdated and does little to address actual threats, and fully 100% of certificate error warnings appear to be false positives. Further, if users spent even a minute a day reading URLs to avoid phishing, the cost (in terms of user time) would be two orders of magnitude greater than all phishing losses. Thus we find that most security advice simply offers a poor cost-benefit tradeoff to users and is rejected. Security advice is a daily burden, applied to the whole population, while an upper bound on the benefit is the harm suffered by the fraction that become victims annually. When that fraction is small, designing security advice that is beneficial is very hard. For example, it makes little sense to burden all users with a daily task to spare 0.01% of them a modest annual pain.

Sounds like me.

EDITED TO ADD (12/12): Related article on usable security.

Posted on November 24, 2009 at 12:40 PM64 Comments

Decertifying "Terrorist" Pilots

This article reads like something written by the company’s PR team.

When it comes to sleuthing these days, knowing your way within a database is as valued a skill as the classic, Sherlock Holmes-styled powers of detection.

Safe Banking Systems Software proved this very point in a demonstration of its algorithm acumen — one that resulted in a disclosure that convicted terrorists actually maintained working licenses with the U.S. Federal Aviation Administration.

The algorithm seems to be little more than matching up names and other basic info:

It used its algorithm-detection software to sift out uncommon names such as Abdelbaset Ali Elmegrahi, aka the Lockerbie bomber. It found that a number of licensed airmen all had the same P.O. box as their listed address — one that happened to be in Tripoli, Libya. These men all had working FAA certificates. And while the FAA database information investigated didn’t contain date-of-birth information, Safe Banking was able to use content on the FAA Website to determine these key details as well, to further gain a positive and clear identification of the men in question.

In any case, they found these three people with pilot’s licenses:

Elmegrahi, who had been posted on the FBI Most Wanted list for a decade and was convicted of blowing up Pan Am Flight 103, killing 259 people in 1988 over Lockerbie, Scotland. Elmegrahi was an FAA-certified aircraft dispatcher.

Re Tabib, a California resident who was convicted in 2007 for illegally exporting U.S. military aircraft parts — specifically export maintenance kits for F-14 fighter jets — to Iran. Tabib received three FAA licenses after his conviction, qualifying to be a flight instructor, ground instructor and transport pilot.

Myron Tereshchuk, who pleaded guilty to possession of a biological weapon after the FBI caught him with a brew of ricin, explosive powder and other essentials in Maryland in 2004. Tereshchuk was a licensed mechanic and student pilot.

And the article concludes with:

Suffice to say, after the FAA was made aware of these criminal histories, all three men have since been decertified.

Although I’m all for annoying international arms dealers, does anyone know the procedures for FAA decertification? Did the FAA have the legal right to do this, after being “made aware” of some information by a third party?

Of course, they don’t talk about all the false positives their system also found. How many innocents were also decertified? And they don’t mention the fact that, in the 9/11 attacks, FAA certification wasn’t really an issue. “Excuse me, young man. You can’t hijack and fly this aircraft. It says right here that the FAA decertified you.”

Posted on November 23, 2009 at 2:36 PM26 Comments

Al Qaeda Secret Code Broken

I would sure like to know more about this:

Top code-breakers at the Government Communications Headquarters in the United Kingdom have succeeded in breaking the secret language that has allowed imprisoned leaders of al-Qaida to keep in touch with other extremists in U.K. jails as well as 10,000 “sleeper agents” across the islands….

[…]

For six months, the code-breakers worked around the clock deciphering the code the three terrorists created.

Between them, the code-breakers speak all the dialects that form the basis for the code. Several of them have high-value skills in computer technology. The team worked closely with the U.S. National Security Agency and its station at Menwith Hill in the north of England. The identity of the code-breakers is so secret that not even their gender can be revealed.

“Like all good codes, the one they broke depended on substituting words, numbers or symbols for plain text. A single symbol could represent an idea or an entire message,” said an intelligence source.

The code the terrorists devised consists of words chosen from no fewer than 20 dialects from Afghanistan, Iran, Pakistan, Yemen and Sudan.

Inserted with the words ­ either before or after them ­ is local slang. The completed message is then buried in Islamic religious tracts.

EDITED TO ADD: Here’s a link to the story that still works. I didn’t realize this came from WorldNetDaily, so take it with an appropriate amount of salt.

Posted on November 23, 2009 at 7:24 AM68 Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.