Second SHB Workshop Liveblogging (2)

The first session was about deception, moderated by David Clark.

Frank Stajano, Cambridge University (suggested reading: Understanding victims: Six principles for systems security), presented research with Paul Wilson, who films actual scams for “The Real Hustle.” His point is that we build security systems based on our “logic,” but users don’t always follow our logic. It’s fraudsters who really understand what people do, so we need to understand what the fraudsters understand. Things like distraction, greed, unknown accomplices, social compliance are important.

David Livingstone Smith, University of New England (suggested reading: Less than human: self-deception in the imagining of others; Talk on Lying at La Ciudad de Las Ideas; a subsequent discussion; Why War?), is a philosopher by training, and goes back to basics: “What are we talking about?” A theoretical definition—”that which something has to have to fall under a term”—of deception is difficult to define. “Cause to have a false belief,” from the Oxford English Dictionary, is inadequate. “To deceive is intentionally have someone to have a false belief” also doesn’t work. “Intentionally causing someone to have a false belief that the speaker knows to be false” still isn’t good enough. The fundamental problem is that these are anthropocentric definitions. Deception is not unique to humans; it gives organisms an evolutionary edge. For example, the mirror orchid fools a wasp into landing on it by looking like and giving off chemicals that mimic the female wasp. This example shows that we need a broader definition of “purpose.” His formal definition: “For systems A and B, A deceives B iff A possesses some character C with proper function F, and B possesses a mechanism C* with the proper function F* of producing representations, such that the proper function of C is to cause C* to fail to perform F* by causing C* to form false representations, and C does so in virtue of performing F, and B’s falsely representing enables some feature of A to perform its proper function.”

I spoke next, about the psychology of Conficker, how the human brain buys security, and why science fiction writers shouldn’t be hired to think about terrorism risks (to be published on Wired.com next week).

Dominic Johnson, University of Edinburgh (suggested reading: Paradigm Shifts in Security Strategy; Perceptions of victory and defeat), talked about his chapter in the book Natural Security: A Darwinian Approach to a Dangerous World. Life has 3.5 billion years of experience in security innovation; let’s look at how biology approaches security. Biomimicry, ecology, paleontology, animal behavior, evolutionary psychology, immunology, epidemiology, selection, and adaption are all relevant. Redundancy is a very important survival tool for species. Here’s an adaption example: The 9/11 threat was real and we knew about it, but we didn’t do anything. His thesis: Adaptation to novel security threats tends to occur after major disasters. There are many historical examples of this; Pearl Harbor, for example. Causes include sensory biases, psychological biases, leadership biases, organizational biases, and political biases—all pushing us towards maintaining the status quo. So it’s natural for us to poorly adapt to security threats in the modern world. A questioner from the audience asked whether control theory had any relevance to this model.

Jeff Hancock, Cornell University (suggested reading: On Lying and Being Lied To: A Linguistic Analysis of Deception in Computer-Mediated Communication; Separating Fact From Fiction: An Examination of Deceptive Self-Presentation in Online Dating Profiles), studies interpersonal deception: how the way we lie to each other intersects with communications technologies; and how technologies change the way we lie, and can technology be used to detect lying? Despite new technology, people lie for traditional reasons. For example: on dating sites, men tend to lie about their height and women tend to lie about their weight. The recordability of the Internet also changes how we lie. The use of the first person singular tends to go down the more people lie. He verified this in many spheres, such as how people describe themselves in chat rooms, and true versus false statements that the Bush administration made about 9/11 and Iraq. The effect was more pronounced when administration officials were answering questions than when they were reading prepared remarks.

EDITED TO ADD (6/11): Adam Shostack liveblogged this session, too. And Ross’s liveblogging is in his blog post’s comments.

EDITED TO ADD (6/11): Audio of the session is here.

Posted on June 11, 2009 at 9:37 AM8 Comments

Comments

Pat Cahalan June 11, 2009 3:05 PM

I think like D. Smith’s logical definition. Musing it over, right now. If I decide I like it, we need another term with a near but different definition, as well.

It implies that the deceiver must profit from the action (C does so in virtue of performing F, etc.). While there is certainly a difference between an attacker executing a deceit for some functional advantage, and an attacker executing a deceit without caring about the result, from a counter-strategy standpoint the ability to plan for either of these presupposes that you can understand the functional advantage.

If you can’t do this prior to an attack, your method of preventing the deceit is likely to be insufficient and incomplete. In some cases, it may be better to assume that “deceit” does not require that the attacker gain a functional advantage.

Of course, trying to protect from everything is likely to fail, as well.

It’s a good logical definition for a start, certainly… but if you’re trying to have a taxonomy of attacks and defenses against misrepresentations, you’re going to have to extend that…

GregW June 11, 2009 3:34 PM

I was intrigued by a “better” definition of deception as described above, and I had trouble parsing this new definition, and what improvement in understanding it contained.

So I tried to understand it in my own way by going concrete, mapping the most classic example of deception I could think of, the story of Eve and Satan (sorry Alice and Bob!), and mapping that story to the terms A, B, C, C*, F, and F*. The definition and story thus posed goes like this:

For systems A (Satan) and B (Eve), A (Satan) deceives B (Eve) iff A (Satan) possesses some character C (Satan’s tongue) with proper function F (Satan speaking), and B (Eve) possesses a mechanism C* (Eve’s brain? body?) with the proper function F* (Eve thinking? obeying God?) of producing representations, such that the proper function of C (Satan’s tongue) is to
cause C* (Eve’s Brain? body?) to fail to perform F* (Eve thinking? obeying God?) by causing C* (Eve’s brain? body?) to form false representations, and C (Satan’s tongue) does so in virtue of performing F (Satan speaking), and B’s (Eve’s) falsely representing enables some feature of A (Satan) to perform its proper function.”

I note that I am not a logician and I don’t know if my methodology is valid, but based on this exercise I reached three tentative conclusions: 1) it’s difficult, practically, to define/determine C/C/F/F, and 2) there seems to be some wierd logical wiggling going on with this concept of “representations” that I clearly don’t understand or is not well-defined as far as I can tell. And 3) given all that, I am intrigued but not convinced the definitional ball has been moved forward.

Am I missing something?

(Side-new-thought: Not to be heretical, but it’s kinda funny to think that the very first story/lesson in the Torah/Bible could be viewed not just as a story about “original sin” but alternatively as a story about “the original security failure”! Now that security breach got us all in a lot of trouble! Ah, user training, so rarely sufficient!)

HJohn June 11, 2009 4:34 PM

@GregW: “Side-new-thought: Not to be heretical, but it’s kinda funny to think that the very first story/lesson in the Torah/Bible could be viewed not just as a story about “original sin” but alternatively as a story about “the original security failure”! Now that security breach got us all in a lot of trouble! Ah, user training, so rarely sufficient!)”


Basically, the Fall of Man occured, not because Satan was able to crack God’s design, but because he deceived someone into ignoring policy through social engineering. (I guess “don’t eat the apple” is the ancient equivalent of “don’t give anyone your password.)

Moral of the story: Even a “perfect” system is vulnerable when it has human users.

David Livingstone Smith June 12, 2009 12:26 PM

GregW

The Adam and Even analysis does not work as presented, both because you need to understand the somewhat technical notion of proper function to deploy it properly and because it is intended for analyzing actual, rather than mythological, examples.

Analysis would need to go like this.
For systems A (Satan) and B (Eve), A (Satan) deceives B (Eve) iff A (Satan) possesses some character C (Satan’s ability to speak) with proper function F (tempting), and B (Eve) possesses a mechanism C* (Eve’s belief-forming apparatus) with the proper function F* of producing representations, such that the proper function of C (Satan’s ability to speak) is to
cause C* (Eve’s belief-forming apparatus) to fail to perform F* (accurately represent the world) by causing C* (Eve’s belief-forming apparatus) to form false representations, and C (Satan’s ability to speak) does so in virtue of performing F (tempting), and B’s (Eve’s) falsely representing enables some feature of A (Satan) to perform its proper function.”

In fact, though this can’t possibly work because it was Yahweh who was lying about the fruit (he told Adam that the fruit was poisonous)!

By the way, this analysis would entail that God created Satan to tempt Eve (tempting is the proper function–the raison d’etre–of Satan’s ability to speak).

Finally (and also by the way) thethird chapter of Book of Genesis does not claim that the serpent is identical to Satan. The tempter is just described as a snake with legs. God removes his legs in punishment for tempting Eve, which is apparently an explanation of why snakes don’t have legs.

HJohn June 12, 2009 12:58 PM

@David Livingstone Smith

Good additions, and I add a couple more to relate this more to modern dilemmas:
* Satan was actually a disgruntled former employee. He was fired (cast out of Heaven) by the boss (God) for insubordination (trying to take God’s throne, i.e., become CEO). True, the bible says serpent, but let’s assume the serprent was Satan or someone taking orders from Satan.
* The disgrunted employee (Satan/serpent) wanted to undermine God Company as retribution, and did so by social engineering a new hire (Eve).
* The new hire (Eve) sought collusion from a coworker (Adam) seeking her approval.
* The employees (Adam and Eve) were promised a reward for violating the “don’t eat fruit from that tree” policy (the reward was “your eyes will be open, and you will be like God).
* The result is termination (dust to dust), just as the disgrunted former employee wanted. Now everyone at the company has a shorter tenure, more difficult working conditions, and are under more scrutiny.

Okay, that was fun. lol. Kudos to Greg and David.

GregW June 12, 2009 9:07 PM

David, thanks for filling in the blanks with my hypothetical example. I’m honored to hear from the new definition’s author himself!

Re-assessing the definition and the concrete example of mine you corrected, I buy that A) the new definition seems legit, and B) it does advance things, removing the notions of “belief” and “intentionality” and removing a bit of an anthropocentric definitional perspective, and replacing them with this notion of entities that “form representations”.

I am still mildly uneasy that I don’t know logically quite how to define this concept: “forming representations”, but as a layperson I surmise there’s a moderate amount of philosophical precedent behind those terms. The “forming representations” concept sort of reminds me of something you probably know– that old AI chestnut, Searle’s Chinese room– the Chinese room can be said to “form representations” as part of its process, but the notions of intentionality are removed when viewed in this functional mimcry-oriented way. In a similar way, your new definition of deception removes intentionality from the definition and focuses more on functional deception.

As side notes, you are quite right about the Serpent/Satan distinction not being made in Genesis 3, but only made by later interpreters and referrents to that story. As for your opening comment, I’m not sure why your definition would fail with mythological examples and not actual ones. Stories of actual deception are not that different from mythological ones. But I take your point. Finally, you assert (essentially agreeing with the serpent’s claim) that Yahweh was lying about the fruit saying it was poisonous. However, God’s assertion that Adam would die if he ate the fruit was not precisely a claim of its poisonous nature (and thus a lie.) In fact, Adam did die, because A) God removed his access to the tree of life to ensure he would not live eternally, and/or B) he died a “spiritual death” of separation from God, and/or C) his disobedience/sin did lead to eventual physical death that was a byproduct of him eating the apple, even if not an immediate byproduct. Or so some say; my apologies for drifting into issues less relevant to our core discussion of deception. Anyway, thanks for your kind and thoughtful reply.

Online dating June 16, 2009 10:04 PM

“For example: on dating sites, men tend to lie about their height and women tend to lie about their weight. ”

IIRC, a study some years ago showed that men tended to lie also about their marital and financial status, and women about their age (men too, but not to the same degree.) The author’s conclusion was that each lied about factors that they perceived to be valued by the other party — which, the study showed, was a correct perception. Sorry I can’t point to it.

There was a good deal of self-deception or creation of fantasy relationships, in that lying about such salient characteristics as height and weight is blown at the first meeting. Many, women somewhat more than men, deceive themselves about the extent to which they can get away with age deception. All of these were involved in the common plaint, “Why don’t I ever get a second date?” (Because the first one exposed all your lies, showing that you’re not only older, shorter, fatter, and uglier than your picture and description, but that you’re also a liar.)

The role of self-deception in such cases would be a fertile field of study: How much self-deception precedes an individual’s attempts to provide false information on the dating site?

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.