How We Become Habituated to Security Warnings on Computers

New research: “How Polymorphic Warnings Reduce Habituation in the Brain ­- Insights from an fMRI Study.”

Abstract: Research on security warnings consistently points to habituation as a key reason why users ignore security warnings. However, because habituation as a mental state is difficult to observe, previous research has examined habituation indirectly by observing its influence on security behaviors. This study addresses this gap by using functional magnetic resonance imaging (fMRI) to open the “black box” of the brain to observe habituation as it develops in response to security warnings. Our results show a dramatic drop in the visual processing centers of the brain after only the second exposure to a warning, with further decreases with subsequent exposures. To combat the problem of habituation, we designed a polymorphic warning that changes its appearance. We show in two separate experiments using fMRI and mouse cursor tracking that our polymorphic warning is substantially more resistant to habituation than conventional warnings. Together, our neurophysiological findings illustrate the considerable influence of human biology on users’ habituation to security warnings.

Webpage.

EDITED TO ADD (3/21): News article.

Posted on March 18, 2015 at 6:48 AM26 Comments

Comments

RonK March 18, 2015 7:30 AM

Interesting research, however, there is a trade-off between defeating habituation and the ability to make genuine OS warnings recognizable as such and also hard to imitate.

Dr. I. Needtob Athe March 18, 2015 8:17 AM

I’d say the mere fact that they claim to “open the ‘black box’ of the brain” is in itself a frightening security warning.

vas pup March 18, 2015 9:07 AM

Bruce, thank you! Good link to the web page as well. There is research inside:
“Using Measures of Risk Perception to Predict Information Security Behavior: Insights from Electroencephalography (EEG)”. Deserve attention.

Frank March 18, 2015 9:21 AM

We see this with users all the time when it comes to SSL warnings in the browser. They become desensitized to them over time and happily click through them. Another reason why no-one should be using self-signed certificates.

Celos March 18, 2015 9:25 AM

Any countermeasures will be completely temporary. People will adapt and while they may take a second longer to realize what that message is, they will still not try to understand its implications.

The only thing that can be done is to warn sparingly. For that to be secure, the underlying system has to be well-designed and designed with security in mind. Warning often will always fail.

chuckb March 18, 2015 10:05 AM

For many habitual web browsers, the same process operates to generate responses to headlines rather than the content of stories…

jeff March 18, 2015 10:09 AM

I’ve written security software products. Even when using my own product I would see warnings where I couldn’t tell whether I should allow things to proceed or not. It’s not always a question of becoming habituated to these warnings. Sometimes, nobody knows the right answer.

David Leppik March 18, 2015 11:22 AM

People become habituated because that’s the only rational response to error messages they don’t understand which get in their way.

The solution is not to make them more annoying, it’s to only ask people to make decisions that are relevant to them and that they are qualified to make. If they aren’t qualified to make the decision, they should be able to seek the advice of someone who is qualified– which in turns means these should not be “Surprise! Must act now!” pop-ups presented when someone is in a hurry to accomplish something.

As an example of a case where security warnings shouldn’t be shown, consider a nurse who is looking up a drug fact sheet and sees a “Do you want to run this file?” warning. If the nurse guesses wrong, there could either be (a) a security breach, or (b) medical complications and possible malpractice. Guess which scares the nurse more.

JeffH March 18, 2015 11:30 AM

It’s a great piece of research from a biological standpoint, but is it really habituation that causes people to click through warnings, or is it just habituation that takes over once they determine the warning is unhelpful jargon-laden half-truth? Poor UI is as much to blame as the users, if not more.

Stop popping up unhelpful message boxes to click through and we won’t have anything like as much habituation to overcome. People invented the green bar for certificate validation for a reason.

My favourite on Windows: ‘Revocation information for this certificate is not available. Do you want to proceed?’. This statement tells one nothing unless one has an understanding of how CAs & revocation lists work. It is also likely that the user is going to click Yes anyway even after a detailed risk analysis, because we now habitually connect over HTTPS (which is a good thing) and often nothing genuinely sensitive is transferred. Failed at the first hurdle.

The top hits in Google are all advice on how to turn off the publisher check. It’s also a warning message that can be pointing the user to the wrong problem – revocation data might well be available, but another problem like a mismatched clock could be at fault – it doesn’t even hint at that.

The least it could do is frame the question as a risk posed to the user, with a troubleshooter or links on where to go for more details if the risk is considered high. Even better if it were applied only to pages where the browser/server knows there is a risk, or allowed the user to tune where they themselves consider risk to apply.

JeffP March 18, 2015 1:54 PM

A)bort? R)etry? I)gnore?
I’m still not sure how I should have responded to that one. If it polymorphed, I’m sure my head would explode.

@David Leppik. Isn’t this the Aunt Tillie and CUPS configuration discussion? “If Aunt Tillie can’t [understand your security warning], scolding her for being a brainless luser buys you exactly nothing.”

quixote March 18, 2015 3:36 PM

1) Too many of the warnings are ass-covering. We’d click through even if we did understand them. Life is too short to study every blithering notice, so if they startyanking at our attention, the only result will be lotss of irritated users.

Solution? Limit warnings to real potential problems for the user and word it so any user can understand what the problem might be.

Many years ago I grew tired enough of Windows to switch to Linux.At the command line, there’s definitely the assumption you know what you’re doing. But sometimes at critical points, there are dialogs such as Delete All? Yes No. (The default being “yes” since that was the command.) Mindlessly hitting return will get you exactly what you asked for. Believe me, only once did I skip over that boring text. It didn’t need any embellishments or de-habituation triggers.

DB March 18, 2015 3:59 PM

This whole approach is flawed.

Instead of trying to reduce habituation of bothersome warnings… we need to make the warnings actually MEAN something, so that users care about them. Windows Vista, for example, gave so many warnings for every little thing, that the whole system of warnings became meaningless. Later versions improved it some, but the problem still persists, due to lack of user education and user caring.

If I care about what the warning says, and I know that I should care and why, then I will self-resist any habituation, and pay attention to the warning. If I don’t care, and the warning is just “in my way” then I will dismiss it out of habit regardless of any method you use to defeat habituation.

So the real solution is twofold:
1) only show warnings that really matter to the user (according to the user, not according to security experts)
2) educate users better what really should matter, so that the things that should matter do matter to them more

NOTE: The warnings themselves CANNOT be education, that just causes habituation!

Mike Amling March 18, 2015 4:00 PM

I can’t tell you how many times I’ve gotten a dire-looking SSL warning from a browser, and by going though a lot of clicking, was able to determine that the problem is “You connected to example.com but the certificate that the site presented is for http://www.example.com.” What is it that browser writers expect my Mom to do when presented with the warning?

Can the web site send a 301 (or is it a 302) HTTP redirect from example.com to http://www.example.com when the user tries https://example.com if the site doesn’t have an example.com certificate?

I’ve been in software design meetings where the question of “What should it do if XYZ happens?” comes up and someone suggests “Ask the user” and I’ve been the only one who considers whether the user will be in a position to answer the proposed question.

Max March 18, 2015 5:32 PM

Security warnings shouldn’t exist in the first place, except as a super advanced option. Either make it an error, with no easy way around it, or shut up.

Andrew March 18, 2015 7:15 PM

I think Raymond Chen nailed it when he described this Windows 95 placeholder dialog text:

    In order to demonstrate our superior intellect, we will now ask you a question you cannot answer.

This is exactly the attitude of security system designers, albeit often inadvertently or through a misguided desire to offer chices that are irrelevant. Do you want secure software – yes or no? Well, yes, please! Just give me sane and secure defaults and have the means to alter or subvert them safely kept away from average users. The cleaner in a power plant should never have to keep dismissing a “Do you want shut down the nuclear reactor” button when they plug in their floor polisher…

Carlos March 18, 2015 8:26 PM

@Frank • March 18, 2015 9:21 AM

Actually, I think it’s worse than that.

You see, while browsers will nag you and try hard to stop you if you try to log in to your company webmail that uses self signed SSL certificates, they issue no warnings at all if a fishing site doesn’t actually use SSL.

So yeah, people get used to ignore the warning because they’re dumb warnings that in most cases shouldn’t be shown to begin with.

Because we’re pretty good at learning to ignore useless stuff.

Ben R March 18, 2015 8:43 PM

I click through bad certificate warnings 100% of the time, because 100% of the time they pop up on random sites that happen to use self-signed certificates—sites that I would be willing to visit over HTTP. If I got one of these warnings when doing online banking, I would heed it and not click through. Since that never happens, telemetry will show that I always click through the warnings. It’s not because I’m ignoring them; it’s because 100% of the warnings are false alarms. Making them even more difficult to click through would just waste even more of my time.

Browsers need to stop acting as though HTTPS without a validated certificate is less secure than HTTP. If you want web encryption to become ubiquitous then stop assuming that the only sites that would ever use HTTPS are super secure sites that I’m going to give all my personal information to and would never want to visit without a validated certificate. By all means draw a red background in the URL bar to make it clear that this isn’t a validated site (and use a red background for HTTP too, while you’re at it), but let me see the stupid web page. It’s just a stupid post on a message board that somebody linked me to. I just want to read the stupid thing.

Wendy M. Grossman March 19, 2015 2:58 AM

Ben: which raises another point. The security warnings browsers display do not distinguish the severity of the consequences. If you click through a warning to visit example.com instead of http://www.example.com and it’s a news site or something like that, the consequences of making a mistake likely aren’t that great. But the warning looks to the user exactly the same if it comes from that site or a banking site, where the consequence (as you indicate) could be having your bank account wiped out.

A real-world equivalent might be replacing stop signs, traffic lights, and automated barriers at level crossings all with the same warning sign.

wg

k10 March 19, 2015 10:56 AM

What good is a warning when there’s nothing you can do about it, and no one to report it to who’s motivated to address the problem?

David Leppik March 19, 2015 11:22 AM

@Wendy: I can’t imagine how a computer is going to be able to tell that website A is supposed to be a bank, while website B is supposed to be a news site. Nor is it even safe to assume that a bogus news site isn’t important. Faking the NY Times could be used to incite a riot, or to provide false instructions for reporting an anonymous tip.

Computers can’t afford to assume that any site is trivial to a particular user. The best they could do is either assume all sites are important, or provide clear warnings that actually succeed in eroding a user’s trust in a site.

Anonymous Cow March 19, 2015 11:27 AM

Here’s one reason why it’s annoying: how many warnings one page has. I can understand one warning that the page wants to go to and out of SSL mode, but when one page requires 6 or more such warnings what is going on with that page? I see a lot more of those than I see certification warnings.

Starship Buzzing By March 19, 2015 7:53 PM

This is how people operate. I agree with posters who argue it is ‘bad ui’. There can be a profound ‘cry wolf’ effect. Of all of the psychological problems attributable to humankind, the ‘cry wolf’ effect is one of the most scary ones. Especially when that ‘crying wolf’ is induced and controlled by the ‘wolf’ to begin with.

I have seen this in scary effect and what happens to the person is they can create a literal – though psychological – stealth wall in their own mind for the behalf of the “attacker”.

The largest incidence of this is with people’s inner “sensors”, which might include internal warnings. Such as “this will backfire on you”, “you will go to hell for this”, and so on. Deadening the conscience until the conscience no longer is present, and with it their underlying “early warning radar system” for danger.

HTTPS systems, kind of “meh” there. There have been extraordinary changes in terms of defense against MITM attacks between the browser and client. These have come about because of the attacks which have targeted Google and Facebook. Firesheep, and the like. Problem is the solution was very conduicive to Google, being both a main browser vendor and a major server services provider. But, not so conducive to the industry at large as specific and relatively obscure browser and server changes have to be made.

Still, in general, signal to noise is a consistent problem even for end users, and if subjected to significant noise, end users will learn to tune out the signal. Just as, for instance, Schneier points out in his latest book end users have learned to tune out online advertisements.

Julius March 20, 2015 7:24 AM

I completely agree that users become desensitised to security warnings because they don’t find them useful or actionable enough.

Consider this: every time my computer is low on battery, I get a warning telling me that the system is about to hibernate. I then take the charger out of my bag and plug it in. There is a large “Dismiss” button on the warning, but I don’t think I ever ignored the warning and let my computer turn off. This is precisely because the warning is informing me of something genuinely useful that I know how to fix. A typical security dialog is much more visible than the “low battery” warning so the problem is not with its visuals but with the contents.

vas pup March 20, 2015 8:46 AM

On Internet, psychology and security:
http://www.bbc.com/news/technology-31976567
“The internet had made people numb to the suffering and humiliation of others, she concluded.”

http://www.bbc.com/news/technology-31983905
“US lobby group Consumer Watchdog agreed with Mr Tusch’s concerns and wrote a letter to Facebook boss Mark Zuckerberg asking him to “suspend the suicide prevention program until it is fully protective of the rights of all individuals and contains safeguards against abuse”.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.