it i
C n s
ybe M
rs ar
eCt C
ur itie
y s
and PrivaCy
Security Fatigue
Brian Stanton and Mary F. Theofanos, US National Institute of Standards and Technology
Sandra Spickard Prettyman, Independent Consultant
Susanne Furman, US National Institute of Standards and Technology
Security fatigue has been used to describe experiences with online
security. This study identifies the affective manifestations resulting
from decision fatigue and the role it plays in users’ security decisions.
I think I am desensitized to it—I know bad things
passwords, run antivirus software programs, and
can happen. You get this warning that some virus is
accept unwieldy terms of agreement, often with-
going to attack your computer, and you get a bunch
out a clear understanding of why and to what end.
of emails that say don’t open any emails, blah, blah,
One result is that people reach a saturation point
blah. I think I don’t pay any attention to those things
and become inured to the issue of cybersecurity.2
anymore because it’s in the past. People get weary of
Here, we argue that people have indeed reached
being bombarded by “watch out for this or watch out
this saturation point, one that results in what
for that” (participant 101).
Steven Furnell and Kerry-Lynn Thomson call
“security fatigue.”3 They propose that “there is a
ver and over again today, people are
threshold at which it simply gets too hard or bur-
bombarded with messages about the
densome for users to maintain security.”3 When
dangers lurking on the Internet, about
this fatigue happens, people become “desensi-
security breaches in major corpora-
tized” and “get weary,” as participant 101 from
tions1 and the US government, and about the
our study notes at the beginning of the article.
need to be constantly attentive while online. To
We define fatigue as a type of weariness, a reluc-
combat these dangers, users are forced to update
tance to see or experience any more of something.
IT Pro September/October 2016
P u b l i s h e d b y t h e I E E E C o m p u t e r S o c i e t y
1520-9202/16/$33.00 © 2016 IEEE

When these feelings are related to security, we
tion, they might still not make rational decisions
use the term security fatigue. Although other
because they are “influenced by motivational
factors might be included in security fatigue, in-
limitations and misrepresentations of personal
cluding vigilance and loss of control, this article
focuses on the role that decision fatigue plays
Another limit to rationality related to online
and the affective manifestations resulting from
privacy and security is what Adam Beautement,
it. This weariness often manifests as resignation
M. Angela Sasse, and Mike Wonham call the
or a loss of control in people’s responses to on-
“compliance budget.”6 They argue that employ-
line security. People are told they need to be con-
ees engage in a cost-benefit analysis in which
stantly on alert, constantly “doing something,”
they weigh the advantages and disadvantages
but they are not even sure what that something
of compliance. Once their compliance limit is
is or what might happen if they do or do not do
reached, people choose not to comply or find
it. Security fatigue, and the resignation and loss
ways to work around compliance; their willing-
of control associated with it, certainly present a
ness to comply stops as they are confronted with
challenge to efforts aimed at promoting online
additional security policies and requirements.
security and protecting online privacy.
This work clearly addresses security behaviors
Furnell and Thomson identify security fatigue
in organizations, where workers are expected
as a concept related to people’s experiences with
to comply with workplace policies and practices
online security in the workplace. Our work here
related to online security. Because there are no
examines security fatigue among the general
clear security policies or best practices for the
public (average users, not security experts or
general public to follow, the compliance budget
IT professionals), presenting empirical evidence
does not seem to apply in the same way it does
for its existence in their everyday lives and tech-
in the workplace. Anne Adams and Sasse argue
nology usage. We argue that security fatigue is
that, in fact, many security policies promote
one cost that users experience when bombarded
an adversarial relationship with users,7 putting
with security messages, advice, and demands for
them in what Cormac Herley calls an “impos-
compliance—and that this cost often results in
sible compliance regime.”8
what security experts might consider less secure
All this research deals with limits to the de-
online behavior. Such work can help advance
cision-making process. The ability to make de-
our understanding of how the public thinks
cisions, like the compliance budget, is a finite
about and approaches cybersecurity, and can
resource. It occurs when individuals are in-
provide us with a better understanding of how
undated with choices and asked to make more
to help users be more secure in their online
decisions than they can process.9 The result is
that they often make irrational tradeoffs, avoid
making decisions, and have impaired self-regu-
Background Literature
lation.10 These indicators of decision fatigue are
Alessandro Acquisti and Jens Grossklags argue
also present in the cybersecurity space. Amos
that bounded rationality4 limits our ability to ac-
Tversky and Daniel Kahneman argue that when
quire and then apply information in the online
people are fatigued, they fall back on heuristics
privacy and security space.5 Several factors can
and cognitive biases when making decisions.11
limit our rationality: the amount of information
In our previous work based on this dataset,12 we
we can process, our minds’ cognitive limitations,
discovered that participants often relied on mul-
the time we have available to make a decision,
tiple mental models that were partially formed
incomplete information, and systematic psycho-
and incomplete to help them make sense of and
logical deviations from rationality. From this
negotiate their experiences with online privacy
perspective, individuals generally have access to
and security. These mental models reflect the
limited information, and even with complete in-
use of heuristics and cognitive biases previously
formation, are often unable to act in optimal ways
identified by Tversky and Kahneman. We see se-
given the vast amount of data they need to pro-
curity fatigue as yet another piece that influences
cess. In addition, Acquisti and Grossklags argue
the decision-making process, a subset of decision
that even if individuals had complete informa-
fatigue, but in a different domain.
2 7

CyberseCurity and PrivaCy
Table 1. Demographic data about study participants.
Washington DC
metropolitan area
and then each researcher worked with a subset
of interviews to determine inter-coder reliabil-
ity. We met regularly as a team to discuss codes
and their application and to revise the code list
as needed. Once we reached agreement on the
codes and their operationalization, we contin-
ued to code interviews independently, with each
interview being coded by at least two research-
ers. We continued to meet to discuss the coding
Data for this article are part of a larger qualita-
process until we reached saturation—the point at
tive study related to average users’ (not computer
which no new properties or dimensions emerged
experts’) perceptions and beliefs about cyberse-
from the coding process.13 At this point, we shift-
curity and online privacy. Data collection took
ed from coding to analysis, discussing relation-
place from January to March 2011, and included
ships in the data and among the codes. We wrote
40 semistructured interviews (see Table 1 for
memos individually and shared ideas related to
participant demographics). Participants came our interpretation of the data and codes. This
from urban, rural, and suburban areas and from
iterative and recursive analytic process provided
a range of employment settings.
opportunities for interdisciplinary discussions
The semistructured protocol asked questions
and alternative explanations—especially impor-
about online activities, computer security, secu-
tant in this multidisciplinary team-based study.
rity icons and tools, and security terminology.
Although the interviews did not specifically
Participants also answered demographic ques-
address security fatigue, we began to notice
tions and provided a self-assessment of their
many indicators in which fatigue surfaced as
computer knowledge. We piloted the protocol
participants discussed their perceptions and be-
with a small group of participants to assess the
liefs about online privacy and security. When we
questions’ face validity and language appro-
recoded the data for security fatigue, it surfaced
priateness, adjusting the instrument based on
in 25 out of 40 interviews, being one of the most
feedback from the pilot. When all interviews
consistent codes across the dataset. We then
were completed, one of the researchers on the
refined our analysis and examined the data for
team transcribed them, along with field notes
contributing factors, symptoms, and outcomes
from the interviews. The question categories
of fatigue. When compiled together, there were
that the protocol utilized to elicit participant
more than eight single-spaced pages of data re-
knowledge, behaviors, and emotions related lated to security fatigue. It permeated the data
to online activity and cybersecurity were as
and tinged it with a tremendous negativity, often
expressed as resignation or loss of control. We
explore these responses next, as well as how se-
• list of, frequency of, reasons for, security needs
curity fatigue might in fact be the result of a “bad
of, security tools for, and feelings about online
cost-benefit tradeoff to users.”8
• definition of, knowledge about, reasons for,
levels of, training in, use of, and feelings about
Adopting security advice is an ongoing cost that
computer security
users experience and that contributes to secu-
• identification of, knowledge about, beliefs about,
rity fatigue. The result is that users are likely to
use of, and feelings about security icons and tools
ignore such advice and engage in behaviors that
• familiarity with, knowledge about, and under-
experts believe put them at risk. From the per-
standing of security terminology
spective of a cybersecurity expert, this behavior
is often seen as irrational. Yet we argue that when
Data analysis began with the development of
examined from the lens of security fatigue, these
an a priori code list (informed by the literature
behaviors (and the beliefs that drive them) make
and our knowledge of the field) constructed by
much more sense. Users are tired of being over-
the research team. We operationalized all codes
whelmed by the need to be constantly on alert,
IT Pro September/October 2016

tired of all the measures they are asked to adopt
I get tired of remembering my username and pass-
to keep themselves safe, and tired of trying to un-
words (participant 204).
derstand the ins and outs of online security. All
of this leads to security fatigue, which causes a
If you give me too many more blocks, I am going to be
sense of resignation and a loss of control. Our
turned off. My [XXX] site, first it gives me a login,
data clearly demonstrate the manifestations of
then it gives me a site key I have to recognize, and then
security fatigue as a specific example of decision
it gives me a password. So that is enough, don’t ask me
fatigue,10 including
anything else (participant 109).
• avoiding unnecessary decisions,
[Security] seems to be a bit cumbersome, just something
• choosing the easiest available option,
else to have and keep up with (participant 117).
• making decisions driven by immediate motivations,
• choosing to use a simplified algorithm,
There is the firewalls, and Norton, and there is this
• behaving impulsively, and
and antivirus, and run your checkup, and so many
• feeling resignation and a loss of control.
things that you can do, I just get overwhelmed (par-
ticipant 108).
Security fatigue, like decision fatigue, occurs
when individuals are asked to make more de-
Participants are “tired,” “turned off,” and
cisions than they can process, depleting their
“overwhelmed” by it all. This overarching fatigue
resources and resulting in the behaviors and
factors into their cost-benefit analysis, and the
emotions just listed. In previous work, we dis-
result is that many reject security advice or prac-
cussed how these were linked to the public’s use
tices that they realize might protect them more,
of incomplete and often contradictory multiple
in part because they are driven by immediate
mental models.12 In the following sections, we
motivations—usually related to the completion
present evidence of security fatigue in the gen-
of their primary task. For example, participant
eral public, using direct quotes from our partici-
101 recognizes that some of his behaviors are
pants, rather than our synthesis of them, given
not what they should be (“I am lazy about my
that these are critical in qualitative research to
passwords, and I use the same ones; I know they
understanding the data and its implications.
should have random numbers and letters”), but
chooses them anyway, because following the se-
I Get Tired Just Thinking about It
curity advice “just makes things more difficult.”
More than half of our participants alluded to
For these participants, security is cumbersome,
fatigue in one way or another during the course
difficult, and altogether overwhelming, and they
of their interview, even though fatigue was not a
choose to follow practices that make things eas-
direct part of the interview protocol. The quote
ier and less complicated. In many ways, they are
from participant 101 used at the beginning of
making what seem to be irrational tradeoffs,10 al-
this article highlights this idea succinctly: “Peo-
though Herley would argue that, in fact, the trade-
ple get weary of being bombarded by ‘watch out
offs make sense economically in terms of users’
for this and watch out for that.’” A little further
cost-benefit analyses.8
on in the interview, he states, “it also bothers me
In addition, when people are fatigued, they
when I have to go through more additional secu-
are prone to fall back on heuristics and cogni-
rity measures to access my things, or get locked
tive biases when making decisions.11 Our data
out of my own account because I forgot [and] I
show that, based on their experience, users have
accidentally typed in my password incorrectly.”
several cognitive biases that result from security
When discussing security, he uses words such as
“irritating,” “annoying,” and “frustrating,” which
all contribute to an overarching sense of the fa-
• they personally are not at risk (they have noth-
tigue he experiences in the online environment.
ing of value that someone else would want);
Other participants articulate similar positions,
• someone else is responsible for security, and if
whether speaking about passwords, antivirus
targeted, they will be protected (not their re-
software, or security in general:
sponsibility); and
2 9

CyberseCurity and PrivaCy
• no security measure that they put in place will
heard horror stories of anyone getting my email (par-
really make a difference (large corporations
ticipant 101).
and the government cannot even protect them-
selves, so how can any one individual?).
Again, participants are likely to avoid making
decisions related to security because they do not
When participants expressed these beliefs, it was
feel at risk.
often with a sense of resignation or a sense that
they had no control over the situation.
Whose Job Is It, Anyway?
Many participants relegate online security to an-
Why Would I Be Targeted?
other source, whether that is their bank, the store
Our data show participants often feel they are
or site with which they are interacting, or some-
not personally at risk—they are not important
one with more expertise. This represents a form
enough for anyone to care about their informa-
of decision avoidance, another characteristic
tion. There is an edge of frustration to this data,
of decision fatigue.10 The following participant
as if participants are tired of hearing about hav-
comments clearly articulate that they are relying
ing to protect themselves when they do not feel at
on others to take care of them:
risk. The frustrated tone, minimization of risk,
and devaluing of information is evident in the
There seems to be so many updates. I would like to think
following participant comments:
that Norton is supposed to be handling that security. I
don’t want to do all those updates (participant 112).
I don’t see what of value I have on there that would
make a difference (participant 110).
It is up to the banks to make sure they protect your
information (participant 115).
It doesn’t appear to me that it poses such a huge secu-
rity risk. I don’t work for the state department, and I
[Security is] something that I rely on somebody else to
am not sending sensitive information in an email. So,
take care of (participant 206).
if you want to steal the message about [how] I made
blueberry muffins over the weekend, then go ahead
For these participants, security is not really some-
and steal that (participant 108).
thing they want to do, they feel comfortable do-
ing, or for which they feel responsible. Instead,
If someone needs to hack into my emails to read stuff,
it is something to be relegated to others who are
they have problems. They need more important things
more capable or more responsible. Many partici-
to do (participant 119).
pants seem resigned to the idea that cybersecurity
is something they are not going to understand
I am not working for the government like you are,
and that if it were up to them, they would not
where everything is top secret and important. If [my
be secure. Having another entity responsible for
data] is stolen or hacked into, no big deal (partici-
their online security makes them feel safer, even
pant 112).
though they recognize it as a loss of control.
This often results in avoiding decisions related
Will It Really Matter?
to security or in choosing the easiest available
Participants often articulate a position that no
matter what they do, it will never be enough, so
In addition, participants who have not experi-
why do it? In part, this stems from the fact that
enced a problem themselves, or who do not know
they do not see the direct benefit from the behav-
others who have experienced any security issues,
ior. In addition, many participants recognize that
are prone to ignore advice in spite of recognizing
major institutions (banks, stores, and the gov-
that threats exist, as in the following quote:
ernment) have experienced security breaches. If
these large, wealthy organizations cannot protect
I know that the risk exists, and my security can be
themselves online, how can any one individual
compromised, and my information can be stolen. But
be expected to do so? Again, participant 101 pro-
I don’t hear it happening often to my peers. I haven’t
vides insight here:
IT Pro September/October 2016

If I took all the security measures possible, and I made
I am probably two weeks behind those who are
my password d3121, unlike scissors90, is it going to
out there to try and break into computers, forever
make all that difference? I don’t have to be vigilant all
two weeks behind” (participant 107).
the time. If it is going to happen, it is going to happen
(participant 101).
There is a sense of fatalism here, as if some-
While users’ cybersecurity behavior is
often portrayed as irrational, in fact it
thing will eventually happen no matter what a
might be quite rational and reflect an
person does. The following three participants
astute cost-benefit analysis that results in users
clearly expressed this same sense:
choosing to ignore “complex security advice that
promises little and delivers less.”8 We argue that
If someone wants to get into my account, they are go-
users experience a sense of security fatigue that
ing to get into my account (participant 116).
also contributes to this cost-benefit analysis and
reinforces their ideas about the lack of benefit for
It is not the end of the world. If something happens it
following security advice. From this perspective,
is going to happen (participant 119).
we in the IT community need to rethink the way
we currently conceptualize the public’s relation-
I haven’t kept up with the latest and greatest soft-
ship to cybersecurity. Current mental models that
ware applications for antivirus or antispyware. I
position cybersecurity as something that is not
know there are risks, and there are ways to prevent
worth the effort will be challenging if not impos-
some of the risks. I don’t know that I feel completely
sible to change. Yet, as IT professionals, it is our
comfortable that all of the risks are completely avoid-
responsibility to take up this challenge and work
able or that everything can be blocked. You hear that
to alleviate the security fatigue users experience.
things—I don’t even know how real or threatening
Our data provide evidence for three specific
threats are. Don’t open this, don’t open that, there
ways to minimize security fatigue and help us-
is a new worm or Trojan horse. There is a lot of in-
ers adopt more secure online practices: limit the
formation and there may be a lot of misinformation.
decisions users have to make related to security;
And I tried—it is all behind me, and I cannot ever
make it easy for users to do the right thing related
secure my computer. There is a lot to keep up with
to security; and provide consistency (whenever
(participant 117).
possible) in the decisions users need to make. For
example, consider a work environment that of-
In the physical world, we lock our doors to pro-
fers different ways for users to log into the sys-
tect ourselves, and we know it works because we
tem. The first is the traditional user name and
know when someone has broken into our space.
password. The second is a personal identification
However, with cybersecurity, we might not know
and verification card (PIV). The card is easier for
when the attempts have been made and thwart-
the user and more secure than the alternative.
ed, perhaps leading to the idea of misinformation
Thus, the PIV option should show up as the de-
as expressed by participant 117 in the previous
fault login option—it should not be difficult to
access. This is but one example of a way to al-
Users feel inundated by the tasks, choices, and
leviate decisions that cause security fatigue and
work that security decisions require of them and
make it easier for the user to do the right thing.
are unsure that compliance actually makes them
As we design security solutions, we must be con-
any more secure. Whatever they do, it is never
scious of those areas that cause users to experience
quite enough. Herley agrees, noting that “us-
fatigue, so they do not become resigned and com-
ers ignore new advice for several reasons. First,
placent or feel a loss of control related to their on-
they are overwhelmed. Given the sheer volume
line security. We must also continue to investigate
of advice offered, no user has any real prospect
users’ beliefs, knowledge, and use of cybersecurity
of keeping up.”8 Things are constantly changing,
advice and the factors, such as security fatigue, that
both in terms of the security advice they receive
inform them, so we can ultimately provide more
and the tactics used by those who want to violate
benefit and less cost for adopting cybersecurity ad-
their security. As one participant put it, “I think
vice that will keep users safer online.
3 1

CyberseCurity and PrivaCy
password rules and analysis to privacy concerns. Stanton
1. G. McGraw, “Security Fatigue? Shift Your Paradigm,”
has also worked on biometric projects for the US Depart-
Computer, vol. 47, no. 3, 2014, pp. 81–83.
ment of Homeland Security and the Federal Bureau of
2. A. Viseau, A. Clement, and J. Aspinall, “Situating
Investigation’s Hostage Rescue Team, and with latent fin-
Privacy Online: Complex Perceptions and Everyday
gerprint examiners. He received an MS in cognitive psy-
Practices,” Information, Communication, and Society, vol.
chology from Rensselaer Polytechnic Institute. Contact him
7, 2004, pp. 92–114.
3. S. Furnell and K.L. Thomson, “Recognising and Ad-
dressing ‘Security Fatigue,’” Computer Fraud and Secu-
Mary F. Theofanos is a computer scientist with the US
rity, Nov. 2009, pp. 7–11.
National Institute of Standards and Technology’s Mate-
4. A. Acquisit and J. Grossklags, “Privacy and Rational-
rials Measurement Laboratory. She performs research on
ity in Individual Decision Making,” IEEE Security &
usability and human factors of systems. Theofanos is the
Privacy, vol. 3, no. 1, 2005, pp. 26–33.
principal architect of the Usability and Security Program
5. H.A. Simon, “Theories of Bounded Rationality,”
evaluating the human factors and usability of cybersecurity
Decision and Organization, C.B. McGuire and Roy
and biometric systems, and the convener of the ISO SC7/
Radner, eds., North-Holland Publishing, 1972,

Working Group 28 on usability standards. She received an
pp. 161–176.
MS in computer science from the University of Virginia.
6. A. Beautement, M.A. Sasse, and M. Wonham, “The
Contact her at
Compliance Budget: Managing Security Behaviour in
Organisations,” Proc. 2008 Workshop on New Security
Sandra Spickard Prettyman is an independent re-
Paradigms, 2008, pp. 47–58.
search consultant. She specializes in qualitative research
7. A. Adams and M.A. Sasse, “Users Are Not the En-
methods, providing expertise in designing and imple-
emy,” Comm. ACM, vol. 42, no. 12, 1999, pp. 40–46.
menting rigorous qualitative research projects. Prettyman
8. C. Herley, “So Long, and No Thanks for the Exter-
was a professor at the University of Akron, where she
nalities: The Rational Rejection of Security Advice by
taught doctoral courses in research methods and courses
Users,” Proc. 2009 Workshop on New Security Paradigms,
in social and philosophical foundations of education for
2009, pp. 133–144.
graduate and undergraduate students. Contact her at
9. K.D. Vohs et al., “Making Choices Impairs Subse-
quent Self-Control: A Limited-Resource Account of
Decision Making, Self-Regulation, and Active Initia-
Susanne Furman is a cognitive scientist in the US Na-
tive,” J. Personality and Social Psychology, vol. 94, no. 5,
tional Institute of Standards and Technology’s Visualiza-
2008, pp. 883–898.
tion and Usability Group. She works on and investigates
10. B. Oto, D. Limmer, and E.M.S. Training, “When
usability for both cybersecurity and biometric devices for
Thinking Is Hard: Managing Decision Fatigue,”
agencies such as the US Department of Homeland Se-
EMS World, vol. 41, no. 5, 2012, pp. 46–50.
curity and the Federal Bureau of Investigation. Furman
11. A. Tversky and D. Kahneman, “Availability: A Heu-
has worked at the US Department of Health and Human
ristic for Judging Frequency and Probability,” Cogni-
Services, and ran its usability program. She has a PhD
tive Psychology, vol. 5, no. 2, 1973, pp. 207–232.
in applied experimental psychology human factors from
12. S.S. Prettyman et al., “Privacy and Security in the
George Mason University. Contact her at susanne.
Brave New World: The Use of Multiple Mental Mod-
els,” Human Aspects of Information Security, Privacy, and
Trust, Springer, 2015, pp. 260–270.
13. K. Charmaz, Constructing Grounded Theory: A Practical
Guide through Qualitative Research, Sage Publications,
Brian Stanton is a cognitive scientist in the Visualiza-
tion and Usability Group at the US National Institute
of Standards and Technology. He works on the Common
Industry Format project developing usability standards

Selected CS articles and columns are available
and investigates usability and security issues ranging from

for free at
IT Pro September/October 2016