SECTEST keynote talk

advertisement
Why Usable Security Matters
and
How Testing Can Help Achieve It
SECTEST Keynote
Paul Ammann
March 31, 2014
1
Outline
•
•
•
•
•
•
A Poll
What’s wrong with usable security thinking
The consequences of unusable security
Lessons from airplane safety
The path forward
Security testing
– What we need to test
– How we need to test
2
What’s Wrong With ‘Usable Security’
Thinking?
Security
implementers
sometimes
invent the user
instead of
discovering
the user
4
Proper Focus: Fit with Users & Activity
• If you want productive & secure users
– and security is usually the secondary task
• Then you need to understand
– Primary user activities
– User motivations
– User behavior
– Impact on bottom line
5
The Consequences of Unusable Security
• Unusable Security
Costs Money
• Unusable Security
Costs Security
6
Unusable Security Costs Money
7
Standard Security Thinking:
“Users Should Make the Effort”
• Question: how much? It all adds up:
1. Time spent on security tasks: authentication,
access control, warnings, security education
….
2. Failure: time spent on errors and error
recovery (user and visible organizational cost)
3. Disruption of primary tasks = re-start cost
8
Does This Really Help Security?
9
Time is Money
“An hour from each of the US’s 180 million online
users is worth approximately US$2.5 billion. A
major error in security thinking has been to treat
users’ time—an extremely valuable resource—as
free.”
C Herley, IEEE S&P Jan/Feb 2014
10
Impact on Productivity – Lost Sales
Not a particularly effective
security measure
Not usable: failure rate
around 40% - so
customers go elsewhere
“CAPTCHAs waste 17 years
of human effort every day”
(Pogue, Scientific
American March 2012)
11
Authentication ‘Wall of Disruption’
12
Authentication Hate List
1. Why can’t I reuse my old password?
2. Repeated authentication to the same system
(e.g. because of 15 minute time-outs)
3. Authenticating to infrequently used systems
– Difficulty to recall previous password
– Password could have expired in the meantime
– Resetting a password is not easy
4. Creating a valid password (different rules for
each system)
13
Authentication Hate List
4. Managing a high number of different
credentials
– Different policies means strategies for creating &
recalling passwords don’t work
– Which credentials to use for which system
5. Use of RSA tokens
– “It's this extra, again, effortful stuff. I have to dig
around in my bag and get the RSA ID token out and
then set it on my laptop and then type out the
number, make sure that you're not typing it right
before changes or as it's changing or whatever.”
14
Impact on Productivity – Long-Term
1. User opt out of services, return devices
– Improves their productivity, but often reduces
organizational productivity (example: email)
– Organization has less control over alternatives
2. Stifling innovation: new opportunities that
would require changes in security
3. Staff leaving organization to be more
productive/creative elsewhere
15
Unusable Security is Ridiculous …
16
One Alternative
17
Technology Should be Smarter than This
• Move from explicit to implicit authentication:
1. Proximity sensors to detect user presence
2. Behavioral biometrics: zero-effort, one-step,
two-factor authentication
3. Exploit modality of interaction: use videobased authentication in video, audio in
audio, etc.
4. Web fingerprinting can identify users – why
not use it for good?
18
‘Green shoots’ 1 – FIDO
a commercial alliance to
replace passwords …
www.fido.org
19
‘Green shoots’ 2:
Security that supports user goals: Parental controls
Apparently parents didn’t much care
But business users loved it!
PayPhrase discontinued in February 2012
“Purchase Delegation” introduced for business users20
The Consequences of Unusable Security
• Unusable Security
Costs Money
• Unusable Security
Costs Security
21
Unusable Security Costs Security!
1. User errors - even when trying to be secure
2. Non-compliance/workarounds to get tasks done
3. Security policies that cannot be followed make
effort seem futile:
“It creates a sense of paranoia and fear, which makes some
people throw up their hands and say, “there’s nothing to be done
about security,” and then totally ignore it.”
Expert Round Table IEEE S&P Jan/Feb 2014
22
User Errors When Trying to be Secure
• Document
redaction prone
to error
• Is the document
really free of
confidential data?
• If not:
– Blame the user?
– Or look deeper?
23
Noncompliance
Are these legitimate users?
24
You Can Only Ask For So Much
25
Reasons For Non-Compliance
• Compliance requires ability and willingness
Can’t comply
Security tasks that are impossible to complete
– remove/redesign (security hygiene)
Could comply but won’t comply
The cost of security tasks that can be
completed in theory, but require a high level
of effort and/or reduce productivity. Identify
& reduce friction through better design or
better policies
Can comply and do comply
Security tasks that staff routinely comply
with – provides examples of what is
workable in a particular environment =
26
template for security
Revocation
• Usability and revocation
• Who identifies unneeded privileges?
– Manager? Employee?
– Answer says a lot about the organization
• Demo environment vs. actual practice
– “How does that work with 1000 privileges?”
27
Old Security, No Longer Usable
• Entering a complex
password on
touchscreen keyboard
time-consuming and
error-prone
• users look for passwords
that are easy to enter 
severely reduced
password space
28
New Security, Unusable Implementation
• Replacing existing 2FA
card with a more secure
one – good
• Replacing 6-digit
numeric code with 8char alphanumeric
password (valid for 1
minute) – not good
29
Impact on Security – Long-Term
1. Increased likelihood of security breaches
2. ‘Noise' created by habitual non-compliance
makes malicious behavior harder to detect
3. Lack of appreciation of and respect for security
creates a bad security culture
4. Frustration can lead to disgruntlement:
intentional malicious behavior - insider attacks,
sabotage
30
Lessons From Airplane Safety
• April 26, 1994, Nagoya Japan
• Sequence of events on landing:
– F/O inadvertently entered GO
AROUND mode
– Subsequent crew actions led to stall
– Crash killed 264 of 271 on board
• Official cause
– Crew error
• But usability played a key role
– Mental model of crew diverged from
actual airplane state
– Crew actions reasonable in a different
airplane state
• FAA and Aircraft manufacturers have
learned from this!
31
Analyzing Aircraft Accidents
• Typical report, as paraphrased by Norman
Air Force: It was pilot error—the pilot failed to take corrective action.
Inspector General: That’s because the pilot was probably unconscious.
Air Force: So you agree, the pilot failed to correct the problem.
• There is a similar attitude in security
– Fact: Users don’t do what they are supposed to
– Question: Is it their fault?
• Can we learn from aircraft designers?
– They have a multi-decade head start on us!
32
The Path Forward?
33
First Things First: There Has to be a
Reason for System Developers to Care
• Some organizations don’t care about
usability or usable security
– Not much to do there
– Dangerous invitation to competitors!
• Some do care
Q: How to make it happen?
A: High-level commitment
A: Feedback loops
A: Appropriate personnel
34
Models Have to Include the User
• Modern aircraft design has critical role for
human factors
– The same needs to happen in security
• Security is a secondary task
– Users are trying to do something else
• Modeling human behavior is critical
– The user is part of the system
– We need to understand how things go wrong
35
Notions from Fault Tolerance
• Error management can be viewed in terms of
– Error avoidance
– Error detection
– Error recovery
• Software engineering is well-acquainted with
this approach already!
– But we haven’t really applied it to security
36
Classification of Errors (Norman)
• Slips
– goal is correct
– but execution is flawed
• Mistakes
– goal is wrong
Violations are not errors
This looks a lot like mutation analysis!
37
What We Need To Test
• Security mechanisms in practice
– What actually happens?
– What do users actually do?
• Models that incorporate user behavior
– Can we assess how a given system behaves under
various profiles of human behavior?
• Let’s look at two examples:
– Warnings
– Spear Phishing
38
The Problem With Warnings
• Fact: PDF files are dangerous.
– That’s a usability problem!
– Is a generic warning helpful? Why not?
– Is a detailed warning better?
39
The Problem With Warnings (2)
• How are users supposed to react?
– Standard advice: Only download “trusted” pdf
– “Please phish me!”
• The user has no way to accurately assess risk
– Hence, the only rational action is to give up
– Which is exactly what users do
• Study results (Krol et al)
– Very high “ignore” rate
– Risk actually higher for “expert” users
• “Expert” users didn’t understand PDF risks
40
Gone Phishing
• Goal: Train motivated users to avoid spear-phishing
– Send out a bespoke, but fake, phish
– Present “clickers” with training
• Simple notification: “You’ve been spear-phished”
vs.
• Deep training: “Here’s how to recognize an attack”
• Study results (Caputo et al)
– Deep training doesn’t work
• “Clickers” panic
• And immediately close all windows!
• No one reads the deep training!
• Need to study actual users!!!
41
Skill Set for Phishing
• Assess targeting phish
– Not too obvious, not too “real”
• Looking for more than simple hypothesis test
– Case study design critical
• Analysis of unstructured data
– Why did “clickers” fail to benefit?
• These are not the typical CS skillset!
42
How We Need To Test
• Testing human behavior
– Requires experts in human behavior!
– “Security Testing” needs to be interdisciplinary
• We need to understand different approaches to
analyzing systems
– Case studies that actually are case studies
– Rigorous understanding of what works
• and what doesn’t
43
One Possible Model
• Lots of software is “user tested” by fielding
– Google does this all the time
– Collects data on usage
– Makes decisions about products based on facts
• Security is a bit different
– Harder to monitor difficulty with mechanisms
– Impossible to monitor non compliance
– But there is still a lot of data to analyze
44
Questions?
• Contact:
– pammann@gmu.edu
• Acknowledgements:
– Angela Sasse has taught me a lot about usable security
and shared slides generously!
• Further reading
• Adams and Sasse: Users are not the enemy (CACM 1999)
• Krol et al.: Rethinking security warnings (7th CRiSIS 2012)
• Caputo et al.: Going spear phishing (S&P magazine Jan/Feb
2014)
• Herley: More is not the answer (S&P magazine Jan/Feb 2014)
• Norman: The Design of Everyday Things (latest 2013)
45
Download