slides

advertisement
Secure Interaction Design
Cynthia Kuo
1
Overview
• Describe project on Wi-Fi access point
configuration
• Show mockups and design process for
Google Safe Browsing
• Talk about how you can design for
security
2
Overview
• Describe project on Wi-Fi access point
configuration
• Show mockups and design process for
Google Safe Browsing
• Talk about how you can design for
security
3
Wi-Fi
• Also known as 802.11 a/b/g
• October 2006: 4 million units shipped
each week
4
Going Back a Few Years…
• Returns
– ~30% return rate
• Technical Support
– 12 – 20 minutes / call
– ~10% of sales 
technical support call
Rough Estimate
• $50 / hr technical
support
• 15 minute call =
$12.50
• $10 materials 
$2 profit / unit
(assume 20%)
• 1 call = profit from
unit + 5 other units!
5
Research Question
• Why don’t users configure their wireless
networks securely?
– Cannot?
– Choose not to?
– Don’t know to?
6
Traditional Solution
• “Layer” different study techniques
– Interviews: assess values, thought
processes, and level of security knowledge
– Surveys
– Contextual inquiry: observe users
– Usability study: evaluate features
7
Why Not?
• Evaluation of configuration process
must be holistic
– One user study method will not provide
insight into entire process to pinpoint
problems
– Security is not a primary task
• Takes a long time!
– Number of qualified users may be small
8
Designing a User Study
• How do we evaluate a system where the
end goal may be different for every user?
• How can we ask about security concepts
(e.g., encryption) if we don’t know whether
users know what they are?
• People know that they’re supposed to care
about security. How do we design a study
without social acceptability bias?
9
Assumptions
• Textbook study methods make
assumptions that may not hold for
security software
10
Common Assumptions
1. Clear-cut criteria for success
–
Good security is risk management
2. Multiple ways to reach end result
–
No “undo” for some security breaches
3. Familiarity with underlying concepts
–
Task list may unintentionally provide information
4. Tasks are primary goals
–
No one wants to “do” security
5. Users respond without bias
–
Social acceptability biases
Kuo, Perrig, and Walker, ACM <interactions>, May + June 2006
11
Configuration Process
12
Evaluation Methodology
Evaluating the
Whole Process
• What do people know
about wireless
security?
• What security issues do
people care about?
• If users are aware of
the security issues and
care about them, are
users able to configure
the access points?
Target Home User
• Uses laptop as primary
computer
• Has broadband connectivity
at home
• Uses wireless on a daily
basis (5+ times/week)
Study Design
•
•
•
•
•
Interview (25 min)
Questionnaire (5 min)
Tasks (45 min)
Questionnaire (5 min)
Debriefing (10 min)
13
Interview: Broadcasting?
14
Questionnaire
• Opinions & concerns
–
–
–
–
Availability
Reliability
Connection speed
Ease of use
–
–
–
–
Open networks
Security
Privacy
Health
15
Experimental Setup
• Gradual
revelation
• User task
– Set up access
point for home
– Explain
motivation &
understanding of
possible
consequences
Scenario
Okay, let’s pretend you
just received this 802.11
access point as a gift.
You would like to set up
and use a wireless
network at home today.
Just set up the access
point as you would if you
were at home.
16
Findings
• Users are reasonably knowledgeable
about wireless technologies
• …but have difficulty translating that
knowledge into security policies and
feature configurations
• Novice users perform significantly
worse than expert users
– Expanding market  novice users
17
What Does that Mean for Products?
18
Goal-Based Design
• Can “level the playing field” between
novice and expert users
– Start from human goals, not technical
features
– Do not assume people are familiar with
technical terms or particular technologies
– Anticipate common error states
– Minimize time & human effort required
19
Prototype Design
20
Results
21
Lessons
• More than one user study method may
be needed to evaluate your problem
• Watch out for assumptions in your user
study methods
• Adapt existing methods for your needs
22
Overview
• Describe project on Wi-Fi access point
configuration
• Show mockups and design process for
Google Safe Browsing
• Talk about how you can design for
security
23
Google Safe Browsing
• Anti-phishing alert
• Part of Google Toolbar for Firefox
• http://www.google.com/tools/firefox/safe
browsing/index.html
24
Maps Bubble
• Warning bubble and
icon used to appear
trustworthy
• Gray background to
emphasize danger
and to catch attention
• Bubble attached to
browser chrome to
convey message
origin
• Active elements on
page disabled
25
Lessons
• Establish trustworthiness of message
– Origin
– Authority
• Match intrusiveness to severity
– No false positives
• Recommend what actions to take
• Provide a feeling of closure
26
Overview
• Describe project on Wi-Fi access point
configuration
• Show mockups and design process for
Google Safe Browsing
• Talk about how you can design for
security
27
Design for Security
• Think like your user
– Use personas
• Stop thinking like yourself
– Design for your personas
• User test, user test, user test
– Watch your users
– Don’t always believe what they say
28
Think Like Your User
• Personas
– A precise description of your user and what
s/he wants to accomplish
– Make up archetypical users
• More specific is better!
– Design for these users
• You may have primary and secondary
personas
• 3 - 12
Cooper (1999)
29
Example Persona
Dan
Dan is a 46-year old sales executive for a
sports magazine. He has never heard of
encryption, Diffie-Hellman, or EKE. Dan
sent 38 emails from his Blackberry 8700c
yesterday. He travels 50% of the time to
meet with clients all over the East Coast.
Using his IBM T41 laptop, he checks his
email from different hotels – he prefers
Wyndham - every night. Dan often needs to
download sensitive documents that contain
his company’s business strategies. After 10
hours of meetings during the day, Dan does
not want to spend any time configuring
anything. Dan likes to play basketball in his
spare time.
30
Stop Thinking Like Yourself
• You are probably not the typical user
• Your user does not think like you
• Your user probably does not know as
much as you do (about security in
general and especially your product)
• Your user is not dumb, but will almost
always make mistakes
31
Common Mistake #1:
Thinking Like an Engineer
No!
Dan doesn’t know what L2TP is and he doesn’t ever want to.
“The user might want to disable L2TP Passthrough.”
32
Common Mistake #2:
Focusing on Tasks & Features, Not Goals
• Users’ Goals
–
–
–
–
• False Goals
Not feel stupid
Not make mistakes
Get work done
Have fun (or at least
not be too bored)
– Save memory
– Run in a browser
– Safeguard data
integrity
– Increase programexecution efficiency
– Use cool technology
or features
Cooper, Alan. The Inmates are Running the Asylum. Sams, 1999.
33
Software Evaluation
• Inexpensive, “discount” methods
– Low-fidelity
– Cognitive walkthrough
– Heuristic evaluation
• Expensive
– Formal models (e.g., GOMS)
– Formal experiment
34
Discount Methods: Predictive?
# Problems that
Did Occur
Lab
# Problems that Could
Potentially Occur
25
29
11 (44%)
4 (16%)
2 (8%)
9 (31%)
7 (24%)
1 (3%)
7 (28%)
4 (16%)
2 (8%)
9 (31%)
6 (21%)
2 (7%)
Heuristic Evaluation
Experts
System Designers
Non-experts
Cognitive Walkthrough
Experts
System Designers
Non-experts
Desurvire, Kondziela, Atwood (1992)
35
Common Mistake #3:
Listening to One Person
• “A customer said we should…”
• 80% rule
• Feature creep
36
Lessons
• Think like your user
• Stop thinking like yourself
• User test, user test, user test
– Be careful about what information you use
37
Thank you!
Questions? Comments?
cykuo@cmu.edu
Download