Design for Privacy 1
February 28, 2006
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 1
Outline
Brief overview and key points from readings
• Privacy issues and human-computer interaction
• A user-centric privacy space framework
Design of privacy tools
Design for privacy in everyday software
Your turn
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 2
Chapter 19: Privacy Issues and
Human-Computer Interaction
Mark S. Ackerman and Scott D.
Mainwaring
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 3
What is privacy?
Many definitions
From and HCI perspective: “Privacy is about individuals’ capabilities in a particular situation to control what they consider to be personal data.”
Key ideas relevant to HCI:
• Control
• Risk perception and risk management
• Ethical, political, and legal issues often need to be addressed when considering privacy
• Individual and context dependent
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 4
Usability engineering for privacy
Similar to usability engineering for any other type of design problem, BUT
• Privacy not the a primary task
• Individualized privacy needs
• Privacy failures can be dangerous
• There may be legal requirements related to privacy
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 5
Privacy and CSCW
Initially CSCW work ignored privacy because it was assumed that collaboration did not raise privacy issues
In 1990s CSCW researchers started to realize that their work raised lots of privacy issues
Privacy-related CSCW research
• Media space applications
• Other collaborative applications with privacy concerns
• Privacy and awareness
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 6
Individual differences w/r/t privacy
Privacy differences
• People have differing types of concerns
• People also differ in their level of concern
Approaches to addressing these differences
• Better interfaces
• Clustering users
• Adaptive systems
• Systems for training users
• User-tailorable systems
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 7
Ubicomp privacy
Sensors create privacy concerns
Privacy guidelines can help mitigate concerns
Important to design systems that limit flow of information and allow for access controls
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 8
Chapter 20: A User-Centric Privacy
Space Framework
Benjamin Brunk
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 9
Exoinformation
Information we shed as we go about our normal activities
Exoinformation = data shadow = data exhaust
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 10
Privacy Space Framework
Awareness
Detection
Prevention
Response
Recovery
Brunk, Figure 20-2 p. 414
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 11
Design of Privacy Tools
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 12
Privacy tool examples
Cookie managers
Anonymizers
Encryption tools
Disk wiping utilities
P3P user agents
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 13
Issues to consider
Privacy is a secondary task
• Users of privacy tools often seek out these tools due to their awareness of or concern about privacy
• Even so, users still want to focus on their primary tasks
Users have differing privacy concerns and needs
• One-size-fits-all interface may not work
Most users are not privacy experts
• Difficult to explain current privacy state or future privacy implications
• Difficult to explain privacy options to them
• Difficult to capture privacy needs/preferences
Many privacy tools reduce application performance, functionality, or convenience
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 14
Case study: Tor
Internet anonymity system
Allows users to send messages that cannot be traced back to them (web browsing, chat, p2p, etc.)
UI was mostly command line interface until recently
2005 Tor GUI competition
• CUPS team won phase 1 with design for
Foxtor!
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 15
One-size-doesn’t-fit-all problem
Tor is configurable and different users will want to configure it in different ways
• But most users won’t understand configuration options
• Give users choices, not dilemmas
We began by trying to understand our users
• No budget, little time, limited access to users
• So we brainstormed about their needs, tried to imagine them, and develop personas for them
This process led to realization that our users had 3 categories of privacy needs
• Basic, selective, critical
Instead of asking users to figure out complicated settings, most of our configuration involves figuring out which types of privacy needs they have
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 16
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 17
Understand primary task
Anonymity is not a primary task
What are the primary tasks our users are engaged in when they want anonymity?
Lots of them …. Web browsing, chatting, file sharing, etc., but we speculate that browsing will be most frequent for most users
So, instead of building anonymity tool that you can use to anonymize web browsing…
… build a web browser with built in anonymity functions
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 18
Metaphors
Because of performance issues and problems accessing some web sites through Tor, some users will want to turn the anonymity function on and off
Important to make it easy for users to determine current state
Communicate through visual symbol and readily understandable metaphor
Brainstormed possibilities: torized/untorized, private/exposed, cloaked/uncloaked, masked/unmasked
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 19
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 20
Next steps
Build or prototype
User studies
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 21
Design for privacy in every day software
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 22
Examples
Ecommerce personalization systems
• Concerns about use of user profiles
Software that “phones home” to fetch software updates or refresh content, report bugs, relay usage data, verify authorization keys, etc.
• Concerns that software will track and profile users
Communications software (email, IM, chat)
• Concerns about traffic monitoring, eavesdroppers
Presence systems (buddy lists, shared spaces, friend finders)
• Concerns about limiting when info is shared and with whom
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 23
Issues to consider
Similar to issues to consider for privacy tools
PLUS
Users may not be aware of privacy issues up front
• When they find out about privacy issues they may be angry or confused, especially if they view notice as inadequate or defaults as unreasonable
Users may have to give up functionality or convenience, or spend more time configuring system for better privacy
Failure to address privacy issues adequately may lead to bad press and legal action
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 24
Amazon.com privacy makeover
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 25
Streamline menu navigation for customization
Provide way to set up default rules
Every time a user makes a new purchase that they want to rate or exclude they have to edit profile info
• There should be a way to set up default rules
Exclude all purchases
Exclude all purchases shipped to my work address
Exclude all movie purchases
Exclude all purchases I had gift wrapped
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 27
Remove excluded purchases from profile
Users should be able to remove items from profile
If purchase records are needed for legal reasons, users should be able to request that they not be accessible online
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 28
Better: options for controlling recent history
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 29
Use personae
Amazon already allows users to store multiple credit cards and addresses
Why not allow users to create personae linked to each with option of keeping recommendations and history separate
(would allow easy way to separate work/home/gift personae)?
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 30
Allow users to access all privacyrelated options in one place
Currently privacy-related options are found with relevant features
Users have to be aware of features to find the options
Put them all in one place
But also leave them with relevant features
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 31
I didn’t buy it for myself
How about an “I didn’t buy it for myself” checkoff box
(perhaps automatically checked if gift wrapping is requested)
I didn’t buy it for myself
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 32
Other ideas for improving Amazon privacy interface?
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 33
Your turn
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 34
Group problems
Informing users about and configuring phone home features
Configuring release of location information in friend finder service
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 35
Phone home features
Many software products contain phone home features, for example, for performing software updates or monitoring usage patterns. In some cases software phones homes quite frequently, for example, to update phishing black lists or check for fresh image files. Users may be concerned that the software company is using these features to track or profile them. Thus it is important that the software is up front about the fact that it is phoning home. Furthermore, some users may wish to disable such features or be prompted every time before they phone home (due to privacy or other concerns), whereas other users are happy to have them operate automatically.
Discuss the various approaches you have seen different software manufacturers take to addressing this problem. What do you like/dislike about them?
How should phone home features be designed so that they facilitate informed consent? Describe an example user interface design and general principles that might be applied to specific cases.
What sort of user studies should be performed to test this user interface design?
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 36
Configuring a friend finder
New location-based services are becoming available that allow individuals to use their cell phones to keep track of the location of their friends (or kids or employees, etc.). Imagine a service that allows users to configure their phones to automatically provide their location information to some people but not others under certain conditions.
For example, location disclosure might be limited depending on time of day, the user’s location, or the person requesting the location.
Design a configuration interface to allow a user to setup the conditions under which their location will be disclosed. How will you deal with the small phone screen? What are the basic primitives that will be needed for configuration rules? How will you make it easy for users to manage large numbers of rules and understand the consequences of adding or removing rules?
What kinds of user studies would you perform to test such an interface?
Usable Privacy and Security • Carnegie Mellon University • Spring 2006 • Cranor/Hong/Reiter • http://cups.cs.cmu.edu/courses/ups-sp06/ 37