Design for Privacy February 20, 2007 1

advertisement
Design for Privacy
February 20, 2007
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
1
Outline
Engineering privacy
Design of privacy tools
Design for privacy in everyday software
Obtaining informed consent
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
2
Engineering privacy
Layer 1 responsibility: Control of personal data collected
User Privacy Concerns
Data Recipient
inflow
outflow
internal
unauthorized
2nd use
external
unauthorized
2nd use
improper
access
errors
reduced
judgments
combining
data
unauthorized
collection
unauthorized
execution
exposure
attention/
inflow of data
Service Edge
Network Edge
Client Side
Layer 2 responsibility: Access control
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
3
external parties:
government/
litigation related parties
peers
content/service
provider
3rd party
3rd party
access
provider
primary User
3rd party
secondary
user
3rd party
application/
system provider
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
4
non-identified
data
collection
Privacy by
Architecture
Privacy by
policy
identified
data
collection
network centric
architecture
client centric
architecture
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
5
Privacy
stages
0
identifiability
identified
Approach
to privacy
protection
privacy
by
policy
(notice and
choice)
1
Linkability
of data to
personal
identifiers
linked
• unique identifiers across databases
• contact information stored with profile information
linkable with
reasonable &
automatable
effort
• no unique identifies across databases
• common attributes across databases
• contact information stored separately from profile
or transaction information
not linkable
with
reasonable
effort
• no unique identifiers across databases
• no common attributes across databases
• random identifiers
• contact information stored separately
from profile or transaction information
• collection of long term person characteristics on a
low level of granularity
• technically enforced deletion of profile details at
regular intervals
unlinkable
• no collection of contact information
• no collection of long term person characteristics
• k-anonymity with large value of k
pseudonymous
2
privacy
by
architecture
3
anonymous
System Characteristics
6
Design of Privacy Tools
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
7
Privacy tool examples
Cookie managers
Anonymizers
Encryption tools
Disk wiping utilities
P3P user agents
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
8
Issues to consider
 Privacy is a secondary task
• Users of privacy tools often seek out these tools due to
their awareness of or concern about privacy
• Even so, users still want to focus on their primary tasks
 Users have differing privacy concerns and needs
• One-size-fits-all interface may not work
 Most users are not privacy experts
• Difficult to explain current privacy state or future
privacy implications
• Difficult to explain privacy options to them
• Difficult to capture privacy needs/preferences
 Many privacy tools reduce application
performance, functionality, or convenience
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
9
Case study: Tor
Internet anonymity system
Allows users to send messages that cannot
be traced back to them (web browsing,
chat, p2p, etc.)
UI was mostly command line interface until
recently
2005 Tor GUI competition
• CUPS team won phase 1 with design for
Foxtor!
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
10
One-size-doesn’t-fit-all problem
 Tor is configurable and different users will want to
configure it in different ways
• But most users won’t understand configuration options
• Give users choices, not dilemmas
 We began by trying to understand our users
• No budget, little time, limited access to users
• So we brainstormed about their needs, tried to imagine them, and
develop personas for them
 This process led to realization that our users had 3
categories of privacy needs
• Basic, selective, critical
 Instead of asking users to figure out complicated settings,
most of our configuration involves figuring out which types
of privacy needs they have
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
11
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
12
Understand primary task
 Anonymity is not a primary task
 What are the primary tasks our users are
engaged in when they want anonymity?
 Lots of them …. Web browsing, chatting, file
sharing, etc., but we speculate that browsing will
be most frequent for most users
 So, instead of building anonymity tool that you
can use to anonymize web browsing…
 … build a web browser with built in anonymity
functions
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
13
Metaphors
 Because of performance issues and problems
accessing some web sites through Tor, some
users will want to turn the anonymity function on
and off
 Important to make it easy for users to determine
current state
 Communicate through visual symbol and readily
understandable metaphor
 Brainstormed possibilities: torized/untorized,
private/exposed, cloaked/uncloaked,
masked/unmasked
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
14
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
15
Design for privacy in every day
software
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
16
Examples
 Ecommerce personalization systems
• Concerns about use of user profiles
 Software that “phones home” to fetch software
updates or refresh content, report bugs, relay
usage data, verify authorization keys, etc.
• Concerns that software will track and profile users
 Communications software (email, IM, chat)
• Concerns about traffic monitoring, eavesdroppers
 Presence systems (buddy lists, shared spaces,
friend finders)
• Concerns about limiting when info is shared and with
whom
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
17
Issues to consider
 Similar to issues to consider for privacy tools
PLUS
 Users may not be aware of privacy issues up
front
• When they find out about privacy issues they may be
angry or confused, especially if they view notice as
inadequate or defaults as unreasonable
 Users may have to give up functionality or
convenience, or spend more time configuring
system for better privacy
 Failure to address privacy issues adequately may
lead to bad press and legal action
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
18
Amazon.com privacy makeover
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
19
Streamline menu navigation for
customization
Provide way to set up default rules
Every time a user makes a new purchase
that they want to rate or exclude they have
to edit profile info
• There should be a way to set up default rules
 Exclude all purchases
 Exclude all purchases shipped to my work address
 Exclude all movie purchases
 Exclude all purchases I had gift wrapped
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
21
Remove excluded purchases from
profile
 Users should be able to
remove items from profile
 If purchase records are
needed for legal reasons,
users should be able to
request that they not be
accessible online
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
22
Better: options for controlling recent
history
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
23
Use personae
Amazon already allows users to store
multiple credit cards and addresses
Why not allow users to create personae
linked to each with option of keeping
recommendations and history separate
(would allow easy way to separate
work/home/gift personae)?
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
24
Allow users to access all privacyrelated options in one place
Currently privacy-related options are found
with relevant features
Users have to be aware of features to find
the options
Put them all in one place
But also leave them with relevant features
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
25
I didn’t buy it for myself
How about an “I
didn’t buy it for
myself” checkoff box
(perhaps
automatically
checked if gift
wrapping is
requested)
I didn’t buy it
for myself
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
26
Other ideas for improving Amazon
privacy interface?
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
27
Obtaining informed consent
 Many software products contain phone home features, for example,
for performing software updates or monitoring usage patterns. In
some cases software phones homes quite frequently, for example, to
update phishing black lists or check for fresh image files. Users may
be concerned that the software company is using these features to
track or profile them. Thus it is important that the software is up front
about the fact that it is phoning home. Furthermore, some users may
wish to disable such features or be prompted every time before they
phone home (due to privacy or other concerns), whereas other users
are happy to have them operate automatically.
 Discuss the various approaches you have seen different software
manufacturers take to addressing this problem. What do you
like/dislike about them?
 How should phone home features be designed so that they facilitate
informed consent? Describe an example user interface design and
general principles that might be applied to specific cases.
 What sort of user studies should be performed to test this user
interface design?
Usable Privacy and Security • Carnegie Mellon University • Spring 2007 • Cranor/Hong • http://cups.cs.cmu.edu/courses/ups-sp06/
28
Download