Design for Privacy II 05-899 / 17-500 Usable Privacy and Security

advertisement
Design for Privacy II
05-899 / 17-500 Usable Privacy and Security
Rachel Shipman
March 2nd, 2006
Agenda
• Chapter 21: Five Pitfalls in the Design for
Privacy
–
–
–
–
Introduction
What is Faces?
Design and Evaluation
5 pitfalls
• Description
• Example of falling into the pitfall
• Example of avoiding the pitfall
– Mental models and information flow
• Group Activity
– Applying what we’ve learned about the pitfalls
7/17/2016
Rachel Shipman - Design for Privacy II
2
Chapter 21: Five Pitfalls in the Design for
Privacy
Introduction
• Why is it difficult to design privacysensitive systems?
• “No definition of privacy is possible
because privacy issues are
fundamentally matters of values,
interests and power”
– Alan F. Westin, legal and policy scholar
7/17/2016
• Even without a clear definition, system
designers are pressured to design
systems that support user’s privacy
Rachel Shipman - Design for Privacy II
needs
3
Chapter 21: Five Pitfalls in the Design for
Privacy
Introduction
• Privacy-affecting = any interactive
system whose use has personal privacy
implications
• Privacy-sensitive = any privacyaffecting system that reasonably avoids
invading or disrupting personal privacy
• By avoiding the five pitfalls…
– Designers can minimize the number of
privacy-affecting systems that are not
privacy-sensitive
7/17/2016
Rachel Shipman - Design for Privacy II
4
Chapter 21: Five Pitfalls in the Design for
Privacy
What are the Pitfalls?
• Understanding Privacy
Implications
– Obscuring potential information flow (1)
– Obscuring actual information flow (2)
• Socially Meaningful Action
– Emphasizing configuration over action (3)
– Lacking coarse-grained control (4)
– Inhibiting established practice (5)
7/17/2016
Rachel Shipman - Design for Privacy II
5
Chapter 21: Five Pitfalls in the Design for
Privacy
What is Faces?
• Faces is a user interface
prototype for managing
personal privacy in ubiquitous
computing settings
• What is ubicomp?
wikipedia: ubicomp integrates
computation into the
environment, rather than
having computers which are
distinct objects.
Mark Weiser: “…ubiquitous
computing, or the age of calm
technology, when technology
recedes into the background of
our lives”
7/17/2016
Rachel Shipman - Design for Privacy II
6
Chapter 21: Five Pitfalls in the Design for
Privacy
What is Faces? (cont’d)
• Problem: ubicomp complicates user
interaction with the system
– Users can be unaware of / unable to
influence disclosure of personal information
as they go about their everyday routines
• Answer: Faces
– Support disclosure preferences (who can
see what, when)
– Provide feedback about past disclosures
(so that users can iteratively refine
disclosure preferences)
7/17/2016
Rachel Shipman - Design for Privacy II
7
Chapter 21: Five Pitfalls in the Design for
Privacy
Design and Evaluation of Faces
• Requirements gathering
– Literature review
– Interviewed 12 local residents, surveyed
130 people
– Iterated through series of low-fidelity
prototypes
• Findings
– Primary determinant of privacy preferences
is who (inquirer)
– disclosure situation is also important
7/17/2016
Rachel Shipman - Design for Privacy II
8
Chapter 21: Five Pitfalls in the Design for
Privacy
Design and Evaluation of Faces (cont’d)
• Different disclosure
preferences for different
inquirers
• Optionally add situation
parameter
• Each disclosure
preference can be
associated with a face
• “If this inquirer wants info
when I’m in this situation,
show her this face”
7/17/2016
Rachel Shipman - Design for Privacy II
9
Chapter 21: Five Pitfalls in the Design for
Privacy
Goffman’s Self-Presentation Theory
• Erving Goffman, 1956 publication The
Presentation of Self in Everyday Life
• We are all performers, viewed by others as
characters exhibiting qualities our performance
was designed to invoke
• “The self, then, as a performed character, is
not an organic thing that has a specific
location, whose fundamental fate is to be born,
mature, and to die; it is a dramatic effect
arising diffusely from a scene that is
presented, and the characteristic issue, the
crucial concern, is whether it will be credited or
discredited”
7/17/2016
Rachel Shipman - Design for Privacy II
10
Chapter 21: Five Pitfalls in the Design for
Privacy
Design and Evaluation of Faces (cont’d)
7/17/2016
Rachel Shipman - Design for Privacy II
11
Chapter 21: Five Pitfalls in the Design for
Privacy
Design and Evaluation of Faces (cont’d)
• Formative evaluation
– 5 participants
– Introduction & tutorial
– Configure disclosure preferences for 2
inquirers, 2 situations of their choice
• Configuration settings did not match
stated preferences during scenario
• Could not remember chosen precision
settings for each face
7/17/2016
Rachel Shipman - Design for Privacy II
12
Chapter 21: Five Pitfalls in the Design for
Privacy
The pitfalls concerning understanding
• 2 pitfalls concerning understanding
privacy implications
• Illumination of
– The systems potential for information
disclosure
– The actual disclosures made by the system
• Build user’s comprehension of system
scope, utility and usage implications
7/17/2016
Rachel Shipman - Design for Privacy II
13
Chapter 21: Five Pitfalls in the Design for Privacy
Pitfall 1: Obscuring potential information flow
• Systems should clearly communicate the
nature and extent of their potential for
information disclosure
–
–
–
–
–
–
–
Types of information
Kinds of observers
Media through which it is conveyed
Length of retention
Potential for unintentional disclosure
Presence of third party observers
Collection of meta-information (“exoinformation” –
Benjamin Brunk)
• Making scope clear helps users understand
capabilities and limits of the system
7/17/2016
Rachel Shipman - Design for Privacy II
14
Chapter 21: Five Pitfalls in the Design for
Privacy
Pitfall 1: Falling in…
• MS Internet Explorer
• Eldercare facility using
transponder badges
• California school system
that placed RFID tags in
student ID cards
• Anyone use Campus
XPress?
7/17/2016
Rachel Shipman - Design for Privacy II
15
Chapter 21: Five Pitfalls in the Design for
Privacy
Pitfall 1: Falling in… (cont’d)
• Does the Tor
installation wizard fall
into or avoid pitfall 1?
• Do the options clearly
communicate the
nature and extent of
their potential for
information
disclosure?
7/17/2016
Rachel Shipman - Design for Privacy II
16
Chapter 21: Five Pitfalls in the Design for
Privacy
Pitfall 1: Avoiding the pitfall…
• Clear statements
regarding sharing
email addresses
with third parties
on website
account creation
• Do major sites
follow this
example?
7/17/2016
Rachel Shipman - Design for Privacy II
17
Chapter 21: Five Pitfalls in the Design for Privacy
Pitfall 2: Obscuring actual information flow
• Designs should make clear the
actual disclosure of information
through the system
• Disclosure should be obvious to
the user as it occurs or within a
reasonable delay
• Provide sufficient feedback to
inform but not overwhelm the user
7/17/2016
Rachel Shipman - Design for Privacy II
18
Chapter 21: Five Pitfalls in the Design for
Privacy
Pitfall 2: Falling in…
• Kazaa (remember
chapter 33?)
• Web browsers that do
not indicate when a
cookie is set / what
information is
disclosed through its
use
• Does Faces fall into
this pitfall?
7/17/2016
Rachel Shipman - Design for Privacy II
19
Chapter 21: Five Pitfalls in the Design for
Privacy
Pitfall 2: Avoiding the pitfall…
• Mozilla web browser;
prominent, real-time
feedback about cookie
placement and
characteristics
– Don’t see it in Firefox
• IM systems that inform
user if someone adds
them to contact list
7/17/2016
Rachel Shipman - Design for Privacy II
20
Chapter 21: Five Pitfalls in the Design for
Privacy
The pitfalls concerning action
• 3 pitfalls concerning user’s ability to
conduct socially meaningful action
– Everyday privacy manipulation occurs
through subtle manipulation of coarse
controls
– Observers discern meaning from actions by
accumulating evidence across media
• Help users intuitively shape this
evidence to influence social
consequences of behavior
7/17/2016
Rachel Shipman - Design for Privacy II
21
Chapter 21: Five Pitfalls in the Design for
Privacy
Pitfall 3: Emphasizing config over action
• Design should not require excessive
configuration to create and maintain privacy
• “…a fine and shifting line between privacy and
publicity exists, and it is dependent on social
context, intention, and fine-grained
coordination between action and the
disclosure of that action.”
– Palen and Dourish, Unpacking “Privacy” for a
Networked World (2003)
• Because configuration has become a universal
UI design pattern, many systems fall in to the
configuration pitfall
7/17/2016
Rachel Shipman - Design for Privacy II
22
Chapter 21: Five Pitfalls in the Design for
Privacy
Pitfall 3: Falling in…
• Faces prototype (as illustrated in
usability evaluation)
• Kazaa and other P2P file sharing
systems (chapter 33)
• E-mail encryption software
(chapter 34)
• Most software applications, both in
and out of the ubicomp realm
7/17/2016
Rachel Shipman - Design for Privacy II
23
Chapter 21: Five Pitfalls in the Design for
Privacy
Pitfall 3: Avoiding the pitfall…
• Social networking tools
– Friendster
– Tribe
– Dodgeball
• Badge / smart card systems
– In/Out board at Georgia Tech
– Smart cards to allow receptionist limited
access to calendar for scheduling
meetings, grant meeting partners access to
location before meeting to infer arrival time
• Cadiz and Gupta, Privacy Interfaces for
Collaboration (2001)
7/17/2016
Rachel Shipman - Design for Privacy II
24
Chapter 21: Five Pitfalls in the Design for
Privacy
Pitfall 4: Lacking coarse-grained control
• Designs should offer an obvious
top-level mechanism for halting
and resuming information
disclosure
7/17/2016
– Users are accustomed to turning
something off when they want it to
stop
– Simple power / exit button will do
– Ordinal controls are another
Rachel Shipman - Design for Privacy II
possibility
25
Chapter 21: Five Pitfalls in the Design for
Privacy
Pitfall 4: Falling in…
• Web sites that maintain
shopping histories –
could be resolved by
adding an “I didn’t buy it
for myself” option…
– Cranor, I Didn’t Buy It
for Myself (2003)
I didn’t buy it
for myself
7/17/2016
• Buried privacy controls
in most web browsers,
no option to switch
between “normal” policy
and “block all cookies”
policy
Rachel Shipman - Design for Privacy II
26
Chapter 21: Five Pitfalls in the Design for
Privacy
Pitfall 4: Avoiding the pitfall…
• Anything with an obvious halting
and resuming disclosure
– Lens cap on a camera
– Power button on cell phone
– IM systems with invisible modes
– Faces prototype
7/17/2016
Rachel Shipman - Design for Privacy II
27
Chapter 21: Five Pitfalls in the Design for
Privacy
Pitfall 5: Inhibiting established practice
7/17/2016
• Privacy is managed through a
range of established, nuanced
practices
• By supporting roles, expectations,
and practices already used in
target context…
• Designs accommodate user’s
natural efforts to transfer existing
skills
to new media
Rachel Shipman - Design for Privacy II
28
Chapter 21: Five Pitfalls in the Design for
Privacy
Pitfall 5: Falling in…
• Gmail’s content-triggered
advertising
– Inconsistent with longestablished expectation that
mail is for eyes of sender
and recipient only
– The issue is not whether
private information is
disclosed, it is with user
expectation of mail
provider’s behavior
7/17/2016
Rachel Shipman - Design for Privacy II
29
Chapter 21: Five Pitfalls in the Design for
Privacy
Pitfall 5: Avoiding the pitfall…
• Cell phones and IM tools
– Users can ignore requests without having
to explain why
• False data entry on web sites
– as discussed in Tuesday’s lecture
• Tribe.net
– Partition your social network into tribes so
that groups remain connected to greater
network but still individually represented
7/17/2016
Rachel Shipman - Design for Privacy II
30
Chapter 21: Five Pitfalls in the Design for
Privacy
Mental Models and Information Flow
Norman: designer’s
goal is to design the
system image such
that user’s mental
model of system
operation coincides
with designer’s
mental model
7/17/2016
Rachel Shipman - Design for Privacy II
31
Chapter 21: Five Pitfalls in the Design for
Privacy
Mental Models and Information Flow
Building on Norman’s
role of mental models
in the design process,
designers of privacyaffecting applications
can aim to harmonize
the user’s and
observer’s
understandings of the
user’s personal
information
disclosures
7/17/2016
Rachel Shipman - Design for Privacy II
32
Group Activity!
• Come up with your own
scenario/example that falls into
one of the design pitfalls
• How would you fix it?
7/17/2016
Rachel Shipman - Design for Privacy II
33
Group Activity: Example
• Signing in to Gmail also signs me into
Google Talk
– There are times when I don’t want to show
up on Talk, I just want to check email
• Falls into pitfall 5 because: Don’t expect
to be signed into chat when I’m logging
into web mail
• Possible solution: Add an invisible
mode to Google Talk and make that the
default mode anytime I log into Gmail
7/17/2016
Rachel Shipman - Design for Privacy II
34
Now you do it!
• Break into teams of 2-3
• I will assign each team a pitfall
• 15 minutes to work, then present
your example and solution to the
class
7/17/2016
Rachel Shipman - Design for Privacy II
35
Bibliography
Lederer S, Hong JI, Dey AK, Landay, JA (2004) Personal privacy through
understanding and action: five pitfalls for designers. Published online: 16
September 2004
Cranor, LF (2004) I didn’t buy it for myself: privacy and ecommerce personalization.
Kluwear Academic Publishers
Erving Goffman (1956) The Presentation of Self in Everyday Life. Doubleday, New
York
Palen L, Dourish P (2003) Unpacking “privacy” for a networked world. In:
Proceedings of the CHI 2003 conference on human factors in computing
systems
Norman, DA (1988) The design of everyday things. Basic Books, New York
Westin, A (1995) Privacy in America: an historical and socio-political analysis. In:
Proceedings of the national privacy and public policy symposium, 1995
Weiser, M (1996) Presentation on Ubiquitous Computing found here
http://www.ubiq.com/hypertext/weiser/NomadicInteractive/
Brunk, B (2005) A User-Centric Privacy Space Framework (Pages 401-420 of
Security and Usability). O’Reilly Media, Inc.
Cadiz J, Gupta A (2001) Privacy Interfaces for collaboration. Technical report MSRTR-2001-82, MicrosoftRachel
Corporation,
Redmond, Washington
7/17/2016
Shipman - Design for Privacy II
36
Download