Privacy and Mobile Ubiquitous Computing A lecture of sorts by Travis Christian

advertisement
Privacy and Mobile
Ubiquitous Computing
A lecture of sorts by Travis Christian
Agenda

Defining some terms

What's the big deal?

From the user's perspective

Privacy concepts

Case studies

Design guidelines

Conclusions
Definitions

Privacy:

“The ability of individuals to control the
terms under which their personal
information is acquired and used.”


Mary J. Culnan, “Protecting Privacy Online: Is Self-Regulation Working?”
Journal of Public Policy and Marketing 19:1 (2000), 20–26.
“The right to be let alone”

Samuel Warren and Louis D. Brandeis
Definitions

Ubiquitous Computing

“third wave in computing, just now beginning […]
or the age of calm technology, when technology
recedes into the background of our lives.”
- Mark Weiser, http://sandbox.xerox.com/ubicomp/

“ubicomp”

“pervasive computing”

“everyware”

“Computing without computers, where information
processing has diffused into everyday life, and
virtually disappeared from view.”

Adam Greenfield, Everyware
What's the big deal?

UbiComp 11th Internation Conference

http://www.ubicomp.org/ubicomp2009/

Locaccino, Google Latitude

Project Oxygen (MIT)

RFID

Future: smart homes, wearable computers,
embedded devices.... ???
What's the big deal?

With all of these potentials come privacy
risks.

Location → tracking

Aggregation → activity inference

Networking → data farming

Complexity → lack of understanding

Easy to forget → no informed consent

(Chapter 19: Privacy Issues and Human-Computer Interaction)
From the user's perspective


How they view privacy

Research on privacy configuration shows
that most do not customize settings

It's important, but not a primary task
Different concerns

Unauthorized access (security breach)

Sharing without consent

Collection of personal data

Inability to correct errors

(Chapter 19: Privacy Issues and Human-Computer Interaction)
From the user's perspective

3 consistently observed levels of concern

Marginal (indifferent)

Fundamentalist (uncompromising)

Pragmatic (will make tradeoffs)


Majority across many studies
(Chapter 19: Privacy Issues and Human-Computer Interaction)
Privacy concepts

Different forms of privacy

Access personal data (conscious
decision)

“Exoinformation”: left by interactions



Queries, timestamps, IP addresses, etc.
Used to build aggregate profiles
“Barn Door” property


Once left unprotected, there is no
way of knowing whether data has
been
(Chapter 20: seen
A User-Centric Privacy Space Framework)
Privacy Concepts

Privacy boundaries

Disclosure boundary

Identity boundary

Temporal boundary

Users need to be aware and manage

(Peripheral Privacy Notifications for Wireless Networks)
Case Study: Faces

Privacy manager for ubicomp environments

Disclosure preferences: user decides rules


WHO sees WHAT info in WHICH
situations
Metaphor: “faces” represent how users
portray themselves to others in a situation

Situation: generic setting for the purpose
of establishing a “face”

Ex: Weekend shopping trip

(Chapter 21: Five Pitfalls in the Design for Privacy)
Case Study: Faces

Levels of precision


Types of information


Undisclosed, vague, approximate,
precise
Identity, location, activity, nearby people
Feedback: log of disclosures used to
iteratively define preferences
Case Study: Faces


Testing: 5 participants

Created rules for 2 inquirers, 2 situations

Given realistic scenario for each situation

Preferences stated in scenarios did not
match settings for associated situations
Conclusion: Separating configuration from
context is a mistake. Users should mold
system behavior through their actions,
instead of thinking abstractly about privacy.
Case Study: Reno

SMS-based location inquiry tool

3-stage experimental process

Experience Sampling Method (ESM)


Pilot study


Internal testing
User study


Who would disclose what information
Tested with 2 families
(Developing Privacy Guidelines for Social Location Disclosure Applications and Services)
Case Study: Reno

Results

Responses based on specific goals

Denial and deception

Automation not popular

Derived design guidelines

Next step: “Boise” map-based successor
Design Guidelines: Faces

Don't obscure potential information flow


Users can make informed use of a
system only when they understand the
scope of its privacy implications.
Don't obscure actual information flow

Users should understand what
information is being disclosed to whom.
Design Guidelines: Faces

Promote user action over configuration


Provide coarse-grained control


Designs should enable users to practice
privacy as a natural consequence of their
use of the system.
Designs should provide an obvious way
to halt and resume disclosure.
Support established practice
Design Guidelines: Reno

Don't start with automation

Allow flexible disclosure

Support plausible deniability

Support deception

Support simple evasion (“busy”)

Start with person-to-person communication

Provide status/away messages
Design Guidelines: Reno

Avoid handling user data

Consider user groups likely to need privacy

Characterize users' use of privacy features

Account for long learning curve

Account for specific circumstances
Design Guidelines: Proportionality

Principle of proportionality:


“any application, system, tool, or process
should balance its utility with the rights to
privacy (personal, informational, etc) of
the involved individuals”
Method built on 3 “judgments”

Legitimacy: are the goals useful?

Appropriateness: find the best alternative

Adequacy: justify proper use of
parameters
Summary

Importance of ubicomp

Role of privacy

User's perspectives

Case studies: prior research

Design guidelines
Conclusion





Ubicomp is an important concept and will
expand rapidly in the near future.
Usable privacy plays a vital role in realworld ubicomp systems.
Privacy risks are a real threat to end users
Design for ubicomp is challenging, but
there are guidelines for preserving privacy
More research is needed
Sources







Security and Usability. Chapter 19 Privacy Issues and Human-Computer Interaction (M.
Ackerman and S. Mainwaring)
Security and Usability. Chapter 20 A User-Centric Privacy Space Framework (B. Brunk)
Security and Usability. Chapter 21 Five Pitfalls in the Design for Privacy (S. Lederer, J.
Hong, A. Dey, and J. Landay)
Samuel Warren and Louis D. Brandeis, The Right to Privacy, Harvard Law Review,
1890.
B. Kowitz and L. Cranor. Peripheral Privacy Notifications for Wireless Networks. In
Proceedings of the 2005 Workshop on Privacy in the Electronic Society, 7 November
2005, Alexandria, VA, pp. 90-96.
G. Iachello, I. Smith, S. Consolvo, M. Chen, and G. Abowd. Developing Privacy
Guidelines for Social Location Disclosure Applications and Services. In Proceedings of
the Symposium On Usable Privacy and Security 2005, Pittsburgh, PA, July 6-8, 2005.
Iachello, G. and Abowd, G. D. 2005. Privacy and proportionality: adapting legal
evaluation techniques to inform design in ubiquitous computing. In Proceedings of the
SIGCHI Conference on Human Factors in Computing Systems (Portland, Oregon,
USA, April 02 - 07, 2005). CHI '05. ACM Press, New York, NY, 91-100.

http://www.ubiq.com/hypertext/weiser/UbiHome.html

http://www.studies-observations.com/everyware/

http://www.ubicomp.org/ubicomp2009/
Questions
?
Download