Privacy Risk Models for Designing Privacy-Sensitive Ubiquitous Computing Systems Jason Hong

advertisement
Privacy Risk Models for
Designing Privacy-Sensitive
Ubiquitous Computing Systems
Jason Hong
Jennifer Ng
Scott Lederer
James Landay
Carnegie Mellon
Carnegie Mellon
University of California, Berkeley
University of Washington
Motivation
Ubiquitous Computing is Coming
Advances in wireless networking, sensors, devices
– Greater awareness of and interaction with physical world
“But what about my privacy?”
Find Friends
E911
Motivation
But Hard to Design Privacy-Sensitive Ubicomp Apps
Discussions on privacy generate lots of heat but not light
–
–
–
–
Big brother, overprotective parents, telemarketers, genetics…
Many conflicting values
Often end up talking over each other
Hard to have reasoned debates and create designs that
address the issues
Need a design method that helps design teams:
– Identify
– Prioritize
– Manage privacy risks for specific applications
Propose Privacy Risk Models for doing this
Privacy Risk Model Analogy
Security Threat Model
“[T]he first rule of security analysis is this:
understand your threat model. Experience teaches
that if you don’t have a clear threat model –
a clear idea of what you are trying to prevent and
what technical capabilities your adversaries have –
then you won’t be able to think analytically about
how to proceed. The threat model is the starting
point of any security analysis.”
- Ed Felten
Privacy Risk Model
Two Parts: Risk Analysis and Risk Management
Privacy Risk Analysis
– Common questions to help design teams identify potential risks
– Like a task analysis
Privacy Risk Management
– Helps teams prioritize and manage risks
– Like severity rankings in heuristic evaluation
Will present a specific privacy risk model for ubicomp
– Draws on previous work, plus surveys and interviews
– Provide reasonable level of protection for foreseeable risks
Outline
 Motivation
 Privacy Risk Analysis
 Privacy Risk Management
 Case Study: Location-enhanced Instant Messenger
Privacy Risk Analysis
Common Questions to Help Design Teams Identify Risks
Social and Organizational Context
–
–
–
–
–
Who are the users?
What kinds of personal info are shared?
Relationships between sharers and observers?
Value proposition for sharing?
…
Social and Organizational Context
Who are the users? Who shares info? Who sees it?
Different communities have different needs and norms
– An app appropriate for families might not be for work settings
Affects conditions and types of info willing to be shared
– Location information with spouse vs co-workers
– Real-time monitoring of one’s health
Start with most likely users
– Ex. Find Friends
– Likely sharers are people using mobile phone
– Likely observers are friends, family, co-workers
Find Friends
Social and Organizational Context
What kinds of personal info are shared?
Different kinds of info have different risks and norms
– Current location vs home phone# vs hobbies
Some information already known between people
– Ex. Don’t need to protect identity with your friends and family
Different ways of protecting different kinds of info
– Ex. Can revoke access to location, cannot for birthday or name
Social and Organizational Context
Relationships between sharers and observers?
Kinds of risks and concerns
– Ex. Risks w/ friends are unwanted intrusions, embarrassment
– Ex. Risks w/ paid services are spam, 2nd use, hackers
Incentives for protecting personal information
– Ex. Most friends don’t have reason to intentionally cause harm
– Ex. Neither do paid services, but want to make more money
Mechanisms for recourse
– Ex. Kindly ask friends and family to stop being nosy
– Ex. Recourse for paid services include formally complaining,
switching services, suing
Social and Organizational Context
Value proposition for sharing personal information?
What incentive do users have for sharing?
Quotes from nurses using locator badges
– “I think this is disrespectful, demeaning and degrading”
– “At first, we hated it for various reasons, but mostly we felt we
couldn’t take a bathroom break without someone knowing
where we were…[but now] requests for medications go right to
the nurse and bedpans etc go to the techs first... I just love
[the locator system].”
When those who share personal info do not benefit in
proportion to perceived risks, then the tech is likely to fail
Privacy Risk Analysis
Common Questions to Help Design Teams Identify Risks
Social and Organizational Context
–
–
–
–
–
Who are the users?
What kinds of personal info are shared?
Relationships between sharers and observers?
Value proposition for sharing?
…
Technology
–
–
–
–
–
How is personal info collected?
Push or pull?
One-time or continuous?
Granularity of info?
…
Technology
How is personal info collected?
Different technologies have different tradeoffs for privacy
Network-based approach
– Info captured and processed by external computers that users
have no practical control over
– Ex. Locator badges, Video cameras
Client-based approach
– Info captured and processed on end-user’s device
– Ex. GPS, beacons
– Stronger privacy guarantees, all info starts with you first
Technology
Push or pull?
Push is when user sends info first
– Ex. you send your location info on E911 call
– Few people seem to have problems with push
Pull is when another person requests info first
E911
– Ex. a friend requests your current location
– Design space much harder here
need to make people aware of requests
want to provide understandable level of control
don’t want to overwhelm
Find Friends
Technology
One-time or continuous disclosures?
One-time disclosure
– Ex. observer gets snapshot
Fewer privacy concerns
Continuous disclosure
– Ex. observer repeatedly gets info
Greater privacy concerns
– “It’s stalking, man.”
Find Friends
Active Campus
Technology
Granularity of info shared?
Different granularities have different utility and risks
Spatial granularity
– Ex. City? Neighborhood? Street? Room?
Temporal granularity
– Ex. “at Boston last month” vs “at Boston August 2 2004”
Identification granularity
– Ex. “a person” vs “a woman” vs “alice@blah.com”
Keep and use coarsest granularity needed
– Least specific data, fewer inferences, fewer risks
Outline
 Motivation
 Privacy Risk Analysis
 Privacy Risk Management
 Case Study: Location-enhanced Instant Messenger
Privacy Risk Management
Helps teams prioritize and manage risks
First step is to prioritize risks by estimating:
– Likelihood that unwanted disclosure occurs
– Damage that will happen on such a disclosure
– Cost of adequate privacy protection
Focus on high likelihood, high damage, low cost risks first
– Like heuristic eval, fix high severity and/or low cost
– Difficult to get exact numbers, more important is the process
Privacy Risk Management
Helps teams prioritize and manage risks
Next step is to help manage those risks
How does the disclosure happen?
– Accident? Bad user interface? Poor conceptual model?
– Malicious? Inside job? Scammers?
What kinds of choice, control, and awareness are there?
– Opt-in? Opt-out?
– What mechanisms? Ex. Buddy list, Invisible mode
– What are the default settings?
Better to prevent or to detect abuses?
– “Bob has asked for your location five times in the past hour”
Case Study
Location-enhanced Instant Messenger
New features
–
–
–
–
Request a friend’s current location
Automatically show your location
Invisible mode, reject requests
Default location is “unknown”
Who are the users?
– Typical IM users
Relationships?
– Friends, family, classmates, …
One-time or continuous?
– One-time w/ notifications
Case Study
Location-enhanced Instant Messenger
Identifying potential privacy risks
– Over-monitoring by friends and family
– Over-monitoring at work place
– Being found by malicious person (ex. stalker, mugger)
Assessing the first risk, over-monitoring by family
– Likelihood depends on family, conservatively assign “high”
– Damage might be embarrassing but not life-threatening,
assign “medium”
Managing the first risk
– Buddy list, Notifications for awareness, invisible mode,
“unknown” if location not disclosed
– All easy to implement, cost is “low”
Discussion
Privacy risk models are only a starting point
– Like task analysis, should try to verify assumptions and answers
– Can be combined with field studies, interviews, low-fi prototypes
Summary
Privacy risk models for helping design teams
prioritize, and manage risks
identify,
Privacy risk analysis for identifying risks
– Series of common questions, like a task analysis
Privacy risk management for prioritizing & managing risks
– Like severity ratings in heuristic evaluation
Described our first iteration of privacy risk model
– Help us evolve and advance it!
Download