Privacy in Ubiquitous Computing

advertisement
Yaser Ghanam

http://www.youtube.com/watch?v=BGrji2bIiG8

What is privacy to you?



Solicited from audience.
Definition.
How does it affect you?



Marketing harassment
Social implications
Political misuse (http://www.youtube.com/watch?v=BGrji2bIiG8)

But some people thing it is not a big deal.

Are we aware of information collection?








How can we be aware?



One-time alert (fades out)
On-use alerts (obtrusiveness)
Do people really care?




Mobile phones (call recording, SMS: details of web and email usage stay for a year in the ISPs servers in the UK)
iTunes
Search engines (http://jumpcut.com/view/?id=E6F1D3F067F411DCB8E0000423CF381C)
Social networks
At work!
RFID tags (http://video.canadiancontent.net/14116142-privacy-and-rfid-chips.html)
Type of information collected
Difference between what people prefer and how people behave.
How much privacy?
Borders of privacy
Towards more privacy

Privacy-driven design: Privacy principles.
“The claim of individuals... to determine for
themselves when, how, and to what extent
information about them is [collected and]
communicated to others.”
 Alan F. Westin. Privacy and Freedom. Atheneum, New York
NY, 1967.

Your information -> Heaven for Marketing
 telemarketers, junk mail, spam email.

Social implications
 Someone gets access to your online dating
account (your wife/husband!!)

Political misuse
 IBM's Hollerith punch card technology was used
to collect census data, later used by the Nazis to
identify Jews for transport to extermination
camps.
Black, E. (2001). IBM and the Holocaust: The Strategic Alliance Between Nazi Germany and America's Most
Powerful Corporation. New York: Crown.

Or.. is it?
“All this secrecy is making life harder, more
expensive, dangerous and less serendipitous”
 Peter Cochrane. Head to Head. Sovereign Magazine, pages 56–57, March 2000.

Be “good” and you’d have nothing to worry
about.
 But do people agree on what “good” is?




Phones calls
Internet usage
Search engines
Social networks
 Personal info, educational background, religious
views, activities, friends, friends of friends ... Etc.

RFID technology
Biological identity
• fingerprints, race, colour, gender, physical attributes, DNA
Financial identity
• bank accounts, credit cards, stocks, transfers
Legal identity
• SIN, Passport info, Driver’s licence
Social identity
• membership in religion institutions, auto clubs, ethnicity
Relationships
• child of, parent of, spouse of, friend of
Property Associations
• home address, business address, insurance policies
Blaine A. Price1, Karim Adam, Bashar Nuseibeh, Keeping Ubiquitous Computing to Yourself: a practical model for user control of privacy.
Financial
behaviour
Social
behaviour
Shopping
behaviour
• Trends and
changes: monthto-month variance
against baseline.
• Behaviour
statistics: drug use,
violations of law,
family traits.
• Purchase of item in
a certain class
suggests desire to
buy other items in
same class.
DNA analysis
Data linking
• DNA linked to
human genome
database infers
tendency to
disease,
psychological
behaviour.
• Device with a given
MAC address was
seen at a given
place/time.
• This MAC address
is registered to
person A.
• Conclusion: Person
A was at
place/time.
Blaine A. Price, Karim Adam, Bashar Nuseibeh, Keeping Ubiquitous Computing to Yourself: a practical model for user control of privacy.


Always-on, video-based
Awareness of others’ presence & activities
Carman Neustaedter and Saul Greenberg, The Design of a Context-Aware Home Media Space for Balancing Privacy and Awareness,
UbiComp 2003, LNCS 2864.

Exercise:
 I am currently conducting a market research as
part of my business plan for a new set of products
to the market. Are you willing to voluntarily give
me permission to have copies of the receipts of all
shopping transactions you make for 12 months?
 What if I give you 5% of the overall total reported
on these receipts?


A study was conducted to answer this question.
75% of people were concerned about their
privacy or commercial profiling
BUT: in exchange for uncertain, smallish gains


87% of participants disclosed large amounts of
private information.
“people do not act according to their stated
preferences”
User preference Vs. Behaviour: Spiekermann, Grossklags, Berendt (2001) Stated Privacy Preferences versus Actual
Behaviour in EC environments: a Reality Check, Proc 5th Int Conf Wirtschaftsinformatik.

Individualistic view
 cost vs. benefit. Free flow of info enables us to build better
personalized, proactive systems.
 So, protect only highly sensitive data.

Collectivistic view
 Society as a whole can benefit from less privacy
▪ E.g. exposing criminal acts, encouraging honesty & transparency
 So, value societal benefits over individuals’ privacy.

Reciprocal view
 No watchers and watched, you know as much about
anybody else as they know about you.
 So, deal with privacy based on “egalitarian” knowledge.
Mika Raento, Privacy in ubiquitous computing: Lots of questions, a couple of answers.

Natural borders: Physical limitations of observations
 walls, doors, clothing, darkness,
 also sealed letters, phone calls.

Example:
 A context-aware wearable system
 The feeling of having someone (or something) constantly
peeking over our shoulder and second guessing us
 Laws to use such information might be enforced to determine:
▪ your physical data (were you at the crime scene?)
▪ your intentions (by assessing the data feed from body sensors)
 Which motivates legislation that would make the deletion of
such information a crime
▪ just as recent laws against cybercrime
Gary T. Marx. Murky conceptual waters: The public and the private. Ethics and Information
Technology, 3(3):157–169, 2001.

Social borders: Expectations about confidentiality
 for members of certain social roles: family, doctors,
lawyers.
 also expectations that your colleagues will not read
your personal fax messages, or material left lying around
the photocopy machine.

Example:
 Wearable health monitoring devices improve
information flow to your physicians and their personnel.
 Yet, it threatens to facilitate data sharing beyond local
clinic staff to include your health insurer and employer.
Gary T. Marx. Murky conceptual waters: The public and the private. Ethics and Information
Technology, 3(3):157–169, 2001.

Spatial or temporal borders: The expectations of people
that parts of their life, can remain separate.
 a wild adolescent time should not interfere with today’s life as a
father
 also your work colleagues and friends in your favourite club.

Example:
 Mileage programs allow airlines to increase customer loyalty
and provide consumers with free flights and other reward.
 Different values to each customer ( “gold,” “silver”)
 Sales agents asses my “net worth” to the company once they
have my frequent flyer number, and consequently provide me
with special services (if I am “valuable”)
 or withhold from me certain options (if I am not)
Gary T. Marx. Murky conceptual waters: The public and the private. Ethics and Information
Technology, 3(3):157–169, 2001.

Borders due to ephemeral or transitory effects
 an action that we hope gets forgotten soon.
 also old pictures and letters that we put out in our trash.
 our expectations of being able to have information simply
pass away unnoticed or forgotten.

Example:
 Memory Amplifier: Any statement I make during a private
conversation could potentially be played back (audio
&video).
 Even if this information would never get disclosed to
others:
▪ Do you want to deal with people who have perfect memories?
Gary T. Marx. Murky conceptual waters: The public and the private. Ethics and Information
Technology, 3(3):157–169, 2001.

Well-established in HCI: users don't change
default settings.
 It is not obvious, too difficult, or not rewarding.




Build systems and data collection that the users
can understand and give permission for.
Distribute data only to entities the users trust.
Make the setup of trust relations easy enough
for users.
Make the system compelling enough so that the
users are willing to configure it.
Leysia Palen (1999), Social, individual and technological issues for groupware calendar systems.
Mika Raento, Privacy in ubiquitous computing: Lots of questions, a couple of answers.
Boyle, M., Neustaedter, C. and Greenberg, S. Privacy Factors in Video‐based Media Spaces. In Harrison, S. (Ed.) Media
Space: 20+ Years of Mediated Life.Springer

Openness and transparency
 Enable the user to have an accurate mental model of the systems' working
 Obtain user’s consent before collecting/using data
 But: Systems are supposed to be invisible
 How can the user be aware of when and what data is being collected?
 How do you get consent for all collection?

Individual participation: subject can see and modify records
 How can the user correct a model built by the system?

Collection limitation: not excessive for purpose
 Build systems that use data, without knowing how relevant attributes are.
 Insure quality of data used.
 Minimize length of history stored

Use limitation: only for stated purposes, access controls
 How do individuals give permission to distribute data to others?
Marc Langheinrich, Privacy by Design Principles of Privacy-Aware Ubiquitous Systems, Ubicomp
2001Proceedings.
Feedback About
Control Over
Capture
When and what info about me gets
into the system.
When & when not to give out what info. I can
enforce my own preferences for system
behaviours with respect to each type of info I
convey.
Construction
What happens to info about me once it
gets inside the system.
What happens to info about me. I can set
automatic default behaviours & permissions.
Accessibility
Which people and what software have
access to info about me and what info
they see or use.
Who & what has access to what info about me.
I can set automatic default behaviours &
permissions.
Purposes
What people want info about me for.
Since this is outside of the system, it
may only be possible to infer purpose
from construction & access behaviours.
It is infeasible for me to have technical control
over purposes. With appropriate feedback,
however, I can exercise social control to
restrict intrusion, unethical & illegal usage.
Victoria Bellotti and Abigail Sellen. Design for Privacy in Ubiquitous Computing
Environments.


Policy matching: provide mechanisms for
comparing user’s policy to that of the ubicomp
service.
Noise: hide or disguise a user’s location or
identity by:





Anonymizing: hiding the identify of the user
Hashing: disguising the identify of the user
Cloaking: making the user invisible
Blurring: decreasing the accuracy of the location
Lying: giving intentionally false information about
location or time
Blaine A. Price, Karim Adam, Bashar Nuseibeh, Keeping Ubiquitous Computing to Yourself: a practical model for user control of privacy.
Blaine A. Price, Karim Adam, Bashar Nuseibeh, Keeping Ubiquitous Computing to Yourself: a practical model for user control of privacy.
Download