Privacy in Ubiquitous Computing II

advertisement
Yaser Ghanam

John has an account in each of the three entities
you have.
 Choose one entity to be YOU (as a group).
 Assume you have unconditional access to the
databases of the other two.
 Using this data, how can you improve your business?
 Example: As eBay, how can I use info from my client’s
PayPal account and his CoverMe insurance account to
improve my business?
“A funeral home obtained the names and
addresses of people diagnosed with cancer,
and contacted a Montreal woman on the list
about buying a burial plot and pre-paid
funeral services. ”
http://www.media-awareness.ca/english/issues/privacy/why_issue_privacy.cfm
“A candy company got hold of the names of
people in a weight watchers program, and
sent them chocolate bars in the mail.”
http://www.media-awareness.ca/english/issues/privacy/why_issue_privacy.cfm
“Participants in a medical research study
reported an increase in difficulties getting
employment and insurance after they gave
genetic samples to the researchers.”
http://www.media-awareness.ca/english/issues/privacy/why_issue_privacy.cfm

John has an account in each of the three entities
you have.
 Choose one entity to be YOU (as a group).
 Assume you have unconditional access to the
databases of the other two.
 Using this data, how can you improve your service?
 Example: As eBay, how can I use info from my client’s
PayPal account and his CoverMe insurance account to
improve my service?







What is privacy?
How does it affect you?
Do we care about privacy?
Borders of privacy
Information collection
Context aware systems
Design for privacy
“The claim of individuals... to determine for
themselves when, how, and to what extent
information about them is [collected and]
communicated to others.”
 Alan F. Westin. Privacy and Freedom. Atheneum, New York
NY, 1967.

“The right to be left alone!”

Your information -> Heaven for Marketing
 telemarketers, junk mail, spam email.

Social implications
 Someone gets access to your online dating
account (your wife/husband!!)

Political misuse
 IBM's Hollerith punch card technology was used
to collect census data, later used by the Nazis to
identify Jews for transport to extermination
camps.
Black, E. (2001). IBM and the Holocaust: The Strategic Alliance Between Nazi Germany and America's Most
Powerful Corporation. New York: Crown.

Exercise:
 I am currently conducting a market research as
part of my business plan for a new set of products
to the market. Are you willing to voluntarily give
me permission to have copies of the receipts of all
shopping transactions you make for 12 months?
 What if I give you 5% of the overall total reported
on these receipts?


A study was conducted to answer this question.
75% of people were concerned about their
privacy or commercial profiling
BUT: in exchange for uncertain, smallish gains
87% of participants disclosed large amounts of
private information.
 “people do not act according to their stated
preferences”

User preference Vs. Behaviour: Spiekermann, Grossklags, Berendt (2001) Stated Privacy Preferences versus Actual
Behaviour in EC environments: a Reality Check, Proc 5th Int Conf Wirtschaftsinformatik.

Natural borders: Physical limitations of observations
 walls, doors, clothing, darkness,
 also sealed letters, phone calls.

Example:
 A context-aware wearable system
 The feeling of having someone (or something) constantly
peeking over our shoulder and second guessing us
 Such information might be able to determine:
▪ your physical data (were you at the crime scene?)
▪ your intentions (by assessing the data feed from body sensors)
 Which motivates legislation that would make the deletion of
such information a crime
▪ just as recent laws against cybercrime
Gary T. Marx. Murky conceptual waters: The public and the private. Ethics and Information
Technology, 3(3):157–169, 2001.

Social borders: Expectations about confidentiality
 for members of certain social roles: family, doctors,
lawyers.
 also expectations that your colleagues will not read
your personal fax messages, or material left lying around
the photocopy machine.

Example:
 Wearable health monitoring devices improve
information flow to your physicians and their personnel.
 Yet, it threatens to facilitate data sharing beyond local
clinic staff to include your health insurer and employer.
Gary T. Marx. Murky conceptual waters: The public and the private. Ethics and Information
Technology, 3(3):157–169, 2001.

Spatial or temporal borders: The expectations of people that
parts of their life, can remain separate.
 a wild adolescent time should not interfere with today’s life as a father
 also your work colleagues and friends in your favourite club.

Example:
 Mileage programs allow airlines to increase customer loyalty and
provide consumers with free flights and other reward.
 Different values to each customer ( “gold,” “silver”)
 Sales agents asses my “net worth” to the company, and offer me
special services (if I am “valuable”)
Gary T. Marx. Murky conceptual waters: The public and the private. Ethics and Information
Technology, 3(3):157–169, 2001.

Borders due to ephemeral or transitory effects
 an action that we hope gets forgotten soon.
 also old pictures and letters that we put out in our trash.
 our expectations of being able to have information simply
pass away unnoticed or forgotten.

Example:
 Memory Amplifier: Any statement I make during a private
conversation could potentially be played back.
 Even if this information would never get disclosed to
others:
▪ Do you want to deal with people who have perfect memories?
Gary T. Marx. Murky conceptual waters: The public and the private. Ethics and Information
Technology, 3(3):157–169, 2001.

In the solution you provided earlier, what
privacy borders you might have crossed?
 Natural
 Social
 Spatial/Temporal
 Ephemeral/Transitory



Phone calls
Internet usage
Social networks
 Personal info, educational background, religious
views, activities, friends, friends of friends ... Etc.


RFID technology
Search engines
Biological identity
• fingerprints, race, colour, gender, physical attributes, DNA
Raw
Financial identity
• bank accounts, credit cards, stocks, transfers
Social identity
• membership in religion institutions, auto clubs, ethnicity
Financial behaviour
• Saving patterns, tendency to buy/sell stocks
Derived
Shopping behaviour
• Spending behaviours, attraction to offers
Data linking
• Mood detection & browsing behaviour-> recommendation sys.
Blaine A. Price1, Karim Adam, BasharNuseibeh, Keeping Ubiquitous Computing to Yourself: a practical model for user control of privacy.
Carman Neustaedter and Saul Greenberg, The Design of a Context-Aware Home Media Space for Balancing Privacy and Awareness,
UbiComp 2003, LNCS 2864.
Boyle, M., Neustaedter, C. and Greenberg, S. Privacy Factors in Video‐based Media Spaces. In Harrison, S. (Ed.) Media
Space: 20+ Years of Mediated Life.Springer


Always-on, video-based
Awareness of others’ presence & activities

Camera state: play, pause or stop


Capturing angle: control what is on focus
Video fidelity: blurring, frame rate, frame size


Easy-off button: turn off the camera
Gesture (de)activation: camera & microphone



Telecommuter detection
Family/Friend detection
Visual/Audio feedback
telecommuter is
the only person
at the computer
more awareness
Someone else
is present in
the room
less awareness
less privacy
more privacy
camera angle
away from wall
camera is
paused


Privacy is a concern with new technologies.
Solutions need to balance privacy with:
 Unobtrusiveness (Wearable)
 Awareness (HMS)
 Convenience (Google)

At the end, it is all up to us.

In the video:
 One guest said: “Google does do some nasty stuff… They
know so much about people!”
 Another said: “Their point is that it does make your
searches better.”

Similarly, health monitoring systems
 One might say: “They expose so much info about your
activities, diets, moods…”
 Another says: “But they improve health awareness!”

Where do you stand?
Download