Usable Privacy and Security

advertisement
Quick Discussion – based on:
http://cups.cs.cmu.edu/courses/ups-sp08/





Unpatched Windows machines compromised
in minutes
Phishing web sites increasing by 28% each
month
Most PCs infected with spyware (avg. = 25)
Users have more passwords than they can
remember and practice poor password
security
Enterprises store confidential information on
laptops and mobile devices that are
frequently lost or stolen
2
“Give end-users
security controls they can understand
and privacy they can control for
the dynamic, pervasive computing
environments of the future.”
- Computing Research Association 2003
3
4
POP!
5
6
“Users do not want to be responsible for,
nor concern themselves with, their own
security.”
- Blake Ross


Security experts are concerned about the bad
guys getting in
Users may be more concerned about locking
themselves out
7
8




Pick a hard to guess password
Don’t use it anywhere else
Change it often
Don’t write it down
9
11

Make it “just work”
◦ Invisible security

Make security/privacy understandable
◦ Make it visible
◦ Make it intuitive
◦ Use metaphors that users can relate to

Train the user
12

Developers should
not expect users to
make decisions
they themselves
can’t make
13
- Chris Nodder
(in charge of user
experience for
Windows XP SP2)

Privacy is a secondary task

Users have differing privacy concerns and
needs
◦ Users of privacy tools often seek out these tools due
to their awareness of or concern about privacy
◦ Even so, users still want to focus on their primary
tasks
◦ One-size-fits-all interface may not work

Most users are not privacy experts

Many privacy tools reduce application
performance, functionality, or convenience
◦ Difficult to explain current privacy state or future
privacy implications
◦ Difficult to explain privacy options to them
◦ Difficult to capture privacy needs/preferences
15




Internet anonymity system
Allows users to send messages that cannot
be traced back to them (web browsing, chat,
p2p, etc.)
UI was mostly command line interface until
recently
2005 Tor GUI competition
◦ CUPS team won phase 1 with design for Foxtor!
16

Tor is configurable and different users will want to
configure it in different ways
◦ But most users won’t understand configuration options
◦ Give users choices, not dilemmas

We began by trying to understand our users
◦ No budget, little time, limited access to users
◦ So we brainstormed about their needs, tried to imagine
them, and develop personas for them
17
Jim is a current UG at CSM.
Goals:
1. Be sure he’s on track to graduate in 4 years
2. Find some courses that are interesting
3. Get together with friends to study and have fun
Other:
Jim is taking a full course load and also working part
time, so he’s always very busy. He also tends to be
disorganized, so he keeps losing information and
having to look it up again. He is a little shy and
doesn’t know too many people in the department yet.
Susie is a parent researching schools for her son Bob, who will be
graduating from HS soon.
Goals:
1. She wants to find an environment that will be welcoming and
stimulating for Bob
2. She thinks Bob may ultimately want to pursue graduate work,
so she wants to be sure the school has faculty doing interesting
research
3. She wants to find out how expensive the school is and what
type of financial aid is available.
Other:
Susie works full time but considers her family to be a top priority.
It’s very important to her for her son to be happy, so she’s willing to
devote a fair amount of time to the task of selecting a university.
The family has a computer at home, so she’s spending her evenings
visiting websites to collect data. She’s comfortable surfing the web,
but prefers websites that are logical and not too cluttered.

The process led to realization that our users had 3
categories of privacy needs
◦ Basic, selective, critical

Instead of asking users to figure out complicated
settings, most of our configuration involves
figuring out which types of privacy needs they have
20
22


Privacy laws and regulations vary widely throughout the
world
US has mostly sector-specific laws, with relatively
minimal protections - often referred to as “patchwork
quilt”
◦ Federal Trade Commission has jurisdiction over fraud and
deceptive practices
◦ Federal Communications Commission regulates
telecommunications

European Data Protection Directive requires all
European Union countries to adopt similar
comprehensive privacy laws that recognize privacy as
fundamental human right
◦ Privacy commissions in each country (some countries have
national and state commissions)
◦ Many European companies non-compliant with privacy laws
(2002 study found majority of UK web sites non-compliant)
23









Bank Secrecy Act, 1970
Fair Credit Reporting Act, 1971
Privacy Act, 1974
Right to Financial Privacy Act, 1978
Cable TV Privacy Act, 1984
Video Privacy Protection Act, 1988
Family Educational Right to Privacy Act, 1993
Electronic Communications Privacy Act, 1994
Freedom of Information Act, 1966, 1991, 1996
24

HIPAA (Health Insurance Portability and
Accountability Act, 1996)
◦ When implemented, will protect medical records
and other individually identifiable health
information

COPPA (Children‘s Online Privacy Protection
Act, 1998)
◦ Web sites that target children must obtain
parental consent before collecting personal
information from children under the age of 13

GLB (Gramm-Leach-Bliley-Act, 1999)
◦ Requires privacy policy disclosure and opt-out
mechanisms from financial service institutions
25

Direct Marketing Association Privacy Promise
http://www.thedma.org/library/
privacy/privacypromise.shtml

Network Advertising Initiative Principles
http://www.networkadvertising.org/

CTIA Location-based privacy guidelines
http://www.wowcom.com/news/press/body.cfm?record_id=907
26
27



Policies let consumers know about site’s
privacy practices
Consumers can then decide whether or not
practices are acceptable, when to opt-in or
opt-out, and who to do business with
The presence of privacy policies increases
consumer trust
What are some problems with privacy policies?
28

BUT policies are often
◦
◦
◦
◦
difficult to understand
hard to find
take a long time to read
change without notice
29


Identification of site, scope,
contact info
Types of information collected


Security assurances
Children’s privacy
◦ Including information about
cookies






How information is used
Conditions under which
information might be shared
Information about opt-in/optout
Information about access
Information about data
retention policies
Information about seal
programs
There is lots of information
to convey -- but policy
should be brief and
easy-to-read too!
What is opt-in? What is opt-out?
30

Project organized by Hunton & Williams law firm
◦ Create short version (short notice) of a human-readable
privacy notice for both web sites and paper handouts
◦ Sometimes called a “layered notice” as short version would
advise people to refer to long notice for more detail
◦ Now being called “highlights notice”
◦ Focus on reducing privacy policy to at most 7 boxes
◦ Standardized format but only limited standardization of
language
◦ Proponents believe highlights format may eventually be
mandated by law

Alternative proposals from privacy advocates focus on
check boxes
Interest Internationally

Interest in the US for financial privacy notices

◦ http://www.privacyconference2003.org/resolution.asp
◦ http://www.ftc.gov/privacy/privacyinitiatives/ftcfinalreport060228.pd
f
31
32
33
34
WE SHARE [DO NOT SHARE] PERSONAL INFORMATION WITH OTHER WEBSITES OR COMPANIES.
Collection:
We
We
We
We
We
collect personal information directly from you
collect information about you from other sources:
use cookies on our website
use web bugs or other invisible collection methods
install monitoring programs on your computer
Uses: We use information about you to:
Send you advertising mail
Send you electronic mail
Call you on the telephone
Sharing: We allow others to use your information to:
Maintain shared databases about you
Send you advertising mail
Send you electronic mail
Call you on the telephone
YES

NO









With Your
Consent



Without Your
Consent



With Your
Consent



N/A
Without Your
Consent



N/A
Access: You can see and correct {ALL, SOME, NONE} of the information we have about you.
Choices: You can opt-out of receiving from
Advertising mail
Electronic mail
Telemarketing
Us


Affiliates



Retention:
{Six Months
Three Years
Change:
We keep your personal data for:
Third Parties


N/A

Forever}
We can change our data use policy {AT ANY TIME, WITH NOTICE TO YOU, ONLY FOR DATA COLLECTED IN THE FUTURE}
35

Developed by the World Wide Web Consortium
(W3C) http://www.w3.org/p3p/
◦ Final P3P1.0 Recommendation issued 16 April 2002

Offers an easy way for web sites to
communicate about their privacy policies in a
standard machine-readable format
◦ Can be deployed using existing web servers

Enables the development of tools (built into
browsers or separate applications) that
◦ Summarize privacy policies
◦ Compare policies with user preferences
◦ Alert and advise users
36





Laboratory study of 28 non-expert computer users
Asked to evaluate 10 web sites, take 15 minute
break, evaluate 10 more web sites
Experimental group read web-based training
materials during break, control group played
solitaire
Experimental group performed significantly better
identifying phish after training
People can learn from web-based training
materials, if only we could get them to read them!
38



Most people don’t proactively look for
training materials on the web
Many companies send “security notice” emails
to their employees and/or customers
But these tend to be ignored
◦ Too much to read
◦ People don’t consider them relevant
39

Can we “train” people during their normal
use of email to avoid phishing attacks?
◦ Periodically, people get sent a training email
◦ Training email looks like a phishing attack
◦ If person falls for it, intervention warns and
highlights what cues to look for in succinct and
engaging format
P. Kumaraguru, Y. Rhee, A. Acquisti, L. Cranor, J. Hong, and E.
Nunge. Protecting People from Phishing: The Design and
Evaluation of an Embedded Training Email System. CyLab
Technical Report. CMU-CyLab-06-017, 2006.
http://www.cylab.cmu.edu/default.aspx?id=2253
40

Lab study compared two prototype
interventions to standard security notice
emails from Ebay and PayPal
◦
◦
◦
◦
Existing practice of security notices is ineffective
Diagram intervention somewhat better
Comic strip intervention worked best
Interventions most effective when based on real
brands
42
44

Ecommerce personalization systems
◦ Concerns about use of user profiles

Software that “phones home” to fetch software
updates or refresh content, report bugs, relay
usage data, verify authorization keys, etc.
◦ Concerns that software will track and profile users

Communications software (email, IM, chat)
◦ Concerns about traffic monitoring, eavesdroppers

Presence systems (buddy lists, shared spaces,
friend finders)
◦ Concerns about limiting when info is shared and with
whom
45


Similar to issues to consider for privacy
tools PLUS
Users may not be aware of privacy issues up
front
◦ When they find out about privacy issues they may
be angry or confused, especially if they view
notice as inadequate or defaults as unreasonable


Users may have to give up functionality or
convenience, or spend more time
configuring system for better privacy
Failure to address privacy issues adequately
may lead to bad press and legal action
46
47

Every time a user makes a new purchase that
they want to rate or exclude they have to edit
profile info
◦ There should be a way to set up default rules




Exclude
Exclude
Exclude
Exclude
all
all
all
all
purchases
purchases shipped to my work address
movie purchases
purchases I had gift wrapped
49


Users should be able to
remove items from
profile
If purchase records are
needed for legal
reasons, users should
be able to request that
they not be accessible
online
50
51




Currently privacy-related options are found
with relevant features
Users have to be aware of features to find the
options
Put them all in one place
But also leave them with relevant features
52
How about an
“I didn’t buy it
for myself”
check-off box
(perhaps
automatically
checked if gift
wrapping is
requested)
I didn’t buy it
for myself
53
Desire to avoid unwanted marketing causes
some people to avoid giving out personal
information
55
The little people inside my computer might
know it’s me…
… and they might tell their friends
56

“My TiVo thinks I’m a psychopath!”
57
Everyone wants to be understood.
No one wants to be known.
58
…but then you started getting personalized
ads for your favorite brand of dog food
59


Concerns about being charged higher prices
Concerns about being treated differently
60

Revealing info to family members or coworkers
◦ Gift recipient learns about gifts in advance
◦ Co-workers learn about a medical condition

Revealing secrets that can unlock many
accounts
◦ Passwords, answers to secret questions, etc.
61
The Cranor family’s 25
most
frequent
grocery
purchases (sorted by
nutritional value)!
62



Stalkers, identity thieves, etc.
People who break into account may be able to
access profile info
People may be able to probe recommender
systems to learn profile information
associated with other users
63

Records are often subpoenaed in patent
disputes, child custody cases, civil litigation,
criminal cases
64


Governments increasingly looking for
personal records to mine in the name of
fighting terrorism
People may be subject to investigation even if
they have done nothing wrong
65
66



Wireless location tracking
Semantic web applications
Ubiquitous computing
68
Download