A False Dichotomy?

advertisement
PRIVACY
versus
UTILITY and CONTROL
A False Dichotomy?
Giuseppe Bianchi
Roma, August 19, 2008
YICGG 2008
The motivating questions
• Ubiquitous “any time, any place” e-presence
• Pervasive services in a virtual pervasive world
• Body networks, sensor networks, internet of things…
Is pervasive technology forcing users to waive their
privacy rights?
And how much of our privacy can be traded for
security and/or new service opportunities?
Privacy issues in modern
pervasive environments

Tons of e-data gathered from…
•
•
•
•
•

…and processed: correlation, profiling, data mining, …
•
•
•
•
•

Pervasive always-on devices
Body-connected devices/tags
Sensing (smart) environments
Biometrics
E-surveillance
Location
Access
communication
Application
….
Result:
(where I am)
(who I am)
(who I’m connecting to)
(what am I doing)
• precise definitions of the individual’s preferences
• Personal information on consumers perceived as asset
 often subject matter of commercial transactions
• Individual often do not consent or even are not aware!
Achieving Privacy

Privacy: what it should be
• the right of the individual to control his
personal information by posing specific
obligations on the subjects that process his
data.

What it is:
• No consent = no service
• Consent  loose specification on how data will
be treated


Regulation fails from being precise (in most cases)
In practice: consent = I accept to lose my privacy
Privacy: a difficult problem
Opposite and conflicting interests
User privacy
anonymity,
untraceability,
…
How to guarantee
against criminals
and abuses?
Public Interest
Private activities
Service operation
public authorities and bodies must achieve
public interests (governmental, security,
economic, tax purposes, etc.)
How to provide effective service without full
data disclosure?
information vital for commercial and
customer relationships, business activities
and strategies, …
High privacy levels
high level of control
No control on users  DICHOTOMY!  high security
Hard to run services
high service utility
The current solution: trust and hope


Trust that the providers
and their employeers
respect your privacy
(and the applicable laws)
Trust that the security
measures set up by the
provider are adequate
Privacy vs utility/control: trade-offs set forth only by regulation
When public interest prevails

E.g.: EU directives
• Article 8 (2) of the European Convention of
Human Rights and Fundamental Freedoms
(Dec. 2000)
• i) national security, ii) public safety, iii) economic well-being
of the country, iv) prevention of disorder or crime, v)
protection of health or morals, vi) protection of the rights and
freedoms of others
• Article 13 of the EU Directive 95/46/EC (Nov.
1995)
• (a) national security; (b) defense; (c) public security; (d)
prevention, investigation, detection and prosecution of
criminal offences, or of breaches of ethics for regulated
professions; (e) important economic or financial interest of a
State, including monetary, budgetary and taxation matters;
(f) monitoring, inspection or regulatory function connected
with the exercise of official authority, (g) the protection of the
data subject or of the rights and freedoms of others.
• Also, (under certain limits) for scientific research and statistic
creation purposes
What data protection laws say

EC/95/46 European Data Protection Directive







Clear general principles but not nearly translated in a
detailed set of mandatory security measures


…
Data must be processed fairly and lawfully
Data must be collected for specified, explicit and legitimate
purposes and not further processed in a way incompatible with
those purposes
Data must be adequate, relevant and not excessive in relation to
the purposes for which they are collected and/or further processed
Data must be kept in a form which permits identification of data
subjects for no longer than is necessary for the purposes for which
the data were collected or for which they are further processed
…
Consistent in ALL regulations!!
Way too generic terms (how to enforce them?)



a “minimum” of data security measures that must be implemented
“fair” processing of the data
“appropriate, adequate, suitable” data security measures that
should be adopted
Why laws and trust may not be
the right answer
DB

The issue of insiders
• how to tightly control employeers and sysadm?

Security of long-term stored data is costly
• Data retention laws pose long term storage obligations
(6 months to 5 years)

Many recent privacy violation scandals
• GSM wiretapping in Italy, Greece, etc
• Abuse of lawful interception hooks
• Profiling abuses from operators
Towards technology-based solutions

Four major directions:
• Empower users with improved control on their
Example 1 data access and disclosure
• Rely as much as possible on pseudonymization
Example 2 and anonymization
Example 2
• Divide et impera: distribute data across
multiple entities, and permit recombination
only by authorities
Example 3
• Permit (selective and in-purpose) operation on
data only if specific technically enforceable
conditions arise
Example 1: RFID Contactless
Privacy Manager
Faked hair
Artificial leg
model #4456
(polyester)
model #459382
Developed in the
IST-DISCREET
EU project
ref: [deliverables @ www.discreet.org,
2007-2008]
Bakunin
letters
5 x $100
banknotes
Serial #:
597387,389473…
Lingerie
model Penelope
Bras(Size 4)
CPM: special RFID tag/reader;
Selectively jams every access attempt
Permits access only to user-authorized tags
User-friendly GUI
(on cellular phone)
Connected via
Bluetooth to CPM
Example 2: pseudonymization PKI
/1

Traditional access:
• Identity = authenticate the user
• Identity = control if user is authorized
• Identity = use as accounting label

A rethinking:
• Authentication: no identity necessary

we only need to have a way to get back to the user if abuses occur…
• Authorization: no identity necessary

Showing a permission does not necessarily imply identity disclosure
• Accounting: a pseudonym is sufficient


We need a stable label (alias for the user) and payment mechanisms
Key idea:
• Separate pseudonym from access permission


Pseudonym: a label provided by a public infrastructure, which may be
reverted only by authorities
Access permission: a cryptographic “blind” credential
Example 2: pseudonymization PKI
/2
Developed in the
IST-DISCREET
EU project
ref: [ENCTS 197, 2008]
Pseudonym disclousure
SP Server
IR Server
SP Administrative Domain
ph
IR Administrative Domain
as
ph
e2
e1
as
Multiple servers
Chosen by the user
Pseudonym not assigned by SP
At registration, pseudonym gets “blindly”
authorized (SP sees real user) – non trivial crypto extension needed
During access, i) pseudonym gets verified, ii) authorization
Is checked. No identity revealed
Example 3: privacy-preserving
network monitoring Developed in the
IP packet
(raw capture)
Network Link Tap
FRONT-END
1st stage (flow/feature) processing
packet protection through encryption
Protected
packet
BACK-END
Packet meta-data analysis
Escrowing (if suspected anomaly)
Decryption and second-stage data processing
IST-PRISM
EU project
ref: [ACM CCS NDA08, 2008]
IDEA: allow decryption and
processing of data ONLY IF
specific conditions emerge
Demonstrated on anomaly
detection monitoring
application. Ingredients:
- Fast processing on
front-end through
Bloom filters
- threshold-based
decryption (Shamir’s approach)
only when clearly formalized
conditions representative
of a suspected anomaly occur
Conclusions

Privacy:
• Not only a regulatory issue
• But a TECHNICAL issue!

ICT can solve the (false!) privacy dichotomy!
• Technical solutions do exist to have BOTH privacy AND utility &
control!
• Services and technologies themselves should be technically
structured so as to comply with data protection

Non ICT further research directions:
• Law: how to quantitatively account for technical enhancements in
next generation regulation?
• Social/economic: can privacy preservation become an asset?
Download