Privacy-Enhancing Technologies (PETs) Simone Fischer-Hübner Note: The “OPTIONAL” tags (for the CS 6910 students) and page numbers were added by L. Lilien Overview I. II. III. IV. V. Introduction to PETs Anonymous Communication Technologies Anonymous eCash P3P (Platform for Privacy Preferences) Privacy-enhanced Idenity Management 2 I. Introduction to PETs Need for Privacy-Enhancing Technologies Law alone is not sufficient for protecting privacy in our Network Society PETs needed for implementing Law PETs for empowering users to exercise their rights 3 Classifications of PETs 1. PETs for minimizing/ avoiding personal data losses (-> Art. 6 I c., e. EU Directive 95/46/EC) (providing Anonymity, Pseudonymity, Unobservability, Unlinkability) At communication level: • Mix nets, Onion Routing • DC nets • Crowds At application level: Anonymous Ecash • Anonymous Credentials 2. PETs for the safeguarding of lawful processing (-> Art. 17 EU Directive 95/46/EC) • P3P • Privacy policy languages • Encryption 3. Combination of 1 & 2 • Privacy-enhanced Identity Management • 4 Definitions - Anonymity Anonymity: The state of being not identifiable within a set of subjects (e.g. set of senders or recipients), the anonymity set Source: Pfitzmann/Hansen 5 Perfect sender/receiver anonymity Perfect sender (receiver) anonymity: An attacker cannot distinguish the situations in which a potential sender (receiver) actually sent (received) a message or not. 6 Definitions - Unobservability Unobservability ensures that a user may use a resource or service without others being able to observe that the resource or service is being used Source: Pfitzmann/Hansen 7 Definitions - Unlinkability Unlinkability of two or more items (e.g., subjects, messages, events): Within the system, from the attacker’s perspective, these items are no more or less related after the attacker’s observation than they were before Unlinkability of sender and recipient (relationship anonymity): It is untraceable who is communicating with whom 8 Definitions - Pseudonymity Pseudonymity is the use of pseudonyms as IDs Pseudonymity allows to provide both privacy protection and accountability Person pseudonym Role pseudonym Relationship pseudonym Role-relationship pseudonym Transaction pseudonym L I N K A B I L I T y Source: Pfitzmann/Hansen 9 Definitions - Pseudonymity (cont.) Source: Pfitzmann/Hansen 10 II. Anonymous Communication Technologies Mix-nets (Chaum, 1981) Bob Alice A2, r1 A3, r2 Bob, r3, msg K3 K2 K1 msg Mix 3 Mix 1 A3, r2 Bob, r3, msg K3 K2 Bob, r3, msg K3 Mix 2 Ki: public key of Mixi, ri: random number, Ai: address of Mixi 11 OPTIONAL Functionality of a Mix Server (Mixi) Input Message Mi Discard repeated messages Collect messages in batch or pool Sufficient messages from many senders ? Change outlook *) i Reorder *) decrypts Mi = ci[Ai+1, ri, Mi+1] with the private key ci of Mixi, ignores random number ri, obtains address Ai+1 and encrypted Mi+1 Output Message Mi+1 to Mixi+1 MIX Message DB 12 OPTIONAL Why are random numbers needed ? If no random number ri is used : ci(M, Ai+1 ) Mixi M Mixi+1 Address(Mixi+1) = Ai+1 =? ci (M, Ai+1) 13 OPTIONAL Sender Anonymity with Mix nets 14 OPTIONAL Sender Anonymity with Mix-nets (cont.) 15 OPTIONAL Recipient Anonymity with Mix- nets 16 OPTIONAL Recipient anonymity with Mix- nets (cont.) 17 Two-Way Anonymous Conversation 18 Protection properties & Attacker Model for Mix nets Protection properties: Attacker may: Sender anonymity against recipients Unlinkability of sender and recipient Observe all communication lines Send own messages Delay messages Operate Mix servers (all but one...) Attacker cannot: Break cryptographic operations Attack the users personal machine 19 Attacks & Countermeasures Passive attacks: Correlation by content : -> all message to / from Mix should be encrypted and have to include random string Correlation by message length : -> uniform message length (through padding) Time correlation : -> Output batch: accumulate N messages, forward them in random order -> Pool: If (N+1)th message arrives, forward one message from the pool -> Combination of batch + pool -> Interval batching: Fill batch/pool with dummy messages at end of time interval T 20 -> random delay OPTIONAL Attacks & Countermeasures Active attacks: Isolate & Identify ( (n-1)-attack): -> Dummy messages -> Check sender ID Message replay attacks: -> Discard replays -> Intermix detours -> charge of Ecash Intersection/partitioning attacks: -> Mix cascades (use always the same sequence of Mixes) 21 Mix- Applications: Anonymous remailers Sender anonymity against recipients Servers that strip off identifying information from emails and forward them to receiver Simple remailers (one ”Mix”) are ”single points of trust” with no protection against time/content correlation attacks Some use encryption, can be chained and work like mixes (e.g., Mixmaster) Sender Remailer Recipient 22 OPTIONAL Existing Mix-based systems for HTTP (real-time) Simple Proxies Anonymizer.com ProxyMate.com Mix-based Systems considering traffic analysis: Onion Routing (Naval Research Center) TOR (Free Haven project) JAP (TU Dresden) 23 OPTIONAL Anonymising Proxies – Anonymizer.com Functionality: Web proxy (single ”Mix”) that forwards request on the user’s behalf Does not forward IP address of end user Eliminates infos about user’s machine (e.g., previously visited sites) Filters out cookies, JavaScript, active content Limitations: Single point of trust Connection itself is not anonymised 24 Onion Routing Onion = Object with layers of public key encryption to produce anonymous bi-directional virtual circuit between communication partners and to distribute symmetric keys Initiator's proxy constructs “forward onion” which encapsulates a route to the responder (Faster) symmetric encryption for data communication via the circuit U X Z Y X Y Y Z Z Z 25 OPTIONAL – see 6030 slides Forward Onion for route W-X-Y-Z: X exp-timex, Y, Ffx, Kfx, Fbx, Kbx Y exp-timey, Z, Ffy, Kfy, Fby, Kby, Z exp_timez, NULL, Ffz, Kfz, Fbz, Kbz, PADDING Each node N receives (PKN = public key of node N): {exp-time, next-hop, Ff, Kf, Fb, Kb, payload} PKN exp-time: expiration time next_hop: next routing node (Ff, Kf) : function / key pair for symmetric encryption of data moving forward in the virtual circuit (Fb, Kb) : function/key pair for symmetric encryption of data moving backwards in the virtual circuit payload: another onion (or null for responder´s proxy) 26 OPTIONAL Example: Virtual Circuit with Onion Routing 27 OPTIONAL Onion Routing - Review Functionality: Hiding of routing information in connection oriented communication relations Nested public key encryption for building up virtual circuit Expiration_time field reduces costs of replay detection Dummy traffic between Mixes (Onion Routers) Limitations: First/Last-Hop Attacks by Timing correlations Message length (No. of cells sent over circuit) 28 Crowds for anonymous WebTransactions 1. User first joins a "crowd" of other users, where he is represented by a "jondo" process on his local machine 2. User configures his browser to employ the local jondo as a proxy for all new services 3. User´s request is passed by the jondo to a random member of the crowd 4. That member can either submit the request directly to the web server or forward it to another randomly (with pf> 1/2) chosen user. -> Request is eventually submitted by a random member [jondo – derived from “John Doe”], an epitome for an anonymous 29 OPTIONAL Communications with Crowds 1 6 3 5 5 1 2 6 2 3 4 4 Communications between jondos is encrypted with keys shared between jondos 30 Anonymity degrees in Crowds 31 OPTIONAL Anonymity Properties in Crowds 32 Crowds -Review Sender anonymity against: end web servers other Crowd members eavesdroppers Limitations: No protection against “global” attackers, timing/message length correlation attacks Web server´s log may record submitting jondo´s IP address as the request originator´s address Request contents are exposed to jondos on the path Anonymising service can be circumvented by Java Applets, Active X controls Performance overhead (increased retrieval time, network traffic and load on jondo machines) 33 No defend against DoS-attacks by malicious crowd members OPTIONAL DC (Dining Cryptographers) nets [Chaum 1988 ] 34 OPTIONAL DC-nets: Perfect sender anonymity through Binary superposed sending and broadcast 35 OPTIONAL Anonymity preserving multiaccess protocols 36 OPTIONAL Anonymity preserving multiaccess protocols (cont.) 37 OPTIONAL Implementation-Example: Local-Area Ring Networks 38 OPTIONAL DC nets - Review Protection properties: Perfect sender anonymity through superposed sending (message bits are hidden by one-time pad encryption) Message secrecy through encryption Recipient anonymity through broadcast and implicit addresses (addressee is user who can successfully decrypt message) Problems: Denial of Service attacks by DC-net participants (Defense: trap protocols) Random key string distribution 39 III. Anonymous Ecash based on Blind Signatures Protocol Overview 40 Protocol steps for creating and spending untraceable Ecash Customer (Alice): Bank: generates a note number (100-digit number) at random in essence multiplies it by a blinding (random) factor signs the blinded number with a private key and sends it to the bank verifies and removes Alice´s signature debits Alice´s account (by $1) signs blinded note with a digital signature indicating its $1-value and sends it to Alice Customer (Alice): divides out the blinding factor uses bank notes (transfers it to shop) Merchant (Bob): Bank: verifies bank´s digital signature transmits note to bank verifies its signature checks the note against a list of those already spent credits Bob´s account sends signed ”deposit slip” to Bob Merchant (Bob): hands the merchandise to Alice together with his own signed receipt 41 OPTIONAL Mathematical protocol for issuing and spending untraceable money [Chaum 1987] (e,n) : bank´s public key, (d,n): bank´s private key 1. Alice chooses at random x and r, and supplies the bank with B= re * f(x) (mod n) where: x: serial number of bank note, r: blinding factor, f: one-way function 2. The bank returns Bd (mod n) = (re f(x) )d (mod n) = r * f(x) d (mod n) and withdraws one dollar from her account 3. Alice extracts C = Bd / r (mod n) = f(x)d (mod n) from B 4. To pay Bob one dollar, Alice gives him the pair (x, f(x)d(mod n)) 5. Bob immediately calls the bank, verifying that this note has not already been deposited The Bank and the shop do not know the blinding factor, i.e. they cannot relate the banknote to Alice -> Alice can shop anonymously 42 OPTIONAL Why is one-way function f needed? Suppose (x, xd mod n) is electronic money Money can be forged: choose y exhibit (ye mod n, y) To forge money of the form (x, f(x)d mod n), you have to produce (f-1(ye) mod n, y). 43 OPTIONAL Blind Signatures and Perfect Crime (e.g. blackmail) [von Solms et al. 1992]: Open bank account, create blinded notes, send mail with threat announcement and blinded notes Let the bank first sign the blinded notes and then publish them (e.g. in a newspaper) Divide out blinding factors to create digital money (only the blackmailer knows the blinding factors) Note: Conditions are worse as in usual kidnapping cases: Police cannot register serial number of bank notes No physical contact needed (to transfer blackmailed money) 44 IV. Platform for Privacy Preferences Project (P3P) - Overview Developed by the World Wide Web Consortium (W3C) Summarize privacy policies Compare privacy policies with user preferences Alert and advise users P3P increases transparency, but it does not set baseline standards or enforce policies P3P user agent software Does not require web sites to change their server software Enables the development of tools (built into browsers or separate applications) that P3P helps users understand privacy policies Final P3P 1.0 Recommendation issued 16 April 2002 Allows web sites to communicate about their privacy policies in a standard computer-readable format Microsoft Internet Explorer 6 Netscape Navigator 7 AT&T Privacy Bird http://privacybird.com/ For more information http://www.w3.org/P3P/ http://p3ptoolbox.org/ Web Privacy with P3P by Lorrie Faith Cranor http://p3pbook.com/ Source: Lorrie Cranor, lorrie.cranor.org 45 Basic components P3P provides a standard XML format that web sites use to encode their privacy policies Sites also provide XML “policy reference files” to indicate which policy applies to which part of the site (usually at ”well known location” /w3c/p3p.xml) Sites can optionally provide a ”compact policy” by configuring their servers to issue a special P3P header when cookies are set “P3P user agent” fetch and read P3P policies and can inform users about the site’s P3P privacy practices, and/or compare P3P policies with privacy preferences (in XML) set by users and take appropriate actions Source: Lorrie Cranor, lorrie.cranor.org 46 A simple HTTP transaction GET /index.html HTTP/1.1 Host: www.att.com . . . Request web page Web Server HTTP/1.1 200 OK Content-Type: text/html . . . Send web page Source: Lorrie Cranor, lorrie.cranor.org 47 … with P3P 1.0 added GET /w3c/p3p.xml HTTP/1.1 Host: www.att.com Request Policy Reference File Web Server Send Policy Reference File Request P3P Policy Send P3P Policy GET /index.html HTTP/1.1 Host: www.att.com . . . Request web page HTTP/1.1 200 OK Content-Type: text/html . . . Send web page Source: Lorrie Cranor, lorrie.cranor.org 48 P3P increases transparency P3P clients can check a privacy policy each time it changes P3P clients can check privacy policies on all objects in a web page, including ads and invisible images http://www.att.com/accessatt/ http://adforce.imgis.com/?adlink|2|68523|1|146|ADFORCE Source: Lorrie Cranor, lorrie.cranor.org 49 P3P in [MS] IE6 Automatic processing of compact policies only; third-party cookies without compact policies blocked by default Privacy icon on status bar indicates that a cookie has been blocked – pop-up appears the first time the privacy icon appears Source: Lorrie Cranor, lorrie.cranor.org 50 Users can click on privacy icon for list of cookies; privacy summaries are available at sites that are P3P-enabled Source: Lorrie Cranor, lorrie.cranor.org 51 Privacy summary report is generated automatically from full P3P policy Source: Lorrie Cranor, lorrie.cranor.org 52 AT&T Privacy Bird Free download of beta from http://privacybird.com/ “Browser helper object” for IE 5.01/5.5/6.0 Reads P3P policies at all P3P-enabled sites automatically Puts bird icon at top of browser window that changes to indicate whether site matches user’s privacy preferences Clicking on bird icon gives more information Current version is information only – no cookie blocking Source: Lorrie Cranor, lorrie.cranor.org 53 OPTIONAL Chirping bird is privacy indicator Source: Lorrie Cranor, lorrie.cranor.org 54 OPTIONAL Click on the bird for more info 55 OPTIONAL Privacy policy summary mismatch Source: Lorrie Cranor,56 lorrie.cranor.org OPTIONAL Users select warning conditions Source: Lorrie Cranor,57 lorrie.cranor.org What’s in a P3P policy? Name and contact information for site The kind of access provided Mechanisms for resolving privacy disputes The kinds of data collected How (for what purposes) collected data is used, and whether individuals can opt-in or opt-out of any of these uses Whether/when data may be shared and whether there is opt-in or opt-out Data retention policy 58 Source: Lorrie Cranor P3P/XML encoding Statement P3P version <POLICIES xmlns="http://www.w3.org/2002/01/P3Pv1"> <POLICY discuri="http://p3pbook.com/privacy.html" Location of name="policy"> human-readable P3P policy name <ENTITY> <DATA-GROUP> privacy policy <DATA Site’s ref="#business.contact-info.online.email">privacy@p3pbook.com name </DATA> <DATA and ref="#business.contact-info.online.uri">http://p3pbook.com/ contact </DATA> info <DATA ref="#business.name">Web Privacy With P3P</DATA> </DATA-GROUP> Access disclosure </ENTITY> Human-readable <ACCESS><nonident/></ACCESS> explanation <STATEMENT> <CONSEQUENCE>We keep standard web server logs.</CONSEQUENCE> <PURPOSE><admin/><current/><develop/></PURPOSE> How data may <RECIPIENT><ours/></RECIPIENT> be used <RETENTION><indefinitely/></RETENTION> <DATA-GROUP> Data recipients <DATA ref="#dynamic.clickstream"/> <DATA ref="#dynamic.http"/> Data retention policy </DATA-GROUP> </STATEMENT> Types of data collected </POLICY> </POLICIES> 59 Source: Lorrie Cranor P3P - Limitations P3P alone does not ensure compliance of privacy policies with privacy laws guarantee a minimum, non-negotiable level of privacy protection for individuals Are users forced/pushed to give-up privacy ? 60 OPTIONAL V. Privacy-Enhanced Identity Management - Motivation phone no Friend Glenn Government name income tax interests Blood group diary Hospital Users release different partial identities in dependence of their current roles/relationships 61 OPTIONAL EU FP6 Integrated Project Vision: Users can act securely and safely in the Information Society while keeping sovereignty of their private sphere http://www.prime-project.eu.org/ 62 OPTIONAL Pseudonymous Credential Systems (according to [Chaum 1985 ]) 63 OPTIONAL PRIME – user control of data release and linkability User controls What personal data is provided to what site under which conditions What identities /pseudonyms to use for interactions Person pseudonym Role pseudonym Relationship pseudonym Role-relationship pseudonym Transaction pseudonym [Pfitzmann/Hansen] L I N K A B I L I T y 64 OPTIONAL PRIME - E-Shopping Application 65 OPTIONAL HCI in PRIME: From legal reqs to PRIME UI - Enforcing informed consent JITCTAs- ”Just-in-Time Clickthrough Agreements” Art. 29 Working Party Proposal of Mulitlayered Privacy Notices DADAs – ”Drag and Drop Agreements” 66 OPTIONAL Dialog box for click-through 67 OPTIONAL ”DADA” to send credit card info 68 OPTIONAL From legal reqs to PRIME UI – Data Track Increase transparancy through Intelligible access to previous data disclosures Scrollable transaction records, Search template sentences: “Who has received my [drop-down list with data]?” Online help and functions for exercising basic rights 69 OPTIONAL PRIME application scenario for RFIDs based on “Blocker” Tag 1,2,3, …, 2023 pairs of sneakers and… (reading fails)… Blocker simulates all (billions of) possible tag serial numbers!! 70 Source:Ari Juels, RSA Laboratories OPTIONAL Two bottles of Merlot #458790 Blocker tag system should protect privacy but still 71 avoid blocking unpurchased items OPTIONAL Selective blocker tags (RSA Laboratories) Blocker tag can be selective: Privacy zones: Only block certain ranges of RFIDtag serial numbers Zone mobility: Allow shops to move items into privacy zone upon purchase Example: Tags might carry a “privacy bit” Blocker blocks all identifiers with privacy bit on Items in supermarket have privacy bit off On checkout, leading bit is flipped from off to on PIN required, as for “kill” operation 72 OPTIONAL PRIME IDM Privcay-enhanced IDM solution 73 OPTIONAL Privacy-enhanced IDM User’s personal device act as a ”RFID proxy” and runs an IDM system blocks all tags in a certain range and response on their behalfs Personal device controls what RFIDs may be selectively scanned, or can pretend that user carries certain RFIDs Response is governed by user-defined privacy policy, computed by IDM system 74 OPTIONAL Questions ? http://www.cs.kau.se/~simone/ 75 SEE CS 6910 WEB PAGE Further reading A.Pfitzmann, M.Hansen, ”Anonymity, Unlinkability, Unobservability, Pseudonymity, and Identity Management – A Consolidated Proposal for Terminology”, V0.28, http://dud.inf.tu-dresden.de/Anon_Terminology.shtml D.Chaum, "Untraceable Electronic Mail, Return Addresses, and Digital Pseudonyms", Communications of the ACM, 24 (2). 1981, pp. 84-88, http://world.std.com/~franl/crypto/chaum-acm-1981.html D.Chaum, "The Dining Cryptographers Problem: Unconditional Sender and Recipient Untraceability", Journal of Cryptology, 1, 1988. D.Chaum, "Security without Identification: Transaction Systems to Make Big Brother Obsolete", Communications of the ACM, 28 (10). 1985, pp.1030-1044, http://www.chaum.com/articles/Security_Wthout_Identification.htm P. Syverson, D. Goldschlag, M. Reed, "Anonymous Connections and Onion Routing", Proceedings of the 1997 Symposium on Security and Privacy, Oakland, 1997, http://www.itd.nrl.navy.mil/ITD/5540/projects/onion-routing/OAKLAND_97.ps , http://www.onion-router.net/Publications.html M.Reiter, A.Rubin, "Anonymous Web Transactions with Crowds", Communications of the ACM, Vol.42, No.2, February 1999, pp. 32-38. D.Chaum, "Achieving Electronic Privacy", Scientific American, August 1992, pp.76-81, http://www.chaum.com/articles/Achieving_Electronic_Privacy.htm Garfinkel, Juels, Pappu, ”RFID PRivacy: An Overview of Problems and Proposed Solutions”, IEEE Security & Privacy, May/June 2005 PRIME Framework V1, April 2005, http://www.prime-project.eu.org/ Simone Fischer-Hübner, "IT-Security and Privacy - Design and Use of Privacy-Enhancing Security Mechanisms", Springer Scientific Publishers, Lecture Notes of Computer Science, LNCS 1958, May 2001, ISBN 3-540-42142-4 (chapter 4). 76