Formal Methods for Security Protocols Catuscia Palamidessi Penn State university, USA 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 1 Security Protocols Contents of previous lectures: • Brief introduction to security protocols • Aims and properties • authentication, secrecy, integrity, anonymity, etc. • Brief introduction to Cryptographic tools • Symmetric and asymmetric cryptography • one-way functions, door traps • Vulnerability of Security protocols • Next: Introduction to Concurrency 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 2 Brief introduction to Concurrency • The CSP approach • Communicating Sequential Processes [Hoare 78] • Mathematical framework for the description and analysis of systems consisting of processes interacting via exchange of messages • Automatic tools available for proving properties of CSP specifications: • Model-checker FDR • Theorem prover PVS 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 3 The CSP formalism • A small mathematical language containing the main constructs for specifying concurrency, parallelism, communication, choice, hiding etc. • The evolution of processes is based on a sequence of events or actions • Visible actions S • Interaction with other processes, communication • Invisible action t • Internal computation steps 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 4 The CSP language: Syntax • Inaction: Stop • • Termination, deadlock, incapability of performing any action, either internal or external Input: in ? x : A g P(x) • Execute an input action on channel in, get message x of type A, then continue as P(x) • Output: out ! m g P(x) • • Execute an output action on channel out, send message m, then continue as P(x) Recursion: P(y1,…,yn) = Body(y1,…,yn) • Process definition. P is a process name, y1,…,yn are the parameters, Body(y1,…,yn) is a process expression • Example: Copy = in ? x g out ! m g 6 June 2002 - Lecture 3 Copy TU Dresden - Ws on Proof Theory and Computation 5 The CSP Syntax • External (aka guarded) choice: P [] Q • Execute a choice between P and Q. Do not choose a process which cannot proceed • Example: (a ? x g P(x)) [] (b ? x g Q(x)) Execute one and only one input action. If only one is available then choose that one. If both are available than choose arbitrarily. If none are available then block. The unchoosen branch is discarded (commitment) • Internal choice: P + Q • Execute an arbitrary choice between P and Q. It is possible to choose a process which cannot proceed 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 6 The CSP Syntax • Parallel operator w/synchronization: P || Q • P and Q proceed in parallel and are obliged to synchronize on all the common actions • Example: (c ? x g P(x)) || (c ! m g Q) • Synchronization: the two processes can proceed only if their actions correspond • Handshaking: sending and receiving is simultaneous (clearly an abstraction. Buffered communication can anyway be modeled by implementing a buffer process) • Communication: m is transmitted to the first process, which continues as P(m). • Broadcasting: c ! m is available for other parallel procs • Question: what happens with the process ((c?xgP(x)) [] (d?y gQ(y))) || (c!m gR) 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 7 The CSP Syntax • Parallel operator w/synchronization and interleaving: P ||AQ • P and Q are obliged to synchronize only on the common actions in A • They interleave on all the actions not in A • Example: (c ? x gP(x)) ||{c} ((c ! m gQ) [] (d ! n g R)) • the two processes can either synchronize on the action on channel c, or the second process can perform an action on d. In this second case the first process will remain blocked, though, until the second will decide to perform (if ever) an output action on c. • Question: in what part of the second process could this action on c be performed ? • Abbreviation: P ||| Q stands for P ||f Q 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 8 The CSP’s Syntax • Hiding: P \ A • P \A behaves as P except that all the actions in A are turned into invisible actions. So they cannot be used anymore to synchronize with other processes. • One possible use of this mechanism is to avoid that external processes interfere with the communication channels in P. (Internalization of communication in P.) • Renaming: P[y/x] • P[x/y] behaves as P except that all the occurrences of x are renamed by y. • Typically this serves to create different instances of the same process scheme • Abbr: P[y1,y2 /x1,x2] will stand for P[y1/x1][y2/x2] 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 9 Modeling Security Protocols in CSP Security protocols work through the interaction of a number of processes in parallel that send messages to each other. A formalism for concurrency is therefore an obvious notation for describing the participants and their role in the protocol • Example: The Yahalom protocol Message Message Message Message 6 June 2002 - Lecture 3 1 2 3 4 agb: bgs: sga: agb: a.na b.{a.na.nb}ServerKey(b) {b. kab.na.nb}ServerKey(a) .{a.kab}ServerKey(b) {a. kab}ServerKey(b) .{nb}kab TU Dresden - Ws on Proof Theory and Computation 10 Modeling Security Protocols in CSP • We assume that each process has channels • Receive • Send that it uses for all communications with the other nodes via the medium • Let us assume that A (Alice) and B (Bob) use the protocol, with A as initiator and B as responder, and that J (Jeeves) is the secure server 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 11 Modeling Security Protocols in CSP • A ’s view (initiator): • Message 1 a sends to b: a.na • Message 3 a gets from j: {b. kab.na.nb}ServerKey(a) .{a.kab}ServerKey(b) • Message 4 a sends to b: {a. kab}ServerKey(b) .{nb}kab • In CSP this behavior can be modeled as follows: Initiator(a,na ) = env?b: Agent g send.a.b.a.na g [] (receive.J.a{b. kab.na.nb}ServerKey(a) .m g send.a.b.m.{nb}kab g Session(a,b,kab,na,nb) ) kab e Key nb e Nonce meT 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 12 Modeling Security Protocols in CSP • B ’s view (responder): • Message 1 b gets from a: a.na • Message 2 b sends to j: b.{a.na.nb}ServerKey(b) • Message 4 b gets from a: {a. kab}ServerKey(b) .{nb}kab • In CSP this behavior can be modeled as follows: Responder(b,nb ) = [] (receive.a.b.a.na g send.b.J.b .{a.na.nb}ServerKey(b) g receive.a.b.{a. kab}ServerKey(b) .{nb}kab kab e Key g Session(b,a,kab,na,nb) ) nb e Nonce meT 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 13 Modeling Security Protocols in CSP • J’s view (server): • Message 2 j gets from ‘b’: b.{a.na.nb}ServerKey(b) • Message 3 j sends to a : {b. kab.na.nb}ServerKey(a) .{a.kab}ServerKey(b) • In CSP this behavior can be modeled as follows: Server(J,kab ) = [] (receive.b.J.b .{a.na.nb}ServerKey(b) g send.J.a. {b. kab.na.nb}ServerKey(a) .{a.kab}ServerKey(b) A,B e Agent g Server(J,ks ) ) Nb ,nb e Nonce Server(J) = ||| Server(J,kab ) kab e KeysServer Question: why several server processes in parallel? 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 14 Modeling an intruder • We want to model an intruder that represents all potential intruder behaviors Intruder(X) = learn ? m: messages gIntruder(close(X U {m}) [] say ! m: X /\ messages gIntruder(X) • Close(X) represents all the possible information that the attacker can infer from X. Typically we assume: • Dolew-Yao Assumptions: • • • • k , m |- {m}k {m}k , k-1 |- m <x1, … ,xn> |- xi x1 , … , xn |- <x1,…,xn>} 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 15 Putting the network together Initiator(Alice,nA)[fake,take/receive,send] ||| Responder(Bob,nB)[fake,take/receive,send] ||| Server(Jeeves)[fake,take/receive,send] ||| Intruder(f)[take.x.y,fake.x.y/learn,say] Bob send receive Jeeves Alice receive send fake.x.Bob receive send take.Alice.y 6 June 2002 - Lecture 3 learn Yves say TU Dresden - Ws on Proof Theory and Computation 16 Alternative with direct channels S =[fake,comm,take,comm,/receive,send,receive,send] Initiator(Alice,nA)[S] ||| Responder(Bob,nB)[S] ||| Server(Jeeves)[S] ||| Intruder(f)[S] Bob send receive Comm.Alice.Bob Jeeves Alice fake.x.Bob receive receive send send take.Alice.y 6 June 2002 - Lecture 3 learn Yves say TU Dresden - Ws on Proof Theory and Computation 17 Expressing Security Properties in CSP • Security properties: the goals that a protocol is meant to satisfy, relatively to specific kinds and levels of threat – the intruders and their capabilities • We will consider the following security properties: • Secrecy • messages, keys, etc. have not become known • Authentication • Guarantees about the parties involved in the protocol • Non-repudiation • Evidence of the involvement of the other party • Anonymity • Protecting the identity of agents wrt particular events 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 18 Anonymity • We will model events as consisting of two components: the event itself, x, and the identity of the agent performing the event, a a.x • AnUsers: the users who want to remain secret • Given x, define A = {a.x | a e AnUsers } • Definition: A protocol described as a CSP system P provides anonymity if an arbitrary permutation of the events in A, applied to all the traces of P, does not alter the set of all possible traces of P 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 19 Anonymity • Traces of a process: the sequences of visible actions in all possible runs • Example: Traces: a.b.c.d • Example: Traces: a -> b -> Stop ||| c -> d -> Stop a.c.b.d c.a.b.d a.c.d.b c.a.d.b c.d.a.b a -> b -> c -> Stop ||{b} d -> b -> e -> Stop a.d.b.c.e 6 June 2002 - Lecture 3 d.a.b.c.e a.d.b.e.c d.a.b.e.c TU Dresden - Ws on Proof Theory and Computation 20 Anonymity • Let AnUsers = {p1,p2} • Let A = {p1.m, p2.m} • Example 1 • Example 2 • Example 3 p1.m -> p2.m -> Stop p1.m -> Stop ||| p2.m -> Stop p1.m -> Stop + p2.m -> Stop • Question: for each system, say whether or not it provides anonymity wrt A 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 21 Anonymity • A more involved example: P = p1. m -> a -> Stop [] p2. m -> a -> Stop ||{p1.m , p2.m } p1. m -> b -> Stop [] p2. m -> c -> Stop Question: Does P provides anonymity wrt A = {p1.m, p2.m} 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 22 • Answer: No Anonymity P has traces (p1.m).b.a , (p2.m).c.a , … but not (p2.m).b.a , (p1.m).c.a , … The permutation { p1 -> p2 , p2 -> p1 } changes the traces. • However, if we assume that the observer has no visibility of the actions b and c, then the system does provide anonymity wrt A = {p1.m, p2.m} • One elegant way to formalize the concept of visibility in CSP is to use the the hiding operator: P\{b, c} provides anonymity wrt A • Note: Hiding A would not be correct. Example: p1.m -> Stop 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 23 Anonymity • In general, given P, consider the sets: • A = {a.x | a e AnUsers } : the actions that we want to know only partially (we want to know x but not a) • B : the actions that we want to observe • C = Actions – (B U A) : The actions we want to hide A The system to consider for the Anonymity analysis: P\C B 6 June 2002 - Lecture 3 C Method: for any perm r : A -> A Check that r (traces(P\C)) = traces(P\C) TU Dresden - Ws on Proof Theory and Computation 24 The dining cryptographers • Three cryptographers share a meal • The meal is paid either by the organization (master) or by one of them. The decision on who pays is taken by the master • Each of them is informed by the master whether or not he is paying • GOAL: The cryptographers would like to know whether the organization is paying or not, but without knowing the identity of the cryptographer who is paying (if any). 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 25 The dining cryptographers • Solution: Each cryptographer tosses a coin. Each coin is in between two cryptographers. • The result of each coin-tossing is visible to the adjacent cryptographers, and only to them. • Each cryptographer examines the two adjacent coins • If he is not paying, he announces “agree” if the results are the same, and “disagree” otherwise. • If he is paying, he says the opposite • Claim: if the number of “disagree” is even, then the master is paying. Otherwise, one of them is paying. In the latter case, the non paying cryptographers will not be able to deduce whom exactly is paying 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 26 Example: The dining cryptographers Crypt(0) notpays.0 pays.0 Coin(0) Coin(1) Master out.1 Crypt(1) 6 June 2002 - Lecture 3 Coin(2) look.2.0 Crypt(2) TU Dresden - Ws on Proof Theory and Computation 27 The dining cryptographers • Specification in CSP: Master and Coins Master = Sn pays.n -> notpays.(n+1) -> notpays (n+2) -> Stop + notpays.0 -> notpays.1 -> notpays.2 -> Stop Coin(n) = Heads(n) + Tails(n) Heads(n) = look.n.n.hd ->Stop ||| look.(n-1).n.hd ->Coin(n) Tails(n) = look.n.n.tl -> Stop ||| look.(n-1).n.tl ->Coin(n) • Note: the arithmetic operations are modulo 3 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 28 The dining cryptographers • Specification in CSP: Cryptographers Crypt(n) = notpays(n) -> Check(n) [] pays(n) -> Check’(n) Check(n) = look.n.n?x -> look.n.(n+1)?y -> if (x=y) then out.n. agree -> Stop else out.n. disagree -> Stop Check’(n) = look.n.n?x -> look.n.(n+1)?y -> if (x=y) then out.n. disagree -> Stop else out.n. agree -> Stop 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 29 The dining cryptographers • Specification in CSP: The whole system Crypts = Crypt(0) ||| Crypt(1) ||| Crypt(2) Coins = Coin(0) ||| Coin(1) ||| Coin(2) Meal = Master ||{pays, notpays} ( Coins ||{look} Crypts ) 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 30 The dining cryptographers • The anonymity property • A = { pays.0, pays.1, pays.2 } • B = { out } • C = Actions – (B U A) = {look,notpays} • Theorem: For every permutation r: A -> A, we have r(Traces(Meal\C)) = traces(Meal\C) • This theorem means that an external observer cannot infer which cryptographer has paid. • This theorem can be proved by using the automatic tool FDR. 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 31 The dining cryptographers • One can argue that previous result is not strong enough: a cryptographer has more information than an external observer. Let us then do the analysis for a cryptographer, say Crypt(0) A = { pays.1, pays.2 } B = { pays.0, notpays.0, look.0, out } C = Actions – (B U A) • Theorem: For every permutation r: A -> A, we have r(traces(Meal\C)) = traces(Meal\C) • This means that if Crypt(1) or Crypt(2) pay, then Crypt(0) can’t infer which of them has paid. The same can be shown for the other two. So Meal\C provides the desired anonymity property. 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 32 The dining cryptographers • Example of a case in which the anonymity property does not hold. • Assume that Crypt(0) can access the result of the third coin, namely has visibility of the result of the action look.2.2 A = { pays.1, pays.2 } B = { pays.0, notpays.0, look.0, out } U { look.2.2 } C = Actions – (B U A) • We have that for some permutation r: A -> A, r(traces(Meal\C)) =/= traces(Meal\C) pays.2 notpays.0 look.00.heads look.0.1.heads look.2.2.heads out.2.disagree YES pays.1 notpays.0 look.00.heads look.0.1.heads look.2.2.heads out.2.disagree NO 6 June 2002 - Lecture 3 TU Dresden - Ws on Proof Theory and Computation 33