Chapter 7 Bargaining “Necessity never made a good bargain” Economic Markets • Allocation of scarce resources – Many buyers & many sellers traditional markets – Many buyers & one seller auctions – One buyer & one seller bargaining The Move to Game-Theoretic Bargaining • Baseball – Each side submits an offer to an arbitrator who must chose one of the proposed results • Meet-in-the-Middle – Each side proposes its “worst acceptable offer” and a deal is struck in the middle, if possible • Forced Final – If an agreement is not reached by some deadline, one party makes a final take-it-or-leave-it offer Bargaining & Game Theory • Art:Negotiation • Science: Bargaining • Game theory’s contribution: – to the rules of the encounter Outline • Importance of rules: The rules of the game determine the outcome • Diminishing pies: The importance of patience • Estimating payoffs: Trust your intuition Take-it-or-leave-it Offers • • • • • Consider the following bargaining game (over a cake): I name a take-it-or-leave-it split. If you accept, we trade If you reject, no one eats! Faculty senate – if we can’t agree on a recommendation (for premiums in health care) the administration will say, “There is no consensus” and do what they want. We will have no vote. • Under perfect information, there is a simple rollback equilibrium Take-it-or-leave-it Offers accept 1-p , p reject 0,0 p • Second period: Accept if p > 0 • First period: Offer smallest possible p The “offerer” keeps all profits Counteroffers and Diminishing Pies • In general, bargaining takes on a “take-it-or-counteroffer” procedure • If time has value, both parties prefer to trade earlier to trade later • E.g. Labor negotiations – Later agreements come at a price strikes, work stoppages, etc. of • Delays imply less surplus left to be shared among the parties Two Stage Bargaining • Bargaining over division of a cake • I offer a proportion, p, of the cake to you • If rejected, you may counteroffer (and of the cake melts) • Payoffs: • In first period: 1-p , p • In second period: (1-)(1-p) , (1-)p Rollback • Since period 2 is the final period, this is just like a take-it-or-leave-it offer: – You will offer me the smallest piece that I will accept, leaving you with all of 1- and leaving me with almost 0 • What do I do in the first period? Rollback • Give you at least as much surplus • Your surplus if you accept in the first period is p • Accept if: Your surplus in first period Your surplus in second period p 1- Rollback • If there is a second stage, you get 1- and I get 0. • You will reject any offer in the first stage that does not offer you at least 1-. • In the first period, I offer you 1-. • Note: the more patient you are (the slower the cake melts) the more you receive now! First or Second Mover Advantage? • Are you better off being the first to make an offer, or the second? Example: Cold Day • If =1/5 (20% melts) • Period 2: You offer a division of 1,0 • You get all of remaining cake • I get 0 = 0.8 =0 • In the first period, I offer 80% • You get 80% of whole cake • I get 20% of whole cake = 0.8 = 0.2 Example: Hot Day • If =4/5 (80% melts) • Period 2: You offer a division of 1,0 • You get all of remaining cake • I get 0 = 0.2 =0 • In the first period, I offer 20% • You get 20% of whole cake • I get 80% of whole cake = 0.2 = 0.8 First or Second Mover Advantage? • When players are impatient (hot day) First mover is better off – Rejecting my offer is less credible since we both lose a lot • When players are patient (cold day) Second mover better off – Low cost to rejecting first offer • Either way – if both players think through it, deal struck in period 1 Don’t Waste Cake COMMANDMENT In any bargaining setting, strike a deal as early as possible! • Why doesn’t this happen? – Reputation building – Lack of information Uncertainty in Civil Trials • Civil Lawsuits • If both parties can predict the future jury award, can settle for same outcome and save litigation fees and time • If both parties are sufficiently optimistic, they do not envision gains from trade Plaintiff sues defendant for $1M • Legal fees cost each side $100,000 • If each agrees that the chance of the plaintiff winning is ½: • Plaintiff: • Defendant: - $500K - $100K = $ 400K $500K - $100K = $-600K • If simply agree on the expected winnings, $500K, each is better off settling out of court. • Defendant should just give the plaintiff $400K as he saves $200K. Uncertainty in Civil Trials • What if both parties are too optimistic? • Each thinks that his or her side has a ¾ chance of winning: • Plaintiff: • Defendant: - $750K - $100K = $ 650K $250K - $100K = $-350K • No way to agree on a settlement! Defendant would be willing to give plaintiff $350, but plaintiff won’t accept. von Neumann/Morganstern Utility over wealth • How big is the cake? • Is something really better than nothing? Lessons • Rules of the bargaining game uniquely determine the bargaining outcome • Which rules are better for you depends on patience, information • What is the smallest acceptable piece? intuition Trust your • Delays are always less profitable: Someone must be wrong Non-monetary Utility • Each side has a reservation price • Like in civil suit: expectation of winning • The reservation price is unknown • One must: • Consider non-monetary payoffs • Probabilistically determine best offer • But – probability implies a chance that no bargain will be made Example: Uncertain Company Value • Company annual profits are either $150K or $200K per employee • Two types of bargaining: • Union makes a take-it-or-leave-it offer • Union makes an offer today. If it is rejected, the Union strikes, then makes another offer • A strike costs the company 10% of annual profits Take-it-or-leave-it Offer • Probability that the company is “highly profitable,” i.e. $200K is p • If offer wage of $150 • Definitely accepted • Expected wage = $150K • If offer wage of $200K • Accepted with probability p • Expected wage = $200K(p) Take-it-or-leave-it Offer Example I • p=9/10 • 90% chance company is highly profitable • Best offer: Ask for $200K wage • Expected value of offer: (.9)$200K = $180K • But: 10% chance of No Deal! Take-it-or-leave-it Offer Example II • p=1/10 • 10% chance company is highly profitable • Best offer: Ask for $150K wage • If ask for $200K Expected value of offer: (.1)$200K = $20K • If ask for $150K, get $150K • Not worth the risk to ask for more. Two-period Bargaining • If first-period offer is rejected: A strike costs the company 10% of annual profits • Note: strike costs a high-value company more than a low-value company! • Use this fact to screen! Screening in Bargaining • What if the Union asks for $160K in the first period? • Low-profit firm ($150K) rejects – as can’t afford to take. • High-profit firm must guess what will happen if it rejects: • Best case – Union strikes and then asks for only $140K (willing to pay for some cost of strike, but not all) • In the mean time – Strike cost the company $20K • High-profit firm accepts Separating Equilibrium • Only high-profit firms accept in the first period • If offer is rejected, Union knows that it is facing a low-profit firm • Ask for $140K in second period • Expected Wage: • • • • • $170K (p) + $140K (1-p) In order for this to be profitable $170K (p) + $140K (1-p) > 150K 140 +(170-140)p = 140+ 30p >150 if p > 1/3 , you win What’s Happening • Union lowers price after a rejection • Looks like “Giving in” • Looks like Bargaining • Actually, the Union is screening its bargaining partner • Different “types” of firms have different values for the future • Use these different values to screen • Time is used as a screening device Bargaining • The non cooperative games miss something essential: people can make deals - then can agree to behave in a way that is better for both. Economics is based on the fact that there are many opportunities to "gain from trade“. • With the opportunities, however comes the possibility of being exploited. Human beings have developed a systems of contracts and agreements, as well as institutions that enforce those agreements. • Cooperative game theory is about games with enforceable contracts. Strategic Decisions • • Non-strategic decisions are those in which one’s choice set is defined irrespective of other people’s choices. Strategic decisions are those in which the choice set that one faces and/or the outcomes of such choices depend on what other people do. These decisions can be characterised in two general ways: 1. 2. Cooperative games: Where the outcome is agreed upon through joint action and enforced by some outside arbitrator. Non-cooperative games: The outcome arises through separate action, and thus does not rely on outside arbitration. Cooperative Bargaining • A bargaining situation can be approached as a cooperative game. All bargaining situations have two things in common: 1. 2. The total payoff created through cooperation must be greater than the sum of each party’s individual payoff that they could achieve separately. The bargaining is thus over the ‘surplus’ payoff. As no bargaining party would agree to getting less than what they get on their own. A player’s ‘outside option’ is also known as a BATNA (Best Alternative To Negotiated Agreement) or disagreement value. Two people dividing cash CONSIDER THE FOLLOWING BARGAINING GAME • Jenny and George have to divide candy bar • They have to agree how to divide up the candy • If they do not agree they each get nothing • They can’t divide up more than the whole thing • They could leave some candy on the table What is the range of likely bargaining outcomes? Likely range of outcomes • Clearly neither Jenny nor George can individually get more than 100% • Further, neither of them can get less than zero – either could veto and avoid the loss • Finally, it would be silly to agree on something that does not divide up the whole 100% – they could both agree to something better • But that is about as far as our prediction can go! Likely range of outcomes • So our prediction is that Jenny will get %j and George will get %g where 1. %j ≥ 0; 2. %g ≥ 0 and; 3. %j + %g = %100. Modified bargaining game • Jenny and George still have to divide 100% • They must agree to any split • If they do not agree then Jenny gets nothing and George gets 50% • They can’t divide up more than 100% • They could leave some on the table • Now, what is the range of likely bargaining outcomes? Likely range of outcomes in modified game • Clearly neither Jenny nor George can individually get more than 100% • Further, Jenny would veto anything where she gets less than 0% • George will veto anything where he gets less than 50% • And it would be silly to agree on something that does not divide up the whole 100% Likely range of outcomes for modified game • So our prediction is that Jenny will get %j and George will get %g where 1. %j ≥ 0; 2. %g ≥ 50 and; 3. %j + %g = %100. • Note by changing George’s ‘next best alternative’ to agreeing with Jenny, we change the potential bargaining outcomes. Ultimatum Games: Basic Experimental Results • In a review of numerous ultimatum experiments Camerer (2003) found: – The results reported…are very regular. Modal and median ultimatum offers are usually 40-50 percent and means are 30-40 percent. There are hardly any offers in the outlying categories of 0, 1-10%, and the hyper-fair category 51-100%. Offers of 40-50 percent are rarely rejected. Offers below 20 percent or so are rejected about half the time.” Ultimatum Bargaining with Incomplete Information Ultimatum Bargaining with Incomplete Information • Player 1 begins the game by drawing a chip from the bag. Inside the bag are 30 chips ranging in value from $1.00 to $30.00. Player 1 then makes an offer to Player 2. The offer can be any amount in the range from $0.00 up to the value of the chip. • Player 2 can either accept or reject the offer. If accepted,Player 1 pays Player 2 the amount of the offer and keeps the rest. If rejected, both players get nothing. Experimental Results Questions: 1) How much should Player 1 offer Player 2? 2) Does the amount of the offer depend on the size of the chip? 2) What should Player 2 do? Should Player 2 accept all offers or only offers above a specified amount? Explain. Composition of Urn: 0 - 30 5 - 25 10 - 20 Mean % of Pie Offered to Receiver: 31.2% 34.2% 42.4% How should Ali & Baba split the pie? • Ali and Baba have to decide how to split up an ice cream pie. • The rules specify that Ali begins by making an offer on how to split the pie. Baba can then either accept or reject the offer. • If Baba accepts the offer, the pie is split as specified and the game is over. • If Baba rejects the offer, the pie shrinks, since it is ice cream, and Baba must then make an offer to Ali on how to split the pie. • Ali can either accept or reject this offer. • If rejected, the pie shrinks again and Ali must then make another offer to Baba. • This procedure is repeated until and offer is accepted or the pie is gone. How should Ali & Baba split the pie? • 1. How much should Ali offer Baba in the first round? • 2. Should Baba accept this offer? Why or why not? Ali & Baba’s Pie Woes • Initial Pie Size = 100 • Pie decreases by 20 each time an offer is rejected. • Question: What is the optimal split of this pie? That is, how much should Ali offer Baba in the first round so that Baba will accept the offer. Ali & Baba’s Pie Woes Offerer Ali Round 1 Pie Size 100 Baba 2 80 Ali 3 60 Baba 4 40 Ali 5 20 Pie Split Ali 60 Baba 40 <40 >40 >40 <20 <20 >20 10 10 Baba may as well accept first offer. It never really gets better for him. Formulas: If the number of rounds in the game is even, the pie should be split 50/50. If the number of rounds in the game is odd, then the proportion of the pie for each player is: (n + 1)/2n for Ali (initial offer) – first person advantage! (n-1)/2n for Baba. For example, in this game n = 5, so Ali gets: (5+1) / (2*5) = 6/10. 60% of 100 is 60. Suppose the discount is 25% Offerer Ali Baba Ali Baba Round 1 2 3 4 Pie Size 100 75 50 25 Ali 75 25 50 0 Baba 25 50 0 25 Pie Split If Ali offered 50%, Baba would have no reason to question! He never gets more. Model for Bargaining – no shrinking pie Example – two people bargaining over goods • Amy has 10 apples and 2 banana • Betty has 1 apple and 15 bananas • Before eating their fruit, they meet together Questions: • Can Amy and Betty agree to exchange some fruit? • If so, how do we characterize the likely set of possible trades between Amy and Betty? The Edgeworth Box for Amy and Betty Box is 17 units wide – to represent the 17 bananas in total Box is 11 units high – to represent the 11 apples in total First – what are they trading over? Amy has 10 apples and 2 banana Betty has 1 apple and 15 bananas So in total they are bargaining over the division of 11 apples and 17 bananas So we can represent ALL possible trading outcomes by points in a rectangle – called an Edgeworth Box The Edgeworth Box for Amy and Betty Amy’s apples Measure Amy’s bundle from here Amy’s bananas The Edgeworth Box for Amy and Betty Betty’s bananas Measure Betty’s bundle from here Betty’s apples The Endowment bundle – initial amounts 15 bananas OB 1 apple 10 apples OA 2 bananas The allocation where Betty gets all the apples and Amy gets all the bananas OB 11 apples OA 17 bananas Bargaining and the Edgeworth box • An allocation is only a feasible outcome of trade between Betty and Amy if it cannot be blocked • This means that Betty must be at least as well off with the trade as she is with her endowment • Also Amy must be at least as well off with the trade as he is with his endowment • And the allocation must be Pareto optimal for Betty and Amy so that they BOTH cannot do better Amy’s indifference curves We can draw Amy’s indifference curves Then put them in the Edgeworth Box 10 apples OA 2 bananas This is Amy’s indifference curve through his endowment bundle. She will block any allocation that puts her on a lower indifference curve 10 apples OA 2 bananas So ANY bargaining outcome must be in the shaded region of the Edgeworth Box – otherwise Amy will block the allocation. 10 apples OA 2 bananas Betty’s indifference curves 15 bananas OB 1 apple And we can put Betty’s indifference curves in the Edgeworth box ANY outcome of bargaining between Betty and Amy must lead to an allocation that is inside the shaded area below. This area is called “the lens of trade”. 15 bananas OB 1 apple 10 apples OA 2 bananas Definition – the lens of trade • When two people bargain over allocating goods, any agreed outcome must lie in the lens of trade. • The lens of trade is the area in the Edgeworth box bounded by the indifference curves for each person through the endowment bundle • Any allocation outside the lens of trade will be blocked by one of the people. • We call this “non blocked” set of choices the core. Note that we can move to an allocation that is better for BOTH Betty and Amy, like the green bundle. This bundle puts both Amy and Betty on higher (better) indifference curves. So the brown bundle cannot be Pareto optimal and will be blocked. OB OA The ONLY situation where we cannot find another bundle that makes both people better off is when we are at the tangency of Amy’s and Betty’s indifference curves – like the black bundle below. So this bundle is Pareto optimal. OB OA The contract curve 15 bananas OB 1 apple 10 apples OA 2 bananas The red curve joins all Pareto optimal bundles for Amy and Betty. This is the contract curve. An agreed allocation must lie on this curve So • From co-operative game theory we know that an acceptable allocation must be in the Core • It must lie in the lens of trade or else either Amy or Betty will block the allocation • It must lie on the contract curve or else another coalition of both Amy and Betty would block the allocation • So the core allocations are the contract curve inside the lens of trade. The red line (the contract curve inside the lens of trade) is the core. It gives the likely bargaining outcomes for Amy and Betty 15 bananas OB 1 apple 10 apples OA 2 bananas Summary so far • • • The Edgeworth box can be used to model bargaining outcomes for two people over bundles of goods The core is the set of bundles on the contract curve inside the lens of trade We predict that any trade will most likely lead to a core allocation But which allocation? Bargaining (Chapter 7) • Feasible alternatives – each person does better than disagreement point (d1, d2) S is set of alternatives s is agreement point • • U ={(u1(s), u2(s)), s S) is set of utility allocations Goals of a solution rule: 1. Pareto Optimal 2. independence of irrelevant alternatives 3. independence of linear transformations (if utilities are transformed by vi = ai +biui, solution is the same) Similarly disagreement points are transformed by same function Notice the multipliers and adders can be different for each person. Point is that relatively speaking the values have same relationship. Nash rule: maximize: (u1(s)-d1)(u2(s)-d2) Nash rule gives solution which satisfies the three goals listed! Would be nice if there was only one set of values that were maximizers. • compact: – bounded: can be contained in circle or box – closed: contains its boundary points Continuous functions on compact sets always attain their maximum. If f is continuous on a compact set X, then there exists x1 and x2 in X such that f(x1) f(x) f(x2) for all x in X. Theorem 7.3: The Nash rule is pareto optimal, independent of irrelevant alternatives and independent of linear transformations. Look at characteristics of set of utility allocations Symmetric (about diagonal), but non convex. symmetric, compact, and convex non-symmetric • Defn 7.4: A set of utility allocations U of a bargaining game is said to be convex if it contains every point on the line segment joining any two vertices. • A set of utility allocations U of a bargaining game is said to be symmetric if (u1,u2) U implies (u2,u1) U • A solution rule is symmetric if for every symmetric bargaining game u1(s) = u2(s) for each s. Both get same utility from a deal. • Thm 7.6: In a convex bargaining game, there exists exactly one utility allocation in the Nash solution. • If the game is symmetric, then the utilities in a Nash soluiton are equal. Consider the maximizer curves tangent to S Symmetric (about diagonal), but non convex. symmetric, compact, and convex unique maximizer y x*y = c maximizer curve x • In a strategic game (without cooperation), such as Bach or Stravinsky, either Bach/Bach or Stravinsky/Stravinky is best, but they are not equal, so we pick a mixed strategy. Here, you lose when Bach/Stravinsky or Stravinsky/Bach is picked. • In a correlated system, specific options are selected with certain probabilities. Thus, you could pick each of the good choices 50% of the time (or whatever is fair) • Defn 7.10 A correlated utility allocation with probability distribution with probability distributions (p1,p2,…pn) the utility is ( pi*u1(si), pi*u2(si) ) Assymetric bargaining games • Many bargaining games are essentially asymmetric either because of – differing attitudes towards risk between players – difference in payoffs in case of a disagreement – asymmetry in the set of utility allocations. Monotonicity in Bargaining • The Nash solution works well when there are asymmetries due to risk aversion or even in disagreement points. • When a disagreement point increases (due, say, to an outside option), the amount going to a person increases. • maximize (u1(s) –d1)(u2(s) –d2). We agree to a certain distribution, but if my outside options increase, I expect more. In water example, may agree to split the costs down the middle. When my costs for working alone go down, I expect you to pick up more of the costs of working together. • Changes in risk affect the utility function, so the Nash solution still works quite well. Original bargaining (d1,d2) d1 increases player1 gets more player1 gets less (d1’,d2) • Nash solution may not work well in terms of other asymmetric situations. • Example. Bankruptcy. Assets are less than debts. Nash solution provides an equal division of remaining assets. Unfair, if sizes of outstanding debt are different. • Example. Have K dollars to use to pay debts. Owed A1 and A2 to two people. • K < A1+A2 u2 K A2 A1+A2 fair allocation Original debts are equal Nash solution picks equal division along line of distribution A1 u1 K A2 A1 K K unfair allocation Original debts are unequal Nash solution picks player 2 to get complete payoff, while player 1 (who invested more) gets less than full payment What would we consider to be more fair? • Each person loses same amount? • Each person gets same percent of debt repaid? Notice the two overlapping solution sets. The larger one actually gives player one a smaller payoff. This violates monotincity, which states that as the solution set increases, your utility does not decrease. We also see that Nash doesn’t satisfy monotonicity. That is, when the set of possible solutions is larger, a person can actually get less. Kalai-Smorodinsky solution rule for dealing with assymetries • Take the furthest point on a line from (0,0) to u1_max u2_max. KS utility allocation KS line KS solution is independent of linear transformations, but not of irrelevant alternatives. If B is a convex and symmetric bargaining game, then KS and Nash are the same. Kalai-Smorodinsky solution rule • Take the furthest point on a line from (0,0) to u1_max u2_max. KS utility allocation KS line KS solution is independent of linear transformations, but not of irrelevant alternatives. Notice, how if an unchosen part is added, I can earn less. 7.3 The Core: minimal requirements that any reasonable agreement must have. • Consider the coalition of all players • An allocation just refers to a split of the total payoff available to all players. • An allocation is blocked if some coalition (an individual or subgroup) is better off separating and going their own way (i.e. the allocation does not give them their outside option). Thus, the allocation will never be agreed to. • An allocation is in the core if it cannot be blocked by any coalition including the grand coalition (the coalition of all players). The core is the range of reasonable bargaining outcomes Example • Three firms, x, y and z are negotiating a joint venture (JV). • • • • If any firm does not join the JV then it receives nothing. Firm y is critical to the JV. If x and z work together then they get $0m. Neither x nor z is critical to the JV. If x and y work together then they get $200m. Similarly if z and y work together then they get $220m. But if all three work together then they get $300m in total. X Y yes yes yes Z Value yes 0 yes 200 yes yes 220 yes yes 300 Example • What is the ‘range of likely bargaining outcomes’ (i.e. the core)? • Is an equal split blocked? Yes! Under an equal split, x, y and z each get $100m. So y and z together get $200m. But if y and z leave x out of the JV, then they get $220m. So the coalition of y and z will block an even split. • To be in the core we need a split so that each player gets a positive payoff; x and y together get at least $200m; y and z together get at least $220m; and the total $300m is divided up. • e.g. x gets $50m, y gets $160m, z gets $90m. • e.g. x gets $80m, y gets $120m, z gets $100m. Properties of the core • The core represents stable outcomes in the sense that no individual or subgroup can do better by themselves. • Allocations in the core are Pareto Efficient (i.e. they involve no waste; otherwise the allocation would be blocked by the ‘grand coalition’ of all players) • But – the core may not exist! Core existence – sharing the cost of water • Three towns, Amalga, Benson and Cove are bargaining over new water supplies • Each town pays $30m if it builds its own supply • Any two towns together pay only $40m • All three together pay $66m • So to be in the core, an allocation cannot involve any town paying more than $30m, or any two towns paying more than $40m, but all three towns in total pay $66m Core existence – sharing the cost of water Assume in our models, MUST be better to all work together. Amalga Benson Cove Joint But this cannot hold for any allocation – there is no core for this bargaining problem! As $40m is an average of $20m per town and $66m is an average of $22m per town, so no one will agree to grand coalition. If any two try to combine, the left out one will offer a better deal. Cost 30 yes yes yes yes yes yes yes yes 30 yes 30 yes 40 yes 40 40 yes 66 • See instability. • No one will agree to the grand coalition as it is worse that the pairs. Once the grand coalition was formed, a pair would splinter off as it would be better off. The core focuses on stability of coalitions. However, in many applications it is empty. Core Existence Say Amalga pays $a, Benson pays $b and Cove pays $c. Then $a, $b and $c must be no more than $30m each $a+$b, $a+$c and $b+$c can each be no more than $40m $a+$b+$c = $66m But this is impossible! To see this: $a + $b $a + $40m $c $40m $b + $c $40m Add up: $2a + $2b + $2c $120 So: $a + $b + $c $60 One of the coalitions of 2 towns will block the grand coalition unless this is satisfied. But this is impossible! Summary • For multi-person bargaining • We expect that the outcome will be in the Core • These are the ‘stable’ outcomes • But the Core does not always exist Section 7.3 • The characteristic (or the coalition function) of an nperson bargaining game is the function v:NP(Rn) where N is the set of all subsets of N. • It maps each coalition to its value (for each agent). • v(c) is also known as the worth of the coaltion C. • Any output of an n-person bargaining game that cannot be blocked is called a core-outcome. • Important issue is whether it has a non-empty core. • Balancedness ensures a non-empty core. Balanced contributions (what I contribute is equivalent to what you contribute) require a unique sharing. • Xc is the indicator function of C which is defined by Xc(k) = 1 if k C and 0 otherwise. • Def 7.19: A family of coalitions is said to be balanced if we can assign weighting factors to each so that when we multiply by the weights and add up, we get the grand coalition. • A set is comprehensive, if for any vector x in the set of utilities, any vectors where each component is smaller is in the set. • In essence, the weights in a balanced collection indicate a player’s presence and importance in the coalitions. • A side payment game indicates that utility can be transferred. Bondareva-Shapley theorem • Different ways to prove non-emptiness: - use the definition of the core and construct a core element - use the following well-known theorem: • Bondareva-Shapley theorem (Bondareva (1963) and Shapley (1967)): The core of a cooperative game is non-empty if and only if the game is balanced. Definition balancedness • Let B be a collection of the set 2N Example: n = 4, B = { {1, 2}, {1, 3}, {2, 3}, {4} } For S B define e S n with eiS 1 if i S and 0 otherwise • B is called a balanced collection if there exist weights lS (S S N element of B) such that l e SB • Example: l = {0.5, 0.5, 0.5, 1} S e 1 1 0 0 1 1 1 1 0 1 1 0 1 1 2 0 2 1 2 1 0 1 0 0 0 1 1 • Definition: A game is balanced if for every balanced collection B with corresponding weights lS: l c( S ) c( N ) SB S • In other words, it must be more costly to work separately than to work together. • In the Amalga, Benson, Cove water example: • {A} {B} {C} {AB} {BC}{AC}{ABC} • We could find weights (so collection is balanced) 1 1 0 1 1 1 1 0 1 1 1 2 0 2 1 2 1 1 0 0 0 1 But when we apply those weights to the costs of coalitions ½(40) + ½(40) + ½(40) = 120 < 122 (cost of grand coalition) Definition - Added value Case Study: Several bands exist and would like you to join them. Which do you join and what is your share of the profits? We can consider any group of ‘players’ and ask, “what do you bring to the group”? The answer is your ‘added value’. Helps one to estimate what share of the whole belongs to each person in the group. Added Value Your added value (surplus make possible by you of joining the group) equals: Value of group (with you as a member) minus ( Value of group without you plus your value alone ) Added Value - example • You have an assignment due and you are allowed to work in groups of four people if you choose • Without you, the other three members of your group will be able to get 75 marks (out of 100) each. • If you work alone then you can get 80 marks • But if you work with your group, then each of you will get 85 marks • So your Added Value = (85 × 4) – [(75 × 3) + 80] = 340 – 305 = 35 marks! (5 points for you and 10 for each of the others) Thus, it is a measure of what your presence is worth, above the minimum you would require for your services. Added value – Jenny and George • Divide a dollar. If can’t agree, both get nothing. Added Value George = Added Value Jenny = $100 – ($0 +$0) = $100 Added value – Jenny and George • Divide a dollar. If can’t agree, George gets $50. Added Value George Added Value Jenny = $100 – ($0 +$50) $50 = $100 – ($50 +$0) $50 Note – these are the same. This is a general result for TWO people bargaining (but ONLY for two people) Note, both add $50 as if they don’t work together, the best the two of them can do is $50, but earn $100 together. disagree ment point (George) Total payoff if cooperate disagree ment poing (Jenny) George’s Best Alternative To Negotiated Agreement – disagreement point Jenny’s Best Alternative To Negotiated Agreement – disagreement point George’s added value = Total Payoff – George’s BATNA – Jenny’s BATNA But this clearly equals Jenny’s added value So with 2 people: Total surplus from agreement = each person’s added value Predicted outcome for two person bargaining • For two person bargaining, the bargaining is over the added value from agreement. • Each person gets a share of the added value. • Each person’s TOTAL payoff is their disagreement point PLUS their share of the added value. • So the least anyone will get is their disagreement point (their BATNA: best alternative to negotiation agreement) • The most anyone will get is their outside option PLUS all the added value • In general, get in between - as added value is shared Application to a buyer and seller • So far just looked at two people dividing money • But the same ideas apply to two people bargaining over a good • The trick is to find • The outside options – disagreement points • The Added Value Definition: Willingness-to-pay Willingness-to-Pay (WTP) is the highest price that a buyer will agree to pay for a good or service. In other words – WTP is the price at which the buyer doesn’t care if he buys or walks away – WTP is the price at which the economic profit from buying is zero – (So it is like the “regular price” - you could get that price anytime, so no benefit to buy now. Or, it is like you will use this item in production and just break even – what you sell the item for equals what you paid for the raw goods plus labor.) Definition: Willingness-to-sell Willingness-to-Sell (WTS) is the lowest price that a seller will agree to accept in return for a good or service. In other words – WTS is the price at which the seller doesn’t care if she sells or walks away – WTS is the price at which the economic profit from selling is zero When is trade possible? WTP Buyer will accept a price below their WTP Seller will accept a price Above their WTS WTS • If WTP WTS, then trade is possible • But if WTP < WTS, no trade is possible: there is no price that both will accept! What is the added value created by trade? If the buyer and seller agree to a deal then the added value is just the WTP – WTS. The value to the buyer is the buyer’s economic profit = WTP – Price The value to the seller is the seller’s economic profit = Price – WTS The price divides the added value Value captured by... Willingness-to-Pay Added Value Price Willingness-to-Sell Buyer (consumer surplus) Seller (producer surplus) Multi-party bargaining 1. Each individual or sub group should never get less than their outside option – Because they can always ‘split off’ and go their own way 2. No individual or subgroup can get more than their added value + their outside option. – Because all others can always ‘throw you out’! The key here is the extension to subgroups of individuals. 7.4 Shapley • • • • • • • • • • In many cases, the outcomes in the core are not unique or are confusingly large. Which allocation do we pick? In other cases, the core may be empty. The Shapley value provides an appealing method of deciding the share of each individual in an n-person game. Concept is that of added value. You look at all permutations and figure if you were added to the group in the order represented by the permutation, what would you bring to the group. The reason all orders are used is this. Suppose Ali and Ben can get $10 together, but $1 and $3 individually. There is a total of $6 surplus to divide. The shapley value works with what I brings to the group: V(C+i) – V(C). The difference in the coalition value with i and without i. This value is called the marginal worth of player i when she joins coalition C. Ali could say, “I add $7 when I join you. When I join an empty coaltion I add 1.” The average I add is 4. Ben could say, “I add $9 when I join you and $3 when I join an empty coaltion.” The average is 6. Each person gets their average value. Notice that this is the same as splitting the added value (over disagreement point). The Shapley Value (Cont.) • A well know value division scheme • Aims to distribute the gains in a fair manner • A value division that conforms to the set of the following axioms: – Dummy players get nothing – Equivalent players get the same – If a game v can be decomposed into two sub games, an agent gets the sum of values in the two games: The Shapley Value • Given an ordering of the agents in A, we define S (, a) to be the set of agents of A that appear before a in • The Shapley value is defined as the marginal contribution of an agent to its set of predecessors, averaged on all possible permutations of the agents: 1 Sh( A, a) (v(S (, a) a) v( S (, a))) A! A Simple Way to Compute The Shapley Value • Simply go over all the possible permutations of the agents and get the marginal contribution of the agent, sum these up, and divide by |A|! • Extremely slow • Can we use the fact that a game may be decomposed to sub games, each concerning only a few of the agents? Defn 7.25 Shapley value (v) satisfies these properties: • • • • efficient – everything is allocated symmetric – doesn’t depend on labeling linear - (au+bv) = a(u) + b(v) irrelevant to dummy players: If i is a dummy player i(v) = 0 v(C {i}) v(C ) is called the • The value marginal worth of player i when she joins coalition C. • The Shapley value is best thought of as an allocation rule which gives every player his average or expected marginal worth. The Shapley Value • Grounded in set of axioms that a “good” solution should satisfy. • It is the only concept that conforms to all these axioms. • Are the axioms desirable? Are there other axioms that are desirable? • The test is in actual predictive power. What really happens in practice? • The Shapley value does pretty well in this regard. Using Shapley Values Example (Shapley, Shubik, and Banzhaf) • • • • Determine the power of a party in a multi-party legislature. Say Reds (43), Blues(33), Greens(16) and Browns (8). No party has a majority. The power of a party depends on how crucial it is to the formation of a majority coalition. Reds 43 Blues 33 yes yes yes Greens 16 yes 1 yes yes Value 1 yes yes yes yes yes Browns 8 1 0 yes 0 yes 0 yes yes yes yes yes 1 yes yes 1 yes 1 yes 1 yes yes yes yes yes yes 1 Measuring Contributions • Give value 1 to any majority coalition and 0 otherwise. • So a party makes the contribution 1 if by joining a coalition gives the coalition a majority and 0 otherwise. This party is pivotal. • Total of 15 possible coalitions (2n-1) • The majority coalitions – 4 one party coalitions – none earn points – 6 two party coalitions – 3 earn points [{R,B} {R,G} {R,Br} • Both members are pivotal – 4 three party coalitions – 3 where red is pivotal [{R,B,G} {R,B,Br} {R,G,Br}] and 1 where B, G, Br are each pivotal. – No party is pivotal in the Grand Coalition Using the Formula • The probability term corresponding to each two party coalition is (4-2)!(2-1)!/4! Or 1/12. • The probability terms corresponding each three party coalition is (4-3)!(3-1)!/(4)! = 1/12 1 Sh( A, a) (v(S (, a) a) v( S (, a))) A! Credit ? red blue green brown blue red blue brown green blue red green blue brown green red green brown blue green red brown green blue brown red brown blue green brown green blue brown red brown green blue red brown red green red blue brown red green red brown blue red green brown red blue red green brown blue red blue blue red green brown red blue red brown green red blue green brown red brown blue green red brown red blue brown red green red blue brown green red green brown blue red green red brown blue green red green brown green red blue red brown green blue red blue brown red green blue red brown red blue green red Who gets credit for each of 24 orders: red = 12 value = 1/2 blue = 4 value = 1/6 green = 4 value = 1/6 brown = 4 value = 1/6 The Shapley Values for Each Party • For Red = 1/12 x 3 +1/12 x 3 = ½ • For each of the other three it is = 1/12 x 1 + 1/12 x1 = 1/6 • So Red has the most power. • The other three have equal power even though they are widely disparate in size. • Small parties matter • Not to be used as a precise quantitative measure because we have assumed that all coalitions are equally likely and that all contributions are 0 or 1. • So if two of the larger parties are ideologically completely opposed to each other (never in a coalition) then the smaller parties may have even greater power. Example 7.27 Setting Landing Fees • Airport: fixed costs • variable costs – depending on types of planes that use airport. Consider building one runway. • Who should pay what for its use? • Let’s assume ki is the cost needed to land plane of type i. • Order the plane types so 0 < k1 < k2 … < kT • Let n be the number of expected landings. • In this case, values added to a coaltion are non-positive – as they represent costs. • We assume a runway of cost 10 can handle any smaller needs. Plane Type Cost of Number Runway of landings 1 1M 5K 2 2M 2K 3 3.5M 1K 4 7.5M 1K 5 10M 1K • So, to accommodate everyone we need a 10M runway, but what should each plane type pay for each landing? • Consider the planes 1111122345 • Consider all possible orderings and make each of them pay what they add to the cost of the needed runway (on average). • So for example, the second 2 in an order would never have to pay anything as the first two would have paid it already. • The first two would only have to pay if it were preceded by lesser numbers. If we actually ran the numbers we would get: Charge per landing Plane Type Number of landings Total revenue 1 100 5 500 2 300 2 600 3 800 1 800 4 2800 1 2800 5 5300 1 5300 10000 Computationally complex, so book shows shortcuts. • Only really care about first occurrence of each plane type. So could simplify by looking at ordering of each of five plane types. • Need to count all ways each order could occur so get proper weight. • v(C union i) – v(C) must be paid multiple times depending how many times this pattern occurs • Notice that the cost is 0 if anything of equal or higher cost already occus in C • Notice that the cost is the difference of this plane’s cost and the cost of the highest cost previous plane. • Note, we get exactly the same costs in the following cases: • C is permuted in any order, followed by i, followed by any permutation of remaining planes. • So order the C elements before i in |C|! ways • Order the remaining elemements after I in (|N| - |C|-1)! ways. • We then see the formula • |C|! (|N|-|C|-1)!/|N|! [v(C union i) - v(C)] • This is still pretty expensive to compute as there are lots of choices for C • In the text, they divide up the costs associated with each element into costs for each level. So a type 4 plane has a fee associated with it for each level (1,2,3,4). • The formula they finally end up with is k i (v) ( Kl 1 Kl ) /( t l }Nt} T l 1 Computation is a bit tricky – but it is just the Shapley value, computing using For our example this means: Planes of type 1 pay = 1M/(5+2+1+1+1) = 1000000/10000 = $100 Planes of type 2 pay 1M/10K +1M/5K = $300 Planes of type 3 pay 1M/10K + 1M/5K + 1.5M/3K = $800 Planes of type 4 pay 1M/10K + 1M/5K + 1.5M/3K + 4M/2K= $2800 Planes of type 5 pay 1M/10K + 1M/5K + 1.5M/3K + 4M/2K + 2.5M/1K = $5300