7931 Robert Christensen 7 What is Communication Complexity about?

```CS 7931: Lecture 1
Robert Christensen
September 7, 2014
A model of how to understand how programs interact to solve a
problem. The model was proposed in the late 70s by Andy Yao,
which extracts out only the communications and ignores internal
computation 1 .
Given that Alice has the data x and Bob has the data y, how much
communication is needed in order to compute the function f ( x, y) ?
The function could be anything involving communication, such as
string comparison, set comparison, numerical computation (mean,
median, . . .)
We only care about the communication between Alice and Bob.
The time for internal computation within Alice and Bob is ignored in
this model. For example, in this model Alice might even be allowed
to solve the halting problem before responding to Bob. All that we
consider while using this model is the communication which takes
place between Alice and Bob.
One of the primary uses of this model is to prove lower bounds on
algorithms. This method can be used to prove lower bounds on an
algorithm in a different more complicated model. This works because
if we prove a lower bound for communication cost, the algorithm can
not be any faster than the communication cost of the algorithm.
This model is especially useful for studying streaming algorithms
and distributed systems where communication between machines or
processors is much more significant than processing local data.
What Communication Looks Like in This Model
Computation for Computing EQ
Lets assume that x and y are n-bit strings and f ( x, y) = EQ: is x = y?
(are the strings identical)? How can we do this?
The simple solution is for Alice to send all her data to Bob. Bob
can check if x = y locally and responds to Alice true or false. The
communication cost is determined by the length of x and the single
bit returned to indicate true or false
D (EQ) ≤ | x | + 1 = n + 1
Andrew Chi-Chih Yao. Some complexity questions related to distributive
computing(preliminary report). In
Proceedings of the Eleventh Annual ACM
Symposium on Theory of Computing,
STOC ’79, pages 209–213, New York,
NY, USA, 1979. ACM
1
Alice
Bob
has: x
has: y
cs 7931: lecture 1
Computation of disjoint set
Let the n-bit strings x, y represent the characteristic vectors of sets
Sx and Sy ⊂ [n] (i.e i ∈ SX ⇔ xi = 1). We define the function
DISJ ( x, y) = 1Sx ∩Sy =∅ . A trivial solution to this problem uses the
same process from computing EQ: Have Alice send x to Bob, Bob
computes the function and returns true or false. The deterministic
complexity for DISJ is the same as for EQ.
D (DISJ ) ≤ | x | + 1
Protocol Trees
We can describe a communication protocol abstractly by a protocol
tree.
• A non-leaf node is associated with either Alice or Bob. Attached to
this node is a function that takes the corresponding input as well
as all communication thus far, and outputs b ∈ {0, 1}. The result
of the function indicates which branch of the tree will be pursued
next.
a0 : X → {0, 1}
b0 : Y → {0, 1}
• Leaf nodes are labeled with the output of the protocol.
The protocol associated with a particular input ( x, y) corresponds
to a root-to-leaf path in the protocol tree. Therefore, the length of
this path equals the total communication between Alice and Bob.
The height of the protocol tree represents the worst-case complexity
(over all inputs). The goal is of course to design a protocol tree whose
height is minimized, or show that any protocol tree must have at
least a certain height.
Some Examples
Joint Mean Alice and Bob each have a collection of integers drawn
from [1 . . . n], and the goal is to compute the mean of the union of
the two sets. Note that Alice cannot just send her average to Bob
because the number might not be representable with a finite number
of bits (such as 17 = 0.142857). However, Alice can send the average
as a ratio of two numbers, the sum and the count. This information
is sufficient for Bob to compute the joint average and send back a
corresponding pair.
The maximum size of ∑ x is n2 . In order to describe a value of size
n2 we need 2 log n bits. Thus, the communication complexity to solve
joint mean is
D (Joint Mean) = O(log n)
P
( x, #Alice )
Alice
Bob
P
P
( x + y, #Alice + #Bob )
2
cs 7931: lecture 1
3
This is because ∑ x must be communicated, which takes O(log n)
bits to describe.
Joint Median
A ⊆ {1 . . . n }
B ⊆ {1 . . . n }
The values of A and B might overlap.
Both start out thinking the median could be anywhere in their
range. Denote this range of uncertainty as I, and let m I be its mid| A|
point. Initially, I = [1 . . . n] Alice sends ∑k=0 ( Ak > m I ) (The number
of elements in A which are larger than some interval I). In the beginning m I = n2 Bob excludes the portion of the data which we know
the median can not be a part of. Bob will update the interval I by reporting if the median is on the ‘left’ or the ‘right’ side of Alice’s data.
Eventually both will agree that the median must lay within a specific
interval.
For each round Alice must send the number of elements larger
than the interval I to Bob which takes O(log n) bits to communicate.
Bob returns with a single bit, taking 1 bit to communicate. Since
we are doing a binary search, O(log n) rounds will occur before the
answer is computed. The complexity of this is
D (med) ≤ log2 (n)
Using some tricks. We can send less information in future rounds
because the range of where the median could possible by is shrinking. Using these tricks the complexity shrinks to
D (med) ≤ log(n)
This is given as a problem for readers to solve.2
We discuss the solution in our mailing
list.
2
Proving Lower Bounds in Communication Cost
We define a combinatorial rectangle with the following definition:
y
R = A×B
A⊆X
B⊆Y
Another, more useful, definition of a combinatorial rectangle
R ⊆ x × y is a rectangle iff
x
..
...
cs 7931: lecture 1
4
( x1 , y1 ) ∈ R, ( x2 , y2 ) ∈ R ⇒
( x1 , y2 ) ∈ R
( x2 , y1 ) ∈ R
In other words, if you get the diagonal corners we also get the
anti-diagonal corners.
We now define an interesting property of the inputs associated
with a leaf of a protocol tree. For any leaf i, let Ri denote the subset
of X × Y that causes the protocol to end at this leaf. Note that the sets
R1 , R2 , . . . , R L is a partition of X × Y.
Lemma 1. For any leaf i, Ri is a combinatorial rectangle.
Alice only knows x and Bob only knows y. If ( x1 , y1 ) results in
the same tree traversal as ( x2 , y2 ) this means that the communication
transcript it identical between the two inputs. If we interchange y1
and y2 , at the first step. If the first step is Alice’s step, she will make
the same decision (because she has the same input, x1 ). Lets say the
next step is Bob. Bob is looking at his input y2 , we know when we
has ( x2 , y2 ) as input the path that we took to get to our current location in the decision tree the same. Since Bob is at the same location in
the tree and Alice has communicated the same information as input
for the node, Bob will make the same decision at the node, so the
next step is the same. This pattern will continue until we reach the
leaf node, so if ( x1 , y1 ) and ( x2 , y2 ) arrive in the same combinatorial
rectangle, the input ( x1 , y2 ) and ( x2 , y1 ) will also arrive in the same
combinatorial rectangle.
Having the same solution does not always mean you reach the
same leaf, but if you have the same leaf you will have the same answer.
For example, the following is the decision matrix for equality.
There is nowhere we can make a rectangle to cover more of the
‘1’ solutions since there is no configuration where the rectangle will
encompass no ‘0’s.
The following figure shown another example of a configuration
which does not work.
Because we can not create any larger rectangles in the solution:
` ≥ L = 2n
for equality problem
This is called a ‘fooling set’ argument.

1
 1

1
x

1


..

.
0
0
y
2n








1
All rectangles must have
the same solution, so this
does not work.


1 0 1
0 1 0
0 0 1
cs 7931: lecture 1
References
[1] Andrew Chi-Chih Yao. Some complexity questions related to
distributive computing(preliminary report). In Proceedings of the
Eleventh Annual ACM Symposium on Theory of Computing, STOC
’79, pages 209–213, New York, NY, USA, 1979. ACM.
5
```