Products of Games and Graphs

advertisement
Products
of Functions, Graphs, Games & Problems
Irit Dinur
Weizmann
Products
Why would anyone want to multiply two functions ?
graphs ?
problems ?
• For fun: to “see what happens”
• For “Hardness Amplification”
(holy grail = prove that things are hard)
Given f that is a little hard
construct f’ that is very hard
Circuit complexity, average case complexity, communication complexity,
Hardness of approximation
Products
Why would anyone want to multiply two functions ?
graphs ?
problems ?
• For fun: to “see what happens”
• For “Hardness Amplification”
(holy grail = prove that things are hard)
Given f that is a little hard
construct f’ that is very hard
Circuit complexity, average case complexity, communication complexity,
Hardness of approximation
By taking
f’ = f x f x … x f
P1 x P2
We can multiply many different objects
Numbers
Strings
Functions
Graphs
Games
Computational Problems
Direct Products of Strings / Functions
For example, here is how to multiply two strings:
1
1
0
1
1
0
1
11
11
10
11
11
10
1
11
11
10
11
11
10
0
01
01
00
01
01
00
1
11
11
10
11
11
10
1
11
11
10
11
11
10
0
01
01
00
01
01
00
In the k-fold product of a string 𝑠, for each (𝑖1 , 𝑖2 , … , π‘–π‘˜ ) we have
a π‘˜-bit substring corresponding to the restriction of 𝑠 to 𝑖1 , 𝑖2 , … , π‘–π‘˜ :
𝑓 𝑖1 , 𝑖2 , … , π‘–π‘˜ = 𝑠𝑖1 𝑠𝑖2 …π‘ π‘–π‘˜
Direct Products of Strings / Functions
For example, here is how to multiply two strings:
1
1
0
1
1
0
1
11
11
10
11
11
10
1
11
11
10
11
11
10
0
01
01
00
01
01
00
1
11
11
10
11
11
10
1
11
11
10
11
11
10
0
01
01
00
01
01
00
sum
In the k-fold product of a string 𝑠, for each (𝑖1 , 𝑖2 , … , π‘–π‘˜ ) we have
a π‘˜-bit substring corresponding to the restriction of 𝑠 to 𝑖1 , 𝑖2 , … , π‘–π‘˜ :
𝑓 𝑖1 , 𝑖2 , … , π‘–π‘˜ = 𝑠𝑖1 𝑠𝑖2 …π‘ π‘–π‘˜
𝑠𝑖1 + 𝑠𝑖2 + … + π‘ π‘–π‘˜
(the alphabet stays the same, but harder to analyze)
Testing Direct Products
Given a table of π‘˜-substrings, 𝑓: 𝑛
test that distinguishes between
• 𝑓 is a direct product
• 𝑓 is far from a direct product
π‘˜
→ 0,1 π‘˜ , is there a local
In [GGR] terms: is the property of being a direct product
locally testable ?
(answer: yes, with 2 queries)
Local to Global
Given: a very large and difficult problem (e.g. 3sat)
We will solve it together, by splitting the work into many small
sub-problems, each of (constant) size π‘˜
On average, the local value is > π‘£π‘Žπ‘™
On average, consistent with > π‘π‘œπ‘›π‘  fraction of neighbors
Question: is there a consistent global solution with value > gπ‘£π‘Žπ‘™
π‘˜-sub-problem
Testing Direct Products
[Goldreich-Safra,D.-Reingold, D.-Goldenberg, Impagliazzo-Kabanets-Wigderson]
Theorem [D.-Steurer 2013]
Any collection of local solutions with pairwise consistency 1 − πœ– must
be 1 − 𝑂 πœ– consistent with a global solution.
i.e. the property of being a direct product is testable with 2 queries.
Theorem [David-D.-Goldenberg-Kindler-Shinkar 2013]
The property of being a direct sum is testable with 3 queries.
k-substring
Multiplying Graphs
There are several natural graph products
In the “strong direct product”:
V(G1 x G2) = V(G1) x V(G2)
u1u2 ~ v1v2
iff
u1~v1 and u2 ~ v2
( u ~ v means u=v or u is adjacent to v )
1
2
3
1
11
12
13
2
21
22
23
3
31
32
33
Multiplying Graphs
Basic question: how do natural graph properties
(such as: chromatic number, max-clique, expansion, …)
Behave wrt the product operation
If clique ( G1 ) = m1 and clique ( G2 ) = m2
then clique ( G1 x G2 ) = m1m2
If independent-set ( G1 ) = m1 and independent-set ( G2 ) = m2
then independent-set ( G1 x G2 ) = ?
Generally, the answer is easy if the maximizing solution is itself a product,
but often this is not true. Then, the analysis is challenging
Definition : The Shannon capacity of G
is the limit of ( a(Gk) )1/k as k οƒ  infty
[Shannon 1956]
a(G) – stands for maximum independent set
Consider a transmission scheme of one symbol at a time,
and draw a graph with an edge between each pair of
symbols that might be confusable in transmission.
a(G)
= number of symbols transmittable with zero error
a(Gk) = set of such words of length k
(a(Gk))1/k = effective alphabet size
Lovasz 1979 computed the Shannon capacity of several
graphs, e.g. C5, by introducing the theta function
C7 is still open – (one of the most notorious problems in
extremal combinatorics)
Multiplying Games
Games (2-player 1-round)
U
Alice
V
Bob
u
𝐡 ∢ 𝑉Σ
…
…
𝐴 βˆΆπ‘ˆοƒ Σ
v
Referee: random u οƒŸ v
v
u
Alice
Bob
A(u)
B(v)
Games (2-player 1-round)
U
Alice
Bob
u
…
…
𝐴 βˆΆπ‘ˆοƒ Σ
V
𝐡 ∢ 𝑉Σ
v
Value ( G ) = maximal success probability, over all possible strategies
Games (2-player 1-round)
U
Alice
V
Bob
u
…
…
U = set of variables
The 3SAT game
V = set of 3sat clauses
v
FGLSS
Value ( G ) = maximal success probability, over all possible strategies
Label-Cover Problem : Given a game G, find value ( G )
Strong PCP Theorem: Label Cover is NP-hard to approximate
[AS, ALMSS 1991] + [Raz 1995]
The PCP Theorem [AS, ALMSS]
PCP theorem: “gap-3SAT is NP-hard”
Proof: By reduction from small gap to large gap,
aka amplification
Start with 𝐺 and end up with 𝐺’, s.t.
If π‘£π‘Žπ‘™(𝐺) = 1 then π‘£π‘Žπ‘™(𝐺’) = 1
If π‘£π‘Žπ‘™(𝐺) < 1 then π‘£π‘Žπ‘™(𝐺’) < ½
How?
• by algebraic encoding [AS, ALMSS 1991]; or
• by “multiplying” 𝐺 with itself, 𝐺’ = 𝐺 ⊗ β‹― ⊗ 𝐺
repeatedly [D. 2007]
Multiplying Games
A game is specified by its constraint-graph,
so a product of two games can be defined by a product of
two constraint graphs
𝐺1
X
=
𝐺2
𝐺1 ⊗ 𝐺2
U1
V1
U2
X
u1
=
u2
…
…
…
…
v1
V2
v2
𝐺1 ⊗ 𝐺2
U2
V1
U1
X
u1
=
u2
…
…
…
…
v2
v1
U1 x U2
Alice
Bob
Σ1 x Σ2
…
οƒ 
V1 x V2
u1
u2
…
A : U1 x U2
V2
v1
v2
B : V1 x V2
οƒ 
Σ1 x Σ2
k-fold product of a game
Ux … x U
Vx … x V
Alice
Bob
u1u2…uk
…
…
A : Uk οƒ  Σk
B : Vk οƒ  Σk
v1v2…vk
Also called: the k-fold parallel repetition of a game
Q1: If π‘£π‘Žπ‘™π‘’π‘’ ( 𝐺1 ) = 𝛼1 and π‘£π‘Žπ‘™π‘’π‘’ ( 𝐺2 ) = 𝛼2
then what is π‘£π‘Žπ‘™π‘’π‘’ ( 𝐺1 ⊗ 𝐺2 ) ?
Q2: If π‘£π‘Žπ‘™π‘’π‘’ ( 𝐺 ) = 𝛼, then what is π‘£π‘Žπ‘™π‘’π‘’ ( 𝐺 ⊗π‘˜ ) for π‘˜ > 1 ?
One obvious candidate is the direct product strategy.
But it is not, in general, the best strategy.
Theorem [D.-Steurer 2013]: Let 𝐺 be a projection game.
If π‘£π‘Žπ‘™ 𝐺 < 𝜌, then
π‘£π‘Žπ‘™ 𝐺 ⊗π‘˜
2√𝜌
≤
1+𝜌
If π‘£π‘Žπ‘™ 𝐺 < 𝜌 (close to 0), then π‘£π‘Žπ‘™
π‘˜/2
𝐺 ⊗π‘˜
≤
(4𝜌)π‘˜/4
(new; implies new hardness results for label-cover &
optimal NP-hardness results for set-cover)
If π‘£π‘Žπ‘™ 𝐺 < 1 − πœ– (close to 1), then π‘£π‘Žπ‘™
𝐺 ⊗π‘˜
≤ 1
πœ–2
− 16
BGLR “sliding
scale”conjecture
π‘˜
(known; we just improve the constants of [ Rao, Holenstein, Raz ])
Also: short proof for “strong PCP theorem” or “hardness of label-cover”
Ideas extend to give a parallel repetition theorem for entangled games,
i.e. when the two players share a quantum state [with Vidick & Steurer]
One slide about the new proof
1. View a game as a linear operator acting on (Bob)-assignments
The game value ≈ a natural norm of this operator
2. Define:
val+ G ≔ sup
𝐻
| 𝐺⊗𝐻 |
𝐻
( 𝐺 is the collision value of 𝐺, closely related to π‘£π‘Žπ‘™(𝐺) )
Think of val+ as an “environmental value” of 𝐺: how
much harder is it to play 𝐺 in parallel with environment
𝐻, compared to playing 𝐻 alone
3. Show:
Multiplicativity: π‘£π‘Žπ‘™+ 𝐺 ⊗ 𝐻 = π‘£π‘Žπ‘™+ 𝐺 ⋅ π‘£π‘Žπ‘™+ 𝐻
Approximation: π‘£π‘Žπ‘™+ 𝐺 ≈ π‘£π‘Žπ‘™(𝐺)
So: π‘£π‘Žπ‘™ 𝐺 ⊗π‘˜ ≈ π‘£π‘Žπ‘™+ 𝐺 ⊗π‘˜ = π‘£π‘Žπ‘™+ 𝐺
π‘˜
≈ π‘£π‘Žπ‘™ 𝐺
π‘˜
Approximation is proven by expressing π‘£π‘Žπ‘™+ as an “eigenvalue”, enabled by factoring
out H; easy for expanders
Summary
• Direct product of strings & functions
and a related local-to-global lifting theorem
• Direct product of games
and new parallel repetition theorem
• Direct products of computational problems ??
e.g. for graph problems (max-cut, vertex-cover, ... )
Download