Processing Along the Way: Forwarding vs. Coding

advertisement
Processing Along the Way:
Forwarding vs. Coding
Christina Fragouli
Joint work with Emina Soljanin and Daniela Tuninetti
A field with many interesting
questions…
• Problem Formulations and Ongoing Work
Do credit cards
work in paradise?
1. Alphabet size and min-cut tradeoff
•
•
Directed graph with unit capacity edges, coding over Fq.
What alphabet size q is sufficient for all possible configurations
with h sources and N receivers?
If the min-cut to each receiver is h

7 1
 2N     q  N
4 2

Sufficient for h=2
An Example
Source 1 Source 2
2
1
R1
R2
3
R3
k
RN
An Example
Source 1 Source 2
Network Coding: assign a coding vector
to each edge so that each receiver has
a full rank set of equations
1 0 
x1 

 x2 
2
1
3
k
Coding vector: vector of coefficients
R1
R2
R3
RN
An Example
Source 1 Source 2
For h=2, it is sufficient to consider
q+1 coding vectors over Fq:
x
1 0  1 
 x2 
2
1
3
0 1 1
k
R2
R3
1
a
1
a2 
1
aq1
Any two such vectors form a basis
of the 2-dimensional space

R1
0
RN
An Example
Source 1 Source 2
For h=2, it is sufficient to consider
q+1 coding vectors over Fq:
x
1 0  1 
 x2 
2
1
3
0 1 1
k

R1
R2
R3
RN
0
1
a
1
a2 
1
aq1
An Example
Source 1 Source 2
For h=2, it is sufficient to consider
q+1 coding vectors over Fq:
x
1 0  1 
 x2 
2
1
3
0 1 1
k

R1
R2
R3
RN
0
1
a
1
a2 
1
aq1
An Example
Source 1 Source 2
For h=2, it is sufficient to consider
q+1 coding vectors over Fq:
x
1 0  1 
 x2 
2
1
3
0 1 1
k

R1
R2
R3
RN
0
1
a
1
a2 
1
aq1
An Example
Source 1 Source 2
For h=2, it is sufficient to consider
q+1 coding vectors over Fq:
x
1 0  1 
 x2 
2
1
3
0 1 1
k

R1
R2
R3
RN
0
1
a
1
a2 
1
aq1
Connection with Coloring
0 1 1 0 1

a 1 a 2


 1 a q 1
Source 1 Source 2
2
1
3
R3
k
1
R2
3
2
R1
R1
R2
R3
RN
k

Connection with Coloring
0 1 1 0 1

a 1 a 2


 1 a q 1
Source 1 Source 2
2
1
3
R3
k
1
R2
3
2
k
R1
R1
R2
R3
RN
Fragouli, Soljanin 2004

If min-cut >2

0 1 1 0 1
a 1 a 2


 1 a q 1

Source 1 Source 2
R2
2
1
3
4
k
2
k
1
3
R1
R1
R2
R3
RN
Each receiver observes a set of vertices
Find a coloring such that every receiver
observes at least two distinct colors
Coloring families of sets
A coloring is legal
if no set is monochromatic.
Erdos (1963):
Consider a family of N sets of size m.
If N<q m-1 then the family is q-colorable.

0 1 1 0 1
R2
a 1 a 2


 1 a q 1
4
2
1
q > N 1/(m-1)
3
R1
k

Coloring families of sets
A coloring is legal
if no set is monochromatic.
Erdos (1963):
Consider a family of N sets of size m.
If N<q m-1 then the family is q-colorable.

0 1 1 0 1
R2
a 1 a 2


 1 a q 1
4
2
1
3
R1
k

2. What if the alphabet size is not large enough?
Source 1 Source 2
2
1
R1
R2
3
R3
N receivers
Alphabet of size q
Min-cut to each receiver m
k
RN
2. What if the alphabet size is not large enough?

0 1 1 0 1
a 1 a 2


 1 a q 1
If we have q colors, how many sets
are going to be monochromatic?
There exists a coloring that colors at most Nq1-m
sets monochromatically
R2
4
2
1
3
R1
k

And if we know something about the structure?

0 1 1 0 1
Erdos-Lovasz 1975:
If every set intersects at most qm-3 other
members, then the family is q-colorable.
R2
Source 1 Source 2
a 1 a 2
1
2
3
k
3
R1
R1
R2
R3
RN

 1 a q 1
4
2
1

k

And if we know something about the structure?
Erdos-Lovasz 1975:
If every set intersects at most qm-3 other
members, then the family is q-colorable.

0 1 1 0 1
R2
a 1 a 2


 1 a q 1
4
•If m=5 and every set intersects 9 other sets,
three colors – a binary alphabet is sufficient.
2
1
3
R1
k

What if links are not error free?
Network of Discrete Memoryless Channels
Source
Edges
Binary Symmetric Channel
(BSC)
Capacity C  1  H ( p )
Receiver
0
1
1-p
p
p
1-p
0
1
Network of Discrete Memoryless Channels
Source
Receiver
Min Cut = 2 (1-H(p))
Edges
Binary Symmetric Channel
(BSC)
Capacity C  1  H ( p )
0
1
1-p
p
p
1-p
0
1
Network of Discrete Memoryless Channels
Source
Edges
Binary Symmetric Channel
(BSC)
Receiver
0
1
Vertices
Terminals that have
processing capabilities
in terms of
complexity and delay
1-p
p
p
1-p
0
1
Network of Discrete Memoryless Channels
Source
Edges
Binary Symmetric Channel
(BSC)
Capacity C  1  H ( p )
Receiver
0
1
1-p
p
p
1-p
0
1
We are interested in evaluating possible benefits of intermediate node
processing from an information-theoretic point of view.
Network of Discrete Memoryless Channels
N
Source

Receiver
N

N
Edges
Binary Symmetric Channel
(BSC)
0
1
Vertices
Terminals that have
processing capabilities
Complexity - Delay
1-p
p
p
0
1-p
N
1111010001001111000
1
Perfect and Partial Processing
N
Source
Receiver
N


N
Two Cases:
allow intermediate nodes
N 
Perfect Processing
N
finite
Partial Processing
Perfect Processing

Source


Receiver


We can use a capacity achieving channel code to transform
each edge of the network to a practically error free link.
For a unicast connection: we can achieve the min-cut capacity
Network Coding
Receiver 1
X1
X1 + X2
Source
X2
Receiver 2
Employing additional coding over the error free links
allows to better share the available resources when multicasting
Network Coding: Coding across independent information streams
Partial Processing
N
Source

N
Receiver

N
We can no longer think of links as error free.
Partial Processing
We will show that:
1. Network and Channel Coding cannot be separated without loss of
optimality.
Partial Processing
We will show that:
1. Network and Channel Coding cannot be separated without loss of
optimality.
2. Network coding can offer benefits for a single unicast connection.
That is, there exist configurations where coding across information
streams that bring independent information can increase the end-toend achievable rate.
Partial Processing
We will show that:
1. Network and Channel Coding cannot be separated without loss of
optimality.
2. Network coding can offer benefits for a single unicast connection.
That is, there exist configurations where coding across information
streams that bring independent information can increase the end-toend achievable rate.
3. For a unicast connection over the same network, the optimal
processing depends on the channel parameters.
Partial Processing
We will show that:
1. Network and Channel Coding cannot be separated without loss of
optimality.
2. Network coding can offer benefits for a single unicast connection.
That is, there exist configurations where coding across information
streams that bring independent information can increase the end-toend achievable rate.
3. For a unicast connection over the same network, the optimal
processing depends on the channel parameters.
4. There exists a connection between the optimal routing over a
specific graph and the structure of error correcting codes.
Simple Example
B
Source A
D
E Receiver
C
•Each edge:
0
1-p
p
0
p
1
1-p
1
•Nodes B, C and D can process N bits
•Nodes A and E have infinite complexity processing
N infinite
B
X1
Source A
D
X2
C
Min Cut = 2 (1-H(p))
X1, X2 iid
E Receiver
N=0: Forwarding
B
X1
Source A
D
X2
C
E Receiver
N=0: Forwarding
B
X1
Source A
D
X2
C
E Receiver
N=0: Forwarding
B
X1
Source A
D
E Receiver
X2
C
Path diversity: receive multiple noisy observations
of the same information stream and optimally combine
them to increase the end-to-end rate
X1, X2 iid
N=1
B
Source A
D

E Receiver

C
•Each edge:
0
1-p
p
0
p
1
1-p
1
•Nodes B, C and D can process one bit
•Nodes A and E have infinite complexity processing
N=1
B
X1
Source A
D

E Receiver

C
•Each edge:
0
1-p
p
0
p
1
1-p
1
•Nodes B, C and D can process one bit
•Nodes A and E have infinite complexity processing
N=1
B
X1
Source A
D

E Receiver

C
•Each edge:
0
1-p
p
0
p
1
1-p
1
•Nodes B, C and D can process one bit
•Nodes A and E have infinite complexity processing
N=1
B
X1
Source A
D

E Receiver

X2
C
•Each edge:
0
1-p
p
0
p
1
1-p
1
•Nodes B, C and D can process one bit
•Nodes A and E have infinite complexity processing
Optimal Processing at node D?
B
X1
Source A

D
E Receiver
X2
C
Three choices to send through edge DE:
f1) X1
f2) X1+X2
f3) X1 and X2

All edges: BSC(p)
B
X1
X1
X1
A
D
X2
X2
E
X2
C
Rate
DE
R1
X1
R2
X1+X2
R3
X1 & X2
Network coding offers benefits for unicast connections
All edges: BSC(p)
B
X1
X1
X1
A
D
X2
E
X2
X2
C
Rate
DE
R1
X1
R2
X1+X2
R3
X1 & X2
The optimal processing depends on the channel parameters
Edges BD and CD: BSC(0)
All other edges: BSC(p)
B
X1
X1
X1
A
D
X2
X2
E
X2
C
Rate
DE
R1
X1
R2
X1+X2
R3
X1 & X2
Network and channel coding cannot be separated
Edges AB, AC, BD and CD: BSC(0)
Edges BE, DE and CE: BSC(p)
B
X1
X1
X1
A
D
X2
X2
E
X2
C
Rate
DE
R1
X1
R2
X1+X2
R3
X1 & X2
Edges AB, AC, BD and CD: BSC(0)
Edges BE, DE and CE: BSC(p)
B
X1
X1
X1
A
D
X2
X2
E
X2
C
Rate
DE
R1
X1
R2
X1+X2
R3
X1 & X2
Linear Processing
B
Y1
X1
A
D
X2
Y3
E
Y2
C
Y1  1 0
 N1 
X
Y   0 1   1    N 
 2 
 X   2 
Y3  * *  2   N 3 

A
Choose matrix A to maximize
I ( X 1 , X 2 ; Y1 , Y2 , Y3 )
Connection to Coding
“Equivalent problem”: maximize the composite capacity of a BSC(p)
that is preceded by a linear block encoder
Determined by the weight distribution of the code
Y1  1 0
 N1 
X
Y   0 1   1    N 
 2 
 X   2 
Y3  * *  2   N 3 

A
Choose matrix A to maximize
I ( X 1 , X 2 ; Y1 , Y2 , Y3 )
Conclusions
Download