Talk given at DIMACS in 2004

advertisement
Multiple Description Coding and
Distributed Source Coding:
Unexplored Connections in Information
Theory and Coding Theory
S. Sandeep Pradhan
Department of EECS
University of Michigan, Ann Arbor
(joint work with R. Puri and K. Ramchandran of University of
California, Berkeley)
Transmission of sources over packet networks
1
i1
2
X
Packet
Erasure
Network
Encoder
n
i2
^
Decoder
im
Best Effort Networks : modeled as packet erasure channels.
 User Datagram Protocol (UDP)
 Example: Internet
 Multimedia over the Internet is growing fast

X
Multiple Descriptions Source Coding
Distortion D1
R1
Description 1
X
Side
Decoder 1
MD
Encoder
Central
Decoder
Description 2
R2
D0
Side
Decoder 2
Distortion D2
Find the set of all achievable tuples (R1,R2,D1,D2,D0)
Prior Work
Information Theory: (incomplete list)
• Cover-El Gamal, ‘80: achievable rate region for 2-channel MD.
• Ozarow 1981: Converse for Gaussian sources.
• Berger, Ahlswede, Zhang, ’80-’90.
• Venkataramani et al, ‘01: extension of cover-el gamal region for n-cannels
Finite-block-length codes: (incomplete list)
• Vaishampayan ’93: MD scalar and vector quantizers
• Wang Orchard-Reibman `97: MD transform codes
• Goyal-Kovacevic `98: frames for MD
• Puri Ramchandran `01: FEC for MD
Main idea in random codes for 2-channel MD
(Cover-El Gamal)
Fix: p( x1 , x2 | x)
p(x1)
p(x2)
Find a pair of codewords that that jointly typical with
source word with respect to p(x,x1,x2)
Possible if:
R1  I ( X ; X1 )  R2  I ( X ; X 2 )  I ( X1; X 2 | X )
 R1  R2  I ( X ; X1 )  I ( X ; X 2 )  I ( X1; X 2 | X )
Possible ideas for n-channel MD ?


Extend Cover-El Gamal random codes from 2 to n:
(Venkataramani et al.)
Use maximum distance separable erasure (MDS) codes
(Albanese et al., ‘95)
Erasure codes

Erasure Codes (n, k, d) : Add (n-k) parity symbols
 MDS Codes : d = n – k + 1
 MDS => any k channel symbols => k source symbols.
ENC
Source
Packets
DEC
Distortion
C
H
A
N
N
E
l
01
A subset
Source
k
n
(# of Packets)
“Cliff” effect
Fix: Use many MDS codes
(Albanese et al ’95, Puri-Ramchandran 99)
Example for 3-channels:
Successively refinable source-encoded
bit stream
Description 1
Description 2
(3,1)
(3,2) (3,3)
MDS erasure codes
Distortion
Description 3
1
2
3
What is new in our work?




Symmetric problem, # of descriptions > 2
Explore a fundamental connection between MD coding and distributed
source coding.
New rate region for MD: random binning inspired from distributed
source coding
Constructions for MD: extension of our earlier work (DISCUS) on
construction of coset codes for distributed source coding.
Outline of our strategy:




Start from an MDS erasure code from a different perspective
Connect this code to a distributed source coding problem
Construct random codes based on this connection: (n,k) source-channel
erasure codes
New rate region for MD: A concatenation of these (n,k) source-channel
erasure codes.
Idea #1: A new look at (n,1,n) MDS codes
1
2
3
n-1
n
 (n, 1, n) “bit” code
 All packets are identical (repetition)
 Reception of any one packet
enables reconstruction
 Reception of more than one packet
does not give better quality
 Parity bits wasted…
Idea #1 (contd): (n,1,n) source-channel erasure code
1
Independently quantized versions
of X on every packet
2
3
Reception of any one packet
enables reconstruction
n-1
Reception of more packets
enables better reconstruction
(estimation gains due to
multiple looks!)
n

Extensions to (n,k) source-channel codes

Can we generalize this to (n,k) sourcechannel codes?

Yes: random binning (coset code)
approach !
– Using Slepian-Wolf, Wyner-Ziv Theorems
A Conceptual leap
using binning
(n,1) code
(n,k) code
Idea # 2: Consider a (3,2,2) MDS code
There is inherent uncertainty
at the encoder about which
packets are received by the
decoder.
Needs coding strategy where decoder has access to some information
while the encoder does not: distributed source coding
Background: Distributed source coding
(Slepian-Wolf ‘73, Wyner-Ziv ‘76, Berger ‘77)
X
X and Y => correlated sources
Encoder
X,Y
Decoder
Y
Encoder


Exploiting correlation without direct communication
Optimal rate region: Slepian-Wolf 1973
Distributed source coding (Contd)
Rx , R y : Rx  H ( X | Y ), R y  H (Y | X )
Rate region :
Rx  R y  H ( X , Y )
Random Partitions of typical sets
Ry
7
6
5
4
3
2
Rx
L ( H ( X 1  Rx )
2
LH ( X1 )
2
2
L ( H ( X 2 ) Ry )
LH ( X 2 )
Idea # 2 (contd):
Is there any telltale signs of symmetric
overcomplete partitioning in (3,2,2) MDS codes
00
01
10
11
00
01
10
11
0/1
00
10
01
11
0/1
00
11
01
10
0/1
2
L.I ( X ;Y1 )
Instead of a single codebook, build 3 different
codebooks (quantizers) and then partition
(overcomplete) them
2
L
.I (Y1 ;Y2 )
2
Distortion
Idea #2 (Contd):
01
k
n
(# of Packets)
Problem Formulation
(n,k) source-channel erasure code
Encoder 1
1
i1
2
Encoder 2
Packet
Erasure
Channel
X
Encoder n
n
^
X
i2
Decoder
im
mk
 Decoder starts reconstruction with m> k packets
 Rate of transmission of every packet = same
 Distortion => only a function of # of received packets
 Symmetric formulation, n >2
Problem Formulation : Notation
Source X ~ q(x), Alphabet X, Blocklength=L
^
 Bounded distortion measure d : X  X   
L
LR
 Encoder: Fi :   {1,2,...,2 } i  {1,2,...n}
LR
L
ˆ
 Decoder GJ : {1,2,...,2 }   J , | J | k
J
 Distortion with h packets =  h

Problem Statement (Contd.)
What is the best distortion tuple ( Dk , Dk 1 ,...., Dn )
for a rate of R bits/sample/packet?
Main Result
( R, Dk , Dk 1 ,...., Dn ) achievable if
R
1
1
( H (Y1 , Y2 ,...Yk ))  H (Y1 , Y2 ,...Yn | X )
k
n
for some p.m.f. p( x, y1 , y2 ,... yn )  q( x). p( y1 , y2 ,... yn | x)
and a set of functions g J such that
1
E[ d ( X , g J (Yi1 , Yi2 ,...Yih )]  Dh
l
Example: (3,2) Code

2
(3,2) code: (Yi) have same p.d.f.
–
3 codebooks each of rate I(X;Yi) are constructed randomly.
–
Each is partitioned into exp2(LR) bins and
–
L
# of codewords in a bin is exponential in { --.I(Y
1;Y2)}
2
–
Thus 2R= I(X;Y1)+ I(X;Y2) - I(Y1;Y2)
L. I ( X ;Y1 )
2
L
. I (Y1 ;Y2 )
2
Example of a Gaussian Source : (3,2,2) code
Distortion
1
16
1
16
1
16
1 bit/sample/packet
1
23.5
n-Channel Symmetric MD:
Idea # 3
Concatenation of
(n,1), (n,2)…(n,n)
source-channel
erasure codes
• Base Layer: Reception of any one
packet => decode (3,1) code.
• Middle Layer: (3,2) code => side
information includes some one part of
middle layer (source channel erasure
codes!). Also includes some two parts
of base layer.
• Final Layer: (3,3) code => refine
everything.
• Every part of bitstream
contributes to source reconstruction.
Packet 1
(3,1)
Y11
(3,2)
Y21
(3,3)
Y3
Packet 2
Y12
Y22
Y3
Packet 3
Y13
Y23
Y3
Key Concepts:
•Multiple quantizers which can introduce correlated quantization noise:
MD Lattice VQ (Vaishampayan, Sloane, Diggavi ’01)
• Computationally efficient multiple binning schemes: Symmetric distributed
source coding using coset codes (Pradhan-Ramchandran ’00,
Schonberg, Pradhan, Ramchadran ‘03)
• Note: different from single binning schemes:
(Zamir-Shamai ’98, Pradhan-Ramchandran ’99)
A (3,2) Source Channel Lattice Code
• Y1, Y2, Y3 are correlated quantized versions of source X.
• d(Yi, Yj)
 2.
A (3,2) Source Channel Lattice Code
• Code of distance
5 overcomes
correlation noise
of 2.
A (3,2) Source Channel Lattice Code
A (3,2) Source Channel Lattice Code
• Partitioning through cosets: constructive counterpart of “random bins”.
A (3,2) Source Channel Lattice Code
1
2
Suppose 2
observations
Y1 and Y2.
Asymmetric case
Y2 available at
decoder.
A code that combats
correlation noise
ensures decoding.
A (3,2) Source Channel Lattice Code
Suppose 2
observations
Y1 and Y2.
Symmetric case:
Split the generator
vectors of the code.
1 gets rows, 2 gets
columns
A (3,2) Source Channel Lattice Code
1
2
Suppose 2
observations
Y1 and Y2.
Symmetric case:
Split the generator
vectors of the code.
1 gets rows, 2 gets
columns.
A (3,2) Source Channel Lattice Code
• 1,2,3 are independently quantized versions of source X.
• d(Yi, Yj)
 2.
A (3,2) Source Channel Lattice Code
• Find 3 generator vectors such that any two generate the code.
• 1 gets rows, 2 gets columns, 3 gets diagonal.
A (3,2) Source Channel Lattice Code
• Find 3 generator vectors such that any two are linearly independent.
• 1 gets rows, 2 gets columns, 3 gets diagonal.
2
1
3
Constructions for general n and k
• Choose a code (generator matrix G) that combats correlation
noise. e.g.,
5 0
G

0
5


• Split the rows of G into k submatrices (k generator sets S1, …. Sk). e.g.,
G1 = [5 0] and G2 = [0 5].
• Need a way to generate n generator sets out k such that any k
of them are equivalent to G.
• Choose generator matrix M (dim. k x n) of an (n,k) MDS block
code. Has the property that any k columns are independent. e.g.,
1 0 1
M 

0
1
1


Constructions for general n and k
• Using weights from n columns one at a time, linearly combine k
generator sets (S1, …., Sk) to come up with n encoding matrices. e.g,
G1 = [5 0], G2 = [0 5], G3 = [5 5].
• Efficient algorithms for encoding and decoding using coset code
framework (Forney 1991).
Conclusions
– New rate region for n-channel MD problem
– A new connection between MD problem and
distributed source coding problem
– A new application of multiple binning schemes
– Construction based on coset codes
– A nice synergy between quantization and MDS
erasure codes
Download