Information Rates for Two-Dimensional ISI Channels

advertisement
Information Rates for
Two-Dimensional ISI Channels
Jiangxin Chen and Paul H. Siegel
Center for Magnetic Recording Research
University of California, San Diego
DIMACS Workshop
March 22-24, 2004
3/23/04
1
Outline
•
•
•
•
Motivation: Two-dimensional recording
Channel model
Information rates
Bounds on the Symmetric Information Rate (SIR)
• Upper Bound
•
Lower Bound
•
Convergence
•
•
•
3/23/04
Alternative upper bound
Numerical results
Conclusions
DIMACS Workshop
2
Two-Dimensional Channel Model
• Constrained input array
x[ i , j ]
• Linear intersymbol interference
• Additive, i.i.d.
Gaussian noise
h[i, j]
n[i, j ] ~ N (0,  2 )
n1 1 n2 1
y[i, j ]  
k 0
3/23/04
 h[k , l ]x[i  k , j  l ]  n[i, j]
l 0
DIMACS Workshop
3
Two-Dimensional Processes
• Input process:
X  X [ i , j ]
• Output process:
• Array
Y  Y [ i , j ]
Yii,j m1, j  n 1
upper left corner:
Y [ i, j ]
lower right corner: Y [ i  m  1, j  n  1]
3/23/04
DIMACS Workshop
4
Entropy Rates
• Output entropy rate: H Y   limm ,n 1 H Y1m,1 ,n 
mn
1
2
• Noise entropy rate: H N   log eN0 
• Conditional entropy rate:


1
H Y | X   lim
H Y1m,1 ,n | X1m,1,n  H N 
m ,n  mn
3/23/04
DIMACS Workshop
5
Mutual Information Rates
• Mutual information rate:
I  X ;Y   H Y   H Y | X   H Y   H N 
• Capacity:
C  max I  X ;Y 
P X 
• Symmetric information rate (SIR):
Inputs X  x[i, j ] are constrained to be
independent, identically distributed, and
equiprobable binary.
3/23/04
DIMACS Workshop
6
Capacity and SIR
• The capacity and SIR are useful measures of
the achievable storage densities on the twodimensional channel.
• They serve as performance benchmarks for
channel coding and detection methods.
• So, it would be nice to be able to compute
them.
3/23/04
DIMACS Workshop
7
Finding the Output Entropy Rate
• For one-dimensional ISI channel model:
 
and
1
n
H Y   lim H Y1
n  n
   Elog pY
n
1
HY
where
3/23/04
n
1
y
n
1

Y1n  Y 1, Y 2,Y n
DIMACS Workshop
8
Sample Entropy Rate
• If we simulate the channel N times, using inputs
with specified (Markovian) statistics and generating
output realizations
y
(k )


 y[1]( k ) , y[2]( k ) ,, y[n]( k ) , k  1,2,, N
then
1

N
converges to
3/23/04
 log py 
N
k 1
 
H Y1n
(k )
with probability 1 as
DIMACS Workshop
N  .
9
Computing Sample Entropy Rate
• The forward recursion of the sum-product (BCJR)
algorithm can be used to calculate the probability
P y1n of a sample realization of the channel output.
 
• In fact, we can write
 

n
1
 log p y1n    log p yi | y1i 1
n
n i 1
1



i 1
p
y
|
y
where the quantity
is precisely the
i
1
normalization constant in the (normalized) forward
recursion.
3/23/04
DIMACS Workshop
10
Computing Entropy Rates
• Shannon-McMillan-Breimann theorem
implies
1
 log p
n
  H Y 
n
y1
a .s .
n
1
as n   , where y
is a single long
sample realization of the channel output
process.
3/23/04
DIMACS Workshop
11
SIR for Partial-Response Channels
3/23/04
DIMACS Workshop
12
Capacity Bounds for Dicode
3/23/04
DIMACS Workshop
13
Markovian Sufficiency
Remark: It can be shown that optimized
Markovian processes whose states are determined
by their previous r symbols can asymptotically
achieve the capacity of finite-state intersymbol
interference channels with AWGN as the order r of
the input process approaches .
(J. Chen and P.H. Siegel, ISIT 2004)
3/23/04
DIMACS Workshop
14
Capacity and SIR in Two Dimensions
• In two dimensions, we could estimate H Y  by
calculating the sample entropy rate of a very large
simulated output array.
• However, there is no counterpart of the BCJR
algorithm in two dimensions to simplify the
calculation.
• Instead, we use conditional entropies to derive upper
and lower bounds on
3/23/04
H Y  .
DIMACS Workshop
15
Array Ordering
• Permuted lexicographic ordering:
• Choose vector
k  k1 ,k2 , a permutation of 1,2.
• Map each array index
• Then
or
t1 ,t2 
to
tk ,tk 
1
2
.
s1 , s2  precedes t1 ,t2  if
sk1  tk1
sk1  tk1
and
sk 2  tk 2 .
• Therefore,
k  1,2:
k  2,1:
3/23/04
row-by-row ordering
column-by-column ordering
DIMACS Workshop
16
Two-Dimensional “Past”
• Let
l  l1 ,l2 ,l3 ,l4 
vector.
• Define
be a non-negative
Pastk ,l Y i , j  to be the elements
preceding Y i , j  inside the region
i  l2 , j  l4
i l1 , j l3
Y
3/23/04
(with permutation k )
DIMACS Workshop
17
Examples of Past{Y[i,j]}
3/23/04
DIMACS Workshop
18
Conditional Entropies
• For a stationary two-dimensional random field Y on
the integer lattice, the entropy rate satisfies:


H Y   H Y i , j | Past k , Y i , j 
(The proof uses the entropy chain rule. See [5-6])
• This extends to random fields on the hexagonal
lattice,via the natural mapping to the integer lattice.
3/23/04
DIMACS Workshop
19
Upper Bound on H(Y)
• For a stationary two-dimensional random field Y,
H Y 
where
U1
 min H k ,l
k


H kU,l1Y   H Y i , j | Pastk ,l Y i , j 
3/23/04
DIMACS Workshop
20
Two-Dimensional Boundary of Past{Y[i,j]}
• Define Strip


Y
i , j 
k ,l
to be the boundary
of Past Y i , j  .
k ,l
• The exact expression for Stripk ,l Y i , j 
is messy, but the geometrical concept is
simple.
3/23/04
DIMACS Workshop
21
Two-Dimensional Boundary of Past{Y[i,j]}
3/23/04
DIMACS Workshop
22
Lower Bound on H(Y)
• For a stationary two-dimensional hidden Markov
field Y,
H Y 
where
L1
 max H k ,l
k



H kL,1l Y   H Y i , j | Pastk ,l Y i , j , X Stk ,l Y i , j 
and

X St

Y i , j 
k ,l
the strip Strip
3/23/04
is the “state information” for

Y i , j  .
k ,l
DIMACS Workshop
23
Sketch of Proof
• Upper bound:
Note that
Pastk ,l Y i , j   Pastk , Y i , j 
and that conditioning reduces entropy.
• Lower bound:
Markov property of Y i , j  , given “state

   .
information” X Stk ,l Y i , j
3/23/04
DIMACS Workshop
24
Convergence Properties
U1
• The upper bound H k ,l on the entropy rate is
monotonically non-increasing as the size of the
array defined by l  l1 ,l2 ,l3 ,l4 
increases.
L1
H k ,l
• The lower bound
on the entropy rate is
monotonically non-decreasing as the size of the
array defined by l  l1 ,l2 ,l3 ,l4 
increases.
3/23/04
DIMACS Workshop
25
Convergence Rate
• The upper bound
U1
H k ,l
and lower bound
converge to the true entropy rate
as fast as
lmin
3/23/04
O(1/lmin) ,
H Y 
L1
H k ,l
at least
where
min  l1 , l3 , l4 , for row - by - row ordering k

min  l1 , l2 , l3  for column - by - column ordering k
DIMACS Workshop
26
Computing the SIR Bounds
• Estimate the two-dimensional conditional entropies
H AB
over a small array.



  

• Calculate P A, B , P B to get P A B
for many realizations of output array.

• For column-by-column ordering, treat each row
Y i as a variable and calculate the joint
probability PY 1 , Y 2 ,, Y m 
row-by-row
using the BCJR forward recursion.
3/23/04
DIMACS Workshop
27
2x2 Impulse Response
•
•
•
•
“Worst-case” scenario - large ISI:
0.5 0.5
h1[i, j ]  

0.5 0.5
Conditional entropies computed from 100,000
realizations.
1
 U1



Upper bound: min  H 2,1,7 , 7 ,3,0   2 log eN 0 , 1


1
L1
Lower bound: H 2,1,7 ,7 ,3,0   log eN 0 
2
(corresponds to element in middle of last column)
3/23/04
DIMACS Workshop
28
Two-Dimensional “State”
3/23/04
DIMACS Workshop
29
SIR Bounds for 2x2 Channel
3/23/04
DIMACS Workshop
30
Computing the SIR Bounds
• The number of states for each variable increases
exponentially with the number of columns in the
array.
• This requires that the two-dimensional impulse
response have a small support region.
• It is desirable to find other approaches to computing
bounds that reduce the complexity, perhaps at the
cost of weakening the resulting bounds.
3/23/04
DIMACS Workshop
31
Alternative Upper Bound
• Modified BCJR approach limited to small impulse
response support region.
• Introduce “auxiliary ISI channel” and bound
H Y   H kU,l2
where
H
U2
k ,l
    p yi, j , Past yi, j  log q yi, j | Past yi, j d y

k, l
k, l





and q yi, j | Pastk , l yi, j  is an arbitrary conditional


probability distribution.
3/23/04
DIMACS Workshop
32
Choosing the Auxiliary Channel
• Assume q yi, j | Pastk , l yi, j  is conditional
probability distribution of the output from an
auxiliary ISI channel
• A one-dimensional auxiliary channel permits a
calculation based upon a larger number of columns
in the output array.
• Conversion of the two-dimensional array into a onedimensional sequence should “preserve” the
statistical properties of the array.
• Pseudo-Peano-Hilbert space-filling curves can be
used on a rectangular array to convert it to a
sequence.
3/23/04
DIMACS Workshop
33
Pseudo-Peano-Hilbert Curve
Y i , j   Past 2,1,7 ,8,7 ,l4 Y i , j 
3/23/04
DIMACS Workshop
34
SIR Bounds for 2x2 Channel
Alternative upper
bounds --------->
3/23/04
DIMACS Workshop
35
3x3 Impulse Response
•
Two-DOS transfer function
0

1
h2[i, j ] 
1
10 
1
•
•
1 1

2 1

1 0
Auxiliary one-dimensional ISI channel with memory
length 4.
Useful upper bound up to Eb/N0 = 3 dB.
3/23/04
DIMACS Workshop
36
SIR Upper Bound for 3x3 Channel
3/23/04
DIMACS Workshop
37
Concluding Remarks
• Upper and lower bounds on the SIR of twodimensional finite-state ISI channels were presented.
• Monte Carlo methods were used to compute the
bounds for channels with small impulse response
support region.
• Bounds can be extended to multi-dimensional ISI
channels
• Further work is required to develop computable,
tighter bounds for general multi-dimensional ISI
channels.
3/23/04
DIMACS Workshop
38
References
1.
2.
3.
4.
5.
D. Arnold and H.-A. Loeliger, “On the information rate of binaryinput channels with memory,” IEEE International Conference on
Communications, Helsinki, Finland, June 2001, vol. 9, pp.2692-2695.
H.D. Pfister, J.B. Soriaga, and P.H. Siegel, “On the achievable
information rate of finite state ISI channels,” Proc. Globecom 2001,
San Antonio, TX, November2001, vol. 5, pp. 2992-2996.
V. Sharma and S.K. Singh, “Entropy and channel capacity in the
regenerative setup with applications to Markov channels,” Proc.
IEEE International Symposium on Information Theory, Washington,
DC, June 2001, p. 283.
A. Kavcic, “On the capacity of Markov sources over noisy channels,”
Proc. Globecom 2001, San Antonio, TX, November2001, vol. 5, pp.
2997-3001.
D. Arnold, H.-A. Loeliger, and P.O. Vontobel, “Computation of
information rates from finite-state source/channel models,” Proc.40th
Annual Allerton Conf. Commun., Control, and Computing,
Monticello, IL, October 2002, pp. 457-466.
3/23/04
DIMACS Workshop
39
References
6.
7.
Y. Katznelson and B. Weiss, “Commuting measurepreserving transformations,” Israel J. Math., vol. 12, pp.
161-173, 1972.
D. Anastassiou and D.J. Sakrison, “Some results
regarding the entropy rates of random fields,” IEEE
Trans. Inform. Theory, vol. 28, vol. 2, pp. 340-343, March
1982.
3/23/04
DIMACS Workshop
40
Download