Maximum-likelihood (ML) decoding

advertisement
Maximum-likelihood (ML) decoding
v
u
ECC
encoder
r=v+n
Channel
v’, u’
ECC
decoder
•
Decoder: Must determine v’ to minimize P(E|r)=P(v’≠v|r)
•
The probability of error is P(E) = ∑r P(E|r)P(r)
•
P(r) is independent of decoding ⇒ optimum decoding must
•
minimize P(v’≠v|r) for all r
•
maximize P(v’=v|r) for all r
•
choose v’ as the codeword v that maximizes
P(v|r) = P(r|v)P(v) / P(r)
•
i.e. (if P(v) is the same for all v) that maximizes P(r|v)
1
ML decoding (cont.)
Memoryless channel:
• ML decoder: Maximize P(r|v) = Πj P(rj|vj)
• Alternatively, choose v to maximize log P(r|v) = ∑j log P(rj|
vj)
• The ML decoder is optimal if and only if all v are equally
probable as input vectors. Otherwise, P(r|v) must be
weighted by the codeword probabilities P(v)
2
ML decoding on the BSC
BSC:
•
P(rj|vj) = 1-p if rj=vj and p otherwise
•
log P(r|v) = ∑j log P(rj|vj)
•
Hamming distance: Let r and v differ in d(r,v) positions
●
•
∑j log P(rj|vj) = d(r,v) log p + (n-d(r,v))log(1-p)
= d(r,v) log (p/(1-p)) + nlog(1-p)
log (p/(1-p)) < 0 for p < 0.5, so an ML decoder for a BSC
must choose v to minimize d(r,v)
3
Channel capacity
Shannon (1948)
•
Every channel has a capacity C (determined by the noise
and the input power and bandwidth constraints)
•
Eb(R) and Ec(R) are positive functions of R for R<C
•
There exists a block code of length n such that with ML
decoding
–nE (R)
b
P(E) ≤ 2
–(m+1)nE (R)
c
•
Similar for convolutional codes; P(E) ≤ 2
•
In fact, the average code performs like this. Nonconstructive proof using random coding
•
But ML decoding for long random codes is infeasible!
4
Performance measures
Trade-off of main parameters:
•
Code rate
•
Error probability
•
Word error rate (WER, FER, BLER)
•
Bit error rate (BER)
for a given channel and channel quality
•
Decoding complexity
Performance is often displayed as a curve of an error rate as a
function of channel quality
5
Error rate curves
Log ER
ed
od
nc
U
Coding threshold
Shannon limit
Coding gain
SNR : Eb/N0 (dB)
Eb= Es/R
6
Asymptotic coding gain
Log ER
od
nc
U
ed
Coding gain
Asymptotic coding gain
SNR : Eb/N0 (dB)
7
Asymptotic coding gain
p uncoded ≈
1 −E
e
2
b
/N0
For high SNR (BPSK modulation and AWGN channel) with
soft-decision decoding:
p coded, SD ≈ K code e
−d min RE b / N 0
Asymptotic coding gain:
 E b / N 0 uncoded
=Rd min
 E b / N 0 coded
or 10log10(Rdmin). For HD decoding: 10log10(Rdmin/2). Thus,
SD gives 10log10 2 = 3 dB better ACG than HD
8
Performance close to the Shannon limit
Log ER
o
al c
co
de
o
de
rbo
ed
od
nc
U
c
ssi
Cla
Shannon limit
Tu
rL
DP
Cc
od
e
SNR : Eb/N0 (dB)
9
Coded modulation
•
Encoding + modulation:
•
Need a distance-preserving modulation mapping that
preserves distance between different codewords
•
Thus, we can view codes also in the modulation domain
•
Combined coding and modulation: Design codes specifically
to increase distance
•
Exploit that in a large signal constellation some points
are further apart
•
No bandwidth expansion with coding, but the
constellation is expanded compared to uncoded
modulation
10
Coded modulation
•
Some schemes work on a signal constellation by
1. Encode some input bits by an ECC. Let the output bits
determine a subconstellation
2. Let the remaining input bits determine a point in the
subconstellation
•
•
TCM – coded modulation based on a convolutional
ECC
•
BCM – coded modulation based on block ECC
Also, coded modulation with turbo codes and LDPC
codes
11
Trellises of linear block codes (CH 9)
A representation that facilitates soft-decision decoding
Recall:
•
•
A linear block code is the row space of a generator
matrix
000
Example:
011
101
110
Sender
Receiver
E
E
E
O
O
E
12
Trellises
A trellis is a directed graph:
•
A set Γ of depths or time instances, ordered (usually)
from 0 to n
•
At each time instant, a set of nodes, vertices, representing
the (code) state at that time instant. Usually (in an
ordinary block code) one initial state s0 at time 0 and one
final state sf at time n
•
Edges can go from a state at time i to a state at time i+1
•
Each edge is labeled by one (or more) symbol(s) from the
code alphabet (usually binary)
•
A sequence of edge labels obtained by traversing the
trellis from s0 to sf is a codeword
13
Linear trellises
•
Necessary (but not sufficient) conditions for the
corresponding code to be linear:
•
•
There exists an output function Oi = fi(si, Ii) where
•
fi(si, Ii) ≠ fi(si, I'i) for Ii ≠ I'i
•
Oi is the output block from time i to time i+1
•
Ii is the input block from time i to time i+1
•
si is the state at time i
There exists a state transition function si+1= gi(si, Ii)
14
More properties of linear trellises
•
In the trellis of a linear code, the set of states Σi at time i is
called the state space
•
A trellis is time-invariant iff ∃
•
A finite period initial delay ν∈Γ
•
An output function f and a state transition function g
•
A ”template” state space Σ
such that
●
Σi ⊂ Σ for 0 ≤ i < ν, and Σi = Σ for i ≥ ν
•
fi = f and gi = g for all i ∈Γ
15
Bit-level trellises of linear block codes
•
[n,k] linear block code C
•
Bit-level trellis: n+1 time instants and n trellis sections
•
One initial state s0 at time 0 and one final state sf at time n
•
For each time i > 0, there is a fixed number Incoming(i) of
incoming branches. For all i, Incoming(i) is 1 or 2. Two
branches going to the same state have different labels
•
For each time i < n, there is a fixed number Outgoing(i) of
outgoing branches. For all i, Outgoing(i) is 1 or 2. Two
branches coming from the same state have different labels
•
Each codeword corresponds to a distinct path from s0 to sf
16
Bit-level trellises of linear block codes
•
The number |Σi| is (sometimes) called the state space
complexity at time instant i. For a linear code, we will show
that |Σi|, for all i ∈Γ, is a power of 2
•
Thus, we can define the state space dimension
ρi = log2|Σi|
•
The sequence
(ρ0 = 0, ρ1 ,..., ρi ,..., ρn = 0)
is called the state space dimension profile, and determines
the complexity of an ML (soft-decision) decoder for the
code
17
Generator matrix: TO form
[
1
0
G=
0
0
[
1
0
G'=
0
0
1
1
0
0
1
1
0
0
1
0
1
0
1
0
1
0
1
1
1
0
1
1
1
0
1
0
0
1
1
1
0
1
0
1
1
1
1
0
1
1
0
0
1
1
1
1
1
1
0
1
0
1
]
0
0
0
1
]
18
Download