Image Compression - Hanyang University

advertisement
Computer Vision –
Compression(1)
Hanyang University
Jong-Il Park
Image Compression
 The problem of reducing the amount of data required
to represent a digital image
 Underlying basis
 Removal of redundant data
 Mathematical viewpoint
 Transforming a 2-D pixel array into a statistically
uncorrelated data set
Department of Computer Science and Engineering, Hanyang University
Topics to be covered
 Fundamentals
 Basic concepts of source coding theorem
 Practical techniques
 Lossless coding
 Lossy coding
 Optimum quantization
 Predictive coding
 Transform coding
 Standards
 JPEG
 MPEG
 Recent issues
Department of Computer Science and Engineering, Hanyang University
History of image compression
 Theoretic foundation
 C.E.Shannon’s works in 1940s
 Analog compression
 Aiming at reducing video transmission bandwidth
 Bandwidth compression
 Eg. Subsampling methods, subcarrier modulation…
 Digital compression
 Owing to the development of ICs and computers
 Early 70s: Facsimile transmission – 2D binary image
coding
 Academic research in 70s to 80s
 Rapidly matured around 1990.  standardization such
as JPEG, MPEG, H.263, …
Department of Computer Science and Engineering, Hanyang University
Data redundancy
 Data vs. information
 Data redundancy
 Relative data redundancy
1
RD  1 
,
CR
whereCR  n1 / n2 (compression ratio)
 Three basic redundancies
1. Coding redundancy
2. Interpixel redundancy
3. Psychovisual redundancy
Department of Computer Science and Engineering, Hanyang University
Coding redundancy
 Code: a system of symbols used to represent a body
of information or set of events
 Code word: a sequence of code symbols
 Code length: the number of symbols in each code
word
 Average number of bits
L 1
Lavg   l (rk ) pr (rk )
k 0
Department of Computer Science and Engineering, Hanyang University
Eg. Coding redundancy
 Reduction by variable length coding
Department of Computer Science and Engineering, Hanyang University
Correlation
 Cross correlation
1 M 1 N 1
f ( x, y)  h( x, y) 
f * (m, n)h( x  m, y  n)

MN m0 n0
f ( x, y)  h( x, y)  F * (u, v) H (u, v)
 Autocorrelation
f ( x, y)  f ( x, y)  F * (u, v) F (u, v) | F (u, v) |2
Department of Computer Science and Engineering, Hanyang University
Eg. Correlation
Department of Computer Science and Engineering, Hanyang University
Interpixel redundancy
 Spatial redundancy
 Geometric redundancy
 Interframe redundancy
Department of Computer Science and Engineering, Hanyang University
Eg. Interpixel redundancy
Department of Computer Science and Engineering, Hanyang University
Eg. Run-length coding
Department of Computer Science and Engineering, Hanyang University
Psychovisual redundancy
+
Department of Computer Science and Engineering, Hanyang University
Image compression models
 Communication model
 Source encoder and decoder
Department of Computer Science and Engineering, Hanyang University
Basic concepts in information theory
 Self-information: I(E)= - log P(E)
 Source alphabet A and symbols
 Probability of the events z
 Ensemble (A, z)
J
 Entropy(=uncertainty): H (z)   P(a j ) log P(a j )
j i
 Channel alphabet B
 Channel matrix Q
v  Qz
Department of Computer Science and Engineering, Hanyang University
Mutual information and capacity
 Equivocation: H (z | v)
 Mutual information:
I ( z, v )  H ( z )  H ( z | v )
 Channel capacity C
C  max [ I (z, v )]
z


Minimum possible I(z,v)=0
Maximum possible I over all possible choices of source
probabilities in z is the channel capacity
Department of Computer Science and Engineering, Hanyang University
Eg. Binary Symmetric Channel
BSC
pbs 0
1-pbs 1
1-pe
pe
0
pe
1-pe
Entropy
1
Channel
capacity
Mutual
information
Department of Computer Science and Engineering, Hanyang University
Noiseless coding theorem
 Shannon’s first theorem for a zero-memory source
L'avg
1
H (z ) 
 H (z ) 
n
n
 It is possible to make L’avg/n arbitrarily close to H(z)
by coding infinitely long extensions of the source
 Efficiency = entropy/ L’avg
 Eg. Extension coding
 Extension coding  better efficiency
Department of Computer Science and Engineering, Hanyang University
Extension coding
A
B
A
B
Efficiency =0.918/1.0=0.918
Better efficiency
Efficiency =0.918*2/1.89=0.97
Department of Computer Science and Engineering, Hanyang University
Noisy coding theorem
 Shannon’s second theorem for a zero-memory
channel:
For any R<C, there exists an integer r and code of
block length r and rate R such that the probability of a
block decoding error is arbitrarily small.
 Rate-Distortion theory
x
feasible
x
x
Never
Feasible!
x
The source output can be
recovered at the decoder
with an arbitrarily small
probability of error provided
that the channel has
capacity C > R(D)+e.
Department of Computer Science and Engineering, Hanyang University
Using mappings to reduce entropy
 1st order estimate of entropy
> 2nd order estimate of entropy
> 3rd order estimate of entropy
….
 The (estimated) entropy of a properly mapped image
(eg. “difference source”) is in most cases smaller
than that of original image source.
How to implement ?
The topic of the next lecture!
Department of Computer Science and Engineering, Hanyang University
Download