Image Processing

advertisement
Image Processing
in
Freq. Domain
•
•
•
•
Restoration / Enhancement
Inverse Filtering
Match Filtering / Pattern Detection
Tomography
Enhancement v.s. Restoration
• Image Enhancement:
– A process which aims to improve bad
images so they will “look” better.
• Image Restoration:
– A process which aims to invert known
degradation operations applied to images.
Enhancement vs.
Restoration
• “Better” visual
representation
• Remove effects of
sensing environment
• Subjective
• Objective
• No quantitative
measures
• Mathematical, model
dependent quantitative
measures
Typical Degradation Sources
Low Illumination
Optical distortions
(geometric, blurring)
Sensor distortion
(quantization, sampling,
sensor noise, spectral
Atmospheric attenuation
sensitivity, de-mosaicing)
(haze, turbulence, …)
Image Preprocessing
Restoration
Enhancement
Freq.
Domain
Spatial
Domain
Filtering
Point operations
Spatial operations
• Denoising
• Inverse filtering
• Wiener filtering
Examples
Hazing
Echo image
Motion Blur
Blurred image
Blurred image + additive white noise
Reconstruction as an Inverse Problem
n
Original image
f
g
noise
Distortion
g Hf n
H
Reconstruction
Algorithm
measurements
fˆ
So what is the problem?
g
1

H g n

fˆ
• Typically:
– The distortion H is singular or ill-posed.
– The noise n is unknown, only its statistical properties
can be learnt.
Key point: Stat. Prior of Natural Images
likelihood
MAP estimation:
prior
fˆ  arg max P f g   arg max Pg f P f 
x
x
Bayesian Reconstruction (MAP)
From amongst all possible solutions, choose the one •
that maximizes the a-posteriori probability:
P(f | g)P(g | f) P(f)
Most probable solution
P(f)
measurements
P(g|f)
Image space
Bayesian Denoising
• Assume an additive noise model :
g=f + n
• A MAP estimate for the original f:
fˆ  arg max P f | g 
f
• Using Bayes rule and taking the log likelihood :
ˆf  arg max Pg | f P f   arg min  log Pg | f   log P f 
f
f
P g 
Bayesian Denoising
• If noise component is white Gaussian distributed:
g=f + n
where n is distributed ~N(0,)


ˆf  arg min g  f 2  R f 
f
data term
prior term
R(f) is a penalty for non probable f
Inverse Filtering
• Degradation model:
g(x,y) = h(x,y)*f(x,y)
G(u,v)=H(u,v)F(u,v)

F(u,v)=G(u,v)/H(u,v)
Inverse Filtering (Cont.)
Two problems with the above formulation:
1.
2.
H(u,v) might be zero for some (u,v).
In the presence of noise the noise might be amplified:

F(u,v)=G(u,v)/H(u,v) + N(u,v)/H(u,v)
Solution: Use prior information
2
ˆ
F  arg min HF  G   RF 
F
data term
prior term
Option 1: Prior Term
• Use penalty term that restrains high F values:
Fˆ  arg min E F 
F
E F   HF  G   F
2
where
• Solution:
2
E F 
*
 2 H HF  G   2F  0
F
*
H
Fˆ  *
G
H H 
H (u, v)  1  Fˆ  G H
H (u, v)  1  Fˆ  0
Degraded Image (echo)

F=G/H
Fˆ 
*
H
G
*
H H 
Degraded Image (echo+noise)
Fˆ 
*
H
G
*
H H 
The inverse filter is C(H)= H*/(H*H+ )
At some range of (u,v):
S(u,v)/N(u,v) < 1
noise amplification.
18
16
14
12
C(H)
•
•
10
=10-3
8
6
4
2
0
-1
-0.5
0
0.5
H
1
1.5
2
Option 2: Prior Term
1.
2.
Natural images tend to have low energy at high frequencies
White noise tend to have constant energy along freq.
where
Fˆ  arg min E F 
F


E F   HF  G    u 2  v 2 F 2
2
240
220
200
180
160
140
120
100
80
60
40
50
100
150
200
250
• Solution: E F   2 H * HF  G   2 u 2  v 2 F  0
F
Fˆ 
*
H
G
*
2
2
H H  u v


• This solution is known as the Wienner Filter
• Here we assume N(u,v) is constant.
• If N(u,v) is not constant:
*
H
Fˆ  *
G
2
2
H H   u  v  N (u, v)


Degraded Image (echo+noise)
Wienner Filtering
Wienner
Previous
Degraded Image (blurred+noise)
Inverse Filtering
Using Prior (Option 1)
Wienner Filtering
Matched Filter in Freq. Domain
• Pattern Matching:
– Finding occurrences of a particular pattern in an image.
• Pattern:
– Typically a 2D image fragment.
– Much smaller than the image.
Image Similarity Measures
• Image Similarity Measure:
– A function that assigns a nonnegative real value to two
given images.
– Small measure  high similarity
– Preferable to be a metric distance (non-negative, identity,
symmetric, triangular inequality)
d(
-
)≥0
• Can be combined with thresholding:
1 d ( P, Q)  threshold 
f ( P, Q)  

0
otherwise


The Matching Approach
• Scan the entire image pixel by pixel.
• For each pixel, evaluate the similarity between its local
neighborhood and the pattern.
The Euclidean Distance
as a Similarity Measure
• Given:
– k×k pattern P
– n×n image I
– kxk window of image I located at x,y - Ix,y
• For each pixel (x,y), we compute the distance:
d I , P x , y
2
E
1
 2
k
k 1
2
1
 2 I x, y  P 
k
2






I
x

i
,
y

j

P
i
,
j

i , j 0
• Complexity O(n2k2)
FFT Implementation
d I , P x , y
2
E

k 1
1
 2 I x, y  P
k
2
2
2


i, j   2I x  i, y  j Pi, j 
I
x

i
,
y

j

P

i , j 0
I 2 *  k  k mask of 1's 
Fixed
2( I * P)
• Convolution can be applied rapidly using FFT.
• Complexity: O(n2 log n)
Naïve vs. FFT
Performance table for a 1024×1024 image, on a 1.8 GHz PC:
Naïve
FFT
Time Complexity
Space
Integer Arithmetic
O(n2 k 2 )
n2
O(n2 log n)
Run time for 16×16
Run time for 32×32
Run time for 64×64
1.33 Sec.
4.86 Sec.
31.30 Sec.
Yes
n2
No
3.5 Sec.
3.5 Sec.
3.5 Sec.
7
x 10
2
1.5
1
0.5
0
Normalized Cross Correlation
• NCC:
– A similarity measure, based on a normalized crosscorrelation function.
– Maps two given images to [0,1] (absolute value).
– Measures the angle between vectors Ix,y and P
– Invariant to intensity scale and offset.
d NC

I
I , P  
x, y
x, y
 I 1 P  P1
I x , y  I 1 P  P1
Efficient Implementation
• Note that
• Thus,
X  X1 Y  Y 1  X Y  nXY

I  I 1 P  P1
I , P  

d NC

I
x, y
x, y
x, y
I x , y  I x , y 1 P  P1
I x, y  P  k 2 I x, y  P

2 2
2 2

I

k
I
P

P

k
P
x
,
y
x, y
x, y

• The above expression can be implemented efficiently using 3
convolutions.
7
x 10
1.5
2
1.5
1
1
0.5
0.5
0
Euclidean distance similarity
measure
0
NCC similarity measure
6
x 10
60
11
60
1.2
10
50
50
9
1
8
40
40
7
6
30
0.8
30
0.6
5
20
4
20
0.4
3
10
2
10
0.2
1
10
20
30
40
50
60
70
Euclidean distance similarity
measure
10
20
30
40
50
60
70
NCC similarity measure
Computer Tomography using FFT
CT Scanners
• In 1901 W.C. Roentgen won the Nobel Prize (1st in physics)
for his discovery of X-rays.
Wilhelm Conrad Röntgen
CT Scanners
• In 1979 G. Hounsfield & A. Cormack, won the Nobel Prize
for developing the computer tomography.
• The invention revolutionized medical imaging.
Allan M. Cormack
Godfrey N. Hounsfield
Tomography: Reconstruction from Projection
1
f(x,y)
2
Projection & Sinogram
• Projection: All ray-sums in a direction
• Sinogram: collects all projections
P(t)

y
t

p
x
f(x,y)
X-rays
t
Sinogram
CT Image & Its Sinogram
K. Thomenius & B. Roysam
The Slice Theorem
Fourier
Transform
y
v
1
1
x
f(x,y)
spatial domain
u
frequency domain
The Slice Theorem
f(x,y) = object
g(x) = projection of f(x,y) parallel to the y-axis:
g(x) = f(x,y)dy
Fourier Transform of f(x,y):
F(u,v) = 
 f(x,y) e -2pi(ux+vy) dxdy
Fourier Transform at v=0 :
F(u,0) = 
 f(x,y) e -2piuxdxdy
=
[  f(x,y)dy] e -2piuxdx
=  g(x) e -2piux dx = G(u)
The 1D Fourier Transform of g(x)
Interpolation Method
• Interpolate (linear, quadratic etc) in the frequency space to
obtain all values in F(u,v).
• Perform Inverse Fourier Transform to obtain the image f(x,y).
v
F(u,v)
u
THE END
Download