Document 13612184

advertisement
Image Quality Metrics
• Image quality metrics
• Mutual information (cross-entropy) metric
• Intuitive definition
• Rigorous definition using entropy
• Example: two-point resolution problem
• Example: confocal microscopy
• Square error metric
• Receiver Operator Characteristic (ROC)
• Heterodyne detection
MIT 2.717
Image quality metrics p-1
Linear inversion model
object
“physical
attributes”
(measurement)
hardware
channel
field
propagation
f
H
detection
inversion problem:
determine f, given the measurement
g
g =H f
(noise variance)
σ2
noise-to-signal ratio (NSR) =
=
=σ 2
(average signal power) 1
normalizing signal power to 1
MIT 2.717
Image quality metrics p-2
Mutual information (cross-entropy)
object
“physical
attributes”
(measurement)
hardware
channel
field
propagation
f
H
2

µ k 
1
C = ∑ ln1 + 2

2
k =1

σ 
n
MIT 2.717
Image quality metrics p-3
detection
g
eigenvalues
of H
The significance of eigenvalues
rank of
measurement
(aka how many
dimensions
the measurement
is worth)
1 n  µ k2 
C = ∑ ln1 + 2 
2 k =1  σ 
n
...
n-1
1
0
σ2
µ n2
2
µ n−1
...
eigenvalues of H
MIT 2.717
Image quality metrics p-4
µ 22
µ1
2
Precision of measurement
1 n  µ k2 
C
= ∑
ln1 + 2  =
2
k =1  σ 
µ
t2 <
σ
2 <
µ
t2−1
noise floor
 µ t2− 2

 µ t2−1 
 µ t2 
... +
ln
1 + 2  +
ln
1 + 2  + ln1 + 2  + ...
σ 
 σ 

 σ 
≈precision
of (t-2)th measurement
this term
≤1
E.g. 0.5470839348
these digits worthless
if σ ≈10-5
MIT 2.717
Image quality metrics p-5
this term
≈0
Formal definition of cross-entropy (1)
Entropy in thermodynamics (discrete systems):
• log2[how many are the possible states of the system?]
E.g. two-state system: fair coin, outcome=heads (H) or tails (T)
Entropy=log22=1
Unfair coin: seems more reasonable to “weigh” the two states
according to their frequencies of occurence (i.e., probabilities)
Entropy = − ∑ p(state ) log 2 p(state )
states
MIT 2.717
Image quality metrics p-6
Formal definition of cross-entropy (2)
• Fair coin: p(H)=1/2; p(T)=1/2
1
1 1
1
Entropy = − log 2
− log 2 = 1
bit
2
2 2
2
• Unfair coin: p(H)=1/4; p(T)=3/4
3
1 3
1
Entropy = − log 2
− log 2 = 0.81 bits
4
4 4
4
Maximum entropy ⇔ Maximum uncertainty
uncertainty
MIT 2.717
Image quality metrics p-7
Formal definition of cross-entropy (3)
Joint Entropy
Entropy
log2[how many are the possible states of a combined variable
obtained from the Cartesian product of two variables?]
Joint Entropy( X , Y ) = − ∑
∑ p(x, y )log p(x, y )
2
states states
x∈ X y∈Y
E.g. Joint Entropy(F , G ) = ?
object
“physical
attributes”
(measurement)
hardware
channel
f
MIT 2.717
Image quality metrics p-8
field
propagation
H
detection
g
Formal definition of cross-entropy (4)
Conditional Entropy
Entropy
log2[how many are the possible states of a combined variable
given the actual state of one of the two variables?]
Cond. Entropy(Y | X ) = − ∑
∑ p(x, y )log p( y | x )
2
states states
x∈X y∈Y
E.g. Cond. Entropy(G | F ) = ?
object
“physical
attributes”
(measurement)
hardware
channel
f
MIT 2.717
Image quality metrics p-9
field
propagation
H
detection
g
Formal definition of cross-entropy (5)
object
“physical
attributes”
(measurement)
hardware
channel
f
field
propagation
H
detection
g
Noise adds uncertainty to the measurement wrt the object
⇔ eliminates information from the measurement wrt object
MIT 2.717
Image quality metrics p-10
Formal definition of cross-entropy (6)
uncertainty added due to noise
Cond. Entropy(F | G )
representation by
Seth Lloyd, 2.100
Entropy(F )
C(F , G )
information
contained
in the object
Cond. Entropy(G | F )
information eliminated due to noise
MIT 2.717
Image quality metrics p-11
Entropy(G )
information
contained
in the measurement
cross-entropy
(aka mutual information)
Formal definition of cross-entropy (7)
Joint Entropy(F , G )
Cond. Entropy(F | G )
Entropy(F )
MIT 2.717
Image quality metrics p-12
C(F , G )
Cond. Entropy(G | F )
Entropy(G )
Formal definition of cross-entropy (8)
F
information
source
(object)
Physical Channel
(transform)
Corruption source (Noise)
G
information
receiver
(measurement)
C( F , G ) = Entropy(F ) − Cond. Entropy( F |
G )
= Entropy(G ) − Cond. Entropy(G | F )
= Entropy(F ) + Entropy(G ) − Joint Entropy( F , G )
MIT 2.717
Image quality metrics p-13
Entropy & Differential Entropy
• Discrete objects (can take values among a discrete set of states)
– definition of entropy
Entropy = −∑ p( xk ) log 2 p
( xk )
k
– unit: 1 bit (=entropy value of a YES/NO question with 50%
uncertainty)
• Continuous objects (can take values from among a continuum)
– definition of differential entropy
Diff. Entropy = −
∫( p) (x )ln p
(x ) dx
Ω X
– unit: 1 nat (=diff. entropy value of a significant digit in the
representation of a random number, divided by ln10)
MIT 2.717
Image quality metrics p-14
Image Mutual Information (IMI)
object
“physical
attributes”
(measurement)
hardware
channel
f
Assumptions:
Then
MIT 2.717
Image quality metrics p-15
field
propagation
H
detection
g
(a) F has Gaussian statistics
(b) white additive Gaussian noise (waGn)
i.e.
g=Hf+w
where W is a Gaussian random vector with diagonal
correlation matrix
1 n  µ k2 
C
(
F
,
G )
=
∑
ln1 + 2 
2 k =1  σ

µ k : eigenvalues of H
Mutual information & degrees of freedom
rank of
measurement
n
...
n-1
1
0
σ2
µ n2
mutual
information
µ n2−1
1 n  µ k2 
C = ∑ ln1 + 2 
2 k =1  σ 
...
µ 22
µ12
As noise increases
• one rank of H is lost whenever
σ2 overcomes a new eigenvalue
• the remaining ranks lose precision
σ2
MIT 2.717
Image quality metrics p-16
Example: two-point resolution
Finite-NA imaging system, unit magnification
Two point-sources
(object)
fA
fB
Two point-detectors
(measurement)
A
x
B
~
A
gA
~
B
gB
Classical view
intensities
emitted
noiseless
intensity
@detector
plane
g A gB
MIT 2.717
Image quality metrics p-17
x
intensities
measured
Cross-leaking power
g A = f A + sf B
g B = sf A + f B
s = sinc ( x )
2
s
~
A
MIT 2.717
Image quality metrics p-18
~
B
IMI for two-point resolution problem

1 s 

H =


s 1 
det (H ) =
1 − s
2
µ
1 =
1 + s
µ2 = 1− s
1
 1 − s 


H =
2 
1− s 
− s
1 
−1
1
 (1 −
s )
C(F , G ) = ln1 +
2

σ2
2
MIT 2.717
Image quality metrics p-19
 1  (1 + s )2 

 + ln1 +
2


2 
σ



IMI vs source separation
(SNR ) =
MIT 2.717
Image quality metrics p-20
s→1
1
σ2
s→0
IMI for rectangular matrices (1)
H
=
=
H
underdetermined
(more unknowns than
measurements)
overdetermined
(more measurements
than unknowns)
eigenvalues cannot be computed, but instead
we compute the singular values of the
rectangular matrix
MIT 2.717
Image quality metrics p-21
IMI for rectangular matrices (2)
HT
=
square matrix
H
(
fˆ = H T H
recall pseudo-inverse
)
−1
HT g
inversion operation associated with rank of
(
)
eigenvalues H T H ≡ singular values(H
)
MIT 2.717
Image quality metrics p-22
IMI for rectangular matrices (3)
object
“physical
attributes”
(measurement)
hardware
channel
field
propagation
f
H
detection
g
under/over determined
1
n  τ k

C = ∑ ln1 + 2

2
k =1

σ

MIT 2.717
Image quality metrics p-23
singular values
of H
Confocal microscope
Small pinhole:
Depth resolution
pinhole
virtual slice
Light efficiency
object
beam
splitter
Intensity
detector
Large pinhole:
Depth resolution
Light efficiency
MIT 2.717
Image quality metrics p-24
Depth “resolution” vs. noise
Object structure:
point sources,
mutually
incoherent
optical axis
sampling distance
Imaging method
correspondence
intensity
measurements
CFM
object
scanning
direction
MIT 2.717
Image quality metrics p-25
NA=0.2
Depth “resolution”
vs. noise & pinhole size
MIT 2.717
Image quality metrics p-26
units: Rayleigh distance
IMI summary
• It quantifies the number of possible states of the object that the
imaging system can successfully discern; this includes
– the rank of the system, i.e. the number of object dimensions that
the system can map
– the precision available at each rank, i.e. how many significant
digits can be reliably measured at each available dimension
• An alternative interpretation of IMI is the game of “20 questions:” how
many questions about the object can be answered reliably based on the
image information?
• IMI is intricately linked to image exploitation for applications, e.g.
medical diagnosis, target detection & identification, etc.
• Unfortunately, it can be computed in closed form only for additive
Gaussian statistics of both object and image; other more realistic
models are usually intractable
MIT 2.717
Image quality metrics p-27
Other image quality metrics
•• Mean Square Error (MSQ) between object and image
(
E = ∑
f k − fˆk
object
samples
)
2
result of 
ˆf =



k

inversion 
– e.g. pseudoinverse minimizes MSQ in an overdetermined problem
– obvious problem: most of the time, we don’t know what f is!
– more when we deal with Wiener filters and regularization
•• Receiver Operator Characteristic
– measures the performance of a cognitive system (human or
computer program) in a detection or estimation task based on the
image data
MIT 2.717
Image quality metrics p-28
Receiver Operator Characteristic
Target detection task
Example: medical diagnosis,
• H0 (null hypothesis) =
no tumor
• H1 = tumor
TP = true positive (i.e. correct
identification of tumor)
FP = false positive (aka false
alarm)
MIT 2.717
Image quality metrics p-29
Download