Modern Spectral Estimation Methods Applied ... FOPEN SAR Imagery Leiter Kang

Modern Spectral Estimation Methods Applied to
FOPEN SAR Imagery
by
Leiter Kang
Submitted to the Department of Electrical Engineering and Computer
Science
in partial fulfillment of the requirements for the degree of
Master of Engineering in Electrical Engineering and Computer Science
at the
MASSACHUSETTS INSTITUTE OF TECHNOLOGY
September 2000
@
Leiter Kang, MM. All rights reserved.
The author hereby grants to MIT permission to reproduce and
distribute publicly paper and electronic copies of this thesis document
in whole ( r in part.
MASSACHUSETTS INSTITUTE
OF TECHNOLOGY
LIBRARIES
Author Auto.....
.........
D
.
....................
. . . . . . .
Department of Electrical Engineering and Computer Science
.
August 11, 2000
A
Certified by.
Leslie M. Novak
Senior Staf4,MIT Lincoln Laboratory
Thesis Supervisor
Certified by......
V
JuliJ/A.Vtrakon Professo
Jeffrey H. Shapiro
lectrical Engineering
hesi.,Supervisor
Accepted by
Arthur C. Smith
Chairman, Department Committee on Graduate Students
Modern Spectral Estimation Methods Applied to FOPEN
SAR Imagery
by
Leiter Kang
Submitted to the Department of Electrical Engineering and Computer Science
on August 11, 2000, in partial fulfillment of the
requirements for the degree of
Master of Engineering in Electrical Engineering and Computer Science
Abstract
The automatic target recognition (ATR) of targets obscured by forest canopy in
FOliage PENetration (FOPEN) synthetic aperture radar (SAR) imagery is difficult
due to poor resolution and the electromagnetic distortion introduced by the forest
canopy. In this thesis we have investigated the application of modern spectral estimation methods, which reduce mainlobe width and lower sidelobe amplitude, to FOPEN
SAR imagery in the hope that improved resolution will lead to improved ATR performance. We applied the modern spectral estimation methods to images polarimetrically processed by methods such as the SPAN filter, the polarimetric matched filter
(PMF), and the polarimetric whitening filter (PWF). The discrimination performance
of each modern spectral estimation method was tested on Gaussian classifiers that
discriminate using geometric features and on Gaussian classifiers that discriminate
using polarimetric ratio features. Initial results indicate that the modern spectral
estimation methods investigated in this thesis do not provide significantly improved
discrimination performance using geometric features to discriminate targets from clutter. Discrimination performance was improved in some cases using polarimetric ratio
features.
Thesis Supervisor: Leslie M. Novak
Title: Senior Staff, MIT Lincoln Laboratory
Thesis Supervisor: Jeffrey H. Shapiro
Title: Julius A. Stratton Professor of Electrical Engineering
2
Acknowledgments
The author wishes to acknowledge the following people for their help during the thesis:
Les Novak - for his guidance throughout the course of the thesis.
Jeff Shapiro - for his advice and his kindness during the writing of the
thesis.
Marcel Schneeberger and Eric Haywiser - for their technical help with
ATR.
Serpil Ayasli - for the opportunity to work on an interesting project, from
which I have learned so much.
Roy, Jepras, Hee Jun, and Donny - for transportation to and from Lincoln
Lab and for their encouragement in the Gospel.
My family - for their enduring love and their many sacrifices.
The LORD - for accepting me as a son at the cost of rejecting His true
Son.
"My son, do not make light of the Lord's discipline, and do not lose heart
when he rebukes you,
because the Lord disciplines those he loves, and he punishes everyone he
accepts as a son." - Hebrews 12:5-6 (NIV)
This work was sponsored by the Defense Advanced Research Projects Agency under
Air Force Contract F19628-00-C-0002. Opinions, interpretations, conclusions, and
recommendations are those of the author and are not necessarily endorsed by the
United States Government.
3
Contents
1
Introduction
23
2
Synthetic Aperture Radar
27
2.1
Principles of SAR . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
27
2.1.1
Scattering . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
27
2.1.2
Geometry of SAR . . . . . . . . . . . . . . . . . . . . . . . . .
29
2.1.3
SAR Cross-Range Resolution
. . . . . . . . . . . . . . . . . .
32
Polarimetric Processing . . . . . . . . . . . . . . . . . .. . . . . . . . .
34
2.2.1
Clutter Model . . . . . . . . . . . . . . . . . . . . . . . . . . .
34
2.2.2
Span Filter
. . . . . . . . . . . . . . . . . . . . . . . . . . . .
36
2.2.3
Polarimetric Matched Filter . . . . . . . . . . . . . . . . . . .
36
2.2.4
Polarimetric Whitening Filter . . . . . . . . . . . . . . . . . .
37
2.2
2.3
Sample Images
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3 Superresolution
3.1
3.2
39
42
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
42
3.1.1
System Overview . . . . . . . . . . . . . . . . . . . . . . . . .
43
3.1.2
The Estimation of Covariance Matrices . . . . . . . . . . . . .
46
Mathematical Algorithms
. . . . . . . . . . . . . . . . . . . . . . . .
48
3.2.1
Bandlimited Interpolation . . . . . . . . . . . . . . . . . . . .
48
3.2.2
Minimum Variance Method
. . . . . . . . . . . . . . . . . . .
48
3.2.3
Eigenvector Method
. . . . . . . . . . . . . . . . . . . . . . .
50
3.2.4
Multiple Signal Classification
4
. . . . . . . . . . . . . . . . . .
51
3.3
4
3.2.5
Pisarenko's Method . . . . . . . . . . . . . . . . . . . . . . . .
52
3.2.6
Spatially Varying Apodization . . . . . . . . . . . . . . . . . .
52
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
55
Sample Images
Automatic Target Recognition
60
4.1
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
60
4.2
The Detector
61
4.3
The Discriminator
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . .
63
4.3.1
Pattern Classifier . . . . . . . . . . . . . . . . . . . . . . . . .
63
4.3.2
ROC Curve Areas and Feature Selection . . . . . . . . . . . .
65
4.3.3
Lincoln Laboratory ATR Features . . . . . . . . . . . . . . . .
68
4.4
The Classifier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
71
4.5
Sum m ary
72
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5 Experiments and Results
73
5.1
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
73
5.2
Parameters for Image Processing
. . . . . . . . . . . . . . . . . . . .
75
5.2.1
FOPEN SAR Imagery . . . . . . . . . . . . . . . . . . . . . .
75
5.2.2
Polarimetric Processing . . . . . . . . . . . . . . . . . . . . . .
77
5.2.3
Superresolution . . . . . . . . . . . . . . . . . . . . . . . . . .
79
Computation of Features . . . . . . . . . . . . . . . . . . . . . . . . .
80
5.3.1
Geometric Features . . . . . . . . . . . . . . . . . . . . . . . .
80
5.3.2
Polarimetric Features . . . . . . . . . . . . . . . . . . . . . . .
80
The Modified Feature Selection Algorithms . . . . . . . . . . . . . . .
80
5.4.1
Feature Normalization . . . . . . . . . . . . . . . . . . . . . .
81
5.4.2
Geometric Features . . . . . . . . . . . . . . . . . . . . . . . .
81
5.4.3
Polarimetric Features: Trained on Open Targets . . . . . . . .
82
5.4.4
Polarimetric Features: Trained on Obscured Targets . . . . . .
82
Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
83
5.5.1
Geometric Features . . . . . . . . . . . . . . . . . . . . . . . .
83
5.5.2
Polarimetric Features: Tested on Obscured Targets . . . . . .
86
5.3
5.4
5.5
5
5.5.3
6
Polarimetric Features: Tested on Open Targets
. . . . . . . .
Conclusions and Recommendations
95
A Plots of Mean ROC Curve Areas
A.1
90
97
Geom etric Features . . . . . . . . . . . . . . . . . . . . . . . . . . . .
A.2 Polarimetric Features: Tested on Obscured Targets
98
. . . . . . . . . .
102
. . . . . . . . . . . .
106
A.3.1
Trained on Open Targets . . . . . . . . . . . . . . . . . . . . .
106
A.3.2
Trained on Obscured Targets
110
A.3 Polarimetric Features: Tested on Open Targets
. . . . . . . . . . . . . . . . . .
B Best Feature Sets
114
B.1 Geom etric Features . . . . . . . . . . . . . . . . . . . . . . . . . . . .
115
B.2 Polarimetric Features: Trained on Obscured Targets . . . . . . . . . .
118
B.2.1
Linear Classifier . . . . . . . . . . . . . . . . . . . . . . . . . .
B.2.2
Quadratic Classifier . . . . . . . . . . . . . . . . . . . . . . . . 123
119
B.3 Polarimetric Features: Trained on Open Targets . . . . . . . . . . . . 127
B.3.1
Linear Classifier . . . . . . . . . . . . . . . . . . . . . . . . . . 127
B.3.2
Quadratic Classifier . . . . . . . . . . . . . . . . . . . . . . . . 132
C ROC Curve Areas
C.1
136
Geometric Features .........
136
C.1.1
Linear Classifier . . . . . .
136
C.1.2
Quadratic Classifier . . . .
138
C.2 Polarimetric Features: Trained on Obscured Targets .
139
C.2.1
Linear Classifier . . . . . .
139
C.2.2
Quadratic Classifier . . . .
141
C.3 Polarimetric Features: Trained on Open Targets . . .
143
C.3.1
Linear Classifier . . . . . .
143
C.3.2
Quadratic Classifier . . . .
144
6
D ROC Curves
D.1 Geometric Features ......
145
............................
146
D .1.1
Linear Classifier . . . . . . . . . . . . . . . . . . . . . . . . . . 146
D.1.2
Quadratic Classifier . . . . . . . . . . . . . . . . . . . . . . . . 149
D.2 Polarimetric Features: Tested on Obscured Targets
. . . . . . . . . . 152
D .2.1
Linear Classifier . . . . . . . . . . . . . . . . . . . . . . . . . . 152
D.2.2
Quadratic Classifier . . . . . . . . . . . . . . . . . . . . . . . . 156
D.3 Polarimetric Features: Tested on Open Targets
. . . . . . . . . . . . 160
D.3.1
Trained on Open Targets . . . . . . . . . . . . . . . . . . . . . 160
D.3.2
Trained on Obscured Targets
. . . . . . . . . . . . . . . . . . 168
E Performance of the Modified Feature Selection Algorithm
E.1
176
Geometric Features . . . . . . . . . . . . . . . . . . . . . . . . . . . .
177
E.1.1
Linear Classifier . . . . . . . . . . . . . . . . . . . . . . . . . .
177
E.1.2
Quadratic Classifier . . . . . . . . . . . . . . . . . . . . . . . .
179
E.2 Polarimetric Features: Trained on Obscured Targets . . . . . . . . . .
181
E.2.1
Linear Classifier . . . . . . . . . . . . . . . . . . . . . . . . . .
E.2.2
Quadratic Classifier . . . . . . . . . . . . . . . . . . . . . . . . 184
181
E.3 Polarimetric Features: Trained on Open Targets . . . . . . . . . . . . 187
E.3.1
Linear Classifier . . . . . . . . . . . . . . . . . . . . . . . . . . 187
E.3.2
Quadratic Classifier . . . . . . . . . . . . . . . . . . . . . . . . 189
7
List of Figures
1-1
The FOPEN System . . . . . . . . . . . . . . . . . . . . . . . . . . .
24
2-1
Range vs. Cross-Range (after [1])
. . . . . . . . . . . . . . . . . . . .
30
2-2
Range Resolution (after [1])
. . . . . . . . . . . . . . . . . . . . . . .
31
2-3
Cross-Range Resolution (after [1]) . . . . . . . . . . . . . . . . . . . .
33
2-4
Polarimetrically Processed Images (dB scale): HH, SPAN, PMF, and
P WF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
41
3-1
Superresolution System . . . . . . . . . . . . . . . . . . . . . . . . . .
44
3-2
M osaicking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
46
3-3
Possible Subapertures (after [2]) . . . . . . . . . . . . . . . . . . . . .
47
3-4
Dual Apodization . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
53
3-5
Superresolved Images (dB scale): Baseline, Interpolated, MVM, and EV 58
3-6
Superresolved Images (dB scale): MUSIC, Pisarenko, Joint I/Q SVA,
and Separate I/Q SVA
. . . . . . . . . . . . . . . . . . . . . . . . . .
59
4-1
Flow of Data in an Automatic Target Recognition System (after [3])
60
4-2
CFAR Window (after [16]) . . . . . . . . . . . . . . . . . . . . . . . .
62
4-3
An Example of a ROC Curve
. . . . . . . . . . . . . . . . . . . . . .
66
5-1
The P-3 SAR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
76
5-2
A Test Site at Grayling, MI
77
5-3
A Example of a Full-Sized SAR Image
5-4
Plot of Means of ROC Curve Areas: Geometric Features: Linear Clas-
. . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
sifier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
8
78
85
5-5
Plot of Means of ROC Curve Areas: Geometric Features: Quadratic
C lassifier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5-6
Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier . . . . . . .
5-7
89
Plot of Means of ROC Curve Areas: Polarimetric Features: Trained
on Open Targets: Linear Classifier
5-9
89
Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric
Features: Trained on Obscured Targets: Quadratic Classifier . . . . .
5-8
85
. . . . . . . . . . . . . . . . . . .
93
Plot of Means of ROC Curve Areas: Polarimetric Features: Trained
on Open Targets: Quadratic Classifier
. . . . . . . . . . . . . . . . .
93
5-10 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier . . . . . . .
94
5-11 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Quadratic Classifier . . . . .
94
A-1 Plot of Means of ROC Curve Areas: Geometric Features: Linear Classifier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
98
A-2 Plot of Means of ROC Curve Areas: Geometric Features: Quadratic
C lassifier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
98
A-3 Plot of Means of ROC Curve Areas: Geometric Features: Linear Classifier: Baseline and Upsampled . . . . . . . . . . . . . . . . . . . . . .
99
A-4 Plot of Means of ROC Curve Areas: Geometric Features: Quadratic
Classifier: Baseline and Upsampled . . . . . . . . . . . . . . . . . . .
99
A-5 Plot of Means of ROC Curve Areas: Geometric Features: Linear Classifier: MVM, EV, and Pisarenko . . . . . . . . . . . . . . . . . . . . .
100
A-6 Plot of Means of ROC Curve Areas: Geometric Features: Quadratic
Classifier: MVM, EV, and Pisarenko
. . . . . . . . . . . . . . . . . .
100
A-7 Plot of Means of ROC Curve Areas: Geometric Features: Linear Classifier: Joint-I/Q SVA and Separate-I/Q SVA . . . . . . . . . . . . . .
9
101
A-8 Plot of Means of ROC Curve Areas: Geometric Features: Quadratic
Classifier: Joint-I/Q SVA and Separate-I/Q SVA
. . . . . . . . . . .
101
A-9 Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier for Obscured
Targets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
102
A-10 Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier for
Obscured Targets . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
102
A-11 Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier for Obscured
Targets: Baseline and Upsampled . . . . . . . . . . . . . . . . . . . .
103
A-12 Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier for
Obscured Targets: Baseline and Upsampled
. . . . . . . . . . . . . .
103
A-13 Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier for Obscured
Targets: MVM, EV, and Pisarenko . . . . . . . . . . . . . . . . . . .
104
A-14 Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier for
Obscured Targets: MVM, EV, and Pisarenko . . . . . . . . . . . . . .
104
A-15 Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier for Obscured
Targets: Joint-I/Q SVA and Separate-I/Q SVA
. . . . . . . . . . . . 105
A-16 Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier for
Obscured Targets: Joint-I/Q SVA and Separate-I/Q SVA . . . . . . . 105
A-17 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Open Targets: Linear Classifier . . . . . . . . .
106
A-18 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Open Targets: Quadratic Classifier . . . . . . .
10
106
A-19 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Open Targets: Linear Classifier: Baseline and
U psam pled . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
107
A-20 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Open Targets: Quadratic Classifier: Baseline
and Upsam pled . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
107
A-21 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Open Targets: Linear Classifier: MVM, EV, and
Pisarenko
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
A-22 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Open Targets: Quadratic Classifier: MVM, EV,
and Pisarenko . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
A-23 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Open Targets: Linear Classifier: Joint-I/Q SVA
and Separate-I/Q SVA . . . . . . . . . . . . . . . . . . . . . . . . . .
109
A-24 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Open Targets: Quadratic Classifier: Joint-I/Q
SVA and Separate-I/Q SVA
. . . . . . . . . . . . . . . . . . . . . . .
109
A-25 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier . . . . . . . 110
A-26 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Quadratic Classifier . . . . . 110
A-27 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier: Baseline
and Upsampled . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .111
A-28 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Quadratic Classifier: Baseline
and Upsampled . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .111
11
A-29 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier: MVM, EV,
and Pisarenko . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
112
A-30 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Quadratic Classifier: MVM,
EV, and Pisarenko . . . . . . . . . . . . . . . . . . . . . . . . . . . .
112
A-31 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier: Joint-I/Q
SVA and Separate-I/Q SVA . . . . . . . . . . . . . . . . . . . . . . .
113
A-32 Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Quadratic Classifier: JointI/Q SVA and Separate-I/Q SVA . . . . . . . . . . . . . . . . . . . . .
113
D-1 Sample ROC Curves: Geometric Features: Linear Classifier: HH . . .
146
D-2 Sample ROC Curves: Geometric Features: Linear Classifier: HV . . .
146
D-3 Sample ROC Curves: Geometric Features: Linear Classifier: VV . . .
147
D-4 Sample ROC Curves: Geometric Features: Linear Classifier: SPAN
147
D-5 Sample ROC Curves: Geometric Features: Linear Classifier: PMF
148
D-6 Sample ROC Curves: Geometric Features: Linear Classifier: PWF.
148
D-7 Sample ROC Curves: Geometric Features: Quadratic Classifier: HH
149
D-8 Sample ROC Curves: Geometric Features: Quadratic Classifier: HV
149
D-9 Sample ROC Curves: Geometric Features: Quadratic Classifier: VV
150
D-10 Sample ROC Curves: Geometric Features: Quadratic Classifier: SPAN 150
D-11 Sample ROC Curves: Geometric Features: Quadratic Classifier: PMF
151
D-12 Sample ROC Curves: Geometric Features: Quadratic Classifier: PWF
151
D-13 Sample ROC Curves: Polarimetric Features:
Trained on Obscured
Targets: Tested on Obscured Targets: Linear Classifier: A
D-14 Sample ROC Curves: Polarimetric Features:
. . . . . .
152
Trained on Obscured
Targets: Tested on Obscured Targets: Linear Classifier: B . . . . . . 152
12
D-15 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Linear Classifier: C
. . . . . .
153
D-16 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Linear Classifier: D
. . . . . .
153
D-17 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Linear Classifier: E
. . . . . . 154
D-18 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Linear Classifier: F
. . . . . . 154
D-19 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Linear Classifier: G . . . . . . 155
D-20 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Linear Classifier: H
. . . . . .
155
D-21 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Quadratic Classifier: A
. . . .
156
D-22 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Quadratic Classifier: B
. . . .
156
D-23 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Quadratic Classifier: C
. . . .
157
D-24 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Quadratic Classifier: D
. . . .
157
D-25 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Quadratic Classifier: E
. . . . 158
D-26 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Quadratic Classifier: F
. . . .
158
D-27 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Quadratic Classifier: G . . . .
159
D-28 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Quadratic Classifier: H
. . . .
159
D-29 Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: A . . . . . . . . . . . . . . . . . . . . . . . . . . . .
13
160
D-30 Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: B . . . . . . . . . . . . . . . . . . . . . . . . . . . .
160
D-31 Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: C . . . . . . . . . . . . . . . . . . . . . . . . . . . .
161
D-32 Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: D . . . . . . . . . . . . . . . . . . . . . . . . . . . .
161
D-33 Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: E . . . . . . . . . . . . . . . . . . . . . . . . . . . .
162
D-34 Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: F . . . . . . . . . . . . . . . . . . . . . . . . . . . .
162
D-35 Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: G . . . . . . . . . . . . . . . . . . . . . . . . . . . .
163
D-36 Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: H . . . . . . . . . . . . . . . . . . . . . . . . . . . .
163
D-37 Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: A . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164
D-38 Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: B . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164
D-39 Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: C . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165
D-40 Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: D . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165
D-41 Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: E . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
D-42 Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: F . . . . . . . . . . . . . . . . . . . . . . . . . . . .
166
D-43 Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: G . . . . . . . . . . . . . . . . . . . . . . . . . . . .
167
D-44 Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: H . . . . . . . . . . . . . . . . . . . . . . . . . . . .
14
167
D-45 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Linear Classifier: A . . . . . . . . . 168
D-46 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Linear Classifier: B . . . . . . . . . 168
D-47 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Linear Classifier: C . . . . . . . . . 169
D-48 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Linear Classifier: D
. . . . . . . . 169
D-49 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Linear Classifier: E . . . . . . . . . 170
D-50 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Linear Classifier: F . . . . . . . . .
170
D-51 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Linear Classifier: G
. . . . . . . . 171
D-52 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Linear Classifier: H . . . . . . . . . 171
D-53 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Quadratic Classifier: A
. . . . . . 172
D-54 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Quadratic Classifier: B . . . . . . . 172
D-55 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Quadratic Classifier: C . . . . . . .
173
D-56 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Quadratic Classifier: D
. . . . . .
173
D-57 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Quadratic Classifier: E . . . . . . .
174
D-58 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Quadratic Classifier: F . . . . . . .
174
D-59 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Quadratic Classifier: G
15
. . . . . . 175
D-60 Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Quadratic Classifier: H
16
. . . . . . 175
List of Tables
3.1
Algorithm Complexity . . . . . . . . . . . . . . . . . . . . . . . . . .
45
5.1
Candidate Feature Sets for Polarimetric Features
. . . . . . . . . . .
75
5.2
Parameters for the P-3 UWB SAR [4] . . . . . . . . . . . . . . . . . .
76
B.1
List of Geometric Features . . . . . . . . . . . . . . . . . . . . . . . .
115
B.2 Best Feature Sets for Geometric Features: Linear Classifier . . . . . .
116
B.3 Best Feature Sets for Geometric Features: Quadratic Classifier . . . .
117
B.4 List of Polarimetric Features . . . . . . . . . . . . . . . . . . . . . . . 118
B.5 Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Linear Classifier: A . . . . . . . . . . . . . . . . . . . . . . . . . 119
B.6 Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Linear Classifier: B . . . . . . . . . . . . . . . . . . . . . . . . . 119
B.7 Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Linear Classifier: C . . . . . . . . . . . . . . . . . . . . . . . . . 120
B.8 Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Linear Classifier: D . . . . . . . . . . . . . . . . . . . . . . . . . 120
B.9 Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Linear Classifier: E . . . . . . . . . . . . . . . . . . . . . . . . .
121
B.10 Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Linear Classifier: F . . . . . . . . . . . . . . . . . . . . . . . . .
121
B.11 Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Linear Classifier: G . . . . . . . . . . . . . . . . . . . . . . . . . 122
17
B.12 Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Linear Classifier: H . . . . . . . . . . .T..
. . . . ..
. . . . . .
122
B.13 Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: A . . . . . . . . .T..
..
. . ..
. . . . . . 123
B.14 Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: B . . . . . . . . .T..
. . . ..
. ..
. . . . 123
B.15 Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: C . . . . . . . . .T..
. ......
. ...
124
B.16 Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: D . . . . . . . .
124
B.17 Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: E . . . . . . . .
125
B.18 Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: F . . . . . . . .
125
B.19 Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: G . . . . . . . .
126
B.20 Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: H . . . . . . . .
126
B.21 Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Linear Classifier: A . . . . . . . . . . . . .
127
B.22 Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Linear Classifier: B . . . . . . . . . . . . .
128
B.23 Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Linear Classifier: C . . . . . . . . . . . . .
129
B.24 Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Linear Classifier: D . . . . . . . . . . . . .
129
B.25 Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Linear Classifier: E . . . . . . . . . . . . .
130
B.26 Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Linear Classifier: F . . . . . . . . . . . . .
18
130
B.27 Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Linear Classifier: G . . . . . . . . . . . . . .T..
. . . .. p ..
I [31
. . . ..
B.28 Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Linear Classifier: H . . . . . . . . . . . . . .T..
. . . ..
. . . . ..
. I 31
B.29 Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Quadratic Classifier: A . . . . . . . . . . . .T..
. . . ..
. . ..
..
. I[32
B.30 Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Quadratic Classifier: B . . . . . . . . . . . .T..
. . . ..
. . . . ..
.
[32
B.31 Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Quadratic Classifier: C . . . . . . . . . . .
133
B.32 Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Quadratic Classifier: D . . . . . . . . . . .
133
B.33 Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Quadratic Classifier: E . . . . . . . . . . .
134
B.34 Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Quadratic Classifier: F . . . . . . . . . . .
134
B.35 Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Quadratic Classifier: G . . . . . . . . . . .
135
B.36 Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Quadratic Classifier: H . . . . . . . . . . .
C.1 Mean of ROC Curve Areas: Geometric Features: Linear Classifier . .
135
136
C.2 Standard Deviation of ROC Curve Areas: Geometric Features: Linear
C lassifier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
137
C.3 Mean of ROC Curve Areas: Geometric Features: Quadratic Classifier
138
C.4 Standard Deviation of ROC Curve Areas: Geometric Features: Quadratic
C lassifier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
138
C.5 Mean of ROC Curve Areas for Obscured Targets: Polarimetric Features: Trained on Obscured Targets: Linear Classifier . . . . . . . . .
19
139
C.6 Standard Deviation of the ROC Curve Areas for Obscured Targets:
Polarimetric Features: Trained on Obscured Targets: Linear Classifier
139
C.7 Mean of ROC Curve Areas for Open Targets: Polarimetric Features:
Trained on Obscured Targets: Linear Classifier . . . . . . . . . . . . .
140
C.8 Standard Deviation of ROC Curve Areas for Open Targets: Polarimetric Features: Trained on Obscured Targets: Linear Classifier . . . . .
140
C.9 Sum of Mean of ROC Curve Areas for Both Obscured Targets and
Open Targets: Polarimetric Features: Trained on Obscured Targets:
Linear Classifier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
140
C.10 Mean of ROC Curve Areas for Obscured Targets: Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier . . . . . . .
141
C.11 Standard Deviation of ROC Curve Areas for Obscured Targets: Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier 141
C.12 Mean of ROC Curve Areas for Open Targets: Polarimetric Features:
Trained on Obscured Targets: Quadratic Classifier . . . . . . . . . . .
142
C.13 Standard Deviation of ROC Curve Areas for Open Targets: Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier . . .
142
C.14 Sum of Mean of ROC Curve Areas for Both Obscured Targets and
Open Targets: Polarimetric Features: Trained on Obscured Targets:
Quadratic Classifier . . . . . . . . . . . . . . . . . . . . . . . . . . . .
142
C.15 Mean of ROC Curve Areas: Polarimetric Features: Trained on Open
Targets: Linear Classifier . . . . . . . . . . . . . . . . . . . . . . . . . 143
C.16 Standard Deviation of ROC Curve Areas: Polarimetric Features: Trained
on Open Targets: Linear Classifier
. . . . . . . . . . . . . . . . . . . 143
C.17 Mean of ROC Curve Areas: Polarimetric Features: Trained on Open
Targets: Quadratic Classifier . . . . . . . . . . . . . . . . . . . . . . .
144
C.18 Standard Deviation of ROC Curve Areas: Polarimetric Features: Trained
on Open Targets: Quadratic Classifier: Trained on Open Targets .
20
.
.
144
E.1
Ratio of the Mean ROC Curve Area of the Best Feature Set to the
Highest Mean ROC Curve Area: Geometric Features: Linear Classifier 177
E.2
Feature Set Size: Geometric Features: Linear Classifier . . . . . . . . 178
E.3 Ratio of the Mean ROC Curve Area of the Best Feature Set Mean ROC
to the Highest Mean ROC Curve Area: Geometric Features: Quadratic
C lassifier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 179
E.4 Feature Set Size: Geometric Features: Quadratic Classifier . . . . . . 180
E.5 Ratio of the Obscured Target Mean ROC Curve Area of the Best Feature Set to the Highest Obscured Target Mean ROC Curve Area: Polarimetric Features: Trained on Obscured Targets: Linear Classifier .
181
E.6 Ratio of the Open Target Mean ROC Curve Area of the Best Feature
Set Mean ROC Curve Area to the Highest Open Target Mean ROC
Curve Area: Polarimetric Features: Trained on Obscured Targets: Linear C lassifier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181
E.7 Ratio of the Combined Obscured Target and Open Target Mean ROC
Curve Areas of the Best Feature Set to the Highest Combined Obscured Target and Open Target Mean ROC Curve Areas: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier . . . . . . . 182
E.8 Feature Set Size: Polarimetric Features: Trained on Obscured Targets:
Linear Classifier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183
E.9 Ratio of the Obscured Target Mean ROC Curve Area of the Best Feature Set to the Highest Obscured Target Mean ROC Curve Area: Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier 184
E.10 Ratio of the Open Target Mean ROC Curve Area of the Best Feature Set Mean ROC Curve Area to the Highest Open Target Mean
ROC Curve Area: Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier . . . . . . . . . . . . . . . . . . . . . . . . .
21
184
E.11 Ratio of the Combined Obscured Target and Open Target Mean ROC
Curve Areas of the Best Feature Set to the Highest Combined Obscured Target and Open Target Mean ROC Curve Areas: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier . . . . . . .
185
E.12 Feature Set Size: Polarimetric Features: Trained on Obscured Targets:
Quadratic Classifier . . . . . . . . . . . . . . . . . . . . . . . . . . . .
186
E.13 Ratio of the Mean ROC Curve Area of the Best Feature Set to the
Highest Mean ROC Curve Area: Polarimetric Features: Trained on
Open Targets: Linear Classifier . . . . . . . . . . . . . . . . . . . . .
187
E.14 Feature Set Size: Polarimetric Features: Trained on Open Targets:
Linear Classifier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
188
E.15 Ratio of the Mean ROC Curve Area of the Best Feature Set to the
Highest Mean ROC Curve Area: Polarimetric Features: Trained on
Open Targets: Quadratic Classifier . . . . . . . . . . . . . . . . . . .
189
E.16 Feature Set Size: Polarimetric Features: Trained on Open Targets:
Quadratic Classifier . . . . . . . . . . . . . . . . . . . . . . . . . . . .
22
190
Chapter 1
Introduction
This thesis research is part of a continuing effort to improve the automatic target
recognition (ATR) subsystem of the FOliage PENetration (FOPEN) Advanced Technology Demonstration (ATD) system [5]. The FOPEN ATD is the prototype of a
system funded by the Defense Advanced Research Projects Agency (DARPA) that
is designed to locate stationary military targets such as tanks hidden under forest
canopy. The FOPEN system will be one of many reconnaissance subsystems aboard
the Global Hawk Unmanned Aerial Vehicle (UAV) which will collect data from a vast
area of terrain. Figure 1-1 shows the processing flow of the FOPEN system.
The synthetic aperture radar (SAR) is the image formation subsystem. It consists
of a physical sensor (an antenna) used to collect data from terrain via backscattered
electromagnetic (EM) radiation and a signal processor to synthesize images from the
collected data. The SAR sensor has its own source of illumination; it transmits the
EM radiation that returns as backscatter from the terrain to the sensor. The FOPEN
SAR transmits radiation at frequencies lower than the X-band frequencies typically
used for SAR. These lower frequencies allow the radiation to penetrate through the
forest canopy. The SAR sensor measures the magnitude and the phase of the backscattered electric field and records data for different polarimetric channels (polarizations).
The SAR signal processor uses the magnitude and phase data (i.e. complex data) to
produce a complex-valued image for each of the polarimetric channels.
These images can be combined into a single composite image via polarimetric
23
'-q
ND
D
SAR
Antenna
PoaiercSuperresolution
Processing
--
+
TR0
T
Annotated List
of
Potential Targets
processing. Polarimetric processing algorithms such as the polarimetric whitening
filter (PWF) and the polarimetric matched filter (PMF) are often used to produce
output images with certain properties, such as minimum speckle imaging (PWF) or
maximum target-to-clutter ratio imaging (PMF). For our research we have used the
following three polarimetric processing methods: PWF, PMF, and SPAN filtering.
Polarimetric processing precedes superresolution processing unless the images are
processed using the superresolution method of spatially varying apodization.
Superresolution is the novelty of our research, not normally being part of the
FOPEN system.
Superresolution methods mitigate the effects of the finite band-
width of the backscattered EM radiation by narrowing mainlobes and suppressing
sidelobes.
Modern spectral estimation methods are the particular superresolution
methods studied in this thesis. The modern spectral estimation methods investigated in this thesis are: 1) the minimum variance method (MVM), 2) the eigenvector
method (EV), 3) the MUltiple SIgnal Classification method (MUSIC), 4) Pisarenko's
method, and 5) spatially varying apodization (SVA). We have also investigated bandlimited interpolation, which is not a superresolution method, because it is simple
and it has been shown to improve ATR performance for X-band SAR imagery. All
of these superresolution methods receive complex-valued input imagery. The first
four methods return real-valued (magnitude-squared) imagery; SVA and bandlimited
interpolation return complex-valued imagery. We take the magnitude squared of the
images produced by these latter two methods to have consistent outputs for all the
methods.
The ATR subsystem is a collection of computer algorithms used to detect and
identify potential targets embedded within the images.
This subsystem cuts out
small subsections of the image known as chips, which contain potential targets. The
ATR system annotates each chip with its possible classification (e.g. tank, howitzer)
and passes this information on to a team of human image analysts who check the
accuracy of the results and provide a report detailing the ground order of battle
vehicles contained in the SAR image.
An ATR system recognizes targets by measuring features of objects (specific prop25
erties of an object useful for discriminating between targets and clutter) found in the
image. The greater the differences between the feature values of targets and the feature values of clutter, the better the performance of the ATR in correctly identifying
targets when they are present and rejecting clutter. Improvements in image quality
can enhance the differences between feature values of the two different classes and improve the performance of the ATR system. Novak et al. [6] have shown that modern
spectral estimation methods such as EV and high-definition imaging (HDI), a variant
of MVM, can improve ATR performance when applied to X-band SAR.
Since the application of modern spectral estimation methods has enhanced ATR
performance for X-band SAR data, we would like to extend these methods to low
frequency FOPEN SAR data. Targets in FOPEN SAR imagery, however, do not
necessarily look similar to targets in X-band SAR imagery. Since foliage distorts
and attenuates EM radiation, targets obscured by forest canopy look different from
targets imaged in open areas. Also tree trunks are visible to the FOPEN SAR; their
brightness and many of their other feature values are similar to those of targets, which
makes discrimination harder for the ATR. Thus we need to do research to determine
the effectiveness of previous methods on the low frequency FOPEN SAR data. The
goal of this thesis is to study the performance of the ATR system when modern
spectral estimation methods are applied to FOPEN SAR imagery.
The remainder of the thesis is organized as follows: Chapter 2 covers basic principles of SAR; the physical principles of the SAR sensor and the mathematics of
polarimetric processing methods are described. The superresolution methods studied
in this thesis, particularly the mathematical equations defining the various modern
spectral estimation algorithms and other ancillary algorithms needed for superresolution processing, are described in Chapter 3. Chapter 4 discusses the three stages of the
ATR subsystem, paying particular attention to the second stage, the discriminator.
Chapter 5 discusses the new research done in this thesis and presents experimental
results and possible explanations of the data. Chapter 6 is a short conclusion of the
thesis research; it reviews results and makes suggestions for future work.
26
Chapter 2
Synthetic Aperture Radar
2.1
2.1.1
Principles of SAR
Scattering
Synthetic aperture radar (SAR) is a modified version of an imaging radar known
as side-looking real aperture radar (SLAR). A SLAR is mounted on a moving platform, such as airplane or a satellite, that flies above the surface of the earth to image
the terrain below. An imaging radar images an object by illuminating it with electromagnetic radiation and by processing the returns radiated back from the object.
The interaction of the transmitted wave with the surface of an object being imaged
determines the nature of the backscattered radiation. The radar cross section o of
an object is a gross measure of its backscattering properties. The scalar radar cross
section is defined by the radar equation
Pt GtAe
Pr = P- 2 2
(47rR )
(2.1)
where P is the power received at the antenna, Pt is the power transmitted by the
antenna, Gt is the gain of the antenna, Ae is the effective area of the antenna, and R
is the distance between the antenna and the illuminated object; all of these quantities
are either known or can be measured. The larger the radar cross section, the more
27
power is reflected back toward the antenna and the brighter the object appears in
SAR imagery.
Polarimetric SARs transmit and receive polarized waves. These SARs measure a
more general, complex-valued version of the radar cross section with both magnitude
and phase, and the polarimetric data are stored as complex numbers. FOPEN SAR, in
particular, transmits and measures linearly polarized radiation for two polarizations:
H (the electric field vector is parallel to the ground) and V (the electric field vector
is perpendicular to both the H polarization and the direction of propagation). The
FOPEN SAR collects data from four polarimetric channels, one for each possible
combination of polarizations between the transmitted and received radiation. The
four channels are HH (H transmit, H receive), HV (H transmit, V receive), VH (V
transmit, H receive) and VV (V transmit, V receive). For a monostatic radar such as
the FOPEN SAR, HV = VH.
Freeman and Durden [7] have proposed a simple three-component scattering model
for the polarimetric SAR imagery of forests; the three scattering mechanisms are volume scattering from the forest canopy, double-bounce specular scattering from tree
trunks, and Bragg scattering from the forest floor. The primary scattering mechanisms of targets include specular scattering (single-bounce and triple-bounce scattering) and the dominant scattering mechanism, which is double-bounce scattering due
to the dihedral formed by the side of the target and the ground plane.
Tree trunks and targets (double-bounce scatterers) are the brightest objects in
FOPEN SAR imagery. The double-bounce returns are from the incident radiation
reflecting off the ground, to the object's surface, and back to the radar. The brightness
of targets varies with aspect angle. If we assume that targets are rectangular when
viewed from the air and that the front of a target corresponds to one of its shorter
sides, the aspect angle is defined as the angle between the direction of the moving
radar and the axis through the front of the target. Targets have their brightest returns
when imaged at cardinal aspect angles (00, 90', 1800, and 270') and their weakest
returns when imaged at noncardinal aspect angles, especially 450, 135', 225', and
315'. The phase difference between HH and VV for the double-bounce returns from
28
targets is 180'.
The magnitude of tree trunk returns is relatively constant with aspect angle because tree trunks are rotationally symmetric.
Studies have shown that their HH
returns are much stronger than the VV returns and that the phase difference between HH and VV is only 1000. Although many tree trunks are as bright as targets,
these differences in polarimetric properties may help distinguish between tree trunks
and targets [8].
2.1.2
Geometry of SAR
We now discuss the geometry of a SAR. (The discussions in this Section and in
Section 2.1.3 are taken from Curlander et al. [1]). To simplify analysis, we assume
that the SAR moves in a straight line with constant velocity Vt and constant altitude
relative to the surface. The antenna of the SAR is directed down toward the surface
of the earth, perpendicular to the direction of motion. The look angle -y, which is
shown in Figure 2-2, is the angle between a line perpendicular to the surface and
the direction of the transmitted radar beam, and the depression angle is the angle
complementary to the look angle.
The antenna beamwidth determines the area imaged by each pulse. The angular
beamwidths are nominally defined as
OH
V
where
6
H
LA
WA
WA
(2.2)
(2.3)
is the angular beamwidth in the cross-range (azimuth) direction, 0 v is the
angular beamwidth in the range direction, A is the wavelength of the center frequency
of the radar beam, and LA and WA are the corresponding aperture dimensions. These
directions and parameters are shown in Figure 2-1.
The swath width WG is the width of the region imaged in the range direction; by
29
LA
WA
Cross-Range
Range
Figure 2-1: Range vs. Cross-Range (after [1])
basic trigonometry we get
Rm
WG ?* 0V
cosrj
~~~'
-
ARm2
ACS7
W A COST)
2-4)
where the variables are as shown in Figure 2-2.
For a simple SAR with no processing of the radar returns, the duration of the
transmitted pulse, Tp, determines the resolution in the range direction. We can resolve
two points if the leading edge of the return from the further point arrives after the
trailing edge of the return from the nearer point. The range resolution for unprocessed
returns is
A s
sin 1
cTr
2 sinq
(2.5)
where c is the speed of light. For more sophisticated systems we use pulse compression
and signal processing to improve resolution for a given pulse duration. The resolution
30
Y7
\
V
R"
W
s
R' R
,ARs
ARG
WG
Figure 2-2: Range Resolution (after [1])
for these systems depends only on the bandwidth BR of the pulse:
6RG =
(2.6)
C
2BR sinq
For the P-3 Ultra-Wideband (UWB) SAR, which is the SAR that collected the imagery for this thesis, the bandwidth BR is 509 MHz [4]. Assuming a depression angle
of 300 (,q ~ 60'), the range resolution for the P-3 SAR is 0.34 m.
For the real aperture SLAR the cross-range resolution (the minimum distance
needed to resolve two points at the same distance in range) is determined by the
cross-range beamwidth and the slant range distance R (the distance between the
platform and either of the two points). Only points that are more than one angular
beamwidth apart can be resolved. Thus the cross-range resolution is
6x
(2.7)
= ROH
For the P-3 UWB SAR, the minimum cross-range beamwidth
0
H is 25' (0.4887 radi-
ans) [9] and the minimum slant range distance is 6 km [4]. Thus the best cross-range
resolution for a real aperture P-3 is 2.6 km -
31
this is very poor resolution and without
the use of SAR processing, surveillance radar aboard a UAV is impractical.
2.1.3
SAR Cross-Range Resolution
Cross-range resolution can be improved by using signal processing to exploit the
Doppler frequency properties of radar returns. Consider two points PO and P 1 , as
shown in Figure 2-3, which are at the same distance in range. The zero-Doppler
point PO is on the line that connects the radar with the ground and lies perpendicular
to the direction of motion. The point P is at some angle 0 in cross-range relative to
P0 . For small 0, the Doppler shift of P relative to the moving radar is approximately
proportional to its distance from the zero-Doppler point Po,
fD
2Vt sin 0
2Vtx
A
AR(28
(2-8)
and for any two points P and P2 at the same distance in range, the difference in their
cross-range distance is proportional to the difference in their Doppler shifts. The SAR
cross-range resolution 6x (the minimum distance resolvable using signal processing)
is thus proportional to the minimum Doppler frequency resolution JfD:
6X =
AR
2Vt
6
fD
(2.9)
Since signal processing allows us to achieve fine Doppler resolutions, the SAR crossrange resolution can be made better than the SLAR cross-range resolution.
We now derive an expression for the SAR cross-range resolution as function of
the integration angle (the cross-range angular beamwidth) OH. First, notice that the
Doppler frequency resolution is the reciprocal of the integration time S, the time that
the SAR images a point on the ground.
S
6
(2.10)
fD
A point stays within the radar beam for the duration of one integration angle or
equivalently, one spatial beamwidth. Since the spatial beamwidth is ROH and the
32
Figure 2-3: Cross-Range Resolution (after [1])
velocity of the SAR platform is V,,, the integration time is
ROH
00
vs=
(2.11)
and is proportional to the integration angle. Substituting Equations (2.10) and (2.11)
into Equation 2.9, we find that the SAR cross-range resolution as a function of the
integration angle is
6x
AR
V~
=
(2.12)
A
(2.13)
The cross-range resolution is thus inversely proportional to the integration angle,
and for good cross-range resolution we must have a large integration angle. This
result is the opposite of the real aperture result, Equation (2.7), which says that good
33
cross-range resolution is achieved at a small integration angle.
The previous derivation for the cross-range resolution of SAR used many approximations. A more accurate formula for cross-range resolution [4], which has been
derived from a more rigorous model, is
X-
Ak
4sin
(2.14)
2
where kA is the impulse response broadening factor due to aperture weighting. The
broadening factor results from applying various signal processing filters to the SAR
data. For uniform weighting (which is what we have assumed), kA
that for this kA and small
OH,
=
0.89. Notice
Equation (2.14) becomes
6X ~ 0.89 ( 2 0A
(2.15)
H
which is approximately equal to our previous result, Equation (2.13). For the FOPEN
SAR, A = 0.64 m, kA
=
1.13 (for -30 dB Taylor weighting), and OH = 31.7', so the
cross-range resolution for FOPEN SAR is 0.66 m.
2.2
2.2.1
Polarimetric Processing
Clutter Model
As mentioned in Section 2.1.1, FOPEN SAR can collect four channels of polarimetric
data: HH, HV, VH, and VV. Ignoring VH (since HV contains the same information),
we form a complex-valued vector x of three elements representing all the polarimetric
data for a pixel.
HH
x=
HV
HHI+JHHQ
HVI+JHV
=
VVI+JVV
VV
34
(2.16)
I is the in-phase (real) component and
Q is the quadrature
(imaginary) component.
This vector x is the sum of a clutter component and a noise component,
X = XC
+
(2.17)
XN
and has a covariance matrix E that is the sum of the clutter covariance matrix and
the noise covariance matrix.
(2.18)
The clutter covariance matrix Ec is assumed to have the form
1
0 pf;
(2.19)
c0
0
V=
P\/Y 0
where
o
=
(2.20)
E{IHH12}
SE{ HV12 }
E{jHH12}
E{IVVI 2}
E{ HH12}
E{HH -VV*}
(2.21)
(2.22)
(2.23)
E{HH12}
(2.24)
We model the clutter as a complex, zero-mean Gaussian random vector
xC
with a
probability density function (PDF) of
1
p(xc)
=
and can mr3 det (E c) exp(-4x
HExc)
(2.25)
t
and covariance matrix EC = Efxcx',} (H denotes conjugate transpose).
35
2.2.2
Span Filter
The span filter is the noncoherent (magnitude-squared) sum of the power in each of
the four channels; it is a simple method of forming a composite polarimetric image.
YSPAN
=
|HH 2 + |HV| 2 + |VH
=
|HH 2 + 21HV|2 ± IVV
1
=XH
2
+ IVV
2
2
(2.26)
(2.27)
0 0
0 2 0
x
(2.28)
0 0 1
To obtain the superresolved span image, we superresolve the HH, HV, and VV images
separately and sum the superresolved images to form the single, final (SPAN) image.
2.2.3
Polarimetric Matched Filter
The polarimetric matched filter (PMF) is an alternate method of combining the polarimetric channels [10]; the PMF is a linear processor designed to produce an intensity
image with maximum average target-to-clutter ratio (T/C). The output of the linear
processor for a given pixel vector x is a complex-valued scalar ZPMF whose squared
magnitude
YPMF
is the pixel value of x in the new intensity image.
(2.29)
H
ZPMF
IZPMFI
YPMF
W
HxxW
(2.30)
(2.31)
To find the optimal weight vector w' that maximizes the output T/C ratio
(T)
out
XTXTw}
EF{wHxcxCw
WHETW
wHECW
36
(2.32)
(2.33)
where ET is the target polarization covariance matrix, we solve the eigenvector equation
Aw = Aw
A =
(2.34)
(2.35)
EC'ET
and assign w' to the eigenvector corresponding to the largest eigenvalue Ama,.
Compu-
tation of the weighting vector w' requires a prioriknowledge of the target covariance
matrix and the clutter covariance matrix. To obtain the superresolved PMF image,
we superresolve the complex output of the linear processor ZPMF.
2.2.4
Polarimetric Whitening Filter
The polarimetric whitening filter (PWF) is a quadratic processor that produces an
output SAR intensity image with minimum speckle [11]. Like the PMF, the PWF
requires a prioriknowledge of the clutter covariance matrix. For a given pixel vector
x, the quadratic processor outputs a real, nonnegative scalar YPWF
(2.36)
YPWF = XHAx
which is the pixel value of x in the output image.
The PWF processor is derived assuming that there are no targets in the image.
The result is then applied to imagery with targets. We assume that the clutter is
spatially inhomogeneous and modify our clutter model by multiplying the Gaussian
clutter variable by a spatially varying texture variable
f to obtain a new clutter
variable
C = V/Jxc
(2.37)
The random variable g has a gamma PDF:
PG(g)
= 1 (- -
'v
9)
37
exp (-g)
9
(2.38)
E{g}
=
E{ 2}
=
(2.39)
v
2
v(v + 1)
(2.40)
Our quadratic processor is now:
YPWF =
(2.41)
CHAc
To minimize speckle we need a quantitative measure of speckle. We choose to minimize the ratio of the standard deviation of the output image pixel intensities to the
mean of those intensities
s
std. dev.(yPWF)
m
mean(yPWF)
Using constrained optimization techniques, we find that
A = Ec'
(2.43)
Applying this result to the target plus clutter model, the PWF intensity output is
YPWF
(2.44)
HE1Cjx
12
jHH2 +
HV
2
2
+
6
VV - p* 7H H4
(2.45)
V-y(1 - lpl)2
which can also be written as
YPWF
=
ZPWFZ4
ZPWF
=
EC
2
X
WF
(
(2.47)
To obtain the superresolved PWF image for the MVM, EV, MUSIC, and Pisarenko
methods, we superresolve each of the three channels of ZPWF and then sum the superresolved images to form the final image. To obtained the superresolved PWF image
for SVA, which can return complex-valued imagery, we superresolve each of the three
channels of x and then apply PWF processing. Since the order of polarimetric pro-
38
cessing and superresolution has been reversed, SVA requires its own clutter covariance
matrix, which is computed after superresolution.
Novak et al. [11] have found two interesting properties for the PWF processor:
1) when there are no targets in an image, the PWF processor is a whitening filter,
whose output
ZPWF
has components that have equal expected power and that are
independent of or white with respect to each other; and 2) various estimators for g are
functions of the PWF intensity
version of
YPWF,
yPWF:
the maximum-likelihood estimator is a scaled
the maximum a priori estimator is a nonlinear function of
and the conditional mean estimator is also a nonlinear function of
2.3
YPWF,
YPWF-
Sample Images
Figure 2-4 shows twelve images of targets: four different polarimetric versions (HH,
SPAN, PMF, and PWF) of three different targets.
Each of the images is a dB
intensity image that has been normalized to have a maximum value of 0 dB. The
vertical direction is the range direction, with range increasing from top to bottom,
and the horizontal direction is the cross-range direction, with the SAR moving from
left to right.
The top row shows Target 1, an end-on target in the open (aspect angle
0'
or 1800); the middle row shows Target 2, a noncardinal target in the open (aspect
angle % 570); and the bottom row shows Target 3, a broadside target obscured by
foliage (aspect angle ezz_
90' or 270'). Target 1 is the brightest and most clearly defined
target, because it is in the open (there is no foliage attenuation) and it is oriented
at a cardinal angle (thus most of the incident energy is reflected back toward the
radar). Target 2 is not as bright or as well-defined as Target 1; it reflects most of the
incident energy away from the radar because it is not oriented at a cardinal aspect
angle. Target 3 is the dimmest and most poorly defined target, because its returns
are attenuated by the forest canopy.
Each column of the figure corresponds to a specific polarimetric processing
method. From left to right, the columns are: HH, SPAN, PMF, and PWF. The
39
HH images have considerable speckle and poorly defined targets. The PMF images
look similar to the HH images, having considerable speckle and poorly defined targets. The PMF images, however, have somewhat dimmer clutter background than
the other polarimetric images. Also the PMF image of Target 2 does not have the
relatively bright upper left corner that the other images do. The SPAN images and
the PWF images look similar to each other, and they have less speckle and more
clearly defined targets than the HH images.
40
0
-5
-- 10
-15
-20
-25
-30
-35
-40
Figure 2-4: Polarimetrically Processed Images (dB scale): HH, SPAN, PMF, and
PWF
41
Chapter 3
Superresolution
3.1
Introduction
Improving image resolution beyond the sensor limits requires information beyond that
provided by the sensors. We cannot gather any more information from other sensors,
because if we could, then we would use those sensors to improve resolution. Instead we
must assume that our data obeys certain properties. Every superresolution method
has a collection of such assumptions called a signal model. If the data obeys these
properties, we have extra information about the image, which can be used to improve
resolution.
One common assumption for SAR imagery is the point-scatterer model [2]: a SAR
image is a superposition of weighted and shifted impulses known as point scatterers;
these point scatterers are embedded in noise. An equivalent way of expressing the
point-scatterer model is to say that the SAR data has a sinusoidal signal history. The
signal history is the processed SAR-return sequence from which the complex SAR
image is ultimately formed and is, in fact, the inverse Fourier transform of the SAR
image. Since the inverse Fourier transform of an impulse is a complex sinusoid, the
signal history is a superposition of complex sinusoids. The noise in the signal history
domain is assumed to be wide-sense stationary (WSS).
A point-scatterer has nonzero mainlobe width and considerable sidelobes, because
SAR data is bandlimited as result of being collected from a finite aperture. Since
42
sidelobes degrade image quality, the signal history is often weighted with a Taylor
window or a Kaiser window to suppress them. These windows are linear, spatially
invariant filters which tradeoff wider mainlobe width for lower sidelobe amplitude.
Since mainlobe width determines resolution (and narrower mainlobes imply better
resolution), these windows degrade resolution.
Superresolution methods are nonlinear spatially varying methods that both narrow mainlobes and suppress sidelobes. For our research we have investigated the
superresolution methods known as modern spectral estimation methods. The modern spectral estimation methods investigated in this thesis are: the minimum variance method (MVM), the eigenvector method (EV), the MUltiple SIgnal Classification method (MUSIC), Pisarenko's method, and spatially varying apodization (SVA).
We have also investigated bandlimited interpolation, which is not a superresolution
method. The modern spectral estimation methods are related historically and mathematically to the MVM [2]. They are called spectral estimation methods, because
they estimate the power in each of the frequency bands of an input image; i.e. they
compute the power spectral density (PSD) of an image. They are denoted "modern", because they are not based on the older periodogram-like methods. Although
these estimators are normally used to compute power in the frequency domain from
an input image, we have used these estimators to compute an intensity image from
an input signal history. Since the PSD of a signal history is its intensity SAR image, applying a modern spectral estimator to the signal history data will result in an
estimate of the intensity SAR image.
3.1.1
System Overview
A straightforward method of superresolving images is shown in Figure 3-1 and is
explained below. If we are given a signal history instead of an image, we begin at
Step 3.
1. Transform the image into its signal history via the 2-D inverse discrete Fourier
transform (IDFT).
43
(D
CD
IDFT
--- * IDFT
Cl)
Image
~~~~~~Remove
Taylor
~~~~~~~WeightingSpeesve
uersl
-+
P
Inrpat
Itroae
--
Signal
History
Superresolved
Image
2. Filter to remove the Taylor weighting applied to suppress sidelobes.
3. Superresolve and interpolate to produce the superresolved image of the desired
size.
METHOD
Upsampling
MVM
EV
MUSIC
COMPLEXITY
P log P
P3
P3
P3
Pisarenko
P3
SVA
P
Table 3.1: Algorithm Complexity
Since the superresolution methods are computationally intensive, we use in practice a slightly more complicated approach. Table 3.1 shows the complexity of each
method versus the total number of pixels P. Most of the methods require on the
order of P 3 operations. Thus a 400 x 200 pixel image requires on the order of 10"
operations. A 40 x 20 pixel image, however, requires only on the order of 10 9 operations - a substantial reduction! Therefore we can construct a superresolved image in
a reasonable amount of time if we piece together smaller superresolved images. This
technique is called mosaicking.
To mosaic an image, divide the original, unresolved image into smaller subimages;
superresolve each of these subimages; and then piece together or mosaic these images together to form the final image. We usually break the original image is into
overlapping regions to minimize edge effects in the final image. The subimages are
mosaicked together either by averaging the subimages together or by extracting the
centers of the subimages and placing these into the composite image.
For our re-
search we have chosen to extract the centers of the subimages. Now our final method
of superresolution, with mosaicking (see Figure 3-2), is:
1. Extract subimages from original image.
2. Transform each subimage into its signal history via 2-D IDFT.
45
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
Superresolved
Image
Original
Image
Figure 3-2: Mosaicking
3. Remove the Taylor weighting from each subimage.
4. Superresolve and interpolate to produce the superresolved subimage.
5. Mosaic the subimages together to from the final composite image.
3.1.2
The Estimation of Covariance Matrices
Each of the modern spectral estimation methods (except spatially varying apodization) requires a sample covariance matrix of the signal history data. There are at
least four methods that estimate this matrix - the covariance method (subaperture
averaging), the modified covariance method (forward-backward subaperture averaging), the biased or unbiased correlation method (block-Toeplitz enforcement) and for
oversampled data, decimation averaging [2] - and we have used the forward-backward
estimate for our research.
Although the estimate of the covariance matrix requires multiple samples of the
signal history, we have only one signal history. Fortunately we can obtain A =
(Kx-Kx+1)(Ky -ky+1) extra samples of the signal history using the WSS property
of the signal history. We obtain the extra samples by breaking the full aperture (full
signal history) of size Kx x Ky into smaller subapertures of size Kx x ky as shown
46
subaperture
full aperture
Figure 3-3: Possible Subapertures (after [2])
in Figure 3-3. We then compute the forward (covariance method) estimate RF of
the aperture covariance matrix by rearranging each subaperture into a 1-D vector xij
(either by raster-scanning or some other lexicographically convenient manner) and
then substituting the subaperture vectors into the formula
1A
RF
The size of this matrix RF is
kkX
(3.1)
XiH
XY, and the rank is min(A, kxky).
X
We can improve the estimate of the matrix using the forward-backward method
to obtain an extra "A" signal history samples, in addition to those already used for
the forward estimate. Recall that we have assumed a sinusoidal signal history model.
Since reversing and conjugating a complex exponential yields the same complex exponential ((e-3w)* = e3w), reversing and conjugating the signal history vectors yields
A extra signal history samples. Thus the improved estimate is
RFB
RF
1
=2A
A
Z
EXTJjX
+A
x , ',j~
i,3
1
-(F±Rj)
22(RF+JT
-+
where J is a square matrix of size
kXky x
1
A
J
Z
ij
i'
T
j
(3.2)
(3.3)
kXky with l's on the upper right to
lower left diagonal and O's elsewhere. The rank of RFB is min(2A, KxKy). Since the
47
superresolution methods we are investigating require an invertible covariance matrix,
we require full rank; i.e. 2A > KxKy.
Mathematical Algorithms
3.2
3.2.1
Bandlimited Interpolation
Bandlimited interpolation, also known as upsampling, is not a superresolution
method; it does not narrow mainlobes and it does not suppress sidelobes. We, however, include it as part of our study because Owirka et al. [12] have shown that
bandlimited interpolation can improve ATR performance. Since interpolation does
not add information, this result seems counterintuitive. The completeness of information in a test image, however, does not guarantee good ATR performance. An
ATR system extracts, through features, information relevant for the discrimination
of targets and clutter. Although the features used in an ATR system are chosen with
care and intelligence, the science of feature design is still immature and features might
not optimally extract information from the test image. Bandlimited interpolation can
improve ATR performance by reformulating the information so that the extraction
of information through features is improved.
Bandlimited interpolation (upsampling) of an image by a factor k is done as
follows:
1. Apply a 2-D IDFT to transform the image into its signal history.
2. Zero-pad the signal history to create new signal history k times larger in each
dimension.
3. Apply a 2-D DFT to get the interpolated image.
3.2.2
Minimum Variance Method
The minimum variance method (MVM) is the solution to a constrained optimization
problem. As mentioned in Section 3.1, MVM was originally designed to estimate the
48
power spectra of an input image; we are estimating the power in an image using an
input signal history. Our exposition of MVM, EV, MUSIC, and Pisarenko's method
will follow the exposition for the original usage. Thus the spatial variables in our
exposition will correspond to signal history variables and the frequency variables will
correspond to SAR spatial variables.
MVM adaptively estimates the power at each frequency (w1 , w 2 ) of the 2-D input signal x [i, n2] by filtering the input 2-D signal through a narrowband filter
b [n,, n2; W 1 , W 2] centered about that frequency and measuring the power of the output signal y [ni, n2] = b [ni, n2; W 1 , W 2] * x [ni, n2] [13].
Since there are an infinite
number of ways to create a narrowband filter, we specify that the filter b must pass
the signal at the specific frequency (wI, w 2 ) unaffected while minimizing the expected
power from the other frequency components. Mathematically these specifications are
b[ni, n 2 ; 1, W]e
2
B(wi, w 2 ) =
l
-W2n2 =
1
(3.4)
ni n2
min
PMVM (W1, W2) =
b
E{y[ni, n211 2}
(3.5)
We simplify this problem by writing the equations in vector-matrix form. If we
assume that y has zero mean, the right-hand side of Equation (3.5) is the variance
of y [n 1 , 7n2]. Since y is a WSS process, the variance of y is the zero delay value of its
autocovariance function and we can rewrite Equation (3.5) as:
E{ y[0, 0]1 2}
PMVM(w1, W2) =
Define the column vectors x, b, and w as the values of x [ni,
and w [hi, n2]
icographically.
-
e3-lnje 3w12n2,
(3.6)
b
7n2],
b* [ni, n2; W1,W21
respectively, where the samples have been ordered lex-
Since y[0,0] = bHX, we can rewrite our constrained optimization
equations as:
B(wi, w 2 ) = wHb = 1
49
(3.7)
PMVM(W1, W2)
(3.8)
E{bHxxHb}
=
b
min bHRb
(3.9)
b
where R = E{xxH}. The E{-} represents ensemble average, which we will ultimately
approximate (of necessity) by a sample average over our data, the forward-backward
covariance matrix. Using the method of Lagrange multipliers, we find that the solution is
b=
(3.10)
WHR-iw
1
PAVM(w1, w2) =
3.2.3
(3.11)
_.
Eigenvector Method
The eigenvector method (EV) is a variation on MVM that uses signal subspace decomposition [2]. Assume that two orthogonal subspaces - the signal subspace and
the noise subspace -- make up the signal history domain. For any point scatterer in
the signal subspace w and any noise component n we should have orthogonality; i.e.
nHw = 0. We can use this orthogonality as a measure of how signal-like a signal
history sample is. The real signal s is assumed to be a collection of superimposed
point scatterers. If an input signal history vector is x = s + n' where n' is some vector
in the noise subspace, the quantity
1
(3.12)
InHXI
is larger the more signal-like the signal history is. We obtain a similar quantity by
modifying the MVM through an eigenvector decomposition.
Let Ai be the ith eigenvalue of the covariance matrix R and vi the corresponding
eigenvector. Also let the eigenvalues be ordered as A, < A2 5
50
...
5 AN where N is
the number of eigenvectors. The eigenvector decomposition of R is:
N
R= ZAivivH
(3.13)
i=1
Now assume that the noise subspace is spanned by the eigenvectors corresponding to
the M smallest eigenvalues. We can express the inverse covariance matrix as the sum
of a noise subspace matrix and a signal subspace matrix:
(3.14)
viv
=
M
=
N
1
viv H
viv
(3.15)
i=m+l At
i=1 Ai
R# ise + R
(3.16)
where # denotes the pseudoinverse (per the singular value decomposition). Now we
obtain the EV result from MVM by substituting the noise pseudoinverse covariance
matrix for the the inverse covariance matrix:
1
PEV(wl,
W2)
(3-17)
wR1
wHRoisew
1
SEV(3.18)
This substitution results in a quantity similar to that of Equation (3.12).
3.2.4
Multiple Signal Classification
The MUltiple SIgnal Classification method (MUSIC) is similar to EV, except we
assume that all the noise eigenvalues are white: Ai = A. This eigenvalue is normally
selected to be the arithmetic mean of the measured noise eigenvalues. Thus we have:
M
A if(.9
noise =
(.9
51
PMUSIC(1,
3.2.5
2
WHR3
iseW
(3.20)
Pisarenko's Method
Pisarenko's method was the forerunner to EV and MUSIC [13]. Let h(x) be a strictly
monotone continuous function over (0, oc), H(x) be the inverse function such that
H(h(x)) = x, and h(R) = E', h(Ai)viv['. Pisarenko [14] defined a family of PSD's:
PPIS(W1,W2 ) = H(w Hh(R)w)
(3.21)
For our research h(x) = x- 2 and the Pisarenko estimate is:
1
PPIS(wHR12)
VWHR-2W
(3.22)
This estimate has properties similar to EV and MUSIC. Assume again that signal
history is spanned by two subspaces, the signal subspace and the noise subspace, that
are orthogonal to each other, and also assume that the noise subspace is spanned by
the eigenvectors corresponding to the M smallest eigenvalues. Since
R
N
v vi
(3.23)
the signal eigenvectors are weighted less than the noise eigenvectors. EV and MUSIC
are the extreme cases of a similar weighting scheme where the signal eigenvectors are
weighted so much less than the noise eigenvectors that they are not even considered.
3.2.6
Spatially Varying Apodization
Spatially varying apodization (SVA) is the generalized version of dual apodization,
a nonlinear spatially-varying technique used to reduce sidelobes in optics. SVA can
be derived from MVM [2] but differs in several ways from the other MVM-based
methods. SVA does not superresolve an image since it does not narrow mainlobes, but
it does suppress sidelobes. SVA outputs complex-valued images instead of intensity
52
images. We will not derive SVA from MVM; instead we will follow the more intuitive
exposition of its creators [15]. We will also develop SVA for the 1-D joint I/Q version
and simply present the results for the other versions.
In 1-D dual apodization we suppress sidelobes through the following nonlinear
procedure:
1. Filter the original image x[n] through a rectangular window and a Hanning
window to create two different images; i.e. multiply the signal history X[k] by
the two different windows.
2. Compare the two versions of the image in the image domain. For each point in
the output image, select the complex value from the image with the minimum
magnitude.
0
0-
.
.
0
.
-10
-10
-10
-20
-20
-20
-30
-30
-30
-40-
-40-
-40-
-50-60
0
-50
1
2-
2
3
5
4
6
-60
0
-50
____--
1
Rectangular
2
3
4
_
5
_-
6
-600
1
Hanning
2
3
4
5
6
Apodized
Figure 3-4: Dual Apodization
SVA is similar to dual apodization except an infinite set of windows is used instead
of two. The set of windows is the set of cosine-on-pedestal windows; these windows
have the form
k = 0,1,..., N - 1
Ba[k]
=
1+2a cos(
ba [n]
=
a6[n+1]+ 6[n]+ a6[n - 1]
)
n = 0,1,..., N - 1
(3.24)
(3.25)
where 0 < a < 0.5, Ba[k] is the window in the signal history domain and ba[n] is
the window in the image domain. We have assumed that the images are sampled
at the Nyquist rate and that N is both the aperture length and the discrete Fourier
transform length. A rectangular window corresponds to a
53
=
0, and a Hanning window
corresponds to a
=
0.5. As with dual apodization, we filter the original image x[n]
through these filters and pick the complex values which minimize the intensity of the
output Jy[n]|
2
at each point, n, in the image. The set of filter outputs is
Ya[n]
= ba [n] * x[n]
(3.26)
= ax[n + 1] + x[n] + ax[n -1]
(3.27)
For each n, we must set y[n] to the Ya[n] with minimum magnitude. Using conventional constrained optimization techniques, we find that the solution for y[n] is
a'[] -
y[n]
R {x*[n](x[n - 1] + x[n + 1])}
Jx[n - 1] + x[n + 1]12
x[n]
a'[n] < 0
x[n] + a'[n] (x[n - 1] + x[n + 1])
0 < a'[n] < 0.5
x[n] + 0.5 (x[n - 1] + x[n + 1])
a'[n] > 0.5
(3.28)
(3.29)
For the separate I/Q version of SVA, we perform joint I/Q SVA twice: once on the
real (in-phase) part of the image, and once on the imaginary (quadrature) part of the
image.
2-D SVA is conceptually similar to 1-D SVA but more complicated; the joint I/Q
version is, in fact, so complicated that it is not presented in the original paper [15].
For 2-D separate I/Q SVA, the set of filter outputs in Equation (3.27) becomes
Ya[ni, n2]
-
x[ni, n 2 ] + a 1 a 2 P + a 1Q1 + a 2 Q 2
(3.30)
where x[nI, n 2 ] is either the in-phase or quadrature part of the image but not both
and
Qi
=
x[ni - 1, n2] + x[ni + 1, n2]
(3.31)
Q2
=
x[ni, n2 - 1] + x[ni, n2 + 1]
(3.32)
54
P
= x[ni - 1,n2 - 1] +x[ni + 1,n2+1]+ x[ni - 1, n2+ 1] + x[ni, n 2 +1]
(3.33)
For any given a 2 , ya is a linear function of a, and has a maximum with respect to a,
located at one of the boundary points of a, (ai
=
0 and a,
=
0.5) and a minimum
with respect to a, located at the other boundary point. Similarly, for any given a1 ,
Ya is a linear function of a 2 and has a maximum with respect to a 2 located at one of
the boundary points of a 2 (a 2
=
0 and a 2 = 0.5) and a minimum with respect to a 2
located at the other boundary point. If we allow Ya to vary as a function of both a1
and a 2 , the maximum of Ya with respect to both a, and a 2 must be located at one of
the four boundary points of (a,, a 2 ):
(ai, a 2 )
=
(0, 0), (0, 0.5), (0.5, 0), and (0.5, 0.5);
and the minimum of ya with respect to both a 1 and a 2 must be located at one of the
remaining three boundary points. If the maximum and the minimum have opposite
signs, then there is some (a,, a 2 ) such that ya = 0. Since Ya is a real-valued function,
Ya = 0 is the smallest possible magnitude of ya. Thus we set y = 0 when Ya = 0.
Knowing this we can state the following algorithm for separate I/Q SVA:
1. Compute yta[ni,
n
]
2
for (a,, a 2 )
=
(0, 0), (0, 0.5), (0.5, 0), and (0.5, 0.5). If the
maximum and minimum have opposite signs, set y[rii, n2 ]
be simplified if we notice that ya[il, n 2 ]
=
=
0. This rule can
x[ni, n 2] when (a,, a 2 )
Noting this fact, we write the rule as: compute yIn 1 , in 2 ] for (a,, a 2 )
=
=
(0, 0).
(0,0.5),
(0.5, 0), and (0.5, 0.5). If the sign of any of the three is opposite that of x[ni, n 2 ],
y[ni, n 2] = 0.
2. Otherwise for the four values of (ai, a 2 )
=
(0, 0), (0, 0.5), (0.5, 0), and (0.5, 0.5),
set y[ni, in 2 ] to the Ya[ni, n 2 ] with the minimum magnitude.
3.3
Sample Images
Figure 3-5 and Figure 3-6 show 24 images of targets: eight sets of images (seven
sets of superresolved images and one set of unsuperresolved images) for each of the
targets in Figure 2-4. The unsuperresolved images are the PWF-processed, baseline
images used in the Lincoln Laboratory ATR system, prior to this thesis research.
55
The superresolved images correspond to the seven superresolution methods discussed
in this chapter: upsampling, MVM, EV, MUSIC, Pisarenko's method, 2-D joint-I/Q
SVA, and 2-D separate-I/Q SVA. Prior to superresolution, all of the images were
PWF-processed. After superresolution, all of the images (except the images whose
superresolution method is upsampling) are upsampled by a factor of two.
Both the figures and the images have the same format as Figure 2-4: the top
row of images shows Target 1, the middle row shows Target 2, and the bottom row
shows Target 3; each of the images is a dB intensity image normalized to have a
maximum value of 0 dB; the vertical direction corresponds to the range direction,
and the horizontal direction corresponds to the cross-range direction.
Each column of the figures contains images processed by a specific superresolution
method. From left to right, the columns of Figure 3-5 are baseline, upsampled, MVM,
and EV, respectively, and the columns of Figure 3-6 are MUSIC, Pisarenko, joint-I/Q
SVA, and separate-I/Q SVA. Descriptions for all the columns are written below.
" Since the baseline images are not upsampled, the baseline images have much
coarser resolution than their corresponding superresolved images. Since a 1 x
1 pixel in a baseline image maps to a 2 x 2 pixel box in its corresponding
superresolved image, the baseline images have been magnified by a factor of 2
using nearest neighbor interpolation.
" The upsampled images have noticeable finite-aperture induced sidelobes that
appear as bright horizontal and vertical streaks. These streaks make the obscured target very difficult to see. Also the upsampled images have considerable
speckle, even though they have been PWF-processed.
" The open targets in the MVM images are better defined than in the baseline
images. The sidelobes have been eliminated, and the speckle has been considerably reduced. The obscured target is not visible clearly, but at the bottom
of the target, two point scatterers, not visible in the upsampled image, are now
visible.
56
"
The EV images look very similar to MVM images. The contrast between the
targets and the background clutter, however, is better, and more point scatterers
have been resolved; for example, in the background clutter of the lower half of
Target 2, at least four weak point scatterers, not visible in the MVM image, are
now visible.
" The MUSIC algorithm severely degrades the target images and preserves only
the brightest point scatterers. The contrast between the target point scatterers
and the background clutter is poor, and the edge effects of mosaicking, evidenced
by the boxes present in the images, are severe.
" The Pisarenko images look similar to the MVM and the EV images. The Pisarenko images, however, have poorer contrast between the target and the clutter
background and noticeable edge effects from mosaicking.
" The joint-I/Q SVA method has successfully eliminated the sidelobes; for example, in the image of Target 2, the sidelobes of the bright point scatterer are
no longer visible, and in the image of Target 3, the two point scatterers at the
bottom of the target are now visible. The joint-I/Q SVA imagery, however, has
poorer contrast between targets and clutter than the upsampled images.
" The separate-I/Q SVA images look very similar to the joint-I/Q SVA images.
The separate-I/Q SVA images have better contrast between the targets and the
clutter background but dimmer overall target area.
57
0
-5
-10
-15
-20
-25
-30
-35
-40
Figure 3-5: Superresolved Images (dB scale): Baseline, Interpolated, MVM, and EV
58
0
-5
-10
-15
-20
-25
-30
-35
-40
Figure 3-6: Superresolved Images (dB scale): MUSIC, Pisarenko, Joint I/Q SVA, and
Separate I/Q SVA
59
Chapter 4
Automatic Target Recognition
4.1
Introduction
An automatic target recognition (ATR) system is a collection of computer algorithms
that scan for, detect, and recognize targets within an image. The ATR system cuts
out chips (small subsections of the image) containing possible targets and annotates
these chips with their most probable classifications. These chips either contain actual
targets or they contain clutter, i.e. other objects that are imaged, such as trees or
buildings, which are not of military interest. Target chips correctly identified as
targets are called detections, and clutter chips incorrectly identified as targets are
called false alarms. Two measures of ATR performance are PD, the probability of
detection, and PFA, the probability of false alarm; a good ATR system has high PD
and low PFA-
Input
Image
Detector
Discriminator
Rejects imagery without Rejects natural clutter
false alarms
potential targets
Classifier
Classified
Targets
Rejects man-made
clutter
Figure 4-1: Flow of Data in an Automatic Target Recognition System (after [3])
60
The ATR must process large amounts of data as it tries to find targets embedded
in relatively large images, so the typical SAR ATR is divided into three stages, as
shown in Figure 4-1, to help manage the flow of data through the ATR system [3]. The
detector and the discriminator act as filters, which reduce the amount of data passed
on to the succeeding stages. The detector uses a simple statistical test to locate
possible targets within a full SAR image, cuts out chips of these possible targets
and their surrounding regions, and passes the chips on to the discriminator. The
discriminator further narrows this set of potential targets through more sophisticated
tests designed to reject natural clutter false alarms (trees, grass, and other topological
features). The discriminator passes the refined set of chips on to the classifier. The
classifier tries to match each chip to a known target type, such as a truck, a howitzer,
or particular type of tank, but declares a chip to be man-made clutter (a building,
a civilian vehicle, etc.) if it cannot be classified. Chips declared clutter are removed
from the list of targets. Human image analysts then examine the final collection of
chips and their classifications and make reports for use by military users.
4.2
The Detector
We assume that targets are brighter (have greater radar cross-sections) than their
background, so the detector scans the SAR image looking for pixels bright relative to
their background. Prior to the detector, the SAR image is converted into an intensity
dB image (x --+ 10 log 0 xl). The detector then maps each pixel in the dB image
to a brightness measure known as the CFAR (constant false-alarm rate) statistic.
If the CFAR value of a pixel is greater than a set threshold, this pixel is declared a
target pixel; otherwise it is declared a clutter pixel. Targets often have multiple pixels
with CFAR values greater than the threshold, so the detector groups together pixels
which fall within some target-size area of each other [16]. After this clustering, the
detector cuts out a chip of this candidate target, centering the chip on the centroid
of the cluster of pixels, and then passes the chip on to the discriminator for further
processing.
61
WINDOW CELL
GUARD AREA
TARGET
TEST CELL
Figure 4-2: CFAR Window (after [16])
The CFAR statistic can be defined as either a one-parameter statistic or a twoparameter statistic. Figure 4-2 shows the CFAR window used to compute the CFAR
statistics.
The window pixels are at some specified distance from the test pixel,
separated by a guard area large enough so that a target cannot appear in both the
test pixel and the window pixels simultaneously. Let Xt be the dB value of the test
pixel, jc the mean of the window (background) pixels, and &, the standard deviation
of the window pixels. The one parameter CFAR statistic, also known as the targetto-clutter ratio (TCR), is defined as
KCFAR = Xt - Ac
(4.1)
The two-parameter CFAR statistic is defined as
Xt -
KCFAR
C2)
Regardless of whether we use the one-parameter or the two-parameter statistic, a
pixel is declared a target pixel if its CFAR value is greater than the threshold KT
and is declared a clutter pixel otherwise. The threshold KT is chosen to achieve a
62
specified PFATARGET
KT
KCFAR
(4.3)
CLUTTER
When the clutter in the intensity dB image is Gaussian distributed (equivalently,
when the clutter in the original image is lognormal distributed), the decision rule for
the two-parameter CFAR produces a constant false-alarm rate detector, which has
a constant PFA over some unit area. Although the dB-intensity clutter is usually
not Gaussian distributed, the CFAR detector has been empirically shown to be an
effective algorithm for detecting targets in a SAR image.
4.3
4.3.1
The Discriminator
Pattern Classifier
The discriminator classifies the chips from the detector into target chips and clutter
chips using a pattern classifier (which is not to be confused with the ATR classifier,
the third stage of the ATR). The discriminator first measures features of a chip, which
are properties effective in distinguishing targets from clutter, collects these features
into a feature vector, and then converts this vector into a set of scalar scores using
a pattern classifier. The set of features used to form the feature vector is known as
the feature set. The vector is mapped into the set of scores via a set of discriminant
functions, with one discriminant function per class; the pattern classifier then assigns
the chip to the class corresponding to the best score. Chips declared targets are passed
on to the classifier stage for further processing; chips declared clutter are removed
from further consideration.
For our research we have used the simplest classifier, the Gaussian classifier. When
both the target class and the clutter class have Gaussian-distributed features and have
known a prioriprobabilities, PT and Pc, respectively, this classifier returns the max-
63
imum a posterioriprobability (MAP) decision (i.e. target or clutter) for an observed
feature vector x. The derivation for the classifier's decision rule is straightforward, so
we will not show it here but will only present the results. The reader is encouraged
to look up the derivation in a basic text on statistics or pattern recognition such as
Fukunaga [17].
The general Gaussian classifier, also known as the quadratic discriminant classifier
(QDC) has the decision rule
TARGET
(4.4)
>
yQDC(X
CLUTTER
where
yQDC(X)
2
[(X
-
mT)T(X
mT)
-
-
(X
-
mC)"Ej (X
-
mC
-In
TI
IECI
(4.5)
the scalar -y is the classifier threshold, the parameters mT and ET are the sample
mean and the sample covariance, respectively, of the target class, and the parameters
mC and EC are the sample mean and the sample covariance, respectively, of the
clutter class. If the target class and the clutter class have equal covariance matrices
(E = ET=
Ec), in addition to being Gaussian-distributed, we obtain a simpler
discriminant function for our classifier. The corresponding decision rule is
TARGET
(4.6)
yF(NX>)
CLUTTER
where
yLDC(X) = (mT
-
mC)E
1
1
2
X + -(mT
64
1mT
-
mC
- 1 mC
(4.7)
This classifier is known as the linear discriminant classifier (LDC) or the Fisher classifier. For both classifiers the threshold is set to PC/PT if PT and Pc are known or set
to the value that achieves a specified PFA, and the sample parameters are estimated
from the training set, a set of target vectors and clutter vectors representative of their
respective classes.
The QDC is said to have a quadratic discriminant function, because the classifier
score yQDC(x) is a quadratic function of the elements of the vector x. Similarly,
the LDC is said to have a linear discriminant function, because the classifier score
is a linear function of the elements of x. The classifier threshold of the QDC maps
to a quadratic decision boundary in the feature space (the space whose axes are
the elements of the feature vector). The quadratic decision boundary is a quadratic
surface that formed by constraining the elements of x to produce yQDC(x)
-y. The
classifier threshold of the LDC maps to a linear decision boundary. For a 2-D space
the quadratic decision boundary is a conic section (a circle, an ellipse, or a hyperbola),
and the linear decision boundary is a planar surface (a line, a plane, or a hyperplane),
which is the simplest possible decision boundary for that given space.
Because most features are usually not Gaussian distributed, other classifiers besides the Gaussian classifiers may provide better discrimination between targets and
clutter. These other classifiers can create more sophisticated (and possibly nonlinear)
decision boundaries that take better advantage of the distribution of the data samples
than do the Gaussian classifiers. We, however, use the Gaussian classifiers, because
they are simple and because any feature sets good enough for use on a Gaussian
classifier should give good performance on more sophisticated classifiers.
4.3.2
ROC Curve Areas and Feature Selection
Although many features are available for discrimination, having many of these features in our feature set does not necessarily improve discrimination performance. In
fact, having too many features in a feature set can even decrease performance [18].
Thus we need some way of selecting features for a feature set and some way of comparing the discrimination performance of possible feature sets. In our research we
65
have used the area under the receiver operating characteristic (ROC) curve to measure discrimination performance and the forward-backward search algorithm to select
features for our feature set.
I
P
D
0
P
1
FA
Figure 4-3: An Example of a ROC Curve
The ROC curve is a plot of
PD
vs. PFA for a given classifier, where PD is the
vertical axis and PFA is the horizontal axis. Since both PD and PFA are monotonically
decreasing functions of the classifier threshold -y (the probability of chips, either target
or clutter, whose scores are greater than -y decreases as y increases), we trace out the
ROC curve by increasing 7 from -oo to oo. The curve has endpoints at (PD, PFA; 7)
=
(1, 1; -oo) and (0, 0; oo) and lies within the square formed by the points (PD, PFA)
=
(0, 0), (0, 1), (1, 0) and (1, 1). Using the ROC curve, we can define two measures
of performance:
1) the area under the ROC curve and 2) the minimum distance
between the ROC curve and the perfect performance point (PD, PFA) - (1, 0). The
area under the ROC curve measures the performance of the ROC curve over all PFA
and should increase if PD increases for some PFA. We say that one feature set is
better than another if the better set has greater ROC curve area. The minimum
distance between the ROC curve and the perfect performance point measures how
close the feature set comes to achieving the ideal of high PD and low PFA; this distance
should be smaller as the feature set provides better discrimination. We do not use
66
the minimum distance measure in feature selection but only suggest it as a measure
of judging the quality of the ROC curves in Appendix D.
We now use the area under a ROC curve to help us generate (select features for)
a "best" feature set from a given list of features called the candidate set. Feature
selection algorithms use trial and error to pick the "best" feature set; each feature
search algorithm creates numerous possible "best" feature sets and tests each one
to see which has the best performance.
The particular feature search algorithm
determines which particular feature sets are tested. To test a feature set, first compute
the feature values for features in the feature set and collect these values into feature
vectors. Second, separate the vectors into two mutually exclusive sets, the training
set and the testing set. Third, train a classifier using the vectors in the training set.
Fourth, compute a ROC curve for the feature set by processing the vectors in the
testing set on the classifier. Finally, declare the feature set with the largest area under
its ROC curve to be the "best" feature set.
For our research we have used a suboptimal but quick feature search algorithm
called the forward-backward search. We have not used the optimal and most straightforward feature search algorithm of testing every possible combination of features,
because it is often computationally prohibitive. We have also not used the computationally simple search algorithm of collecting features that have good individual
performance, because this algorithm only works well for uncorrelated features [18].
The forward-backward search iteratively generates a feature set by adding one
feature per iterationto the feature set. This addition of one feature, however, is the
net result of adding two features to and subtracting one feature from the feature set.
The forward-backward search runs until all the features in the candidate feature set
have been used. We then compare the ROC curve areas at the end of each iteration
and declare the "best" feature set to be the set of features found in the generated set
at the end of the iteration with the highest ROC curve area.
The details of the forward-backward search algorithm are as follows:
1. Initialization
67
(a) Add the feature from the candidate set with the best individual performance to the empty generated set.
(b) Add another feature from the candidate set to the generated set. The
added feature should produce the generated set with the best performance.
Record the ROC curve area and the list of features in the generated set.
2. Iteration
(a) Add another feature from the candidate set to the generated set, the feature
that produces the generated set with the best performance.
(b) If the candidate set is empty, record the ROC curve area, record the list
of features in the generated state, and go to step 3a.
(c) Again, add the feature from the candidate set to the generated set that
yields the best performance.
(d) Remove one feature from the generated set and replace it to the candidate
set, the feature that leaves the remaining set with the best performance.
Record the ROC curve area, record the list of features in the generated
set, and return to step 2a.
3. Termination
(a) Find the highest ROC curve area, and declare its corresponding set of
features as the "best" feature set.
4.3.3
Lincoln Laboratory ATR Features
Tree trunks are the most numerous form of clutter processed by the discriminator of
the FOPEN SAR ATR, because they have high CFAR values, comparable to that
of targets, and because they exist in great frequency in the forest, with hundreds or
even thousands of them appearing in an image. The volume of tree trunk false alarms
passing through the ATR can potentially overwhelm the ATR, so we hope to build
feature sets that effectively discriminate targets from tree trunks. We will build our
68
feature sets from two categories of features: 1) geometric features and 2) polarimetric
ratio features.
Geometric Features
Geometric features may be useful in discriminating targets from clutter, because military targets and tree trunks differ in size and shape. We have a basic set of 7 geometric
features available for our research. These features do not extract information from an
entire chip but instead measure the properties of a cluster of pixels near the center
of the chip. To create this cluster, first create multiple clusters of pixels that have
values above a certain threshold, called the low threshold, and at least one pixel in
each cluster that has a value greater than the high threshold, which is greater than
the low threshold. Initialize each cluster with a seed pixel, which has a pixel value
greater than the high threshold. Build the cluster by adding the neighbors of this
original seed which have values greater than the low threshold. Add the neighbors
of these added pixels, which have pixel values greater than the high threshold, and
repeat until there are no more neighbors with values greater than the low threshold.
If two clusters meet as we build clusters, combine the two clusters into one cluster.
Pick the largest cluster from the multiple clusters we have formed that has some
pixels that lie within some specified region around the center of the chip. Use this
cluster for the extraction of feature data.
We now define the following geometric features:
1. Number of Points - the number of points in the cluster
2. Major Length - the largest Euclidean distance between any pair of cluster points
3. Minor Length - Draw a line connecting the two cluster points with the largest
Euclidean distance. Find the points furthest right of the line and the point
furthest left from the line. The minor length is the Euclidean distance between
these two points.
4. The Ratio of the Major Length to the Minor Length
69
5. The Ratio of the Mean of the Points in the Cluster to the Standard Deviation
of the Points in the Cluster
6. Perimeter Length - the length of the perimeter around the cluster
7. Ratio of the Perimeter to the Number of Points
Polarimetric Features
Phenomenological studies by Bessette et al. [8] suggest that polarimetric features
(features which measure differences between polarimetric channels) might be useful for
discrimination. For example, the phase differences between the HH aid VV channels
of the radar return from targets is 1800, but the phase difference between the HH and
VV channels from tree trunks is 100'. This may be a useful discriminant. Also, the
difference between the magnitudes of the HH and VV channels from tree trunks is
~ 10 dB. This difference is attributed to the physical dimensions of tree trunks relative
to the radar pulse wavelength - the tree trunk is small compared to the wavelength
in the horizontal plane but large compared to the wavelength in the vertical plane.
We expect a different magnitude difference for targets, because they are longer in the
horizontal plane and smaller or roughly the same size in the vertical plane. This may
also be a useful discriminant.
Our research focuses on magnitude-comparing polarimetric features, because most
of the superresolution methods destroy phase information. These features require
superresolved chips for each of the channels compared. Each of the chips from the
various channels must be of the same region and aligned with the chips in the other
channels. The channels being compared need not all be pure channels such as HH,
HV, or VV; we can also include polarimetrically processed imagery in our set since
these are combinations of the information in the various channels. For convenience we
will now refer to a chip as the entire set of chips across all the polarimetric channels.
We measure 36 features from each chip. Each feature is computed from pixels
that fall within some specified rectangle centered on the center of the chip. The 36
features represent six groups of six features. Let (ni, n 2 ) denote some pixel within the
70
specified region around the center of the chip, C1[ni, n 2] denote the value of the first
channel at that pixel, C2 [ni, n 2] denote the value of the second channel at that pixel,
E{-} denote the mean operator, and -{-} denote the standard deviation operator.
Our six types of features are:
1.
E{Cj[nin2]}
2. E
{
2
5.
the mean of the ratios
C-[nifl
3. o{C[njn2]}
a{IC2 [nl,n ]}
4.
the ratio of the means
the ratio of the standard deviations
--
the standard deviation of the ratios
max(C[nin])
max(C2[nl,n2])
6. max f
-
the ratio of the maxima
C2[nln2] I -
the maximum of the ratios
These six features are measured for six pairs of channels: (C1 , C 2 ) = (HH, VV), (HH,
HV), (VV, HV), (SPAN, HH), (PMF, HH), and (PWF, HH).
4.4
The Classifier
The classifier used in the ATR system receives the chips which have passed through the
discriminator and tries to classify these into known target classes such as BMP2 and
M2 (tanks) and BTR60 and BTR70 (armored personnel carriers). As an example of a
classifier used for X-band SAR, we consider the Lincoln Laboratory classifier, a meansquared error (MSE) classifier that compares each chip against template images in
the ATR classifier database. (A template is an image of a target formed by averaging
target images over a limited range of aspect angles (~
templates for a single target type.)
50)), and there are multiple
Each image, both test and template, is a dB
image and is normalized by subtracting out the mean of the background clutter. The
classifier computes a MSE score between the chip and each template image according
the formula
1
(T
N
71
-
)22
(8)
where T is the ith pixel in the template image, i is the ith pixel of the test image,
and N is the maximum number of pixels in either image. The chip is classified as the
target in the template with the lower score; if the score is above some maximum value,
the chip is declared clutter. After being classified, the chips and their classifications
are passed on to a team of image analysts.
4.5
Summary
Because we have surveyed a vast amount of material, we will review to keep things
organized.
" The ATR system is divided into three stages: the detector, the discriminator,
and the classifier.
" The detector receives a single, large SAR image and uses a simple statistical test
to find targets contained in the image. Chips of potential targets are extracted
from the image and passed on to the discriminator.
" The discriminator receives candidate chips from the detector. The discriminator
classifies a chip as target or clutter using a pattern classifier. Each chip is first
converted into a feature vector. The pattern classifier converts this feature
vector into a score. Any chip with a score higher than the threshold is sent to
the classifier.
" The classifier matches chips to very specific target categories.
Each chip is
compared to a set of templates through MSE scoring. The chip is matched
to the class with the lowest MSE score. If the score is greater than a certain
threshold, the chip is classified as clutter.
" The results of the ATR system are passed on to a team of image analysts who
review the results of the classifier. A final report is prepared for use by military
users.
72
Chapter 5
Experiments and Results
5.1
Introduction
In this chapter we discuss three experiments that were performed to investigate the
benefits of polarimetric processing methods and superresolution processing methods
on: 1) Gaussian classifiers that discriminate between obscured targets and clutter
using geometric features, 2) Gaussian classifiers that discriminate between obscured
targets and clutter using polarimetric features, and 3) Gaussian classifiers that discriminate between open targets and clutter using polarimetric features.
Prior to
performing these experiments we applied 48 different processing methods, each a
combination of a polarimetric method and a superresolution method, to FOPEN
SAR imagery to generate 48 sets of images. From these sets of images we computed the features listed in Section 4.3.3. The first six sets of images obtained were
the polarimetric baseline images: HH, HV, VV, SPAN, PMF, and PWF (the term
baseline refers to an image with no superresolution processing regardless of whatever
polarimetric processing method that may have been applied). The HH, HV, and VV
images were obtained directly from the SAR signal processor, and the SPAN, PMF,
and PWF images were processed from the HH, HV, and VV images. The remaining 42 sets of images were generated by superresolving each set of baseline images
with each of the seven superresolution methods discussed in Chapter 3: upsampling,
MVM, EV, MUSIC, Pisarenko, joint-I/Q SVA, and separate-I/Q SVA. The baseline
73
images and the superresolution algorithm parameters used in processing the baseline
images are described in Section 5.2, and the parameters used to compute the features
from the various images are described in Section 5.3.
Once the features were computed, we performed three feature selection experiments; for each experiment we modified the forward-backward search algorithm according to the changes described in Section 5.4. In the first experiment we generated
"best" feature sets of geometric features for classifiers that discriminate between obscured targets and clutter. The classifiers were trained using obscured target feature
vectors and clutter feature vectors. A "best" feature set was generated for each type
of classifier (both linear and quadratic) for each of the 48 sets of images. The performance of these "best" feature sets (and their superresolution methods) was compared
using ROC curves.
In the second experiment we generated "best" feature sets for classifiers that
discriminate between obscured targets and clutter using polarimetric features. The
classifiers were trained using obscured target feature vectors and clutter feature vectors. Since polarimetric features are defined in terms of pairs of polarimetric channels,
we did not generate "best" feature sets for each of the 48 processing methods. Instead
we generated eight "best" feature sets for each superresolution method (including the
baseline) and for each type of classifiei. The eight "best" feature sets were generated
using eight different candidate feature sets to initialize the feature search algorithm.
Recall that our 36 polarimetric features are six sets of six features, with one set per
pair of channels. The candidate feature sets do not necessarily contain all 36 features
but only the sets of features whose channel pairs are marked with a "x" in Table 5.1.
Again the performance of the "best" feature sets was compared using ROC curves.
In the third experiment we generated "best" feature sets for classifiers that discriminate between open targets and clutter using polarimetric features. For the first
part of the experiment, we generated "best" feature sets as we did in the previous
experiment except that we trained on open targets and clutter instead of obscured
targets and clutter. For the second part of the experiment, we used the "best" feature sets of the second experiment to generate classifiers trained on obscured targets
74
NAME
A
B
C
D
E
F
G
H
(C 1 , C 2 ) =
(HH, VV)
x
x
x
x
x
x
x
x
(HH, HV)
x
x
x
x
x
x
x
x
(VV, HV)
x
x
x
x
x
x
x
x
(SPAN, HH)
(PMF, HH)
(PWF, HH)
x
x
x
x
x
x
x
x
x
x
x
x
Table 5.1: Candidate Feature Sets for Polarimetric Features
and clutter, and we tested the performance of these classifiers on open targets. We
wanted the classifiers trained only on obscured targets to also work well against open
targets, because this would eliminate the computational cost of training and using
two different classifiers. The "best" feature sets of the first part and the second part
of the experiment were compared using ROC curve areas.
5.2
5.2.1
Parameters for Image Processing
FOPEN SAR Imagery
The FOPEN SAR imagery to which the processing methods were applied was collected during a 1995 phenomenology experiment at Grayling, MI, by a U.S. Navy
P-3 Ultra-Wideband (UWB) SAR [8], which is shown in Figure 5-1. The antenna
of the SAR was oriented at a 450 depression angle, and the parameters of the SAR
are listed in Table 5.2. The test sites at Grayling (see Figure 5-2) contained targets
(tanks, armored personnel carriers, and military trucks) deployed within open areas
of grassy terrain and under the forest canopy of deciduous trees and coniferous trees.
The SAR sensor produced full-sized images covering 0.963 km in range x 1.638 km
in cross-range and sampled at 0.235 m/pixel in range x 0.4 m/pixel in cross-range.
Figure 5-3 shows an example of a full-size image that contains open targets. 224 chips
of open targets, 121 chips of obscured targets, 500 chips of clutter false alarms, all of
size 200 pixels in range x 100 pixels in cross-range, were extracted from these full75
Figure 5-1: The P-3 SAR
PARAMETER
Frequency Regime
Transmitted Bandwidth
Processed Bandwidth
Waveform
Range Resolution
Cross-Range Resolution
Recorded Swath Width
Center Squint Angle (Broadside)
Depression Angle
Standoff Range
Aircraft Ground Speed
Polarizations
VALUE
215 - 900 MHz
509 MHz
509 MHz
Linear FM; dec hirp on receive
0.33 m
0.66 m
930 m
900
150 - 600
6 - 14 km
135 m/s (nominal)
VV, VH, HV, HH
Table 5.2: Parameters for the P-3 UWB SAR [4]
76
sized images. The target chips were extracted from several different full-sized images
and were located within the images using their recorded geographic locations. The
clutter chips were extracted from a single full-sized image and were selected because
they had the highest CFAR values for clutter in that image.
Figure 5-2: A Test Site at Grayling, MI
5.2.2
Polarimetric Processing
The PMF images were processed using the weight vector taken directly calculated
from target and clutter polarization covariances using the polarimetric matched filter
design equations from [10].
0.0541 + jO. 2 903
w=
0.2113 + 0.1937
(5.1)
0.9114
The PWF images that were not superresolved with SVA were processed using the
77
Figure 5-3: A Example of a Full-Sized SAR Image
clutter covariance matrix
EC = 0.0799
1.0000
0.03 7 8 -O0.0008
0.0378 + jO. 0 0 08
0.2874
0.0338 + j0.1098
-0.0037 + Jo. 0 19 1
0.0338 - jO.10 9 8 -0.0037 - jO.0191
(5.2)
0.5179
which was obtained by cutting out chips of forested regions from a full-sized image
and by averaging the covariance matrices of all of the pixels in each of these chips.
The PWF images that were superresolved with joint-I/Q SVA were processed using
the clutter covariance matrix
EC,SVA-J
=
0.01084
1.0000
0.0120 - jO.00 73
0.0120 + j0.0073
0.2372
0.0254 - j0.00 7 3 -0.0130 - jO.0079
0.0254 +j0.0073
-0.0130 + j0.0079
0.5172
(5.3)
and the PWF images that were superresolved with separate-I/Q SVA were processed
78
using the clutter covariance matrix
EC,SVA-S
= 0.005927
1.0000
0.0148 -
0.0148 + j0.00 6 5
0.2379
0.0259 -
0.013 4
Jo. 0 0 6 5
0.0259 + jO.0134
-0.0130 + jO.0081
-0.0130 - 30-0081
0.4834
(5.4)
The PWF covariance matrices for the SVA images were obtained by superresolving
the chips used to compute the original PWF covariance matrix and by averaging the
covariance matrices of all of the pixels in each of these superresolved chips.
Notice that the matrix entries E1 ,2 , E 2 ,1 , E 2 ,3 , and E 3,2 in all of the PWF covariance matrices are nonzero although they were assumed to be identically zero in
Section 2.2.1. The zero assumption, however, is still valid, because E1 ,2 , E 2 ,1i, E 2 ,3 ,
and E 3 ,2 have magnitudes small enough relative to the diagonal matrix entries.
5.2.3
Superresolution
The superresolved chips were generated using the following parameters:
" The upsample factor for bandlimited interpolation is k = 2.
" For MVM, EV, MUSIC, and Pisarenko, the size of the mosaicked subimages
Kx x Ky is 10 x 10 pixels and the subaperture size kx x ky used for the
estimate of the full aperture covariance matrix is 5 x 5. The subaperture size
was chosen to satisfy DeGraaf's constraint [2] of 0.4 <
parameters
KX
< 0.5; for our
= 0.5.
" For EV and MUSIC, the combined signal and noise subspace is spanned by
25 eigenvectors, because the rank kxky of the aperture covariance matrix is
25. The signal eigenvectors for EV are the two eigenvectors corresponding to
the two largest eigenvalues, and the signal eigenvectors for MUSIC are the ten
eigenvectors corresponding to the ten largest eigenvalues.
After superresolution processing, all of the target and clutter chips (except those
whose superresolution method is upsampling) were upsampled by a factor of two to
79
form chips with size 400 pixels in range x 200 pixels in cross-range.
5.3
5.3.1
Computation of Features
Geometric Features
The geometric features of each chip were computed from the dB image of the chip.
The search region for the clustering algorithm is a circle of radius 8 pixels concentric
with the chip. The low threshold and the high threshold for each chip were specified
using the mean m of the pixel values in the entire chip and the standard deviation a
of the pixel values in the entire chip. The high thresholds were chosen so that for each
set of images 95% or more of the target chips would have at least one pixel greater
than the high threshold. Chips without a pixel greater than the high threshold were
removed from our experiments. For the baseline, upsampled, MVM, EV, MUSIC,
and Pisarenko images, the low threshold was m + 1.5a and the high threshold was
m + 2a. For the joint-I/Q SVA and the separate-I/Q SVA images, the low threshold
was m + 1.5c- and the high threshold was m + 1.75a.
5.3.2
Polarimetric Features
The polarimetric features for each chip were computed from the magnitude-squared,
non-dB image of the chip. For the baseline images, the polarimetric features were
computed from an 11 x 11 pixel region, which has a real physical size of 2.5850 m in
range x 4.4 m in cross-range. For the superresolved images, the polarimetric features
were computed from a 21 x 21 pixel region, which has a real physical size of 2.4675
m in range x 4.2 m in cross-range.
5.4
The Modified Feature Selection Algorithms
In our research, we discovered two problems with the forward-backward feature
search: 1) the mean values of different features are often separated by many orders
80
of magnitude. This separation can cause computational problems for the classifier,
especially when inverting a covariance matrix; 2) the search algorithm often produces
unstable feature sets, which work well only when the classifier is trained on the same
data used to generate the feature sets. Section 5.4.1 addresses our solution to the first
problem. Sections 5.4.2, 5.4.3 and 5.4.4 address our solutions the second problem.
5.4.1
Feature Normalization
To eliminate orders of magnitude differences, we normalize the values for each feature
prior to feature selection as follows: compute the mean and the standard deviation
of each feature over the sample space that includes both targets and clutter in both
the training sets and the testing sets. For each feature subtract out the mean from
each sample value of the feature and divide these samples by the standard deviation.
This forces the mean and the standard deviation of each feature, over all the samples
in the target and training sets, to be 0 and 1, respectively. We record the mean and
the standard deviation for each feature to normalize any new testing data that have
yet to be normalized.
5.4.2
Geometric Features
To generate stable feature sets of geometric features, we have modified the forwardbackward search algorithm as follows: run the forward-backward search algorithm
30 times, using a different training set of randomly selected samples each time, to
generate 30 possible "best" feature sets. Use each feature set to form 100 different
classifiers, each classifier using a different training set of randomly selected feature
vectors. Compute the ROC curve area for each of the classifiers using the remaining
feature vectors as the testing set. Calculate the mean of the ROC curve areas for
each feature set over all 100 classifiers. From the five feature sets with the highest
mean ROC curve areas, pick the feature set with the smallest number of features as
the "best" feature set. We introduced this smallest size constraint, because many
of the feature sets with the highest mean ROC curve areas differ only slightly (see
81
Appendix E for exact details) and because having fewer features speeds up the ATR
system.
The training set used for each classifier contained 50% of the obscured targets (61
obscured targets) and 50% of the clutter chips (250 clutter chips). The training set
samples were randomly chosen from the complete set of obscured targets and clutter
chips. The testing set used for each classifier contained the remaining 50% of the
obscured targets (60 obscured targets) and the remaining 50% of the clutter chips
(250 clutter chips).
5.4.3
Polarimetric Features: Trained on Open Targets
For the polarimetric classifiers trained on open targets, we used the modified forwardbackward algorithm for geometric features. Each training set contained 50% of the
open targets (112 open targets) and 50% of the clutter chips (250 clutter chips). The
training samples were randomly chosen from the complete set of open targets and
clutter chips. Each testing set contained the remaining 50% of the open targets (112
open targets) and the remaining 50% of the clutter chips (250 clutter chips).
5.4.4
Polarimetric Features: Trained on Obscured Targets
For the polarimetric classifiers trained on obscured targets, we further modified the
forward-backward feature search algorithm to select "best" feature sets that perform
well for both obscured targets and open targets. The new algorithm is: as before,
run the forward-backward search algorithm 30 times to generate 30 possible "best"
feature sets. Test each of the feature sets on 100 different classifiers using testing sets
of obscured targets and clutter and compute the mean ROC curve areas for each of
the feature sets. Perform a second test on the 100 classifiers now using testing sets
of open targets and clutter, and again compute the mean ROC curve area for each of
the feature sets. Add the two mean ROC curve areas together for each feature set.
From the five feature sets with the largest sum of mean ROC curve areas, pick the
feature set with the smallest number of features as the "best" feature set.
82
Each training set contained 50% of all the obscured targets (61 obscured targets)
and 50% of the clutter chips (250 clutter chips).
The training set samples were
randomly selected from the complete set of obscured targets and clutter chips. The
obscured target testing set contained the remaining 50% of the obscured targets (60
targets) and the remaining 50% of the clutter false alarms (250 false alarms). The
open target testing set contained all of the open targets (224 open targets) and the
clutter chips used for the obscured target testing set (250 clutter chips).
5.5
Results
This section presents a brief analysis of the experiment data using plots of mean
ROC curve areas, which are also included in Appendix A. The following Appendices
provide details about other information from the experiments: Appendix B contains
the "best" feature sets generated by our experiments. Appendix C contains charts
of the means of the ROC curve areas and the standard deviations of the ROC curve
areas for the "best" feature sets. Appendix D contains plots of sample ROC curves
for the "best" feature sets.
5.5.1
Geometric Features
Figure 5-4 and Figure 5-5 show the mean ROC curve areas of the "best" feature sets
of geometric features. Each curve in either figure is the mean ROC curve area of a
superresolution method as a function of polarimetric "channel". Figure 5-4 displays
the mean ROC curve areas for the linear classifier, and Figure 5-5 displays the mean
ROC curve areas for the quadratic classifier. Figures A-3 through A-8 redisplay the
data in Figures 5-4 and 5-5 to clarify the shapes of the curves. From these figures we
observe the following trends about geometric features:
e The quadratic classifier provides about the same performance as the linear classifier: the curves in the quadratic classifier plot and the curves in the linear
classifier plot are quite similar. Assuming that the features are Gaussian dis83
tributed and that the quadratic classifier uses the same features as the linear
classifier, the similarity of the plot of the quadratic classifier to the plot of
the linear classifier implies that the target class and the clutter class have approximately equal feature covariance matrices and that they differ only in their
means. As seen in Appendix B, however, a best feature set for the quadratic
classifier shares some features with the best feature set for the linear classifier,
but it does not share all of its features. The hypothesis of the equal covariance
matrices can still hold if the features common to both feature sets are more important to discrimination than the features unique to the two different feature
sets. The hypothesis can be checked easily by examining the statistics of the
feature values.
* Many curves have peaks at PMF and VV. Since PMF is usually the higher
peak, PMF is the best polarimetric "channel" to use with geometric features
for superresolution methods, and VV is the second best polarimetric "channel".
" MUSIC is, by far, the worst superresolution method for geometric features: the
curves of MUSIC are far below those of the other superresolution methods.
We believe that MUSIC is the worst superresolution method, because MUSIC
severely degrades the shapes of the objects that it superresolves, reducing both
targets and tree trunks to collections of their brightest point scatterers (see
Figure 3-6). For a single target or clutter object, the clustering algorithm will
select one of these point scatterers as the cluster from which geometric features
will be computed.
Since a point scatterer from a target looks similar to a
point scatterer from a tree trunk, the classifier will not discriminate effectively
between the two.
" Superresolution does not greatly improve discrimination over baseline processing for geometric features: baseline processing is one of the best methods for
each polarimetric "channel" and is also part of the processing combination of
baseline and PMF, which has the highest mean ROC area for either the linear
classifier or the quadratic classifier.
84
-0- Baseline
0.9
- Upsampled
MVM
--EV
-0- music
Pisarenko
-0
--
0.85 -
PJoint-/
mSeparate-1/e
SVA
SVA
0.8
-
0.750.7
0.65
0.550.5
WSPAN
Polarimetric Channel
HV
H.
PMF
PW F
Figure 5-4: Plot of Means of ROC Curve Areas: Geometric Features: Linear Classifier
0.C5
-0- Baseline
-0- Upsampled
-0-- MVM
+* EV
0.9 0.85
-
0
Pisarenko
-- Joint-1/0 SVA
+Separate-1/0 SVA
0..
90.75x 0.7-
0.6
0.60.55
--
0.
-
H
W
SPAN
Polarimfetric Channel
PMF
PWeF
Figure 5-5: Plot of Means of ROC Curve Areas: Geometric Features:
Classifier
85
Quadratic
5.5.2
Polarimetric Features: Tested on Obscured Targets
Figure 5-6 and Figure 5-7 show the mean ROC curve areas of the "best" feature
sets of polarimetric features used to discriminate obscured targets from clutter. Each
curve in either figure is the mean ROC curve area of a superresolution method as
function of candidate feature set. Figure 5-6 displays the mean ROC curve areas
for the linear classifier, and Figure 5-7 displays the mean ROC curve areas for the
quadratic classifier. To aid the reader, Figures A-11 through A-16 redisplay the data
in Figures 5-6 and 5-7 in Appendix A. From the figures we observe the following
trends:
" Polarimetric features provide better discrimination performance than geometric
features: many mean ROC curve areas for the best feature sets selected from
candidate feature sets D, F, G, and H are greater than 0.8758, which is the best
mean ROC curve area obtained using geometric features.
" The quadratic classifier again only provides about the same performance as the
linear classifier: the curves in the quadratic classifier plot and the curves in the
linear classifier plot are similar, possibly because the target class and the clutter
have equal covariance matrices.
" Features from the channel pair (SPAN, HH) do little to improve discrimination:
for many of the superresolution methods, the mean ROC curve areas for the
best feature sets selected from candidate feature sets C, D, and G are approximately equal to the mean ROC curve area for the best feature sets selected
from candidate feature sets E, F, and H, respectively.
" Features from the channel pairs (PMF, HH) and (PWF, HH) provide improved
discrimination over features from the channel pairs of candidate feature set B;
for example, the best feature sets selected from candidate feature sets C and
D often have higher mean ROC curve areas than the best feature sets selected
from candidate feature sets A and B. The features from (PWF, HH) seem to
offer better improvement than the features from (PMF, HH) as the mean ROC
86
curve areas of the best feature sets selected from candidate feature sets D and
F are higher than the mean ROC curve areas of the best feature sets selected
from candidate feature sets C and E, respectively. Since the feature sets selected
from candidate feature sets G and H have the highest mean ROC curve areas,
the best discrimination is obtained when features from (PMF, HH) and (PWF,
HH) are used together.
" MUSIC is the worst superresolution method for discriminating obscured targets
from clutter using polarimetric features: the curves for MUSIC are below those
of all other superresolution methods. Recall that the polarimetric features of
an object are computed from the pixels that fall within a specified area (a rectangle) within the chip of the object and that MUSIC reduces an object that
it superresolves to a collection of its brightest point scatterers while suppressing the rest of the object. Because of this suppression, fewer pixels within the
rectangle will differ for targets and clutter, which, we believe, results in poor
discrimination of targets from clutter in MUSIC images. This discrimination,
however, is still remarkable when compared with the best discrimination performance of any classifier using geometric features. The highest mean ROC curve
area for MUSIC (0.8520 for the feature set selected from candidate feature set
G for the quadratic classifier) is comparable to the highest mean ROC curve
area for geometric features (0.8758 for PMF/baseline processing for the linear
classifier). This effectiveness of polarimetric features even with MUSIC is further evidence that polarimetric features are more useful than geometric features
in discrimination.
* EV is the best superresolution method for discriminating obscured targets from
clutter using polarimetric features: it has the highest mean ROC curve area for
many of the candidate feature sets, including the best candidate feature sets,
which have the highest overall mean ROC curve areas, and when it does not
have the highest mean ROC curve area for these best candidate feature sets, it
has a very close, second-best mean ROC curve area.
87
* When maximum polarimetric information is available to the feature selection
algorithm (the feature selection is free to choose features from all of the features
of candidate feature set H) superresolution does not greatly improve discrimination over baseline processing: the highest mean ROC curve area for both the
linear classifier and the quadratic classifier, (i.e. the mean ROC curve area of
Pisarenko's method for the linear classifier) is only marginally better than the
mean ROC curve area of baseline processing on the quadratic classifier. Also, a
significant disadvantage of using superresolution for the candidate feature set H
is the requirement for the superresolution processing of seven images (one each
for HH, HV, VV, and PMF, and three for PWF).
* When limited polarimetric information is available to the feature selection algorithm (when we cannot use either PMF or PWF, because both the target
covariance matrix and the clutter covariance matrix are unknown), superresolution does improve discrimination over baseline processing: baseline processing
is one of the worst methods for candidate feature sets A and B, for both the
linear classifier and the quadratic classifier. Since baseline processing is one of
the better methods for candidate feature sets E and F on the quadratic classifier, we can see that superresolution only marginally improves discrimination
when either (PMF, HH) features or (PWF, HH) features are used.
88
0.96
0.
-40- Baseline
+* Upsampled
-0- MVM
0.75
EV
MUSIC
+
* Pisarenko
Joint-/O SVA
Separate-/O SVA
-4-+-
0 .A7
E
D
Candidate Feature Set
G
F
H
Figure 5-6: Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier
0.9
80.85
0.--
-0-
Upsampled
MVM
EV
-+- MUSIC
* Pisarenko
-Joint-I/O SVA
-Separate-/O SVA
0.75-
.A
Baseline
---
-*-
C
E
D
Candidate Feature Set
F
G
H
Figure 5-7: Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric
Features: Trained on Obscured Targets: Quadratic Classifier
89
5.5.3
Polarimetric Features: Tested on Open Targets
Trained on Open Targets
Figure 5-8 and Figure 5-9 show the mean ROC curve areas of the "best" feature sets of
polarimetric features used to discriminate open targets from clutter. The classifiers in
this section have been trained on open targets and clutter. Each curve in either figure
is the mean ROC curve area of a superresolution method as a function of the candidate
feature set. Figure 5-8 displays the mean ROC curve areas for the linear classifier, and
Figure 5-9 displays the mean ROC curve areas for the quadratic classifier. To aid the
reader, the data in both figures have been redisplayed in Appendix A in Figures A-19
through A-24. From the results shown in the figures we observe the following trends:
" Polarimetric features provide better discrimination of open targets from clutter
than of obscured targets from clutter. For example, most of the mean ROC
curve areas are greater than 0.9473, which is the highest mean ROC curve
area obtained for classifiers that discriminate between obscured targets and
clutter (compare Figures 5-8 and 5-9 to Figures 5-6 and 5-7). We conjecture
that the discrimination of open targets is better than the discrimination of obscured targets, because the differences in polarimetric features between targets
and clutter, already very useful in discriminating between obscured targets and
clutter, are further enhanced by removing the electromagnetic distortion that
comes from the forest canopy.
" The quadratic classifier performs noticeably better than the linear classifier for
the best feature sets selected from candidate feature sets A, D, and F, for some
of the superresolution methods.
" MUSIC is by far the worst superresolution method for discriminating open
targets from clutter using polarimetric features.
" Pisarenko's method is the best superresolution method for discriminating open
targets from clutter using polarimetric features: if it does not have the highest
90
mean ROC curve area for a particular candidate feature set, it has the second
highest mean ROC curve area.
" When maximum polarimetric information is available to the feature selection
algorithm (when the candidate feature set H is used), superresolution does not
improve discrimination over baseline processing: the mean ROC curve area of
baseline processing for the quadratic classifier is the highest mean ROC curve
area for both the linear classifier and the quadratic classifier. Again, superresolution significantly increases the computational burden without any benefit to
discrimination performance.
" When limited polarimetric information is available to the feature selection algorithm (when we cannot use PWF because the clutter covariance matrix is unknown), superresolution does improve discrimination compared with the baseline: the baseline processing is one of the worst methods for candidate feature
sets A, B, C, D, and E, for both the linear classifier and the quadratic classifier (baseline processing is one of the best methods for candidate feature set
F for the quadratic classifier, so superresolution does improve discrimination
compared with the baseline when PWF is used).
Trained on Obscured Targets
Figure 5-10 and Figure 5-11 show the mean ROC curve areas as functions of their
candidate feature sets. The mean ROC curve areas for the linear classifier are displayed in Figure 5-10, and the mean ROC curve areas for the quadratic classifier are
displayed in Figure 5-11. The data has been replotted in Figures A-27 through A-32
to aid the reader.
The performance of the classifiers that are trained on obscured targets and clutter,
but are used to discriminate open targets from clutter is very erratic with respect
to superresolution methods and candidate feature sets: sometimes these classifiers
outperform the classifiers that are trained on open targets and clutter, and sometimes
these classifiers perform much worse. The classifiers trained on obscured targets tend
91
to perform better for best feature sets selected from larger candidate feature sets
(candidate feature sets with more polarimetric information) than for best feature sets
selected from smaller candidate feature sets. The erratic behavior of the curves shows
the extra difficulty of selecting training samples for a classifier that is to work well
against both obscured targets and open targets. Perhaps if the modified forwardbackward search algorithm had generated more feature sets from which we could
select, we would have found a "best" feature set that was robust enough to use, i.e.
one whose classifier performance was not highly dependent on the choice of training
data.
92
I
0.95
0.9
0
0.85
-4- Baseline
0.8
+
EV
ePsarenko
+0 Joint-1/0 SVA
+Separate-1/Q SVA
0.75 A
B
C
D
E
Candidate Feature Set
F
G
H
Figure 5-8: Plot of Means of ROC Curve Areas: Polarimetric Features: Trained on
Open Targets: Linear Classifier
I
0.95
0
0.9
8
I .5
-0- Baseline
-0- Upsampling
-0- MVM
0.8
-4- EV
-0- MUSIC
* Pisarenco
-0- Joint-I/O SVA
+ Separate-/O SVA
0.75 L
A
B
C
E
0
Candidate Feature Set
F
G
H
Figure 5-9: Plot of Means of ROC Curve Areas: Polarimetric Features: Trained on
Open Targets: Quadratic Classifier
93
I
0.99
I
it 0.9
Cr
90.85
Baseline
Upsampled
-0- MVM
-o- EV
4- MUSIC
* Pisarenko
-4- Joint-/O SVA
e-Bparate-1/Q SVA
0-
-w---
0.8
A
C
B
E
D
Candidate Feature Set
G
F
H
Figure 5-10: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier
0.95
0.89
90.85
-0---0-0-
0.8
EV
MUSIC
*
Pisarenko
-B
C
MVM
---
n .A7L
Baseline
Upsampled
Joint-I/O SVA
Separate-/O SVA
E
D
Candidate Feature Bat
F
G
Ii
Figure 5-11: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Quadratic Classifier
94
Chapter 6
Conclusions and Recommendations
In this thesis we investigated the application of modern spectral estimation methods to
FOPEN SAR imagery with the goal of improving the ability to discriminate targets
from clutter. Using a limited database of 121 obscured targets, 224 open targets,
and 500 clutter false alarms taken from fully polarimetric FOPEN SAR imagery
gathered at Grayling, MI, we investigated various combinations of superresolution
and polarimetric processing methods and evaluated the ability to discriminate targets
from clutter. Our findings are summarized as follows:
1. Polarimetric ratio features provided better discrimination of targets from clutter
than geometric features.
2. Using only geometric features, superresolution did not greatly improve the discrimination of obscured targets from clutter. We do not know the impact of
superresolution processing on the discrimination of open targets from clutter
using geometric features, because we did not investigate it.
3. In the absence of complete polarimetric information, superresolution processing
can significantly improve the discrimination of targets (either obscured or open)
from clutter using polarimetric features.
4. The performance of the quadratic classifier was only marginally better than
that of the linear classifier.
95
5. MUSIC is, by far, the worst superresolution method for either geometric features
or polarimetric features.
We recommend the following suggestions for further research:
1. Continue to investigate the use of polarimetric features and polarimetric processing methods for discriminating targets from clutter, because polarimetric
features are more effective than geometric features in discrimination. Also since
we have seen that superresolved images provide better discrimination performance than baseline images for some polarimetric candidate feature sets, the
use of other types of polarimetric features (e.g. polarimetric entropy, etc.) or
methods may improve overall performance and give superresolution additional
advantages over baseline processing.
2. Study in greater depth the results of our three experiments. We have processed
a small number of images using 48 different combinations of methods. Each processing method we investigated is, in itself, very complex, having complicated
algorithms and requiring that many parameters be selected and optimized. We
have not had the time to study the selection and optimization of these algorithm parameters. The features in the "best" feature sets and their feature
values should be analyzed and then be compared across the processing combinations. This will help us understand if we can adjust the parameters of the
superresolution methods and the polarimetric processing methods to further
improve discrimination.
3. Finally, conduct additional studies using a larger, statistically significant
database of targets and clutter in order to validate our preliminary findings.
96
Appendix A
Plots of Mean ROC Curve Areas
This appendix contains plots of the mean ROC curve areas listed in Appendix C.
Figures A-1, A-2, A-9, A-10, A-17, A-18, A-25, and A-26 have already been shown
in Chapter 5. The remaining figures show the data in the previous figures with less
clutter and greater clarity.
97
A.1
Geometric Features
an
0.9
0.85 F
+
spare-
SV
Joint-/0 S-
0.8
0.75
07
-
ewt-V V
0.85
0.6
0.55
-TH
HV
VV
SPAN
PMF
PWF
Polafr'mutric Channel
Figure A-1: Plot of Means of ROC Curve Areas: Geometric Features: Linear Classifier
0.9
-+MVM
MUSIC
0.85
---
oit -/ SVA
Separatw-11
SVA
0.8
0.75
0.7
0.6
0
0.55
H
HV
SPAN
VV
Polarimetric Channel
PMF
PWF
Figure A-2: Plot of Means of ROC Curve Areas: Geometric Features: Quadratic
Classifier
98
0.9
0.85
0.8
0.75
a:)
5
0.7
0.65
--- Baseline
-0- Upsampled
nH
HV
VV
SPAN
Polarimetric Channel
PMF
PWF
Figure A-3: Plot of Means of ROC Curve Areas: Geometric Features: Linear Classifier: Baseline and Upsampled
0.9
0.85F
0.8
0.75
0.7
0.65
--
I-IH
HV
VV
SPAN
Polarimetric Channel
Baseline
-- Upsampled
PMF
PWF
Figure A-4: Plot of Means of ROC Curve Areas: Geometric Features: Quadratic
Classifier: Baseline and Upsampled
99
0.9 i
0.85
0.8-
7
0.75
S
/
0.7-
0.65--
MVM
SEV
Pisarenko
HV
VV
SPAN
Polarimetric Channel
PWF
PMF
Figure A-5: Plot of Means of ROC Curve Areas: Geometric Features: Linear Classifier: MVM, EV, and Pisarenko
0.85F
0.8
0
8Ir
0.75
F
0.7
-
0.65
*
0.H
HV
VV
SPAN
Polarimetric Channel
PMF
MVM
EV
Pisarenko
PWF
Figure A-6: Plot of Means of ROC Curve Areas: Geometric Features: Quadratic
Classifier: MVM, EV, and Pisarenko
100
0.85 F
0.81
0.75
0.7
0.65
_fH
0
-
HV
SPAN
W
Polarmetric Channel
- Joint-1/QSVA
Separate- /Q
SVAI
PMF
Figure A-7: Plot of Means of ROC Curve Areas: Geometric Features: Linear Classifier: Joint-I/Q SVA and Separate-I/Q SVA
0.9
-0- Joint-1/Q SV
0 Separate-I/OVA
0.85 0.80.75
V
0.7
*
0C'HH
HV
SPAN
VV
Polarimetric Channel
PMF
PWF
Figure A-8: Plot of Means of ROC Curve Areas: Geometric Features: Quadratic
Classifier: Joint-I/Q SVA and Separate-I/Q SVA
101
A.2
Polarimetric Features:
Tested on Obscured
Targets
0 WS.
0.9
80.8$
-- Baseline
-0- Uparrjled
0.75
*
*
EV
MUSIC
Pisaranko
Joint-1/O SVA
rate-.I/ sVA
+
*
0.7
F
i
0
Candidate Feature Set
Figure A-9: Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier for Obscured Targets
0.9
0.85
8t 0.8
-m
-.
0.75 -~
Baseline
Uparrped
*EV
*MUSIC
Pisarenko
-e- Joint-/O SVA
SSeparate-I/Q
0.7!
F
G
SVA
I
Candidate Feature Set
Figure A-10: Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric
Features: Trained on Obscured Targets: Quadratic Classifier for Obscured Targets
102
0.9
0
0.85
0.8
8
0.75|
0.7
A
Baseline
Upsampled
B
C
D
E
Candidate Feature Set
F
G
H
Figure A-11: Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric Features: Trained on Obscured Targets: Linear Classifier for Obscured Targets:
Baseline and Upsampled
0.95,
I
0.9
0r 0.85
0.8
0.75
0 .7
A
B
C
D
E
Candidate Feature Set
F
G
H
Figure A-12: Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric
Features: Trained on Obscured Targets: Quadratic Classifier for Obscured Targets:
Baseline and Upsampled
103
n Q-;
0.9
I
0.
8 0.8
8
-
0.75-
0.7 -
B
C
D
E
Candidate Feature Set
F
MVM
EV
Pisarenko
G
H
Figure A-13: Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric Features: Trained on Obscured Targets: Linear Classifier for Obscured Targets:
MVM, EV, and Pisarenko
0.9-
-Ip
-9
-0
cc
0.85 -
Ir
0.8
0.75
-
-
--*
0.7A
B
C
E
D
Candidate Feature Set
F
MVM
EV
Plsarenko
G
H
Figure A-14: Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric
Features: Trained on Obscured Targets: Quadratic Classifier for Obscured Targets:
MVM, EV, and Pisarenko
104
-
0.9
0.85
U
0.751
aJoint-/
SVA
SSeparate-1/0
0.
7A
B
C
D
E
Candidate Feature Set
F
SVA
G
H
Figure A-15: Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier for Obscured Targets: JointI/Q SVA and Separate-I/Q SVA
n Qr,
0.9
0.8
0.75
+Joint-/
SVOA
SSeparate--l/QSVA
0.1
A
B
C
D
E
Candidate Feature Set
F
G
H
Figure A-16: Plot of Means of ROC Curve Areas for Obscured Targets: Polarimetric
Features: Trained on Obscured Targets: Quadratic Classifier for Obscured Targets:
Joint-I/Q SVA and Separate-I/Q SVA
105
A.3
Polarimetric Features: Tested on Open Targets
A.3.1
Trained on Open Targets
0.9 5
9-
0.85-Baselne
Upsapled
0.
a
*EV
0MUSIC
Pesarenkto
SVA
laate-I/O SVA
-0-Joint-I/O
*
A
B
C
E
F
G
Candidate Feature Set
Figure A-17: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Open Targ ets: Linear Classifier
I
0.95
0.8
-41- Baseline
-0- Uparrling
08
0,75'
A
*EV
*MUSIC
Piarenko
-411,
"t-1/O SVA
*0 Sprate-I/ SVA
D
E
Candidate Feature Set
F
G
Figure A-18: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Open Targets: Quadratic Classifier
106
I
0.99
0.98
0.97
S0.96
0.95
0.94
0.93
Upampled
0.920.9110. 911
A
B
C
D
E
Candidate Feature Set
F
G
H
Figure A-19: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Open Targets: Linear Classifier: Baseline and Upsampled
1
0.99
0.98
0.97 F
)
0.96
8
S0.95
0.94
0.93
0.920.91
Upamling
A
B
C
D
E
Candidate Feature Set
F
G
H
Figure A-20: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Open Targets: Quadratic Classifier: Baseline and Upsampled
107
1
0.99
-
0.98
-
-~
-
0.97
/
0.96
(
8
r0.95
-0.94
+* MVM~
+ EV
Pisarenko
-
0.93
0.92
0.91
0.
A
B
C
E
D
Candidate Feature Set
G
F
H
Figure A-21: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Open Targets: Linear Classifier: MVM, EV, and Pisarenko
1
0.99 0.98
0.97
00.960.95r .94.93AO
PO
0 .92
-*+
0.91
+
A
B
C
D
E
Candidate Feature Set
F
MVM
EV
Pisarenko
G
H
Figure A-22: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Open Targets: Quadratic Classifier: MVM, EV, and Pisarenko
108
0.99
0.98
0.97
U 0.96
8
r0.95
~
nII
-
0.94
7-
SVA
+Joint-1/
SSeparate-1/0 SVA
0.931
0.92
0.91
AC ti.
. .A
B
C
E
D
Candidate Feature Set
F
G
iH
Figure A-23: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Open Targets: Linear Classifier: Joint-I/Q SVA and SeparateI/Q SVA
1-
O9
0.98
90.97
U
0.96,
0.95
0.94
90.93
0.92
0.91
+
Joint-/QSVA
pSeparate-/SVAI
A
B
C
D
E
Candidate Feature Set
F
G
H
Figure A-24: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric Features: Trained on Open Targets: Quadratic Classifier: Joint-I/Q SVA and
Separate-I/Q SVA
109
A.3.2
Trained on Obscured Targets
0.94
0,910,85+0
--
08
*
*
Baseline
Uparpled
EV
MUSIC
Pisarenko
Joint-I/O SVA
SSeparate-l/O SVA
+.
A
B
C
D
E
Candidate Feature Set
F
G
H
Figure A-25: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier
1
.
0.95
a0.9
i0.85
Baseline
--4 - Upeampled
--
0.8
* MV
* MUSIC
Pisarenko
0.75
A
B
C
-o- Joint-I/O SVA
* Separate-I/O SVA
D
E
Candidate Feature Set
F
G
H
Figure A-26: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Quadratic Classifier
110
0
0 .98-
0.97S00.96-
8
10.950.940.93-
-+
Baseline
Upsampled
-
0.920.91
nQ
-
A
B
C
D
E
Candidate Feature Set
F
G
H
Figure A-27: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier: Baseline and Upsampled
0.99
0.98
0.97
Baseline
-4- UpsampledI
00.96
0.95
0.94
0.93
0.92
0.91
A
B
C
D
E
Candidate Feature Set
F
G
H
Figure A-28: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: Baseline and Upsampled
111
0.99 -/0.98-
90.97-/
0.95 -
0.94
-
0.93 -
+--
0.92 -
MVM
EV
Pisarenko
0.91
0.9
A
8
C
D
E
Candidate Feature Set
F
G
H
Figure A-29: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier: MVM, EV, and Pisarenko
0.99-i
0.980.97 -
+
0
L 0.96 -
MVM
EV
Pisarenko
0.95 0.94
0.930.92
-
0.91
-
A
B
C
D
E
Candidate Feature Set
F
G
H
Figure A-30: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: MVM, EV, and Pisarenko
112
1
0.991
-
0.98
Joint-/Q SVA
e Separate- 1/0 SVA-
-
0.97
L 0.96
C 0.91
90.94
.-
~0.93
0.92 0.91
9
C
A
E
D
Candidate Feature Set
Ii
G
F
Figure A-31: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric Features: Trained on Obscured Targets: Linear Classifier: Joint-I/Q SVA and
Separate-I/Q SVA
0.99[
TJoint-/
V
Separate-
0.98
SVA
1/Q
0.97
00.96I
0.95 -I
0.93
0.93
0.92
0.911
(14
A
B
1
C
i
D
E
Candidate Feature Set
F
-
G
H
Figure A-32: Plot of Means of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Quadratic Classifier: Joint-I/Q SVA and
Separate-I/Q SVA
113
Appendix B
Best Feature Sets
This appendix lists the best feature sets generated by the three experiments described
in Chapter 5. Tables B.2 and B.3 list the best feature sets of geometric features for
the linear classifier and the quadratic classifier, respectively. Each row in the two
tables corresponds to a best feature set. Each row in the tables contains the best
feature set for the processing combination of the polarimetric "channel" entry and
the superresolution method entry in that row. The features in the feature set are
marked by "x", and the names of these features are listed in Table B.1.
Tables B.5 through B.36 list the best feature sets of polarimetric features. Each
table contains the best feature set of the candidate feature set whose name appears
in the section heading immediately preceding the table. The features in the feature
set are marked by "x", and the names of these features are listed in Table B.4.
Tables B.5 through B.12 list the best feature sets of polarimetric features for linear
classifiers trained on obscured targets and clutter; Tables B.13 through B.20 list the
best feature sets of polarimetric features for quadratic classifiers trained on obscured
targets and clutter; Tables B.21 through B.28 list the best feature sets of polarimetric
features for linear classifiers trained on open targets and clutter; and Tables B.29
through B.36 list the best feature sets of polarimetric features for quadratic classifiers
trained on open targets and clutter.
114
B.1
Geometric Features
No. Feature
1 Number of Points
2 Major Length
3 Minor Length
4 Ratio of the Major Length to the Minor Length
5 Ratio of the Mean to the Standard Deviation
6 Perimeter Length
7 Ratio of the Perimeter Length to the Number of Points
Table B.1: List of Geometric Features
115
1
2
x
x
x
x
3
4
POL
HH
HH
HH
HH
HH
HH
HH
SUPER
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
HH
Separate-I/Q SVA
x
x
HV
HV
HV
HV
HV
HV
HV
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
x
x
x
HV
Separate-I/Q SVA
x
x
VV
VV
VV
VV
VV
VV
VV
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
x
x
x
x
x
x
x
x
x
VV
Separate-I/Q SVA
x
SPAN
SPAN
SPAN
SPAN
SPAN
SPAN
SPAN
SPAN
PMF
PMF
PMF
PMF
PMF
PMF
PMF
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
x
x
x
x
x
x
x
x
x
x
x
x
PMF
Separate-I/Q SVA
PWF
PWF
PWF
PWF
PWF
PWF
PWF
PWF
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
x
x
5
x
x
x
6
x
x
x
x
x
x
x
x
x
x
x
x
7
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
Table B.2: Best Feature Sets for Geometric Features: Linear Classifier
116
POL
HH
HH
HH
HH
HH
HH
HH
HH
HV
HV
HV
HV
HV
HV
HV
HV
VV
VV
VV
VV
VV
VV
VV
VV
SPAN
SPAN
SPAN
SPAN
SPAN
SPAN
SPAN
SPAN
PMF
PMF
PMF
PMF
PMF
PMF
PMF
PMF
PWF
PWF
PWF
PWF
PWF
PWF
PWF
PWF
SUPER
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
1
x
x
x
x
x
x
x
x
x
x
x
x
x
x
2
3
4 | 5
x
x
x
x
7
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
6
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
Table B.3: Best Feature Sets for Geometric Features: Quadratic Classifier
117
B.2
Polarimetric Features: Trained on Obscured
Targets
No.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
Feature
E{HH}/E{VV}
E{HH/VV}
c-{HH}/-{VV}
-{HH/VV}
max{HH}/max{VV}
max{HH/VV}
E{HH}/E{HV}
E{HH/HV}
c-{HH}/-{HV}
-{HH/HV}
max{HH}/ max{HV}
max{HH/HV}
E{VV}/E{HV}
E{VV/HV}
c-{VV}/u-{HV}
o{VV/HV}
max{VV}/ max{HV}
max{VV/HV}
E{SPAN}/E{HH}
E{SPAN/HH}
u{SPAN}/cr{HH}
-{SPAN/HH}
max{SPAN}/max{HH}
max{SPAN/HH}
E{PMF}/E{HH}
E{PMF/HH}
c-{PMF}/-{HH}
a{PMF/HH}
max{PMF}/max{HH}
max{PMF/HH}
E{PWFI/E{HHI
E{PWF/HH}
o-{PWF}/-{HH}
-{PWF/HH}
max{PWF}/ max{HH}
max{PWF/HH}
Table B.4: List of Polarimetric Features
118
B.2.1
Linear Classifier
Candidate Feature Set: A
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
x
x
2
4
5
6 1 7
8
x
9
x
x
x
x
x
x
19
3
x
x
x
x
x
x
x
20
21
10
x
x
11
12 113
x
x
14
15
16
17
18
x
x
x
x
x
x
x
x
x
22
23
24 125
26
27
28
29
30 131
32
x
33
34
35
36
Table B.5: Best Feature Sets for Polarimetric Features: Trained on Obscured Targets:
Linear Classifier: A
Candidate Feature Set: B
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
x
2
x
3
4
x
x
x
5
6
x
x
x
x
x
x
x
x
x
7
8
x
9
10
x
x
11
12
x
13
x
x
14
x
x
15
16
17
x
18
x
x
x
x
x
x
x
x
x
x
x
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
Table B.6: Best Feature Sets for Polarimetric Features: Trained on Obscured Targets:
Linear Classifier: B
119
Candidate Feature Set: C
METHOD
Baseline
Upsampled
MVM
1
x
2
x
x
3
4
5
x
6
7
8
x
x
9
10
x
x
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
12 1 13
x
x
x
EV
MUSIC
Pisarenko
SVA-J
SVA-S
11
x
x
x
x
x
x
x
x
x
x
14
15
x
x
x
x
x
16
x
x
33
34
17
18
35
36
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
19
20
21
22
23
24 125
26
27
28
29
30 131
32
Table B.7: Best Feature Sets for Polarimetric Features: Trained on Obscured Targets:
Linear Classifier: C
Candidate Feature Set: D
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
2
4
x
x
x
x
x
3
x
x
5
6
x
7
8
x
x
x
x
x
9
10
x
x
x
11
12
x
x
x
x
x
x
16
x
x
x
17
18
x
x
x
x
x
x
x
15
x
x
x
x
14
x
x
x
13
x
x
x
x
x
x
x
x
x
x
x
19
20
21
22
23
24
25
26
27
28
29
30
x
x
31
x
x
32
33
x
x
34
35
36
Table B.8: Best Feature Sets for Polarimetric Features: Trained on Obscured Targets:
Linear Classifier: D
120
Candidate Feature Set: E
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
2
x
x
3
4
x
x
6 17
x
x
x
5
x
8
x
x
x
x
9
10
x
x
11
12 113
x
x
14
x
x
x
x
x
x
x
15
16
17
18
x
x
35
36
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
19
20
x
21
x
22
23
24
25
x
x
x
x
x
x
x
x
x
26
x
x
x
27
x
28
29
30 131
32
33
34
Table B.9: Best Feature Sets for Polarimetric Features: Trained on Obscured Targets:
Linear Classifier: E
Candidate Feature Set: F
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
1
2
3
x
x
x
x
x
4
x
x
5
6
x
x
x
20
11
12
13
x
x
14
x
15
x
16
x
x
x
x
x
x
x
x
x
x
x
21
23
x
24 125
x
x
x
x
x
x
x
26
x
x
x
x
x
x
22
18
x
x
x
x
17
x
x
x
x
x
x
x
x
19
10
x
x
x
x
9
x
x
x
x
x
x
x
x
8
x
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
7
27
28
29
x
x
x
x
30 131
x
x
x
x
32
x
33
34
x
x
x
35
x
36
Table B.10: Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Linear Classifier: F
121
Candidate Feature Set: G
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
2
3
x
x
4
5
x
6
x
7
x
x
x
8
9
10
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
11
12 113
14
x
x
x
x
19
20
21
22
23
24
25
x
x
x
x
x
x
30 131
x
x
26
28
18
x
x
x
x
17
x
x
x
x
x
x
x
x
27
16
x
x
x
x
x
x
x
x
15
x
x
29
x
x
x
x
x
34
35
36
x
32
x
33
Table B.11: Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Linear Classifier: G
Candidate Feature Set: H
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
2
3
4
5
x
x
x
x
6
x
x
7
x
x
x
9
x
x
14
x
x
x
x
x
x
x
22
23
x
24
25
17
x
18
x
x
x
26
x
x
x
x
x
21
16
x
x
x
x
x
x
15
x
x
x
20
12 1 13
x
x
x
x
19
11
x
x
x
x
x
10
x
x
x
x
x
x
8
x
x
x
x
x
27
x
28
x
x
x
x
x
30 1 31
29
x
x
32
x
x
x
x
x
x
x
x
x
33
x
x
34
35
x
x
36
Table B.12: Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Linear Classifier: H
122
B.2.2
Quadratic Classifier
Candidate Feature Set: A
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
x
x
x
2
3
4
5
x
x
6
7
8
9
11
12
13
x
14
15
16
17
18
x
x
x
x
x
x
x
x
x
x
x
19
10
20
21
22
23
24
25
x
x
x
x
x
26
27
28
29
30
31
32
33
34
35
36
Table B.13: Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: A
Candidate Feature Set: B
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
2
3
4
5
6 17
8
9
10
11
x
x
x
x
x
x
x
x
12 113
x
x
x
14
15
x
x
x
16
17
x
x
18
x
x
x
x
x
x
x
x
x
x
x
x
19
x
20
x
21
22
23
24 125
26
27
28
29
30 131
32
33
34
35
36
Table B.14: Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: B
123
Candidate Feature Set: C
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
2
3
4
5
6
7
8
9
10
11
12
13
x
14
15
x
16
17
18
x
x
x
x
x
x
x
x
x
x
x
x
19
20
21
22
23
24
25
26
x
x
x
x
x
x
x
27
x
x
28
x
29
30
31
32
33
34
35
36
Table B.15: Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: C
Candidate Feature Set: D
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
2
3
4
5
6
7
8
x
9
x
x
10
11
12
13
x
x
14
15
x
16
17
18
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
19
20
21
22
23
24
25
26
27
28
29
30
x
31
32
33
x
34
35
36
Table B.16: Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: D
124
Candidate Feature Set: E
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
1
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
x
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
19
x
x
x
x
20
21
22
x
23
24
25
26
x
27
x
28
29
30
31
32
33
34
35
36
Table B.17: Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: E
Candidate Feature Set: F
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
x
2
3
4
5
6
7
8
9
x
x
x
10
x
x
x
x
x
x
x
x
11
x
x
x
12
13
14
15
x
16
x
x
x
x
17
18
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
19
20
x
21
22
23
24
25
26
27
28
29
30
31
x
32
x
x
x
33
34
35
36
Table B.18: Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: F
125
Candidate Feature Set: G
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
19
20
21
22
23
24
25
26
x
x
27
x
28
29
30
x
31
x
32
33
34
35
36
Table B.19: Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: G
Candidate Feature Set: H
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
1
x
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
x
2
3
4
5
x
6
7
8
9
10
11
12
13
14
x
x
x
15
16
17
x
18
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
19
x
20
21
22
23
24
25
x
26
x
27
x
28
29
30
31
32
x
x
x
33
x
34
x
35
36
Table B.20: Best Feature Sets for Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier: H
126
B.3
Polarimetric Features: Trained on Open Targets
B.3.1
Linear Classifier
Candidate Feature Set: A
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
x
x
2
3
x
x
x
x
4
5
61
7
x
x
x
x
x
x
x
x
x
x
8
9
10
11
x
12 113
x
x
x
20
21
22
23
24
25
26
27
15
x
x
32
33
16
17
18
x
x
x
28
x
x
x
x
19
14
29
30
31
34
35
36
Table B.21: Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Linear Classifier: A
127
Candidate Feature Set: B
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
2
3
4
5
x
x
x
x
x
x
x
x
x
x
19
x
20
x
x
x
x
x
x
9
10
11
x
x
x
x
12 113
x
14
15
x
x
x
16
17
x
x
18
35
36
x
x
x
x
x
x
8
6 17
x
x
x
x
x
x
27
28
x
x
x
x
x
x
x
21
x
22
23
24 125
26
29
30 131
32
33
34
Table B.22: Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Linear Classifier: B
128
Candidate Feature Set: C
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
1
2
3
x
4
5
6
x
x
x
8
x
9
x
x
x
x
x
x
x
x
x
x
x
x
x
x
19
20
21
22
23
24
10
11
x
x
12 113
x
x
x
x
x
SVA-S
METHOD
1 7
x
x
x
x
x
x
x
x
x
x
x
x
x
x
14
15
x
x
x
x
16
17
x
18
x
x
x
x
x
x
x
x
x
x
x
x
25
26
x
27
28
29
30 131
32
33
34
35
36
Table B.23: Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Linear Classifier: C
Candidate Feature Set: D
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
1
2
x
3
4
x
x
x
x
x
5
6 1 7
x
x
x
x
8
9
10
11
x
x
x
x
x
x
20
21
22
23
24 125
15
x
16
17
18
x
x
19
14
x
x
x
SVA-S
METHOD
12 1 13
x
26
27
28
29
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
32
33
34
30 131
x
x
x
x
x
x
x
x
x
x
35
36
Table B.24: Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Linear Classifier: D
129
Candidate Feature Set: E
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
1
2
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
x
x
x
x
x
x
x
x
19
x
3
4
5
x
x
6
x
7
x
x
8
x
x
9
10
11
12
x
13
14
15
x
16
17
x
x
x
x
x
x
x
x
x
x
x
20
x
x
x
x
x
x
x
x
x
x
x
x
21
x
x
x
x
22
23
24
x
x
x
x
x
25
x
x
26
18
x
x
x
x
x
x
x
x
x
27
x
x
x
28
29
30
x
x
31
32
33
34
35
36
Table B.25: Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Linear Classifier: E
Candidate Feature Set: F
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
2
x
3
4
5
x
6
7
8
x
x
x
x
x
x
x
x
x
19
10
x
x
11
x
x
x
x
x
x
x
20
x
x
x
21
x
12
13
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
31
x
32
x
x
x
x
x
9
x
x
x
x
x
x
x
x
22
x
x
23
24
25
26
27
28
29
30
14
15
x
x
x
x
x
16
x
17
x
x
x
x
x
x
x
x
x
x
33
x
x
x
18
x
x
x
x
34
35
x
36
Table B.26: Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Linear Classifier: F
130
Candidate Feature Set: G
METHOD
1
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD 119
2
3
4
5
6
x
x
x
x
x
x
x
x
x
x
x
7
x
x
x
x
x
x
x
x
x
8
x
9
10
x
11
x
x
x
x
x
x
x
x
x
20
21
22
23
27
x
x
x
x
x
26
14
x
15
x
16
28
x
x
29
30
x
x
x
x
x
31
17
18
x
x
x
x
x
x
x
x
x
x
x
x
x
x
24 125
12 1 13
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
33
x
x
x
x
x
x
x
x
x
32
x
34
35
x
36
Table B.27: Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Linear Classifier: G
Candidate Feature Set: H
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
2
x
3
4
5
61
7
x
x
8
9
10
11
12
x
14
15
x
16
17
18
x
x
x
x
13
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
19
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
20
x
x
x
21
x
x
22
23
x
24
x
x
x
x
x
25
x
x
x
x
x
x
x
26
x
x
x
x
x
x
x
x
x
x
x
x
x
27
x
x
x
x
x
x
x
x
x
28
29
30
31
x
32
x
x
x
x
x
x
x
33
x
x
x
x
x
x
34
35
36
Table B.28: Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Linear Classifier: H
131
B.3.2
Quadratic Classifier
Candidate Feature Set: A
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
2
x
x
x
x
x
x
x
x
19
4
5
x
3
x
x
x
x
x
x
x
x
x
20
21
22
23
6 17
x
x
x
x
x
x
x
x
x
25
24
10
x
x
9
x
x
x
x
x
x
x
8
11
12 113
x
x
x
x
x
x
x
x
26
27
28
29
30
31
14
15
x
16
17
x
x
18
x
x
x
x
x
32
33
34
35
36
Table B.29: Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Quadratic Classifier: A
Candidate Feature Set: B
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
2
3
4
x
x
x
x
x
x
x
x
5
6 17
x
x
x
x
x
x
x
x
x
x
19
20
x
x
x
x
x
x
x
x
21
x
x
x
x
x
x
x
x
8
9
10
x
x
x
x
x
x
x
x
x
26
27
11
12 113
x
x
x
x
x
x
x
14
x
x
x
x
15
x
16
17
18
x
x
x
x
35
36
x
x
x
22
23
24 125
28
29
30 131
32
33
34
Table B.30: Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Quadratic Classifier: B
132
Candidate Feature Set: C
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
2
3
4
x
x
x
x
x
x
x
x
x
x
x
x
19
20
21
22
5
23
61 7
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
24 125
8
9
10
x
x
x
x
x
x
x
x
11
x
12 1 13
x
x
x
x
x
x
x
14
15
16
17
18
x
x
x
x
x
x
x
x
x
x
x
x
26
x
27
28
29
30 131
32
33
34
35
36
Table B.31: Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Quadratic Classifier: C
Candidate Feature Set: D
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
1
x
2
3
x
4
5
6
x
x
x
7
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
8
9
x
x
x
x
x
x
10
x
11
x
x
x
x
x
x
x
x
14
x
x
x
19
20
21
22
23
24
25
26
27
28
29
30 131
15
16
17
x
18
x
x
x
x
x
x
SVA-S
METHOD
12 113
x
x
x
x
x
x
x
x
x
35
36
x
32
x
33
34
Table B.32: Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Quadratic Classifier: D
133
Candidate Feature Set: E
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
2
3
4
5
6
x
x
x
x
x
21
22
23
10
11
12
x
x
13
x
x
x
x
x
x
x
x
x
x
x
x
x
9
x
x
20
8
x
x
x
x
x
x
x
x
19
7
x
x
24 125
14
15
16
17
18
33
34
35
36
x
x
x
x
x
x
26
27
28
29
30 131
32
Table B.33: Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Quadratic Classifier: E
Candidate Feature Set: F
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
x
2
3
4
5
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
20
x
x
x
21
x
x
19
6
7
x
x
x
x
x
x
x
x
8
9
x
x
x
x
x
10
11
x
x
x
23
24 125
13
14
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
15
16
17
x
18
x
x
x
x
x
22
12
26
27
28
29
30 131
x
x
32
x
x
x
x
x
x
x
x
x
33
x
x
x
x
x
34
35
36
Table B.34: Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Quadratic Classifier: F
134
Candidate Feature Set: G
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
2
3
4
5
6
x
x
x
x
x
x
x
x
7
x
x
x
x
x
x
x
x
x
8
9
10
11
12
x
x
x
x
x
x
x
x
x
x
x
x
x
13
x
x
14
19
20
21
22
23
24
x
27
17
18
x
x
x
x
x
x
x
x
x
x
26
16
x
x
x
x
x
x
x
x
25
15
x
28
29
30
x
x
x
x
31
x
32
x
x
33
34
x
x
35
36
Table B.35: Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Quadratic Classifier: G
Candidate Feature Set: H
METHOD
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
SVA-J
SVA-S
METHOD
1
2
3
4
5
x
x
6
7
x
x
8
9
10
11
12
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
19
x
x
x
x
x
x
x
x
x
22
23
x
20
21
24
x
x
x
x
x
x
x
x
25
14
x
x
x
x
x
x
15
16
17
18
x
x
x
x
13
x
x
x
x
x
x
x
x
x
x
26
27
28
29
30
x
x
x
31
32
x
x
x
x
x
x
x
x
x
x
33
x
x
x
x
34
35
36
Table B.36: Best Feature Sets for Polarimetric Features: Trained on Open Targets:
Quadratic Classifier: H
135
Appendix C
ROC Curve Areas
This Appendix shows charts of the means of the ROC curve areas and charts of the
standard deviations of the ROC curve areas for each best feature set generated in the
three experiments of Chapter 5. The charts containing the data of geometric features
are shown as functions of superresolution method and polarimetric processing method.
The charts containing the data of polarimetric features are shown as functions of
superresolution method and candidate feature set.
C.1
Geometric Features
C.1.1
Linear Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
HH
0.7765
0.7666
0.6795
0.6658
0.6453
0.6607
0.6588
0.6017
HV
0.7636
0.7568
0.7554
0.7557
0.5889
0.7588
0.6977
0.6864
VV
0.8112
0.8144
0.8209
0.8204
0.6497
0.7687
0.8232
0.8223
SPAN
0.7891
0.7617
0.7611
0.7517
0.5181
0.7324
0.7930
0.7323
PMF
0.8758
0.8703
0.8360
0.8432
0.6436
0.8184
0.8126
0.8361
PWF
0.8146
0.7873
0.7584
0.7785
0.5295
0.7430
0.7550
0.7299
Table C.1: Mean of ROC Curve Areas: Geometric Features: Linear Classifier
136
I
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
HH
0.0296
0.0346
0.0370
0.0300
0.0281
0.0330
0.0378
0.0247
HV
0.0301
0.0335
0.0304
0.0232
0.0321
0.0263
0.0330
0.0346
VV
0.0262
0.0278
0.0180
0.0217
0.0288
0.0249
0.0212
0.0240
SPAN
0.0270
0.0327
0.0271
0.0257
0.0345
0.0295
0.0291
0.0455
PMF
0.0216
0.0192
0.0209
0.0188
0.0325
0.0229
0.0256
0.0224
PWF
0.0256
0.0275
0.0258
0.0253
0.0328
0.0313
0.0272
0.0296
Table C.2: Standard Deviation of ROC Curve Areas: Geometric Features: Linear
Classifier
137
C.1.2
Quadratic Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
HH
0.8007
0.8153
0.6533
0.6655
0.6551
0.6318
0.6787
0.6477
HV
0.7378
0.7679
0.7301
0.7349
0.5649
0.7285
0.7251
0.6752
VV
0.8079
0.8188
0.7922
0.8020
0.6594
0.7697
0.8070
0.8010
SPAN
0.8209
0.8250
0.7107
0.7205
0.6075
0.7131
0.7829
0.7466
PMF
0.8690
0.8545
0.8222
0.8247
0.6407
0.8016
0.8116
0.8285
PWF
0.8138
0.7817
0.7324
0.7487
0.5922
0.7253
0.7444
0.7232
Table C.3: Mean of ROC Curve Areas: Geometric Features: Quadratic Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
HH
0.0210
0.0251
0.0439
0.0296
0.0422
0.0375
0.0306
0.0520
HV
0.0265
0.0459
0.0333
0.0258
0.0379
0.0307
0.0302
0.0250
VV
0.0244
0.0286
0.0303
0.0221
0.0302
0.0275
0.0212
0.0318
SPAN
0.0252
0.0263
0.0302
0.0344
0.0259
0.0331
0.0303
0.0352
PMF
0.0220
0.0241
0.0273
0.0221
0.0298
0.0259
0.0227
0.0290
PWF
0.0277
0.0260
0.0296
0.0260
0.0280
0.0347
0.0198
0.0271
Table C.4: Standard Deviation of ROC Curve Areas: Geometric Features: Quadratic
Classifier
138
C.2
Polarimetric Features: Trained on Obscured
Targets
C.2.1
Linear Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
A
0.7821
0.7755
0.8527
0.8335
0.7936
0.8043
0.8233
0.8001
B
0.7088
0.8130
0.8073
0.8337
0.7914
0.8325
0.8284
0.8503
C
0.8706
0.8819
0.8844
0.8966
0.7767
0.8664
0.8849
0.8800
D
0.8252
0.8570
0.8935
0.8995
0.7925
0.9195
0.8881
0.9054
E
0.8772
0.8825
0.8844
0.8917
0.8208
0.8647
0.8737
0.8716
F
0.8224
0.8523
0.8990
0.9301
0.8519
0.9264
0.8888
0.9082
G
0.9001
0.9128
0.9273
0.9459
0.8496
0.9432
0.9273
0.9354
H
0.8691
0.9179
0.9261
0.9468
0.8361
0.9473
0.9263
0.9238
Table C.5: Mean of ROC Curve Areas for Obscured Targets: Polarimetric Features:
Trained on Obscured Targets: Linear Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
A
0.0304
0.0268
0.0229
0.0243
0.0246
0.0291
0.0252
0.0318
B
0.0557
0.0266
0.0397
0.0274
0.0223
0.0242
0.0248
0.0205
C
0.0233
0.0225
0.0218
0.0187
0.0284
0.0199
0.0184
0.0188
D
0.0286
0.0207
0.0187
0.0184
0.0273
0.0156
0.0200
0.0173
E
0.0217
0.0201
0.0196
0.0206
0.0241
0.0222
0.0200
0.0203
F
0.0293
0.0228
0.0176
0.0133
0.0252
0.0149
0.0203
0.0217
G
0.0259
0.0167
0.0154
0.0147
0.0215
0.0120
0.0147
0.0150
H
0.0568
0.0142
0.0154
0.0122
0.0217
0.0118
0.0149
0.0188
Table C.6: Standard Deviation of the ROC Curve Areas for Obscured Targets: Polarimetric Features: Trained on Obscured Targets: Linear Classifier
139
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
A
A
0.9888
0.9983
0.5689
0.9173
0.9995
0.7557
0.9504
0.9900
B
B
0.9217
1.0000
0.9878
0.9098
0.9993
0.6475
0.9900
0.9096
C
C
0.8443
0.9975
0.9667
0.9373
0.9194
0.8558
0.9700
0.8200
D
D
0.9653
0.9805
1.0000
1.0000
0.9568
1.0000
1.0000
1.0000
E
E
0.9909
0.9984
0.9736
0.9807
0.9582
0.9532
0.9987
0.9946
F
F
0.8057
0.9706
1.0000
1.0000
0.3268
0.9999
0.9929
1.0000
G
G
0.9997
0.9834
1.0000
1.0000
0.2851
1.0000
0.9977
1.0000
H
H
0.9587
0.9774
1.0000
1.0000
0.4893
1.0000
1.0000
0.9800
Table C.7: Mean of ROC Curve Areas for Open Targets: Polarimetric Features:
Trained on Obscured Targets: Linear Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
_A
0.1006
0.0025
0.0166
0.0126
0.0010
0.2777
0.2174
0.0996
B
0.1224
0.0000
0.0158
0.0317
0.0015
0.0125
0.1000
0.2875
C
0.0734
0.0127
0.0478
0.0178
0.1120
0.1198
0.1714
0.3861
D
0.1107
0.1373
0.0000
0.0000
0.1285
0.0000
0.0000
0.0000
E
0.0862
0.0022
0.0329
0.0469
0.0708
0.0401
0.0072
0.0294
F
0.2452
0.1135
0.0003
0.0000
0.2913
0.0005
0.0033
0.0000
G
0.0015
0.1174
0.0000
0.0000
0.3194
0.0000
0.0032
0.0000
H
0.1645
0.1232
0.0000
0.0000
0.3091
0.0000
0.0000
0.1404
Table C.8: Standard Deviation of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Linear Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
A
1.7709
1.7737
1.4216
1.7508
1.7931
1.5600
1.7736
1.7902
B
1.6305
1.8130
1.7952
1.7434
1.7907
1.4800
1.8184
1.7599
C
1.7149
1.8793
1.8511
1.8339
1.6961
1.7222
1.8549
1.7000
D
1.7905
1.8374
1.8935
1.8995
1.7493
1.9195
1.8881
1.9054
E
1.8681
1.8809
1.8580
1.8724
1.7790
1.8179
1.8724
1.8662
F
1.6280
1.8229
1.8989
1.9301
1.1787
1.9263
1.8817
1.9082
G
1.8998
1.8963
1.9273
1.9459
1.1347
1.9432
1.9250
1.9354
H
1.8279
1.8953
1.9261
1.9468
1.3254
1.9473
1.9263
1.9039
Table C.9: Sum of Mean of ROC Curve Areas for Both Obscured Targets and Open
Targets: Polarimetric Features: Trained on Obscured Targets: Linear Classifier
140
C.2.2
Quadratic Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
A
0.7925
0.7840
0.8066
0.8362
0.7811
0.8378
0.8384
0.7875
B
0.8032
0.8081
0.8352
0.8425
0.7670
0.8154
0.8348
0.8448
C
0.8379
0.8440
0.8634
0.8827
0.8137
0.8419
0.8659
0.8818
D
0.8751
0.8737
0.8867
0.9033
0.8428
0.9106
0.8930
0.9088
E
0.8901
0.8480
0.8872
0.8905
0.8042
0.8598
0.8838
0.8828
F
0.9318
0.8509
0.8838
0.9205
0.8431
0.8955
0.9039
0.9141
G
0.9312
0.9070
0.8900
0.9386
0.8520
0.9309
0.9210
0.9282
H
0.9420
0.9120
0.8987
0.9446
0.8209
0.9293
0.9230
0.9339
Table C.10: Mean of ROC Curve Areas for Obscured Targets: Polarimetric Features:
Trained on Obscured Targets: Quadratic Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
A
0.0259
0.0238
0.0250
0.0265
0.0302
0.0209
0.0200
0.0319
B
0.0309
0.0267
0.0194
0.0233
0.0263
0.0264
0.0209
0.0201
C
0.0216
0.0257
0.0231
0.0186
0.0211
0.0283
0.0207
0.0167
D
0.0241
0.0201
0.0182
0.0159
0.0234
0.0177
0.0200
0.0194
E
0.0172
0.0331
0.0189
0.0193
0.0222
0.0207
0.0194
0.0187
F
0.0187
0.0242
0.0204
0.0145
0.0271
0.0174
0.0180
0.0155
G
0.0138
0.0152
0.0215
0.0123
0.0218
0.0171
0.0149
0.0162
H
0.0169
0.0134
0.0198
0.0119
0.0271
0.0153
0.0172
0.0120
Table C.11: Standard Deviation of ROC Curve Areas for Obscured Targets: Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier
141
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
A
0.9202
0.9559
0.8935
0.8292
0.9952
0.6470
0.6442
0.9400
B
0.9860
0.9896
0.9330
0.9541
0.8263
0.7538
0.9974
1.0000
C
0.8841
0.8854
0.9343
0.9184
0.9916
0.9525
0.8597
1.0000
D
1.0000
0.9962
1.0000
0.9997
0.9781
0.9978
1.0000
1.0000
E
0.9921
0.9801
0.9984
0.9979
0.9974
0.9984
1.0000
1.0000
F
0.9902
1.0000
0.9918
1.0000
0.9695
1.0000
1.0000
1.0000
G
1.0000
0.9999
0.9999
1.0000
0.9915
1.0000
1.0000
1.0000
H
0.9867
0.9998
1.0000
1.0000
0.7907
1.0000
1.0000
1.0000
Table C.12: Mean of ROC Curve Areas for Open Targets: Polarimetric Features:
Trained on Obscured Targets: Quadratic Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
A
0.0061
0.0319
0.1376
0.0576
0.0153
0.0117
0.0108
0.2387
B
0.0334
0.0073
0.0278
0.0171
0.3394
0.0616
0.0014
0.0000
C
0.0532
0.2295
0.2382
0.0084
0.0190
0.1716
0.0309
0.0000
D
0.0000
0.0063
0.0000
0.0010
0.1078
0.0040
0.0000
0.0000
E
0.0038
0.0186
0.0017
0.0026
0.0029
0.0017
0.0000
0.0000
F
0.0076
0.0000
0.0028
0.0000
0.1386
0.0000
0.0000
0.0000
G
0.0000
0.0009
0.0006
0.0000
0.0060
0.0000
0.0000
0.0000
H
0.0083
0.0015
0.0000
0.0000
0.2919
0.0000
0.0000
0.0000
Table C.13: Standard Deviation of ROC Curve Areas for Open Targets: Polarimetric
Features: Trained on Obscured Targets: Quadratic Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
A
1.7127
1.7399
1.7001
1.6654
1.7763
1.4848
1.4826
1.7275
B
1.7893
1.7977
1.7683
1.7965
1.5933
1.5692
1.8322
1.8448
C
1.7220
1.7294
1.7977
1.8011
1.8053
1.7944
1.7256
1.8818
D
1.8751
1.8699
1.8867
1.9031
1.8208
1.9084
1.8930
1.9088
E
1.8822
1.8281
1.8856
1.8884
1.8016
1.8581
1.8838
1.8828
F
1.9220
1.8509
1.8757
1.9205
1.8127
1.8955
1.9039
1.9141
G
1.9312
1.9069
1.8899
1.9386
1.8435
1.9309
1.9210
1.9282
H
1.9288
1.9117
1.8987
1.9446
1.6115
1.9293
1.9230
1.9339
Table C.14: Sum of Mean of ROC Curve Areas for Both Obscured Targets and Open
Targets: Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier
142
C.3
Polarimetric Features: Trained on Open Targets
C.3.1
Linear Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
A
0.9301
0.9368
0.9234
0.9590
0.7909
0.9269
0.9126
0.9369
B
0.9670
0.9717
0.9757
0.9860
0.7870
0.9885
0.9693
0.9728
C
0.9804
0.9841
0.9863
0.9910
0.8659
0.9864
0.9816
0.9893
D
0.9622
0.9730
0.9838
0.9873
0.8418
0.9902
0.9577
0.9666
E
0.9877
0.9853
0.9925
0.9945
0.8650
0.9955
0.9796
0.9882
F
0.9688
0.9892
0.9922
0.9895
0.8447
0.9950
0.9812
0.9848
G
0.9862
0.9922
0.9944
0.9950
0.8880
0.9959
0.9884
0.9911
H
0.9877
0.9951
0.9953
0.9962
0.8875
0.9971
0.9875
0.9926
Table C.15: Mean of ROC Curve Areas: Polarimetric Features: Trained on Open
Targets: Linear Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
A
0.0220
0.0129
0.0183
0.0083
0.0185
0.0110
0.0129
0.0093
B
0.0062
0.0081
0.0054
0.0035
0.0183
0.0036
0.0075
0.0090
C
0.0083
0.0053
0.0038
0.0022
0.0178
0.0037
0.0046
0.0031
D
0.0080
0.0064
0.0043
0.0042
0.0190
0.0037
0.0085
0.0075
E
0.0059
0.0043
0.0031
0.0020
0.0150
0.0026
0.0062
0.0075
F
0.0092
0.0048
0.0033
0.0036
0.0208
0.0022
0.0048
0.0051
G
0.0061
0.0039
0.0027
0.0015
0.0154
0.0018
0.0035
0.0032
H
0.0050
0.0024
0.0028
0.0019
0.0180
0.0015
0.0064
0.0054
Table C.16: Standard Deviation of ROC Curve Areas: Polarimetric Features: Trained
on Open Targets: Linear Classifier
143
C.3.2
Quadratic Classifier
I[A
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
0.9581
0.9606
0.9753
0.9682
0.8540
0.9866
0.9594
0.9666
B
0.9642
0.9719
0.9830
0.9777
0.8742
0.9913
0.9682
0.9675
C
0.9833
0.9811
0.9869
0.9824
0.8828
0.9914
0.9786
0.9844
D
0.9851
0.9869
0.9884
0.9824
0.8804
0.9947
0.9768
0.9790
E
0.9847
0.9844
0.9886
0.9901
0.8964
0.9937
0.9819
0.9865
F
0.9964
0.9921
0.9885
0.9823
0.8805
0.9958
0.9815
0.9798
G
0.9960
0.9937
0.9929
0.9893
0.9040
0.9963
0.9878
0.9849
H
0.9975
0.9956
0.9941
0.9923
0.8922
0.9971
0.9880
0.9863
Table C.17: Mean of ROC Curve Areas: Polarimetric Features: Trained on Open
Targets: Quadratic Classifier
A
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
' 0.0102
0.0092
0.0052
0.0061
0.0150
0.0037
0.0068
0.0075
B
0.0090
0.0061
0.0056
0.0048
0.0150
0.0024
0.0057
0.0070
C
0.0032
0.0047
0.0056
0.0037
0.0181
0.0027
0.0064
0.0035
D
0.0046
0.0032
0.0032
0.0048
0.0162
0.0024
0.0042
0.0050
E
0.0037
0.0078
0.0031
0.0024
0.0194
0.0023
0.0046
0.0032
F
0.0012
0.0025
0.0030
0.0056
0.0168
0.0022
0.0041
0.0041
G
0.0014
0.0026
0.0024
0.0050
0.0161
0.0016
0.0028
0.0033
H
0.0013
0.0015
0.0020
0.0022
0.0182
0.0026
0.0028
0.0033
Table C.18: Standard Deviation of ROC Curve Areas: Polarimetric Features: Trained
on Open Targets: Quadratic Classifier: Trained on Open Targets
144
Appendix D
ROC Curves
The following ROC curves were generated on classifiers using their respective best
feature sets. Each ROC curve was trained and tested on the training set and the
testing set originally used to generate the feature set. So the ROC curve comparisons
do not show ROC curves of the same training and testing sets, because different
feature sets come from different training (hence different testing) sets.
145
D.1
Geometric Features
D.1.1
Linear Classifier
Baseline
Upsampled
-
-
0.9
--
0.8
-
EV
MUSIC
Pisarenko
Joint-1/O SVA
arate-/O SVA
0.7
0.6
4.0.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
PFA
Figure D-1: Sample ROC Curves: Geometric Features: Linear Classifier: HH
Baseline
-
-Upsernpled
-MVMV
0.9
-
0.8
-
EV
MUSIC
Plsarenko
Joint-/O SVA
Separate-I/O SVA
0.7
0.6
0.4
0.3
0.2
0.1
n0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-2: Sample ROC Curves: Geometric Features: Linear Classifier: HV
146
I
-
0.9
-
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-1/O SVA
Separate-/O SVA
-
0.8F
0.7
0.6
'.5
0.4
0.3
0.2
0.1
0.1
0
0.3
0.2
0.9
0.8
0.7
0.6
0.5
0.4
1
PFA
Figure D-3: Sample ROC Curves: Geometric Features: Linear Classifier: VV
1
Baseline
Upsampled
MVM
EV
MUSIC
-
-
-
0.9
-
Pisarenko
0.8
-
-
Joint-I/0 SVA
Separate-1/O SVA
0.70.6Q,. 5k
0.4 F
0.3
0.2
0.1
fi M 0.1i
0
i
0.2
i
0.3
i
0.4
i
0.5
i
0.6
i
0.7
i
0.8
i
0.9
1
1
PFA
Figure D-4: Sample ROC Curves: Geometric Features: Linear Classifier: SPAN
147
m
-
0.9
-
0.8
Baseline
Upsampled
MVM
-
-
EV
MUSIC
Pisarenko
Joint-/O SVA
Separate-/O SVA
0.7[
0.6-
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
PFA
Figure D-5: Sample ROC Curves: Geometric Features: Linear Classifier: PMF
1
-
0.9
-
0.8
-
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-/O SVA
Separate-1/Q SVA
0.7
0.6
Q-0. 5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-6: Sample ROC Curves: Geometric Features: Linear Classifier: PWF
148
D.1.2
Quadratic Classifier
i.
..
-
0.9
-
-
0.8
-
Baseline
L4married
EV
MUSIC
Pisareko
Joint-1/O SVA
Separate-1/ SVA
0.7
0.6
0.4
0.3
0.2
0.1
0
011
0.2
0,3
04
0.5
06
0-7
0.8
0.9
1
PFA
Figure D-7: Sample ROC Curves: Geometric Features: Quadratic Classifier: HH
1 .
-
Baseline
-
Upsanpled
MVM
EV
MUSIC
Pisarenko
Joint-I/O SVA
-
0.9
0.8
-
rate-I/O
SVA
0.7
0.6
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
PFA
08
0,7
0.8
09
1
Figure D-8: Sample ROC Curves: Geometric Features: Quadratic Classifier: HV
149
I
1
-
-
0.9
-
0.8F-
-
-
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-1/O SVA
Separate-1/Q SVA
0.7
0.61-
Oc. 5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-9: Sample ROC Curves: Geometric Features: Quadratic Classifier: VV
-
0.9
Baseline
Upsampled
MVM
EV
MUSIC
-
0.8
-
Pisarenko
Joint-1/0 SVA
Separate-1/Q SVA
0.7
0.6
'.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
PFA
Figure D-10: Sample ROC Curves: Geometric Features: Quadratic Classifier: SPAN
150
: I-
1
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-1/O SVA
Separate-1/Q SVA
-
-
0.9
-
0.8F
-
-
0.7
0.6
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
F--
0.6
0.7
0.8
0.9
1
FA
Figure D-11: Sample ROC Curves: Geometric Features: Quadratic Classifier: PMF
Baseline
Upsampled
-- MVM
- EV
- MUSIC
Pisarenko
Joint-VQ SVA
Separate-1/ SVA
-
-
0.9
0.8F0.7
0.61-X%.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-12: Sample ROC Curves: Geometric Features: Quadratic Classifier: PWF
151
D.2
Polarimetric Features:
Tested on Obscured
Targets
D.2.1
Linear Classifier
-
-6
0.1
Baseline
Upeampled
MVM
EV
music
Piserenko
Jon-OSVAS
Sert-/VA
_-
-
0.8
0.7
0.6
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
PFA
Figure D-13: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Linear Classifier: A
I
-Baseline
-Upesmnpied
-MVM
-EV
-music
0.9
Pisarenko
0.8
-
oitSVA
- ert-I/O
SVA
0.7
0.6
0.4
0.3
0.2
0.1
u0
0.1
02
0.3
0.4
0.5
0.6
07
0.8
0.9
1
PFA
Figure D-14: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Linear Classifier: B
152
i
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-/O SVA
-
-
-
-
0.7
0.6
'.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.7
0.6
0.5
0.8
0.9
1
PFA
Figure D-15: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Linear Classifier: C
1
-
-
-
0.9
-
0.8
-
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/O SVA
Separate-/Q SVA
0.7
0.6
Q-0. 5
0.4
0.3
0.2
0.1
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-16: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Linear Classifier: D
153
..
Baseline
Upsampled
-
-
MVM
-
0.9
-
0.8
-
EV
MUSIC
Pisarenko
Joint-I/O SVA
Separate-1/O SVA
0.7
0.6
.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
PFA
Figure D-17: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Linear Classifier: E
I
-.
-
.. ....
-
-
0.9
-
0.8
-
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-/O SVA
Separate-l/O SVA
0.7
0.6
Q-0.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-18: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Linear Classifier: F
154
I
6:s
Cr
-
-
0.9
-
08
. f-
Baseline
Upsampled
MVM
-
-
-
EV
MUSIC
Pisarenko
Joint-1/ SVA
Separate-1/Q SVA
0.7
0.6
rr
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-19: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Linear Classifier: G
-
-
0.9
-
0.8 F
-
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-1/0 SVA
Separate-I/Q SVA
0.7
0.6
,x%.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
FA
Figure D-20: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Linear Classifier: H
155
D.2.2
Quadratic Classifier
1
Baseline
-Upsampled
-MVM
-EV
-MUSIC
0.9
Pisarenko
0.8
-
on-/ S VA
Bart-/SVA
0
0,7
0.6
00.5
0.4
0.3
0.2
0.1
0
01
02
0.3
0.4
0.5
0.6
0.7
0.8
0.9
PFA
Figure D-21: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Quadratic Classifier: A
-Baseline
-Upeampled
-MVM
-EV
-music
0.9
Pisarenko
0.8
-a
ateIOSVA
0.7
0.6
o%5
0.4
0.3
0,2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
I
PFA
Figure D-22: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Quadratic Classifier: B
156
1
Baseline
Upsampled
-
0.9
-
0.8F-
-
MVM
EV
MUSIC
Pisarenko
Joint-/0 SVA
-
Separate-VO SVA
0.7
0.6
0.4
0.3
0.2
0.1
00
0.1
0.2
0.3
0.4
0.5
0.6
07
0.8
0.9
PFA
Figure D-23: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Quadratic Classifier: C
1
Baseline
Upsampled
-
MVM
-
0.9
-
0.8 F
-
EV
MUSIC
Pisarenko
Joint-I/O SVA
Separate-/O SVA
0.7
0.6
0.4
0.3
0.2
0.1
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-24: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Quadratic Classifier: D
157
,
1
,
,
,.-----,-----r-. . .
.. ,,, ..... "
-
0.9
MVM
EV
MUSIC
-
0.8
Baseline
Upsampled
-
--
Pisarenko
Joint-1/Q SVA
Separate-1/ SVA
0.7
0.6
9. 5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
PFA
Figure D-25: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Quadratic Classifier: E
*I
i
:
-
-
0.9
-
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-/O SVA
Separate-1/Q SVA
0.70.6
'.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-26: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Quadratic Classifier: F
158
1
-
0.9
-
0.8
-
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-/O SVA
Separate-/O SVA
0.7
0.6
9. 5
0.4
0.3
0.2t
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
PFA
Figure D-27: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Quadratic Classifier: G
------rr 1
1,4
1
*.......-
---
0.9
-
-
0.8F
-
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-1/O SVA
0.7
0.6
9.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
PFA
Figure D-28: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Obscured Targets: Quadratic Classifier: H
159
D.3
Polarimetric Features: Tested on Open Targets
D.3.1
Trained on Open Targets
Linear Classifier
I
-Baseline
-Upsampled
-MVM
-EV
-music
0.9
-
Plsarenko
Joint-I/O SVA
Bea rate-I/O SVA
0.78
0.6
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.8
0.7
0.8
0.9
PFA
Figure D-29: Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: A
-
0.8
-
0.8
Baseolne
Upsampled
MVM
EV
music
Pisarenco
-Jn-IOSVAS
_"sraeI/QV
0.7
0.8
8
.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
PFA
Figure D-30: Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: B
160
I
-
-
-
0.9
-
0.8
-
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-/Q SVA
Separate-/Q SVA
0.7
0.6
(L. .5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-31: Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: C
-
-V
-,
"
, I -
-
-
0
-
-
0.9
-
0.8
-
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-/0 SVA
Separate-I/Q SVA
0.7
0.6
9. 5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
FA
Figure D-32: Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: D
161
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-/O SVA
Separate-I/Q SVA
-
-
-
0.9
-
0.8
-
0.7
0.6
'.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-33: Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: E
.- - -
, ------ --
-
0.9
-
-
-
0.8
-
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-1/O SVA
Separate-I/Q SVA
0.7
0.6
0.41
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-34: Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: F
162
I
0.9
0.8
0.7
F
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-/O SVA
Separate-1/Q SVA
-
-
-
0.61
0.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
PFA
Figure D-35: Sample ROC Curves: Pol arimetric Features: Trained on Open Targets:
Linear Classifier: G
-
I
-
-
0.9
0.8
0.7
-
-
-
r
-
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-/O SVA
Separate-1/O SVA
0.6
Q.0.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
PFA
Figure D-36: Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: H
163
Quadratic Classifier
t I . - .-
I
Baseline
-
-Upemrpled
0.9
MVM
EV
MUSIC
--
Plearenko
0.8
Joint-1/O SVA
Separate-1/O SVA
-
0.7
0.6
0.4
0.3
02
0.1
0.1
0
0.2
0.3
0.4
0.5
0.7
0.6
0,8
0.9
PFA
Figure D-37: Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: A
Ii
-
-
0.9
-
0.8
-
Baseline
Upsampled
VMV
EV
MUSic
Pisarenko
Joint-/O SVA
Separate-1/O SVA
0.7
0.6
0.4
03
0.2
0.1
0
01
0.2
0.3
04
05
06
0.7
0.8
09
Figure D-38: Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: B
164
' -
1
0.9
-
0.8
-
Baseline
Upsampled
MVM
EV
-
-
MUSIC
Pisarenko
Joint-/O SVA
Separate-/O SVA
0.7
0.6
00.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
PFA
Figure D-39: Sample ROC Curves: Pol arimetric Features: Trained on Open Targets:
Linear Classifier: C
-
-
0.9
-
0.8
-
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-1/O SVA
Separate-1/O SVA
0.7
0.6
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
PFA
Figure D-40: Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: D
165
1
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-/0 SVA
Separate-1/0 SVA
-
-
0.9
-
0.8
-
0.7
0.6
0.0.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-41: Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: E
-
-
0.9
-
0.8
-
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/O SVA
Separate-I/Q SVA
0.7
0.6
90.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-42: Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: F
166
I,
-to-,
,
Baseline
Upsampled
-
MVM
-
0.9
-
0.8
-
EV
MUSIC
Pisarenko
Joint-1/O SVA
Separate-1/O SVA
0.7
0.6
r
'.5
0.4
0.3
0.2
0.1
n' 1
i
0.1
e 0.3i
0.2
e i
0.5
0.4
0.6
0.7
0.8
0.9
1
FA
Figure D-43: Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: G
Baseline
Upsampled
-
-
MVM
-
0.9
EV
MUSIC
-
0.8
Pwsarenko
Joint-/O SVA
Separate-/O SVA
0.7
0.6
Q.90.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
PFA
Figure D-44: Sample ROC Curves: Polarimetric Features: Trained on Open Targets:
Linear Classifier: H
167
D.3.2
Trained on Obscured Targets
Linear Classifier
* Baseline
*Upsampled
0.9
*EV
MUSIC
Pisarenko
0.8
- SVA
0.7
0.8
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
PFA
Figure D-45: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Linear Classifier: A
-
Baseline
Upsampled
09
-MVM
-EV
MUSIC
0.8-
PAarenko
*Joint-1/O OVA
-Separate-I/O
OVA
0.7
0.6
0.4
0.3
0,2
0.
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
PFA
Figure D-46: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Linear Classifier: B
168
1
--
Baseline
Upsampled
MVM
-
0.9
0.8
EV
MUSIC
-
Pisarenko
Joint-/O SVA
Separate-1/O SVA
0.7
0.6
9.5
0.4
0.3
0.2
0.1
u0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-47: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Linear Classifier: C
-
Baseline
Upsampled
-
MVM
EV
MUSIC
Pisarenko
-
0.9
0.8
-
0.7
Joint-I/Q SVA
Separate-I/Q SVA
0.6
..5
0.4
0.3
02
.i
0
0
0
0.1
0.2
1
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-48: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Linear Classifier: D
169
Baseline
Upsampled
-
MVM
-
0.9
EV
MUSIC
Pisarenko
Joint-/Q SVA
Separate-/Q SVA
-
0.8
-
-
0.7
0.6
'.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-49: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Linear Classifier: E
*-
-
0.9
-
0.8
-
-
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-V/O SVA
Separate-I/Q SVA
0.7
0.6
L
a%5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
PFA
Figure D-50: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Linear Classifier: F
170
-
0.9
-
0. 8
-
Baseline
Upsampled
MVM
EV
MUSIC
Plsarenko
Joint-/0 SVA
Separate-l/Q SVA
0. 7
0. 6
5
0. 4
0. 3
0.2
0.
1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
PFA
Figure D-51: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Linear Classifier: G
-
-
0.9
-
0.8
-
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-/O SVA
Separate-I/Q SVA
0.7
0.6
(%.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
PFA
Figure D-52: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Linear Classifier: H
171
Quadratic Classifier
I
Baseline
-Upsampled
VMV
-
0.8
-EV
0.8
-
MUSIC
Plserenko
JoInt-O SVA
Separate-/O
SVA
0.7
0.6
0.4
0.3
0.2
0.1
0
0.1
0,2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
PFA
Figure D-53: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Quadratic Classifier: A
-Baseline
-Upsampled
-MVMV
-EV
-music
0.9
-
Pisarenko
lairS-I/0 SVA
Sea rate-I/O SVA
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
04
05
0.6
0.7
0.8
0.9
PFA
Figure D-54: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Quadratic Classifier: B
172
Baseline
Upsampled
MVM
EV
MUSIC
-
-
-
0.9
-
0.8
-
Pisarenko
--
Joint-1/O SVA
Separate-1/Q SVA
-
0.7
0.6
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
PFA
Figure D-55: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Quadratic Classifier: C
-
0.9-
-
0.8
--
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-/O SVA
Separate-1/O SVA
0.7
0.6
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
PFA
Figure D-56: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Quadratic Classifier: D
173
. "
---
---
I
Baseline
Upsampled
-
MVM
-
0.9
EV
MUSIC
Pisarenko
Joint-VQ SVA
Separate-I/Q SVA
-
0.8
-
0.7
0.6
0.4
0.3
0.2
0.1
0.1
0
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
PFA
Figure D-57: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Quadratic Classifier: E
Baseline
Upsampled
-
MVM
-
0.9
EV
MUSIC
-
Pisarenko
0.8
-
Joint-/Q SVA
Separate-I/Q SVA
0.7
0.6
0.4
0.31
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FA
Figure D-58: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Quadratic Classifier: F
174
I
Baseline
Upsampled
-
-
MVM
-
0.9
-
-
0.8
-
EV
MUSIC
Pisarenko
Joint-/O SVA
Separate-1/Q SVA
0.7
0.6
0.5
0.4
0.31
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
PFA
Figure D-59: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Quadratic Classifier: G
I
Baseline
Upsampled
MVM
-
-1
0.9
0.8
-
EV
MUSIC
-
-
Pisarenko
Joint-/0 SVA
Separate-/Q SVA
0.7
0.6
'0.5
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
PFA
0.6
0.7
0.8
0.9
Figure D-60: Sample ROC Curves: Polarimetric Features: Trained on Obscured
Targets: Tested on Open Targets: Quadratic Classifier: H
175
Appendix E
Performance of the Modified
Feature Selection Algorithm
As mentioned in Chapter 5, the modified feature selection algorithm picks the feature
set with the fewest features from the five feature sets with the highest mean ROC
curve areas or the highest sum of mean ROC curve areas. We chose this method,
because we found that the mean ROC curve areas for these five feature sets differ from
each only slightly. To demonstrate the performance of the modified feature selection
algorithm, this Appendix shows: 1) charts of the ratios of the ROC curve areas of
the selected feature set to the feature set with the highest ROC curve area and 2)
charts of the size of the selected feature set versus the size of the smallest feature
set out of 30 possible feature sets. The charts display their values as functions of
superresolution method and polarimetric "channel" or as functions of superresolution
method and candidate feature set.
176
E.1
Geometric Features
E.1.1
Linear Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
HH
1.0000
1.0000
1.0000
1.0000
0.9879
1.0000
1.0000
1.0000
HV
1.0000
1.0000
0.9973
0.9994
1.0000
0.9999
1.0000
1.0000
VV
0.9991
0.9999
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
SPAN
1.0000
0.9864
1.0000
1.0000
1.0000
1.0000
1.0000
0.9992
PMF
0.9973
1.0000
1.0000
1.0000
0.9972
1.0000
1.0000
1.0000
PWF
0.9998
1.0000
1.0000
1.0000
0.9931
1.0000
0.9975
1.0000
Table E.1: Ratio of the Mean ROC Curve Area of the Best Feature Set to the Highest
Mean ROC Curve Area: Geometric Features: Linear Classifier
177
POL
HH
HH
HH
HH
HH
HH
HH
HH
HV
HV
HV
HV
HV
HV
HV
HV
VV
VV
VV
VV
VV
VV
VV
VV
SPAN
SPAN
SPAN
SPAN
SPAN
SPAN
SPAN
SPAN
PMF
PMF
PMF
PMF
PMF
PMF
PMF
PMF
PWF
PWF
PWF
PWF
PWF
PWF
PWF
PWF
SUPER
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Min. Size of
Feat. Set
2
1
1
1
1
2
1
1
2
3
2
1
1
2
2
2
2
2
2
1
1
1
2
2
2
2
1
2
1
1
2
3
2
2
1
Size of
Feat. Set
2
2
3
3
1
2
2
3
4
3
2
2
3
2
3
4
2
2
2
2
2
1
3
3
2
2
2
2
3
2
2
3
2
2
2
1
1
1
2
2
2
2
2
2
2
1
1
2
2
1
2
2
3
3
2
2
2
1
2
2
3
Table E.2: Feature Set Size: Geometric Features: Linear Classifier
178
E.1.2
Quadratic Classifier
_
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
HH
1.0000
1.0000
0.9910
1.0000
0.9933
1.0000
1.0000
0.9933
HV
0.9886
1.0000
1.0000
1.0000
0.9968
0.9981
1.0000
0.9974
VV
0.9926
1.0000
0.9961
1.0000
1.0000
1.0000
1.0000
0.9965
SPAN
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
0.9991
1.0000
PMF
0.9993
0.9972
1.0000
1.0000
1.0000
1.0000
0.9970
1.0000
PWF
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
Table E.3: Ratio of the Mean ROC Curve Area of the Best Feature Set Mean ROC
to the Highest Mean ROC Curve Area: Geometric Features: Quadratic Classifier
179
POL
HH
HH
HH
H
HH
HH
HH
HH
HV
HV
HV
HV
HV
HV
HV
HV
VV
VV
VV
VV
VV
VV
VV
VV
SPAN
SPAN
SPAN
SPAN
SPAN
SPAN
SPAN
SPAN
PMF
PMF
PMF
PMF
PMF
PMF
PMF
PMF
PWF
PWF
PWF
PWF
PWF
PWF
PWF
PWF
SUPER
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Min. Size of
Feat. Set
3
2
1
3
1
1
1
2
3
2
1
1
1
1
1
1
2
1
1
1
1
1
1
1
2
2
1
1
1
1
2
1
2
1
1
1
1
1
1
1
2
2
1
1
1
1
1
2
Size of
Feat. Set
4
2
2
3
1
1
2
3
3
4
1
1
2
2
2
1
2
3
1
1
1
1
2
1
4
2
1
2
2
1
2
2
3
1
2
2
2
1
2
2
6
2
2
2
1
2
1
3
Table E.4: Feature Set Size: Geometric Features: Quadratic Classifier
180
E.2
Polarimetric Features: Trained on Obscured
Targets
E.2.1
Linear Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Separate-I/Q SVA
Joint-I/Q SVA
A
1.0000
0.9780
0.9930
0.9793
0.9871
0.9562
0.9782
0.9331
B
0.8835
0.9901
0.9403
0.9555
0.9815
0.9971
0.9886
0.9947
C
0.9914
0.9957
0.9900
0.9907
0.9441
0.9930
1.0000
0.9872
D
1.0000
0.9976
0.9983
0.9856
0.9452
0.9953
0.9919
0.9950
E
0.9964
0.9967
0.9877
0.9888
0.9807
0.9894
0.9821
0.9826
F
1.0000
0.9711
0.9985
0.9976
0.9983
0.9984
0.9749
0.9878
G
1.0000
0.9886
0.9982
0.9946
0.9735
0.9974
0.9926
0.9980
H
0.9662
1.0000
0.9994
0.9976
0.9574
1.0000
0.9951
0.9902
Table E.5: Ratio of the Obscured Target Mean ROC Curve Area of the Best Feature
Set to the Highest Obscured Target Mean ROC Curve Area: Polarimetric Features:
Trained on Obscured Targets: Linear Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
A
0.9988
1.0000
0.6995
0.9418
1.0000
1.0000
0.9504
1.0000
B
1.0000
1.0000
1.0000
0.9140
1.0000
0.7089
0.9947
1.0000
C
0.8500
0.9995
1.0000
0.9691
1.0000
0.9254
1.0000
0.8292
D
0.9653
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
E
1.0000
1.0000
0.9956
0.9972
1.0000
1.0000
0.9988
1.0000
F
0.8801
1.0000
1.0000
1.0000
0.5679
0.9999
0.9929
1.0000
G
1.0000
0.9897
1.0000
1.0000
0.6070
1.0000
1.0000
1.0000
H
1.0000
0.9875
1.0000
1.0000
0.6393
1.0000
1.0000
0.9989
Table E.6: Ratio of the Open Target Mean ROC Curve Area of the Best Feature
Set Mean ROC Curve Area to the Highest Open Target Mean ROC Curve Area:
Polarimetric Features: Trained on Obscured Targets: Linear Classifier
181
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
IA
1.0000
1.0000
0.8675
0.9788
0.9967
1.0000
0.9689
1.0000
B
1.0000
1.0000
1.0000
0.9475
0.9972
0.8516
1.0000
1.0000
C
0.9192
0.9999
1.0000
0.9844
1.0000
0.9609
1.0000
0.9092
D
0.9964
1.0000
0.9992
0.9964
0.9929
0.9977
0.9969
1.0000
E
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
0.9998
1.0000
F
0.9978
0.9987
0.9996
1.0000
0.8301
0.9992
0.9974
1.0000
G
1.0000
1.0000
0.9993
0.9988
0.8789
0.9987
1.0000
1.0000
H
1.0000
1.0000
0.9998
0.9991
0.8198
1.0000
1.0000
0.9987
Table E.7: Ratio of the Combined Obscured Target and Open Target Mean ROC
Curve Areas of the Best Feature Set to the Highest Combined Obscured Target and
Open Target Mean ROC Curve Areas: Polarimetric Features: Trained on Obscured
Targets: Linear Classifier
182
FEAT SET
A
A
A
A
A
A
A
A
B
B
B
B
B
B
B
B
C
C
C
C
C
C
C
C
D
D
D
D
D
D
D
D
E
E
E
E
E
E
E
E
F
F
F
F
F
F
F
F
G
G
G
G
G
G
G
G
H
H
B
H
H
H
H
H
I
SUPER
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Min. Size of
Feat. Set
1
4
1
1
4
1
1
3
6
3
1
3
5
1
2
3
3
5
3
4
6
2
2
3
3
9
2
6
4
7
4
8
6
6
5
6
6
3
6
4
8
9
5
6
9
7
7
6
5
5
5
8
8
5
6
7
8
7
8
10
6
7
8
8
Size of
Feat. Set
5
5
2
3
4
5
3
3
6
9
2
4
5
1
4
3
4
10
5
4
9
4
7
3
5
10
8
6
5
8
5
9
8
10
8
9
7
3
7
4
9
13
6
9
12
8
7
13
5
10
7
9
10
7
9
7
9
9
10
11
12
9
10
8
Table E.8: Feature Set Size: Polarimetric Features: Trained on Obscured Targets:
Linear Classifier
183
E.2.2
Quadratic Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
A
0.9828
0.9662
0.9463
0.9864
0.9891
1.0000
1.0000
0.9250
B
0.9813
0.9895
0.9814
0.9724
0.9595
0.9804
0.9933
0.9880
C
0.9698
0.9839
0.9710
0.9913
1.0000
0.9620
0.9854
1.0000
D
0.9928
1.0000
0.9965
0.9967
0.9948
1.0000
0.9977
0.9985
E
1.0000
0.9916
1.0000
0.9918
0.9874
0.9897
0.9993
0.9933
F
0.9928
0.9682
0.9966
0.9938
1.0000
0.9929
0.9923
0.9942
G
0.9960
0.9945
0.9899
0.9982
1.0000
0.9956
1.0000
0.9996
H
0.9911
0.9962
0.9941
1.0000
0.9605
0.9978
1.0000
0.9964
Table E.9: Ratio of the Obscured Target Mean ROC Curve Area of the Best Feature
Set to the Highest Obscured Target Mean ROC Curve Area: Polarimetric Features:
Trained on Obscured Targets: Quadratic Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
A
1.0000
1.0000
1.0000
1.0000
1.0000
0.9662
0.8991
0.9400
B
0.9886
1.0000
1.0000
1.0000
1.0000
0.7712
1.0000
1.0000
C
0.9780
0.9139
0.9396
0.9926
1.0000
0.9902
0.8862
1.0000
D
1.0000
0.9962
1.0000
0.9997
1.0000
0.9978
1.0000
1.0000
E
1.0000
0.9921
1.0000
1.0000
1.0000
0.9990
1.0000
1.0000
F
1.0000
1.0000
0.9918
1.0000
1.0000
1.0000
1.0000
1.0000
G
1.0000
0.9999
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
H
0.9998
0.9998
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
Table E.10: Ratio of the Open Target Mean ROC Curve Area of the Best Feature
Set Mean ROC Curve Area to the Highest Open Target Mean ROC Curve Area:
Polarimetric Features: Trained on Obscured Targets: Quadratic Classifier
184
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
A
1.0000
1.0000
1.0000
1.0000
1.0000
0.9976
0.9705
0.9411
B
0.9897
1.0000
1.0000
1.0000
1.0000
0.8834
1.0000
0.9963
C
0.9865
0.9611
0.9677
1.0000
1.0000
0.9920
0.9483
1.0000
D
1.0000
1.0000
1.0000
0.9984
1.0000
1.0000
0.9989
0.9993
E
1.0000
0.9919
1.0000
1.0000
1.0000
0.9947
1.0000
0.9969
F
0.9966
0.9866
0.9999
0.9970
1.0000
0.9982
0.9963
0.9981
G
1.0000
0.9980
0.9963
0.9991
1.0000
1.0000
1.0000
0.9998
H
0.9963
0.9980
1.0000
1.0000
1.0000
1.0000
1.0000
0.9983
Table E.11: Ratio of the Combined Obscured Target and Open Target Mean ROC
Curve Areas of the Best Feature Set to the Highest Combined Obscured Target and
Open Target Mean ROC Curve Areas: Polarimetric Features: Trained on Obscured
Targets: Linear Classifier
185
FEAT SET
A
A
A
A
A
A
A
A
B
B
B
B
B
B
B
B
C
C
C
C
C
C
C
C
D
D
D
D
D
D
D
D
E
E
E
E
E
E
E
E
F
F
F
F
F
F
F
F
G
G
G
G
G
G
G
G
H
H
H
H
H
H
H
H
SUPER
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Min. Size of
Feat. Set
2
2
1
2
2
1
1
1
4
2
1
4
5
1
2
4
1
1
2
2
2
1
1
2
4
3
3
4
3
4
3
6
3
2
2
3
4
2
2
3
12
3
4
5
5
4
2
5
4
3
4
4
3
3
4
5
7
5
4
6
6
5
4
4
Size of
Feat. Set
2
2
5
3
2
1
1
6
4
4
4
5
5
2
2
4
3
2
2
2
4
3
4
4
4
5
3
4
3
4
4
7
3
3
2
5
4
3
2
3
12
3
7
5
5
4
2
5
5
4
4
4
3
3
4
5
12
5
6
7
10
5
4
5
Table E.12: Feature Set Size: Polarimetric Features: Trained on Obscured Targets:
Quadratic Classifier
186
E.3
Polarimetric Features: Trained on Open Targets
E.3.1
Linear Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
A
0.9999
0.9994
0.9985
0.9992
0.9975
0.9995
1.0000
1.0000
B
0.9991
0.9989
0.9983
0.9992
0.9932
1.0000
1.0000
1.0000
C
0.9988
0.9990
1.0000
1.0000
0.9990
1.0000
1.0000
1.0000
D
1.0000
0.9991
0.9998
1.0000
0.9986
1.0000
1.0000
0.9988
E
1.0000
0.9985
0.9996
0.9997
0.9935
1.0000
0.9989
1.0000
F
0.9993
0.9991
1.0000
0.9993
1.0000
1.0000
0.9998
0.9987
G
0.9981
0.9992
1.0000
1.0000
1.0000
0.9998
1.0000
0.9997
H
1.0000
0.9996
1.0000
1.0000
0.9984
0.9997
0.9986
0.9994
Table E.13: Ratio of the Mean ROC Curve Area of the Best Feature Set to the
Highest Mean ROC Curve Area: Polarimetric Features: Trained on Open Targets:
Linear Classifier
187
FEAT SET
A
A
A
A
A
A
A
A
B
B
B
B
B
B
B
B
C
C
C
C
C
C
C
C
D
D
D
D
D
D
D
D
E
E
E
E
E
E
E
E
F
F
F
F
F
F
F
F
G
G
G
G
G
G
G
G
H
H
H
H
H
H
H
H
SUPER
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Min. Size of
Feat. Set
2
1
3
7
3
4
1
2
7
8
4
5
4
5
4
6
6
7
6
7
7
6
5
6
4
9
5
9
7
6
3
5
5
10
8
7
7
6
7
7
8
11
8
11
8
10
7
9
9
9
8
9
7
8
7
9
7
12
11
8
12
8
9
11
Size of
Feat. Set
3
1
3
7
6
6
1
2
7
8
4
5
6
5
4
8
6
7
7
7
7
8
8
8
6
11
5
12
7
6
4
7
11
10
8
7
7
6
7
10
10
14
8
11
8
10
9
10
9
13
13
12
10
12
9
9
7
12
11
12
15
8
15
14
Table E.14: Feature Set Size: Polarimetric Features: Trained on Open Targets: Linear
Classifier
188
E.3.2
Quadratic Classifier
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
A
1.0000
1.0000
0.9990
1.0000
0.9986
0.9998
1.0000
1.0000
B
0.9992
1.0000
0.9993
0.9960
1.0000
0.9999
1.0000
0.9974
C
0.9994
0.9958
0.9994
0.9979
1.0000
0.9996
1.0000
1.0000
D
0.9993
0.9967
0.9999
0.9991
0.9989
0.9995
1.0000
0.9995
E
0.9993
0.9999
0.9994
1.0000
1.0000
0.9996
0.9986
0.9991
F
1.0000
1.0000
0.9997
0.9966
1.0000
0.9997
0.9999
0.9985
G
0.9997
1.0000
0.9995
0.9998
1.0000
0.9990
1.0000
0.9996
H
1.0000
1.0000
0.9993
0.9997
0.9956
1.0000
0.9993
0.9993
Table E.15: Ratio of the Mean ROC Curve Area of the Best Feature Set to the
Highest Mean ROC Curve Area: Polarimetric Features: Trained on Open Targets:
Quadratic Classifier
189
FEAT SET
A
A
A
A
A
A
A
A
B
B
B
B
B
B
B
B
C
C
C
C
C
C
C
C
D
D
D
D
D
D
D
D
E
E
E
E
E
E
E
E
F
F
F
F
F
F
F
F
G
G
G
G
G
G
G
G
H
H
H
H
H
H
H
H
SUPER
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Baseline
Upsampled
MVM
EV
MUSIC
Pisarenko
Joint-I/Q SVA
Separate-I/Q SVA
Min. Size of
Feat. Set
2
3
6
4
3
11
3
5
3
3
11
4
6
11
4
7
2
3
8
5
6
7
3
5
4
3
8
5
5
10
4
7
2
2
6
5
8
4
4
3
7
6
9
8
7
11
7
5
3
6
5
8
7
12
5
6
8
5
7
8
10
11
7
5
Size of
Feat. Set
4
6
12
5
3
14
3
5
3
6
11
4
7
18
4
9
2
4
13
6
9
13
5
5
5
3
10
5
5
17
4
9
2
5
7
5
8
4
4
3
9
6
13
9
7
17
7
10
3
6
5
8
8
18
5
8
11
5
9
8
11
20
7
5
Table E.16: Feature Set Size: Polarimetric Features: Trained on Open Targets:
Quadratic Classifier
190
Bibliography
[1] J. Curlander and R. McDonough, Synthetic Aperture Radar: Systems and Signal
Processing. New York: Wiley, 1991.
[2] S. DeGraaf, "SAR imaging via modern 2-D spectral estimation methods," IEEE
Transactions on Image Processing,vol. 7, pp. 729-761, May 1998.
[3] D. Dudgeon and R. Lacoss, "An overview of automatic target recognition," Lincoln Laboratory Journal,vol. 6, pp. 3-10, Spring 1993.
[4] R. Goodman, S. Tummala, and W. Carrara, "Issues in ultra-wideband, widebeam
SAR image formation," Record of the 1995 IEEE - Radar Conference, pp. 479485, 1995.
[5] M. Davis, P. Tomlinson, and R. Maloney, "Technical challenges in ultra-wideband
radar development for target detection and terrain mapping," Record of the 1999
IEEE - Radar Conference, pp. 1-6, 1999.
[6] L. Novak, G. Owirka, W. Brower, and A. Weaver, "The automatic targetrecognition system in SAIP," Lincoln Laboratory Journal,vol. 10, no. 2, pp. 187202, 1997.
[7] A. Freeman and S. Durden, "A Three-Component Scattering Model for Polarimetric SAR Data," IEEE Transactions on Geoscience and Remote Sensing,
vol. 36, pp. 963-973, May 1998.
191
[8] L. Bessette, S. Crooks, and S. Ayasli, "P-3 ultra-wideband SAR, Grayling, Michigan, target and clutter phenomenology," Record of the 1999 IEEE - Radar Conference, pp. 125-129, 1999.
[9] D. Sheen and T. Lewis, "The P-3 ultra-wideband SAR," Proceedings of SPIE the InternationalSociety for Optical Engineering, vol. 2747, pp. 20-24, 1996.
[10] L. Novak and S. Hesse, "Optimal polarizations for radar detection and recognition of targets in clutter," Record of the 1993 IEEE - Radar Conference, pp. 7983, 1993.
[11] L. Novak, M. Burl, R. Chaney, and G. Owirka, "Optimal processing of polarimetric synthetic-aperture radar imagery," Lincoln Laboratory Journal, vol. 3,
pp. 273-90, Summer 1990.
[12] G. Owirka, S. Verbout, and L. Novak, "Template-based SAR ATR performance
using different image enhancement techniques," Proceedings of SPIE - the InternationalSociety for Optical Engineering, vol. 3721, pp. 302-19, April 1999.
[13] S. Kay, Modern Spectral Estimation: Theory and Application. Englewood Cliffs,
NJ: Prentice Hall, 1988.
[14] V. Pisarenko, "On estimation of spectra by means of nonlinear function of the
covariance matrix," Geophysical Journal of the Royal Astrononomical Society,
vol. 28, pp. 511-531, August 1971.
[15] H. Stankwitz, R. Dallaire, and J. Fienup, "Nonlinear apodization for sidelobe
control in SAR imagery," IEEE Transactions on Aerospace and Electronic Systems, vol. 31, pp. 267-279, January 1995.
[16] L. Novak, G. Owirka, and C. Netishen, "Performance of a high-resolution polarimetric SAR automatic target recognition system," Lincoln Laboratory Journal,
vol. 6, pp. 11-24, Spring 1993.
192
[17] K. Fukunaga, Introduction to Statistical Pattern Recognition. Boston: Academic
Press, 2 ed., 1990.
[18] D. Kreithen, S. Halversen, and G. Owirka, "Discriminating targets from clutter,"
Lincoln Laboratory Journal,vol. 6, pp. 25-52, Spring 1993.
193