Pansharpening Quality Assessment Using the Modulation Transfer

advertisement
3880
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, VOL. 47, NO. 11, NOVEMBER 2009
Pansharpening Quality Assessment Using the
Modulation Transfer Functions of Instruments
Muhammad Murtaza Khan, Student Member, IEEE, Luciano Alparone, and
Jocelyn Chanussot, Senior Member, IEEE
Abstract—Quality assessment of pansharpening methods is not
an easy task. Quality-assessment indexes, like Q4, spectral angle
mapper, and relative global synthesis error, require a reference
image at the same resolution as the fused image. In the absence of
such a reference image, the quality of pansharpening is assessed at
a degraded resolution only. The recently proposed index of Quality
Not requiring a Reference (QNR) is one among very few tools
available for assessing the quality of pansharpened images at the
desired high resolution. However, it would be desirable to cross
the outcomes of several independent quality-assessment indexes,
in order to better determine the quality of pansharpened images.
In this paper, we propose a method to assess fusion quality at the
highest resolution, without requiring a high-resolution reference
image. The novel method makes use of digital filters matching the modulation transfer functions (MTFs) of the imaginginstrument channels. Spectral quality is evaluated according to
Wald’s spectral consistency property. Spatial quality measures
interscale changes by matching spatial details, extracted from
the multispectral bands and from the panchromatic image by
means of the high-pass complement of MTF filters. Eventually, we
highlight the necessary and sufficient condition criteria for quality-assessment indexes by developing a pansharpening method
optimizing the QNR spatial index and assessing the quality of
fused images by using the proposed protocol.
Index Terms—Fusion, image quality, modulation transfer function (MTF), pansharpening, spatial distortion, spectral distortion.
I. I NTRODUCTION
T
HE PANCHROMATIC (Pan) and the multispectral (MS)
images provided by satellite instruments are not at the
same resolution. The MS images have several spectral bands but
a spatial resolution lower than that of Pan. The latter has a high
spatial resolution but no spectral diversity. Satellite images with
both high spectral and spatial resolutions are desirable both for
photoanalysis and for improving the results of automated tasks
like classification, segmentation, and object detection [1]–[3].
The process of pansharpening helps in producing MS images
appearing as having both high spatial and spectral resolutions.
Manuscript received November 29, 2008; revised March 21, 2009 and
June 10, 2009. First published October 2, 2009; current version published
October 28, 2009.
M. M. Khan and J. Chanussot are with the Grenoble Images Speech
Signals and Automatics Laboratory (GIPSA-Lab), Department of Images and Signals, Grenoble Institute of Technology, 38402 Saint Martin
d’Hères Cedex, France (e-mail: muhammad-murtaza.khan@gipsa-lab.inpg.fr;
jocelyn.chanussot@gipsa-lab.inpg.fr).
L. Alparone is with the Images and Communications Laboratory, Department of Electronics and Telecommunications, University of Florence, 50139
Florence, Italy (e-mail: alparone@lci.det.unifi.it).
Color versions of one or more of the figures in this paper are available online
at http://ieeexplore.ieee.org.
Digital Object Identifier 10.1109/TGRS.2009.2029094
For instance, the QuickBird satellite provides Pan image at
0.7-m spatial resolution, while the MS image has a 2.8-m
spatial resolution. The pansharpening process provides an MS
image at 0.7-m spatial resolution while preserving the spectral
characteristics of the original MS image.
Pansharpening methods employ different strategies to provide MS images having high resolution. Generally, these methods can be divided into three categories.
1) The first category consists of the multiresolution-analysis
(MRA)-based methods. They employ spatial filters to
extract the high-frequency information from the Pan image. This high-frequency information is added into the
upscaled MS images, possibly weighted by means of a
suitable injection model [4]. Such a model can be based
upon calculation of local interband and intraband statistics (correlation coefficient (CC), mean, variance) [5].
Wavelets [6]–[10], Laplacian pyramids [11], box filters
used by smoothing filter-based intensity modulation [12]
and by high-pass filtering [13], and curvelets [14] are
examples of MRA and related fusion methods.
2) The second category consists of methods based on component substitution (CS). These methods do not involve
any spatial filtering process. They make use of a spectral
transformation to obtain a new projection of the MS and
Pan images in which fusion occurs as substitution of one
component with the Pan image. The inverse transformation produces MS images at the desired high resolution.
Examples of CS methods are intensity-hue-saturationbased methods [15], [16], principal-component-analysis
(PCA)-based methods [13], and Gram–Schmidt (GS)based fusion methods [17], [18].
3) The third type of fusion methods makes use of both CS
and MRA. The methods presented in [19] and [20] are
examples of this type of hybrid fusion. Another example
of hybrid-fusion method incorporating both PCA (CS)
and contourlets (MRA) is presented in [21]. Despite the
hybrid nature of such methods, their behavior is more
similar to MRA methods than to CS methods. In many
cases, they are equivalent to MRA fusion method with
a specific injection model. As an example, the extension
of the additive-wavelet-luminance (AWL) method [19]
to more than three bands constitutes the AWL proportional (AWLP) method [8], in which details extracted
from Pan through MRA are injected proportionally to the
original spectral vector, in order to preserve the spectral angle between the resampled original data and the
fused data.
0196-2892/$26.00 © 2009 IEEE
Authorized licensed use limited to: Jocelyn Chanussot. Downloaded on December 15, 2009 at 10:47 from IEEE Xplore. Restrictions apply.
KHAN et al.: PANSHARPENING QUALITY ASSESSMENT
A problem associated with pansharpening is how to quantify the quality of pansharpened images. Traditional quality
indexes, e.g., universal image quality index (UIQI) [22] and its
extension Q4 [23], spectral angle mapper (SAM), and relative
global synthesis error (ERGAS) [24], all require a reference image at the same resolution as the pansharpened image. Hence,
for quality assessment, the pansharpening algorithms are tested
on spatially degraded images. For instance, the Pan image of
QuickBird is degraded from 0.7- to 2.8-m spatial resolution,
and the MS image is degraded from 2.8- to 11.2-m spatial
resolution. Then, the pansharpened MS image produced is at
2.8-m spatial resolution and can be compared to the reference
2.8-m MS image. It is assumed that, at the desired resolution,
the fusion algorithm would produce a pansharpened image with
the same quality as when it was tested at the lower resolution.
Such an assumption is strengthened if modulation-transferfunction (MTF)-like spatial filters are used to downscale the
data sets [25].
Recently, some quality-assessment methods that do not require a high-resolution reference MS image have been proposed. Some of the examples are the Quality Not requiring a
Reference (QNR) indexes proposed by Alparone et al. [26] and
the quality indexes of Zhou et al. [27]. However, only QNR
indexes provide consistent results. Zhou’s quality-assessment
protocol calculates spectral and spatial qualities separately,
but the calculation of the spatial-quality index is incorrect,
as demonstrated in [28]. Comparisons with Q4, ERGAS, and
SAM in [26] show that Zhou’s spatial quality index exhibits
trends opposite to those of indexes calculated with a reference.
As a limit case, the reference image may appear to have the
poorest spatial quality among the fusion method compared.
Since QNR is the only tool available to assess the quality of
pansharpened images at the desired high resolution, there is
a need for other blind quality-assessment methods that could
be used in alternative or, better, in parallel to QNR indexes.
The recent development of some new indexes not requiring
a reference is an example of the necessity for measuring the
quality at full resolution [29], [30].
In this paper, we present a method for quality assessment,
taking explicitly into account the equivalent spatial filter response of the sensor. The proposed method, analogously to
QNR, does not require a high-resolution reference MS image
and provides two separate indexes, for the spectral and spatial
qualities, respectively. The spectral quality is measured analogously to Wald’s consistency property [24], while a modified
version of Zhou’s spatial index [27] is used to define the new
spatial quality index. The novelty of the method relies on the
fact that filters matching the MTFs of the different channels
of the imaging instruments are used to extract the spectral
(low pass) and spatial (high pass) information from the fused
images.
In addition, we develop a MRA-based pansharpening method
which makes use of MTF filters to extract the details from
the high-resolution Pan image, as suggested in [25]. For the
integration of such details in the upscaled MS images, an
injection model based upon the optimization of the spatial QNR
index has been devised. The development of a pansharpening
method based upon optimization of a quality index provides a
3881
deeper insight into the necessity and sufficiency requirements
for pansharpening quality-assessment indexes.
In Section II, the novel quality-assessment method is presented. In Section III, the QNR optimization-based pansharpening, used to validate the proposed quality-assessment method,
is developed. Section IV presents the results obtained on
simulated Pléiades, IKONOS, and QuickBird images in a
comparison of the QNR-optimizing fusion with three other
methods recently established in the literature. The final section presents a discussion on the obtained results and draws
conclusions.
II. P ROPOSED Q UALITY -A SSESSMENT M ETHOD
Generally, the sharpness of an image and the consistency of
colors are the two traits that are important in various applications, either automated or not. The sharpness of the image
pertains to its spatial characteristics, whereas colors pertain to
the spectral characteristics. Since both spectral and spatial characteristics are relevant in pansharpened images, the proposed
quality-assessment method assesses both of them separately.
The details regarding the theory, development, and implementation of the proposed quality index are given as follows.
A. Spectral Quality Assessment
The spectral quality of pansharpened images can be determined based upon the change in colors of the fused images
as compared to the high-resolution MS reference images. In
order to determine the spectral similarity of the pansharpened
images with the reference MS image, the spectral-angle-error
SAM might be used. However, at the desired high resolution,
the reference image is not available; hence, a comparison using
objective indexes is not feasible. A solution to this problem is
to make use of Wald’s consistency property [24].
The consistency property states that the fused image, once
degraded to its original resolution, should be similar to the
original MS image. However, the type of spatial filter to be
used for degradation is left as an open concern. A subsequent
study has pointed out that the MS images should be degraded
using MTF-shaped filters [25]. Hence, we propose to use the
MTFs of each spectral channel to obtain a low-pass-filtered
(LPF) image. Such an image, once it has been decimated, shall
give the low-resolution MS image. For comparing the degraded
MS image with the original MS image, we use the Q4 index
[23]. Hence, the procedure for assessing spectral quality is
as follows.
1) Apply the corresponding MTF filter to each fused MS
band, i.e., the response of the red band to filter the fused
red MS band, the response of the green band to filter the
fused green MS band, and so on.
2) Decimate the filtered bands.
3) Calculate the Q4 index between the ensemble of the
decimated filtered fused MS bands and the original lowresolution MS bands.
4) Subtract the Q4 index from one to yield a spectraldistortion index.
Authorized licensed use limited to: Jocelyn Chanussot. Downloaded on December 15, 2009 at 10:47 from IEEE Xplore. Restrictions apply.
3882
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, VOL. 47, NO. 11, NOVEMBER 2009
TABLE I
MTF GAINS AT NYQUIST CUTOFF FREQUENCY
gains at Nyquist cutoff frequency. Hence, the MTF filter frequency response for each MS band of QuickBird and IKONOS
was used to generate four different filters. Fig. 2 shows the frequency responses of the QuickBird MTF Gaussian-like 1 : 4
filters, together with the responses of the equivalent 1 : 4 filters
(convolution of the original 1 : 2 filter with its version upsampled by two) derived from SMF and from an almost ideal
23-taps filter [11]. In addition, the gains of the QuickBird and
IKONOS sensors at Nyquist cutoff frequency, as provided in
[32] and [33], are used to develop the approximate Gaussian
filters and are presented in Table I.
Fig. 1. Flowchart of spectral quality-assessment procedure.
Fig. 2. Gaussian-like modeled frequency responses for the MTF of QuickBird
(bands NIR, Red, Green, and Blue), together with equivalent 1 : 4 filters derived
from S&M filter and almost ideal 23-taps filter. (Solid lines) Low-pass filters
and (dashed lines) complementary high-pass filters.
The block-diagram representation of the spectral quality assessment is shown in Fig. 1. While using the proposed protocol for
assessing spectral quality, it should be noted that the MTF filters
for each sensor are different. The difficulty is that the exact
filter response is not provided by the instrument manufacturers.
However, the filter gain at Nyquist cutoff frequency may be
derived from on-orbit measurements. Using this information
and assuming that the frequency response of each filter is
approximately Gaussian shaped, MTF filters for each sensor of
each satellite can be estimated. We have used the Starck and
Murtagh filter (SMF) [31], also known as the “Àtrous” filter, as
the MTF filter for the simulated Pléiades image. The same filter
was used for determining spectral quality of all the four bands.
However, unlike Pléiades, the MTF filters for each MS band
of the QuickBird and IKONOS satellite have slightly different
B. Spatial Quality Assessment
For assessing the spatial quality of the fused images, we
propose to use a modified version of Zhou’s spatial index. The
spatial quality index proposed by Zhou et al. [27] extracts the
high-frequency information from both the Pan and the fused
MS image using a Laplacian filter. CC is calculated between
the details extracted from the Pan image and each pansharpened MS band. Zhou’s index assumes that the ideal value of
correlation between the details of the Pan and MS images is
one. However, it has been noted that the CC between the details
of the Pan image and of the high-resolution MS images may not
be equal to one [28]. At times, there are details that are present
in the MS band which are absent in the Pan image, and vice
versa [26], [34].
To exploit the relationship between the details of the Pan and
the details of the MS images, we propose to use the high-pass
complements of the MTF filters to extract the high-frequency
information from the MS images at both high (fused) and low
(original) resolutions. In addition, the Pan image is downscaled
to the resolution of the original MS image. The high-frequency
information, consisting of spatial details (edges, textures, etc.),
is extracted from high- and low-resolution Pan images. The
UIQI is calculated between the details of the MS and the details
of the Pan image at the two resolutions. It is assumed that this
relationship does not significantly change across scale.
Hence, the proposed procedure for the assessment of spatial
quality becomes the following steps.
1) Apply the corresponding MTF filter to each fused MS
band.
2) Subtract the LPF fused MS bands from the corresponding
fused MS bands. This provides the details of the MS
bands.
3) Apply a low-pass filter to the high-resolution Pan image.
4) Subtract the LPF Pan image from the original Pan image.
This provides the details of the Pan image.
5) Calculate UIQI between the details of each MS image and
details obtained from the high-resolution Pan image.
6) Downscale the Pan image to get a low-resolution Pan
image.
Authorized licensed use limited to: Jocelyn Chanussot. Downloaded on December 15, 2009 at 10:47 from IEEE Xplore. Restrictions apply.
KHAN et al.: PANSHARPENING QUALITY ASSESSMENT
3883
fusion process can be defined based on certain criteria so that
it can be optimized to produce the best results for that particular protocol without producing the best fusion quality. This
indicates that fusion quality-assessment protocols may also be
necessary but are seldom sufficient, i.e., a fusion method results
in an image that statistically provides the desired characteristics
but, in reality, is not close to the reference high-resolution MS
image. To verify the sufficiency property, a fusion method is
developed based on the optimization of the QNR spatial index.
The goal is to develop a pansharpening method which will
render the best QNR values. Once such a method is developed,
it can be shown that a pansharpened image obtained by optimization of the QNR index does not necessarily provide the
image with the best values for the proposed quality-assessment
protocol.
A. Review of QNR
Fig. 3. Spatial quality-assessment procedure. The block marked with Q
calculates all UIQIs in parallel and averages them.
7) Repeat steps 1)–5) for the low-resolution Pan and the
original MS images.
8) Calculate the absolute difference in the UIQI values
across scale of each band.
9) Average the four absolute differences to yield the spatialdistortion index.
The block-diagram representation of the spatial quality assessment is shown in Fig. 3. Note that the same MTF filters
are used for assessing both the spectral and the spatial qualities
of the images. The two spatial filters used to downscale the
Pan image and to extract details of Pan at both scales will be
discussed in Section IV. For the latter task, an approximately
ideal filter will be used instead of an MTF-shaped filter, because
most of Pan images that are commercially distributed are postprocessed for MTF restoration, in order to increase sharpness,
thereby resulting in an almost ideal response of the equivalent
filter (acquisition plus restoration).
Analogously to QNR, UIQI is calculated on image blocks,
including the same portion of scene across scales, before being
averaged to yield a unique value. Hence, if the scale ratio is
four, then the block size is also scaled by four. This means that
if the block size used at the higher resolution is 64 × 64, then
the block size at the lower resolution would be 16 × 16. This
ensures that the same spatial area is analyzed statistically while
making calculations across scale.
QNR protocol calculates the quality of the pansharpened images without requiring a high-resolution reference MS image.
QNR comprises of two indexes, one pertaining to spectral distortion and the other to spatial distortion. As proposed in [26],
the two distortions may be combined together to yield a unique
quality index. However, keeping the two indexes separate is
essential for comparisons with the proposed protocol.
The spectral quality of the fused image is determined by
calculating the spectral distortion QNRDλ between the lowresolution MS images and the fused MS images. Hence, for
determining the spectral distortion, two sets of interband UIQI
values are calculated separately at low and high resolutions. The
differences of corresponding UIQI values at the two scales yield
the spectral distortion introduced by the pansharpening process.
Thus, spectral distortion can be represented mathematically as
N
N
1
Dλ =
Q(Ṁl , Ṁr ) − Q(M̂l , M̂r )
N (N − 1)
l=1 r=1,r=l
(1)
where Ṁ represents the low-resolution MS band, M̂ is the
pansharpened MS band, Q represents UIQI calculation, and N
is equal to the number of MS bands. The QNR spatial distortion
QNRDS is determined by calculating the UIQI between each
MS band and the Pan image at low resolution and, again, at
the high resolution. The difference between the two values
yields the spatial distortion. As defined in [26], QNRDS is
represented as
N 1 (2)
Ds =
Q(Ṁl , Ṗ ) − Q(M̂l , P )
N
l=1
III. V ALIDATION OF P ROPOSED
Q UALITY A SSESSMENT
The proposed quality-assessment protocol can be used alongside the QNR protocol, which was recently proposed by
Alparone et al. [26]. The need for using two separate protocols
arises from the fact that a single protocol may not be sufficient
to assess the quality of pansharpened images at full scale. The
need for more than one protocol arises from the fact that a
where Ṁ represents the low-resolution MS band, Ṗ is the
low-resolution Pan image, M̂ is the pansharpened MS image,
P is the high-resolution Pan image, Q denotes UIQI, and N
represents the number of MS bands.
Thus, the QNR protocol is based on the following criteria.
1) The UIQI interrelationship between the low-resolution
MS bands does not change with resolution. This means
the calculation of six relationships between Red and
Authorized licensed use limited to: Jocelyn Chanussot. Downloaded on December 15, 2009 at 10:47 from IEEE Xplore. Restrictions apply.
3884
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, VOL. 47, NO. 11, NOVEMBER 2009
Green, Red and Blue, Red and Near-Infrared (NIR),
Green and Blue, Green and NIR, and Blue and NIR
images should provide the same results when computed
at the high or at the low resolution.
2) The UIQI relationship between the low-resolution MS
bands and the low-resolution Pan image does not change
when it is recalculated between the high-resolution MS
bands and the high-resolution Pan image. This results in
calculation of four relationships between Red and Pan,
Green and Pan, Blue and Pan, and NIR and Pan images at
the high resolution and at the low resolution, respectively.
B. QNR-Optimizing Injection Model
QNR determines both spectral and spatial qualities without
requiring a reference image. Hence, it is assumed that if a detail
injection model is based upon QNR optimization, it should
ensure both minimum spectral and spatial distortions. Hence,
it is expected that the fused image having the best QNR should
be the best pansharpened image. However, this would require
that QNR is a sufficient criterion and not only necessary, as
demonstrated in [26]. We wish to recall that necessary means
that the true high-resolution image, whenever available, would
yield best possible quality among all fusion methods. This
property can be easily checked either on degraded data or
by means of simulated MS data, e.g., Pléiades. Conversely,
sufficient means that if distortion indexes are the lowest, the
fused image is closest to the ideal reference. The property of
sufficiency can be checked, e.g., by trying to optimize spectral
and spatial distortions of a fusion method, if one finds that they
cannot be forced both to become close to zero, as it happens
with the high-resolution reference image.
In order to develop a fusion method based upon QNR optimization, we begin with the simplest representation of the fusion process. The pansharpening process can be mathematically
defined as
M̂ = M̃ + α ∗ P̈
(3)
where M̂ represents the fused MS image, M̃ represents the
upscaled MS image, and P̈ represents the details of the Pan
image. The upscaled MS image is obtained by using the 23-taps
filter presented in [11]. Since all the details of Pan image should
be added into the upscaled MS image with a proper weight, we
require a detail injection model to obtain pansharpened images
which satisfy a certain criterion. The goal of the detail injection
model could be to find a suitable value of α so that the fused
image has the best QNR. Since QNR is based upon spatialand spectral-distortion measurements, calculated by means of
UIQI, the optimization process requires optimization of ten
simultaneous equations: four of them representing the spatial
distortion and six representing the spectral distortion. Since no
direct relationship exists between the six spectral-distortion and
four spatial-distortion equations, it is difficult to optimize all
ten of them simultaneously. We decided to focus on the four
spatial-distortion equations because optimization of spectral
quality alone may lead to detail injection gains that are all
identically zero. This means that a simple upscaling of the
MS images would also result in the fulfillment of the desired
TABLE II
SPATIAL DISTORTION Ds WITH RESPECT TO AMPLITUDE
AT N YQUIST OF P AN F ILTER
zero-spectral-distortion criteria. Hence, we concentrate on the
fulfillment of the following four conditions, in which LR and
HR represent low resolution and high resolution, respectively,
and Q(·,·) denotes UIQI between two single-band images:
QRed,P an LR ∼
= QRed,P an HR
QGreen,P an LR ∼
= QGreen,P an HR
QBlue,P an LR ∼
= QBlue,P an HR
QN IR,P an LR ∼
= QN IR,P an HR.
(4)
The six equations, representing the MS-band interrelationships,
i.e., the QNR spectral-distortion criteria, which have not been
considered are as follows:
QRed,Green LR ∼
= QRed,Green HR
QRed,Blue LR ∼
= QRed,Blue HR
QRed,N IR LR ∼
= QRed,N IR HR
QGreen,Blue LR ∼
= QGreen,Blue HR
QGreen,N IR LR ∼
= QGreen,N IR HR
QBlue,N IR LR ∼
= QBlue,N IR HR.
(5)
Returning to the set (4), we can demonstrate that each of the
four equations can be solved separately, thereby reducing the
complexity of the optimization algorithm. To satisfy each of
the four QNRDs constraints, Q(M̂ , P ) should be made equal
to Q(Ṁ , Ṗ ) by varying α in (3), which controls the amount
of details that are being injected into the upscaled MS band.
This is achieved by introducing the expression for the fused
MS band (3) into the mathematical expression of UIQI [22],
thereby obtaining a fourth-order equation. By calculating the
roots of such an equation, we get four values of α for each of the
four MS band, for the minimization of QNR spatial-distortion
index only. This method will be referred to as QNROptDS . We
have limited the range of α between zero and one. This is an
arbitrary choice to add details with a weight from zero (none of
the details) to one (all of the details) in the upscaled MS image.
A single root is selected from the four possible solutions which
is between one and zero and is closer to one. The detail injection
weight depends upon the upscaled low-resolution MS band, the
high-resolution Pan image, and the details extracted from the
Pan image. Thus, the detail injection gain calculation for each
MS band is independent of the other bands. It should be noted
that the optimization is done locally, i.e., for each individual
block, rather than globally (the reader will find the detailed
development of the solution in Appendix I).
IV. E XPERIMENTS AND R ESULTS
The proposed fusion and quality-assessment method was
tested on simulated Pléiades, IKONOS, and QuickBird data.
The MS Pléiades images provided by CNES are at both 0.8- and
3.2-m resolutions while the Pan image is at 0.8-m resolution.
Authorized licensed use limited to: Jocelyn Chanussot. Downloaded on December 15, 2009 at 10:47 from IEEE Xplore. Restrictions apply.
KHAN et al.: PANSHARPENING QUALITY ASSESSMENT
3885
Fig. 4. Comparison of different fusion techniques for a simulated Pléiades image as true-color combination of the Blue, Green, and Red bands. The scene
presented is of size 512 × 512 pixels at a 08-m spatial scale. (a) Reference MS image at 0.8 m. (b) 3.2-m MS image interpolated at 0.8 m. (c) eFIHS-SA fused
image. (d) GS fused image. (e) AWLP fused image. (f) Fused image using the proposed QNR-optimization method.
The size of the Pléiades image at 0.8-m resolution is 1024 ×
1024 pixels. This simplifies the fusion process. Since the lowresolution MS image is already present, it can be used alongside
the high-resolution Pan image to obtain the high-resolution
pansharpened MS image. Since the high-resolution MS image
is also present, it can be used to calculate all three Q4, SAM,
and ERGAS indexes. Assuming that we do not have the highresolution MS image, the QNR, Zhou’s, and the proposed
quality-assessment indexes can be used to assess the fusion
quality at full scale, without using the high-resolution reference
MS image. Finally, the results obtained by the two different
types of quality-assessment methods, i.e., those requiring a
reference and those not requiring a reference, can be verified
against each other to check their consistency.
Since the low-resolution Pan image is not available, we tested
different filters to obtain the low-resolution Pan image and
calculated the spatial-quality index on the reference Pléiades
MS image. The distortion indexes obtained are presented in
Table II. Such results show that the spectral distortion of the
high-resolution Pléiades image is closer to zero when the Pan
filter is closer to a likely model of its spectral channels. Using
the Starck and Murtagh Àtrous (S&M) filter to obtain the lowresolution Pan image at 3.2 m and at 12.8 m and the same filter
to obtain the low-resolution MS image at 12.8 m, we assessed
the results of the proposed spatial distortion at two couples
of scales. When it was calculated between the images at 0.8
and 3.2 m, the value of the spatial distortion index was 0.035,
while when calculated between the images at 3.2 and 12.8 m,
the value was 0.045. Thus, we can conclude that the index is
consistent across scale.
For the QuickBird and IKONOS instruments, we only have
the low-resolution MS image and the high-resolution Pan image. Hence, the conventional quality-assessment algorithms,
i.e., Q4, SAM, and ERGAS cannot be used to assess the quality
of the fused images at high resolution. The QNR and proposed
quality-assessment protocols can be used. However, for the
purpose of comparisons among all indexes testing, we degraded
the QuickBird MS and Pan images. The fused MS image can
thus be compared to a reference MS image. Analogous results
can be found with IKONOS. The size of the IKONOS image
used is 2048 × 2048 pixels, and the resolution of the Pan image
is 1 m while the resolution of the MS image is 4 m. The size of
the Quickbird image is 2048 × 2048 pixels, with the resolution
of the Pan image being 0.7 m and of the MS image 2.8 m.
The conventional quality-assessment indexes, namely, Q4,
SAM, and ERGAS, have ideal values of one, zero, and zero, respectively. The QNR index is a quality index in itself. However,
it is based upon spectral- and spatial-distortion information Dλ
Authorized licensed use limited to: Jocelyn Chanussot. Downloaded on December 15, 2009 at 10:47 from IEEE Xplore. Restrictions apply.
3886
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, VOL. 47, NO. 11, NOVEMBER 2009
Fig. 5. Comparison of different fusion techniques for QuickBird image as true-color combination of the Blue, Green, and Red bands. The scene presented is of
size 512 × 512 pixels, at a 0.7-m spatial scale. (a) Pan image at 0.7 m. (b) 2.8-m MS image interpolated at 0.7 m. (c) eFIHS-SA fused image. (d) GS fused image.
(e) AWLP fused image. (f) Fused image using the proposed QNR-optimization method.
and Ds , respectively, having ideal values equal to zero [26].
Zhou’s protocol has the ideal spatial-quality value equal to one
and ideal spectral-distortion value equal to zero [27]. However,
to keep the consistency, the Zhou spatial-quality results are
presented as spatial-distortion results, and hence, the ideal
spatial-distortion value is zero. The proposed method has ideal
spectral- and spatial-distortion indexes tending to zero.
For determining the efficiency of the proposed qualityassessment method, we have compared the results obtained
with other quality-assessment methods. For this purpose, we
have used the QNR optimization-based fusion algorithm, the
Àtrous AWLP [8], GS spectral sharpening [17], [18], and enhanced Fast Intensity Hue Saturation with Spectral Adjustment
(eFIHS-SA) [16] as pansharpening methods for validation. The
reason for this choice is that, among the most widely used
CS methods, GS is considered to produce the best results,
and eFIHS-SA is the fastest. The proposed QNR-optimizing
method is MRA based, same as AWLP, which attained the
second best performance and was rated joint winner of the Data
Fusion Contest held by IEEE GRS-S Data Fusion Committee
in 2006 [35].
Looking at the images shown in Fig. 4, it is clear that the
image obtained by eFIHS-SA fusion is the sharpest. However,
it suffers from spectral distortion, noticeable as color changes.
The image obtained by QNR-optimized fusion is spectrally
similar to AWLP fused image but less sharp. The image
obtained by GS fusion is sharper than both AWLP- and QNRoptimization-based fused images. However, like its counterpart CS method eFIHS-SA, it suffers from a slight spectral
distortion.
For the QuickBird image shown in Fig. 5, no reference MS
image is available at high resolution. Hence, with reference
to the upscaled low-resolution MS image, it is clear that the
image obtained by eFIHS-SA method is spectrally distorted,
yet visually, it is the sharpest of all the fused images. Again, the
QNR-optimized image is spectrally similar to the AWLP fused
image. However, in certain regions, it is less sharp than the
AWLP fused image. Since AWLP method is not region based,
the sharpness of the fused image is uniform throughout the
image. On the contrary, the proposed QNR optimization fusion
method works on local regions. In some regions, more details
can be added while keeping the UIQI constant across scale,
while in certain regions, addition of details does not satisfy the
desired minimized spatial-distortion constraint, and hence, α is
set equal to zero.
In addition, for the IKONOS image shown in Fig. 6, no
reference MS image is available at high resolution. Thus, the
reference of colors (spectral reference) is still the expanded
Authorized licensed use limited to: Jocelyn Chanussot. Downloaded on December 15, 2009 at 10:47 from IEEE Xplore. Restrictions apply.
KHAN et al.: PANSHARPENING QUALITY ASSESSMENT
3887
Fig. 6. Comparison of different fusion techniques for IKONOS image as true-color combination of the Blue, Green, and Red bands. The scene presented is
of size 512 × 512 pixels, at a 1-m spatial scale. (a) Pan image at 1 m. (b) 4-m MS image interpolated at 1 m. (c) eFIHS-SA fused image. (d) GS fused image.
(e) AWLP fused image. (f) Fused image using the proposed QNR-optimization method.
low-resolution MS image. It is evident from the noticeable
change in background color that the image obtained by eFIHSSA method is the most spectrally distorted, even though, visually, it is the sharpest. In addition, GS slightly distorts colors
but less than eFIHS-SA. Again, the QNR-optimized image
is spectrally similar to the AWLP fused image as well as to
the expanded original. However, in certain regions, it is less
sharp than the AWLP fused image and more similar to the
resampled original. The mathematical solution chosen for the
spatial QNR optimizing injection model was aimed at favoring
the quantitative evaluation but has the drawback of penalizing
the visual appearance.
For the purpose of a quantitative analysis, the spatial- and
spectral-quality indexes for Pléiades, QuickBird, and IKONOS
data have been presented in Tables III–VI. Tables III and
IV compare with-reference and without-reference qualityassessment indexes on both full (Pléiades) and degraded scales
(QuickBird). It is note worthy that the proposed no-reference
spectral distortion index (proposed Dλ ), unlike QNR Dλ , is
perfectly in trend with SAM, which is reference based. From
Tables III and IV, it can also be seen that the proposed indexes,
analogously to QNR indexes, rate the reference as the best fused
image. Conversely, Zhou’s indexes state that the reference is
both spatially and spectrally distorted much more than most
of the other fusion methods. This means that Zhou’s protocol
is not even necessary. This incongruence has been recently
remarked by several authors and has motivated the development
of the new indexes.
The proposed spectral-quality index rates the AWLP fused
image as the least spectrally distorted among the pansharpening
methods tested, except on full-scale QuickBird, where AWLP
is second. It can be observed that, for the reference highresolution Pléiades sensor, the proposed quality-assessment
method shows a higher spectral distortion while the spectral distortion for the degraded-resolution QuickBird reference image
is approximately zero. The reason for this difference may lie
in the use of approximate MTF filters. In the case of Pléiades,
we have used the SMF as the MTF filter. The low-resolution
MS image provided by CNES has been obtained by filtering
the high-resolution image by means of filters exactly matching
the laboratory MTFs of the spectral channels, which are not
isotropic but different along and across track. Hence, when we
filter the high-resolution reference MS image using the SMF
filter, the approximation results in a value larger than zero. For
the degraded QuickBird image, the approximate MTF was used
to generate the initial low-resolution image at 11.2-m resolution
from the reference 2.8-m image. When the reference MS image
is degraded using the MTF filter, again, we get the same
Authorized licensed use limited to: Jocelyn Chanussot. Downloaded on December 15, 2009 at 10:47 from IEEE Xplore. Restrictions apply.
3888
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, VOL. 47, NO. 11, NOVEMBER 2009
TABLE III
COMPARISON OF THE PROPOSED SPECTRAL AND SPATIAL QUALITY-ASSESSMENT METHOD WITH OTHER QUALITY-ASSESSMENT
METHODS USING SIMULATED PLÉIADES IMAGES. RANKING OF METHODS IN PARENTHESES
TABLE IV
COMPARISON OF THE PROPOSED SPECTRAL AND SPATIAL QUALITY-ASSESSMENT METHOD WITH OTHER QUALITY-ASSESSMENT
METHODS USING DEGRADED QUICKBIRD IMAGES. RANKING OF METHODS IN PARENTHESES
TABLE V
COMPARISON OF THE PROPOSED SPECTRAL AND SPATIAL QUALITY-ASSESSMENT METHOD WITH OTHER QUALITY-ASSESSMENT
METHODS USING QUICKBIRD IMAGES AT FULL SCALE. RANKING OF METHODS IN PARENTHESES
TABLE VI
COMPARISON OF THE PROPOSED SPECTRAL AND SPATIAL QUALITY-ASSESSMENT METHOD WITH OTHER QUALITY-ASSESSMENT
METHODS USING IKONOS IMAGES AT FULL SCALE. RANKING OF METHODS IN PARENTHESES
low-resolution image, and hence, the error obtained is zero.
If the original low-resolution reference was available, the error
would have been higher as in case of Pléiades sensor because
of the approximation in the instrument MTFs. However, even
with approximations in the MTF filters, the results are in trend,
i.e., the reference is the least spectrally distorted, and that, at
low resolution, the ranking trend is in agreement with the SAM
ranking. This fact suggests that the proposed spectral-quality
index could be used to reliably assess the spectral quality
without either degrading the scale of the data or requiring a
reference high-resolution MS image.
As pointed out in Table II, the proposed DS , analogously
to QNR DS , is moderately sensitive to the spatial filter used
to downscale the Pan image, which is generally unknown.
Analyzing Tables III–VI, one notes that the proposed DS rates
the fusion methods depending upon the relationship between
the details of the Pan and MS images across scale, in strict
analogy with QNR, which however does not use MTF filters
to separate spectral and spatial information. From the tests
conducted, it can be seen that the proposed DS rates the
methods in the same order as ERGAS. Thus, for Pléiades data
set, the QNR-optimized method is the least spatially distorted,
while the eFIHS-SA method is the most spatially distorted. This
is also true for test conducted on degraded QuickBird data set.
ERGAS rates GS better than AWLP; this is the same ranking
done by the proposed DS . In summary, the ranking attained
by the proposed spectral-quality index is in trend with that of
SAM, and the ranking of the proposed spatial-quality index is
in trend with the ranking done by ERGAS.
V. C ONCLUDING R EMARKS
This paper introduced a protocol for quality assessment of
pansharpened imagery at their full spatial scale, i.e., without
requiring a high-resolution reference MS image. The novelty
lies in the use of digital filters matching the shapes of the
MTFs of the spectral channels of the instrument to separate
the low- and high-pass components of the fused MS bands. The
Authorized licensed use limited to: Jocelyn Chanussot. Downloaded on December 15, 2009 at 10:47 from IEEE Xplore. Restrictions apply.
KHAN et al.: PANSHARPENING QUALITY ASSESSMENT
3889
former yields the spectral quality as similarity measure with
the original MS data. The latter originates the spatial-distortion
index from the difference across two scales in the similarity
with the high-pass component of the Pan image.
The protocol was tested on three very high-resolution MS
plus Pan data sets, simulated Pléiades, IKONOS, and QuickBird, in a comparison among four fusion methods, more or
less performing. The proposed method is substantially consistent with the accepted fusion quality-assessment methods,
requiring a reference MS image, i.e., SAM, ERGAS, and Q4.
Although in accordance on an average, the proposed protocol
complements the QNR protocol. Despite the requirement of
the instrument MTF spectral filters, not required by QNR, the
spectral distortion of the proposed protocol is more in trend than
the spectral distortion of QNR with objective measurements
using a reference.
As verified by means of a purposely devised QNR optimizing
fusion method, the spatial distortion of the proposed protocol is
not always in trend with that of QNR. This interesting behavior
suggests that the proposed protocol might be used alongside
the QNR protocol by combining the two couples of spectral
and spatial indexes in order to provide a quality assessment that
has the favorable property of being not only necessary but also
sufficient.
A PPENDIX I
P ANSHARPENING W ITH QNR S PATIAL
O PTIMIZATION C ONSTRAINT
4σ(Ṁl ,Ṗ )
(Ṁl ) · Ṗ
.
·
2
2
(Ṁl ) + Ṗ
cṀl ,Ṗ = cM̂l ,P =
4σ(M̂l ,P )
σ(2M̂ ) + σP2
l
(M̂l ) · P
·
2
2
(M̂l ) + P
(7)
where M̂ and M̃ represent the fused and upscaled MS images,
respectively, and P̈ represents the details of the high-resolution
Pan image. The fused MS image can be represented as
M̂ = M̃ + α ∗ P̈ .
(8)
Substituting (8) in (7), we get
cṀl ,Ṗ =
4σ(M̃l +αl ∗P̈ ,P )
σ(2M̃ +α ∗P̈ ) + σP2
l
l
(M̃l + αl ∗ P̈ ) · P
.
·
2
2
(M̃l + αl ∗ P̈ ) + P
(9)
Substituting each band in the earlier equation, α can be calculated for each band with respect to the Pan image, separately and independently from one another. For the Pléiades,
IKONOS, and QuickBird instruments, there are four MS bands,
and the solution of the earlier equation will provide four values
of α, one for each MS band, that are independent of each other.
Hence, substituting Red MS band in (9) yields
cṀR ,Ṗ =
As aforementioned in Section III, the QNR protocol comprises of spectral- and spatial-quality indexes. These quality
indexes are derived from distortion indexes by taking their
one complement. The spectral distortion Dλ depends upon
the interrelationships between the MS bands using UIQI. This
relationship does not change when the MS bands are upscaled.
Hence, optimization of the spectral-distortion index only does
not provide the necessary condition for obtaining the desired
high-resolution MS image. On the other hand, the relationship
of UIQI between the MS bands and the Pan image at low
resolution is not equal to the relationship between the upscaled
MS bands and the high-resolution Pan image. Thus, it is important to satisfy the QNR spatial-distortion constraint. Focusing
on the spatial-distortion index Ds , the conditions to reduce it
can be calculated. This can be done by setting each of the
terms Q(Ṁl , Ṗ ) − Q(M̂l , P ) in (2) equal to zero. For the lowresolution images, UIQI between each low-resolution MS band
and low-resolution Pan image can be calculated as
UIQI(Ṁl , Ṗ ) = cṀl ,Ṗ =
and the fused MS image should be equal to cṀl ,Ṗ . This implies
that
4σ(M̃R +αR ∗P̈ ,P )
σ(2M̃
R +αR
+ σP2
∗P̈ )
(M̃R + αR ∗ P̈ ) · P
·
2
2
(M̃R + αR ∗ P̈ ) + P
(10)
and solving (10) for αR yields
2
2
4
3
αR
2c σP̈2 P̈ M̃R + σ(M̃R ,P̈ ) P̈
cσP̈2 P̈ + αR
2
2
2
2
2
2
2
+ αR cσP̈ M̃R + P + cP̈ σM̃
+
σ
P
R
+ 4cM̃R P̈ σ(M̃R ,P̈ ) − 4P̈ P σ(P̈ ,P )
2
+ αR 2c M̃R + P
2
2
2
+
σ
σ(M̃R ,P̈ ) + 2cM̃R P̈ σM̃
P
R
− 4M̃R P σ(P̈ ,P ) − 4P̈ P σ(M̃R ,P )
2
2
2
2
σM̃R + σP − 4M̃R P σ(M̃R ,P ) = 0.
+ c M̃R + P
(6)
(11)
Since both the low-resolution MS band Ṁl and the lowresolution Pan image Ṗ are available, cṀl ,Ṗ can be easily
calculated using (6). To reduce the spatial distortion to zero,
the UIQI relationship between the high-resolution Pan image
By substituting any of the other three MS bands, Blue, Green,
or NIR, in (9), we get analogous equations, whose separate
solutions yield four values for αB , αG , and αN IR , respectively.
By observing the structure of (11) and of the analogous equations for the other bands, we conclude that they are dependent
on the Pan image but are independent of each other.
σ(2Ṁ ) + σṖ2
l
Authorized licensed use limited to: Jocelyn Chanussot. Downloaded on December 15, 2009 at 10:47 from IEEE Xplore. Restrictions apply.
3890
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, VOL. 47, NO. 11, NOVEMBER 2009
R EFERENCES
[1] R. Colditz, T. Wehrmann, M. Bachmann, K. Steinnocher, M. Schmidt,
G. Strunz, and S. Dech, “Influence of image fusion approaches on classification accuracy,” Int. J. Remote Sens., vol. 27, no. 15, pp. 3311–3335,
Aug. 2006.
[2] P. Sirguey, R. Mathieu, Y. Arnaud, M. M. Khan, and J. Chanussot, “Improving MODIS spatial resolution for snow mapping using wavelet fusion
and ARSIS concept,” IEEE Geosci. Remote Sens. Lett., vol. 5, no. 1,
pp. 78–82, Jan. 2008.
[3] Q. Du, N. H. Younan, R. King, and V. P. Shah, “On the performance
evaluation of Pan-sharpening techniques,” IEEE Geosci. Remote Sens.
Lett., vol. 4, no. 4, pp. 518–522, Oct. 2007.
[4] B. Aiazzi, S. Baronti, F. Lotti, and M. Selva, “A comparison between
global and context-adaptive pansharpening of multispectral images,”
IEEE Geosci. Remote Sens. Lett., vol. 6, no. 2, pp. 302–306, Apr. 2009.
[5] A. Garzelli and F. Nencini, “Interband structure modeling for Pansharpening of very high resolution multispectral images,” Inf. Fusion,
vol. 6, no. 3, pp. 213–224, Sep. 2005.
[6] L. Wald, T. Ranchin, and M. Mangolini, “Fusion of satellite images
of different spatial resolutions: Assessing the quality of resulting images,” Photogramm. Eng. Remote Sens., vol. 63, no. 6, pp. 691–699,
Jun. 1997.
[7] T. Ranchin and L. Wald, “Fusion of high spatial and spectral resolution
images: The ARSIS concept and its implementation,” Photogramm. Eng.
Remote Sens., vol. 66, no. 1, pp. 49–61, Jan. 2000.
[8] X. Otazu, M. González-Audícana, O. Fors, and J. Núnez, “Introduction
of sensor spectral response into image fusion methods. Application to
wavelet-based methods,” IEEE Trans. Geosci. Remote Sens., vol. 43,
no. 10, pp. 2376–2385, Oct. 2005.
[9] A. Garzelli and F. Nencini, “PAN-sharpening of very high resolution multispectral images using genetic algorithm,” Int. J. Remote Sens., vol. 27,
no. 15, pp. 3273–3292, Aug. 2006.
[10] M. M. Khan, J. Chanussot, L. Condat, and A. Montanvert, “Indusion:
Fusion of multispectral and panchromatic images using induction scaling
technique,” IEEE Geosci. Remote Sens. Lett., vol. 5, no. 1, pp. 98–102,
Jan. 2008.
[11] B. Aiazzi, L. Alparone, S. Baronti, and A. Garzelli, “Context-driven fusion of high spatial and spectral resolution images based on oversampled
multiresolution analysis,” IEEE Trans. Geosci. Remote Sens., vol. 40,
no. 10, pp. 2300–2312, Oct. 2002.
[12] J. G. Liu, “Smoothing filter based intensity modulation: A spectral preserve image fusion technique for improving spatial details,” Int. J. Remote
Sens., vol. 21, no. 18, pp. 3461–3472, Dec. 2000.
[13] P. S. Chavez, J. Stuart, C. Sides, and J. A. Anderson, “Comparison of
three different methods to merge multiresolution and multispectral data:
Landsat TM and SPOT panchromatic,” Photogramm. Eng. Remote Sens.,
vol. 57, no. 3, pp. 295–303, Mar. 1991.
[14] F. Nencini, A. Garzelli, S. Baronti, and L. Alparone, “Remote sensing
image fusion using the curvelet transform,” Inf. Fusion, vol. 8, no. 2,
pp. 143–156, Apr. 2007.
[15] T. M. Tu, S. C. Su, H. C. Shyu, and P. S. Huang, “A new look at IHS-like
image fusion methods,” Inf. Fusion, vol. 2, no. 3, pp. 177–186, Sep. 2001.
[16] T. M. Tu, P. S. Huang, C. L. Hung, and C. P. Chang, “A fast intensityhue-saturation fusion technique with spectral adjustment for IKONOS
imagery,” IEEE Geosci. Remote Sens. Lett., vol. 1, no. 4, pp. 309–312,
Oct. 2004.
[17] C. A. Laben and B. V. Brower, “Process for enhancing spatial resolution
of multispectral imagery using Pan-sharpening,” U.S. Patent 6 011 875,
Jan. 4, 2000.
[18] B. Aiazzi, S. Baronti, and M. Selva, “Improving component substitution pansharpening through multivariate regression of MS+Pan data,”
IEEE Trans. Geosci. Remote Sens., vol. 45, no. 10, pp. 3230–3239,
Oct. 2007.
[19] J. Núnez, X. Otazu, O. Fors, A. Prades, V. Palà, and R. Arbiol,
“Multiresolution-based image fusion with additive wavelet decomposition,” IEEE Trans. Geosci. Remote Sens., vol. 37, no. 3, pp. 1204–1211,
May 1999.
[20] M. González-Audícana, J. L. Saleta, R. G. Catalán, and R. García, “Fusion
of multispectral and panchromatic images using improved IHS and PCA
mergers based on wavelet decomposition,” IEEE Trans. Geosci. Remote
Sens., vol. 42, no. 6, pp. 1291–1299, Jun. 2004.
[21] V. Shah, N. Younan, and R. King, “An efficient Pan-sharpening method
via a combined adaptive-PCA approach and contourlets,” IEEE Trans.
Geosci. Remote Sens., vol. 46, no. 5, pp. 1323–1335, May 2008.
[22] Z. Wang and A. C. Bovik, “A universal image quality index,” IEEE Signal
Process. Lett., vol. 9, no. 3, pp. 81–84, Mar. 2002.
[23] L. Alparone, S. Baronti, A. Garzelli, and F. Nencini, “A global quality
measurement of Pan-sharpened multispectral imagery,” IEEE Geosci. Remote Sens. Lett., vol. 1, no. 4, pp. 313–317, Oct. 2004.
[24] L. Wald, Data Fusion: Definitions and Architectures—Fusion of Images
of Different Spatial Resolutions. Paris, France: ENSMP, 2002.
[25] B. Aiazzi, L. Alparone, S. Baronti, A. Garzelli, and M. Selva,
“MTF-tailored multiscale fusion of high-resolution MS and Pan imagery,” Photogramm. Eng. Remote Sens., vol. 72, no. 5, pp. 591–596,
May 2006.
[26] L. Alparone, B. Aiazzi, S. Baronti, A. Garzelli, F. Nencini, and
M. Selva, “Multispectral and panchromatic data fusion assessment without reference,” Photogramm. Eng. Remote Sens., vol. 74, no. 2, pp. 193–
200, Feb. 2008.
[27] J. Zhou, D. L. Civco, and J. A. Silander, “A wavelet transform method to
merge Landsat TM and SPOT panchromatic data,” Int. J. Remote Sens.,
vol. 19, no. 4, pp. 743–757, Mar. 1998.
[28] C. Thomas and L. Wald, “Comparing distances for quality assessment of
fused images,” in Proc. 26th Symp. Eur. Assoc. Remote Sens. Lab.—New
Developments and Challenges in Remote Sensing, 2007, pp. 101–111.
[29] M. M. Khan, L. Alparone, and J. Chanussot, “Pansharpening quality
assessment using modulation transfer function filters,” in Proc. IEEE Int.
Geosci. Remote Sens. Symp., Jul. 2008, pp. V.61–V.64.
[30] V. Shah, N. Younan, and R. King, “The joint spectral and spatial quality
evaluation of Pan-sharpening algorithms,” J. Appl. Remote Sens., vol. 2,
no. 1, p. 023 531, 2008.
[31] J. L. Starck and F. Murtagh, “Image restoration with noise suppression
using the wavelet transform,” Astron. Astrophys., vol. 288, no. 1, pp. 342–
348, 1994.
[32] M. K. Rangaswamy, “Quickbird II two-dimensional on-orbit modulation
transfer function analysis using convex mirror array,” M.S. thesis, South
Dakota State Univ., Brookings, SD, 2003.
[33] M. K. Cook, B. A. Peterson, G. Dial, L. Gibson, F. W. Gerlach,
K. S. Hutchins, R. Kudola, and H. S. Bowen, “IKONOS technical performance assessment,” Proc. SPIE, vol. 4381, pp. 94–108, 2001.
[34] C. Thomas, T. Ranchin, L. Wald, and J. Chanussot, “Synthesis of multispectral images to high spatial resolution: A critical review of fusion
methods based on remote sensing physics,” IEEE Trans. Geosci. Remote
Sens., vol. 46, no. 5, pp. 1301–1312, May 2008.
[35] L. Alparone, L. Wald, J. Chanussot, C. Thomas, P. Gamba, and
L. M. Bruce, “Comparison of pansharpening algorithms: Outcome of the
2006 GRS-S Data Fusion Contest,” IEEE Trans. Geosci. Remote Sens.,
vol. 45, no. 10, pp. 3012–3021, Oct. 2007.
Muhammad Murtaza Khan (S’07) received the
B.S. degree in electrical engineering from the University of Engineering and Technology, Taxila,
Taxila, Pakistan, in 2000, the M.S. degree in computer software engineering from National University
of Sciences and Technology, Rawalpindi, Pakistan,
in 2005, the M.S. degree in signal, image, audio, and
telecommunication from the Grenoble Institute of
Technology, Grenoble, France, in 2006, where he is
currently working toward the Ph.D. degree in image
processing in the Grenoble Images Speech Signals
and Automatics Laboratory (GIPSA-Lab), Department of Images and Signals.
From November 2000 to July 2004, he was a Software Engineer and, then,
a Senior Software Engineer with Streaming Networks Pvt. Ltd., Islamabad,
Pakistan. His responsibilities included development of pre- and postprocessing
techniques for video processing and optimization of core MPEG 1, 2, and 4
video-encoding algorithms for the Philips TriMedia processor. His research
interests include image and video processing, remote sensing, pansharpening,
genetic algorithms, and embedded systems.
Mr. Khan is a Reviewer of the IEEE TRANSACTIONS ON GEOSCIENCE
AND R EMOTE S ENSING and the IEEE G EOSCIENCE AND R EMOTE S ENSING
LETTERS.
Authorized licensed use limited to: Jocelyn Chanussot. Downloaded on December 15, 2009 at 10:47 from IEEE Xplore. Restrictions apply.
KHAN et al.: PANSHARPENING QUALITY ASSESSMENT
Luciano Alparone received the Laurea degree (cum
laude) in electronic engineering and the Ph.D. degree from the University of Florence, Florence, Italy,
in 1985 and 1990, respectively.
During the spring of 2000 and summer of 2001,
he was a Visiting Researcher with the Tampere International Centre for Signal Processing, Tampere,
Finland. Since 2002, he has been an Associate Professor of electrical communications with the Images and Communications Laboratory, Department
of Electronics and Telecommunications, University
of Florence. He has authored or coauthored over 60 papers in peer-reviewed
journals and a total of 300 publications. His research interests are data compression for remote sensing and medical applications, multiresolution image
analysis and processing, multisensor data fusion, and processing and analysis
of SAR images.
Dr. Alparone was the recipient of the 2004 Geoscience and Remote Sensing
Letters Prize Paper Award for his study on “A global quality measurement of
Pan-sharpened multispectral imagery.”
3891
Jocelyn Chanussot (M’04–SM’04) received the
M.Sc. degree in electrical engineering from the
Grenoble Institute of Technology (INPG), Grenoble,
France, in 1995 and the Ph.D. degree from the University of Savoie, Annecy, France, in 1998.
In 1999, he was with the Geography Imagery
Perception Laboratory for the Delegation Generale
de l Armement (French National Defense Department). Since 1999, he has been with INPG, where
he was an Assistant Professor from 1999 to 2005,
an Associate Professor from 2005 to 2007, and is
currently a Professor of signal and image processing and conducting his
research with the Grenoble Images Speech Signals and Automatics Laboratory
(GIPSA-Lab), Department of Images and Signals. His research interests include
image analysis, multicomponent and hyperspectral image processing, nonlinear
filtering, and data fusion in remote sensing.
Dr. Chanussot is an Associate Editor for the IEEE TRANSACTIONS ON
GEOSCIENCE AND REMOTE SENSING. He was an Associate Editor for the
IEEE GEOSCIENCE AND REMOTE SENSING LETTERS (2005–2007) and for
Pattern Recognition (2006–2008). He is the Chair (2009–2011) and Cochair
(2005–2008) of the GRS Data Fusion Technical Committee and a member
of the Machine Learning for Signal Processing Technical Committee of the
IEEE Signal Processing Society (2006–2008). He is the founding President
of IEEE Geoscience and Remote Sensing French chapter (2007) and a member of the IEEE Geoscience and Remote Sensing Administrative Committee
(2009–2011). He is the General Chair of the first IEEE GRSS Workshop on
Hyperspectral Image and Signal Processing, Evolution in Remote Sensing and
the Program Cochair of the 2009 IEEE International Workshop on Machine
Learning for Signal Processing.
Authorized licensed use limited to: Jocelyn Chanussot. Downloaded on December 15, 2009 at 10:47 from IEEE Xplore. Restrictions apply.
Download