CRC_ChapterAtmosphere_03_GETLATESTFROMRICHARD

advertisement
1
Chapter 2.8 Detecting and Counteracting Atmospheric Effects
Lynne L. Grewe
Computer Science
CSUH
25800 Carlos Bee Blvd
Hayward, CA 94542
grewe@csuhayward.edu
Abstract
Many if not most distributed sensor networks operate in outdoor environments where
atmospheric effects can impede and seriously challenge their successful operation. Most systems
unfortunately are designed to perform in clear conditions. However, any outdoor system
experiences bad atmospheric conditions like fog, haze, rain, smog, snow, etc. Even indoor
systems that are meant to operate under critical conditions like surveillance may encounter their
own atmospheric problems like smoke. To succeed these systems must consider and respond to
these situations. This chapter discusses work in this area.
Atmosphere detection and correction algorithms can be classified into Physic's based modeling
or Heuristic and non-Physic's based approaches. We will discuss these approaches citing
examples. First we will discuss the cause of the problem and how different sensors respond to
the presence of atmosphere.
2.8.1 Motivation: the Problem.
With the increasing use of vision systems in uncontrolled environments, reducing the effect of
atmospheric conditions such as fog in images has become a significant problem. Many
applications (e.g. Surveillance) rely on accurate images of the objects under scrutiny. Poor
visibility is an issue in aircraft navigation [Huxtable 97, Sweet 96, Oakley 96, Moller 94],
highway monitoring [Arya 97], commercial and military vehicles [Barducci 96, Pencikowski
96]. Figure 1 shows an example scene from a visible spectrum camera where the presence of fog
severely impedes the ability to recognize objects in the scene.
Before discussing how particular sensor respond to atmospheric conditions and algorithms to
improve the resulting images, let's discuss what is meant by atmosphere.
2
Figure 1 Building in foggy conditions, visibility of objects is impaired.
2.8.1.1 What is Atmosphere?
Atmosphere whether caused by fog, rain or even smoke involves the presence of particles in the
air. On a clear day, the particles present which make up "air" like Oxygen are so small that
sensors are not impaired in their capture of the scene elements. However, this is not true with the
presence of atmospheric conditions like fog because these particles are larger in size and impede
the transmission of light (electromagnetic radiation) from the object to the sensor through
scattering and absorption.
There have been many models proposed to understand how these particles interact with
electromagnetic radiation. These models are referred to as scattering theories. Models consider
parameters like radius of particle size, wavelength of light, density of particle material, shape of
particles, etc. Selection of the appropriate model is a function of the ratio of the particle radius to
the wavelength of light being considered as well what parameters are known.
When this ratio of particle size to wavelength is near one, most theories used are derived from
the "Mie Scattering Theory" [Mie 08]. This theory is the result of solving the Maxwell equations
for the interaction of an electromagnetic wave with a spherical particle. This theory takes into
account absorption and the refractive index (related to angle which light is bent going through
the particle).
Others theories exist that do not assume the particle has a spherical shape and alter the
relationship between size, distribution, reflection and refraction. Size and shape of these
particles vary. For example, water-based particles range from some microns and perfect spheres
in case of liquid cloud droplets to large raindrops (up to 9 mm diameter), which are known to be
highly distorted. Ice particles have a variety of non-spherical shapes. Their size can be some
millimeters in case of hail and snow, but also extends to the regime of cirrus particles, forming
needles or plates of ten to hundreds of micro-meters.
Most scattering theories use the following basic formula that relates the incident (incoming)
spectral irradiance, E(), to the outgoing radiance in the direction of the viewer, I(, ) , and the
angular scattering function, (, ).
3
Figure 2 Demonstration of light attenuating as it travels through the atmosphere (from
[Narasimhan 01]).
I(, ) = (, ) E()
(Eq. 1)
where  is the wavelength of the light and  is the angle the viewer is at with regards to the
normal to the surface of the atmospheric patch the incident light is hitting. What is different
among the various models is (, ).
An observed phenomenon regarding physic's-based models is that light energy at most
wavelengths () is attenuated as it travels through the atmosphere. In general, as it travels
through more of the atmosphere, the signal will be diminished as shown in Figure 2. This is why
as humans we can only see objects close to us in heavy fog. This is described as the change in
the incoming light as follows (from [McCartney 75]) :
d E(x,)  E(x,) = -() dx
(Eq. 2)
where x is the travel direction and dx is the distance travelled along that direction. Not that ()
called the total scattering function is simply (, ) integrated over all angles. If we travel from
the starting point of x=0 to a distance d, the following will be the irradiance at d :
E(d,) = Eo() exp(-() d)
where Eo() is the initial irradiance at x=0.
(Eq. 3)
4
Figure 3 Cone of atmosphere between an observer and object scatters environmental
illumination in the direction of observer. Thus, acting like a light source, called "airlight" who
brightness increases with path length d. From [Narasimhan 01].
Many of the scattering theories treat each particle as independent of other particles because their
separation distance is many times greater than their own diameter. However, multiple
scatterings from one particle to the next of the original irradiance occur. This phenomenon when
produced from environmental illumination like direct sunlight was coined "airlight" by
[Koschmieder 24]. In this case, unlike the previously described attenuation phenomenon, there
is an increase in the radiation as the particles inter-scatter the energy. Humans experience this
as foggy areas being "white" or "bright". Figure 3 from [Narasimhan 01] illustrates this process.
In [Narasimhan 01], they develop a model that integrates the scattering equation of Eq. 1 with
the "airlight" phenomenon using an overcast environmental illumination model present on most
foggy days.
The reader is referred to [Curry 02], [Day 98], [Kyle 91] and [Bohren 89] for details on specific
atmospheric scattering theories and their equations.
2.8.2 Sensor Specific Issues
Distributed sensor networks employ many kinds of sensors. Each kind of sensor responds
differently to the presence of atmosphere. While some sensors operate much better than others
5
(a)
(b)
(c)
Figure 4 Different sensor images of the same scene, an airport runway. (a) Visible spectrum.
(b) Infrared. (c) Millimeter wave (MMW).
in such conditions, it is true that in most cases there will be some degradation to the image. We
will highlight a few types of sensors and discuss how they are affected.
The system objectives including the environmental operating conditions should select the sensors
used in a sensor network. An interesting work that compares the response of visible, nearinfrared, and thermal-infrared sensors in terms of target detection is discussed in [Sadot 95].
2.8.2.1 Visible Spectrum Cameras
Visual-spectrum, photometric images are very susceptible to atmospheric effects. As shown in
Figure 1 the presence of atmosphere causes the image quality to degrade. Specifically, images
can appear fuzzy, out of focus, objects may be obscured behind the visual blanket of the
atmosphere, and other artifacts may occur. As discussed in the previous section, in fog and
haze, images appear brighter, the atmosphere itself often being near-white, and farther scene
objects are not or fuzzy.
Figure 4 shows a visible spectrum image and the corresponding Infrared and Millimeter wave
images of the same scene. Figure 5 shows another visible-spectrum image in foggy conditions.
2.8.2.2 Infrared Sensors
Infrared (IR) light can be absorbed by water-based molecules in the atmosphere but imaging
using infrared sensors yields better results than visible spectrum cameras. This is demonstrated
in Figure 5. As a consequence IR sensors have been used for imaging in smoke and adverse
atmospheric conditions. For example, infrared cameras are used by firefighters to find people
and animals in smoke filled buildings. IR as discussed in the Image Processing chapter, often is
associated with thermal imaging and is uses in night vision systems. Of course some "cold"
objects, which do not put out IR energy cannot be sensed by an IR sensor and hence in
comparison to a visible spectrum image of the scene may lack information or desired detail as
well as introduce unwanted detail. This lack of detail can be observed by comparing Figure 5(a)
and (c) and noting that the IR image would not change much in clear conditions. Figure 11 below
also shows another pair of IR and visible spectrum images.
6
(a)
(b)
(c)
(d)
Figure 5 Infrared better in imaging through fog that visible spectrum camera ((a-c) from
[Hosgood 03] (d) from [Nasa 03]). (a) Visible Spectrum image (color) on clear day. (b)
Greyscale visible spectrum image on a foggy day. (c) Infrared image on same foggy day. (d)
Visible and IR images of a runway.
2.8.2.3 MMW Radar Sensors
Millimeter radar (MMW) is another good sensor for imaging in atmospheric conditions. One
reason is that compared to many sensors, it can sense at a relatively long range. MMW radar
works by emitting a beam of electromagnetic waves. The beam is scanned over the scene and
the reflected intensity of radiation is recorded as a function of return time. The return time is
correlated with range, and this information is used to create a range image.
In MMW radar, the frequency of the beam allows it in comparison to many other sensors pass by
atmospheric particles because they are too small to affect the beam. MMW radar can thus
"penetrate" atmospheric layers like smoke, fog, etc. However, because of the frequency, the
resolution of the image produced compared to many other sensors is poorer as evidence by
Figure 4 (c), which shows the MMW radar image produced along side of the photometric and IR
images of the same scene. In addition, it is measuring range and not reflected color or other parts
of the spectrum that may be required for the task at hand.
2.8.2.4 LADAR Sensors
Another kind of radar system is that of laser radar or LADAR. LADAR sensors uses shorter
wavelengths than other radar systems (e.g. MMW) and can thus achieve better resolution.
Depending on the application, this resolution may be a requirement. A LADAR sensor sends out
a laser beam and the time of return in the reflection measure the distance from the scene point.
Through scanning of the scene a range image is created. As the laser beam propagates through
the atmosphere, fog droplets and raindrops cause image degradation, which are manifested as
either dropouts (meaning not enough reflection is returned to be registered) or false returns (the
beam is returned from the atmosphere particle itself). [Campbell 98] studies this phenomenon
for various weather conditions yielding a number of performances plots. One conclusion was
that for false returns to occur, the atmospheric moister (rain) droplets had to be at least 3 mm in
diameter. This work was done with a 1.06 micro-meter wavelength LADAR sensor. Results
would change with changing wavelength. The produced performance plots can be used as
thresholds in determining predicted performance of the system given current atmospheric
7
conditions and could potentially be used to select image processing algorithms to improve image
quality.
2.8.2.5 Multispectral Sensors
Satellite imagery systems often encounter problems with analysis and clarity of original images
due to atmospheric conditions. Satellites must penetrate through many layers of the Earth's
atmosphere in order to view ground or near-ground scenes. Hence, this problem has been given
great attention in research. Most of this work involves various remote sensing applications like
terrain mapping, vegetation monitoring, and weather prediction and monitoring systems.
However, with high spatial resolution satellites other less "remote" applications like surveillance
and detailed mapping can suffer from atmospheric effects.
2.8.3 Physic's based Solutions
As discussed earlier, Physic's based algorithms is one of the two calcifications that atmosphere
detection and correction systems can be grouped into. In this section, we describe a few of the
systems that use physic's-based models. Most of these systems consider only one kind of
sensor. Hence in heterogeneous distributed sensor networks one would have to apply the
appropriate technique for each different kind of sensor used. The reader should read section
2.8.1.1 for a discussion on atmosphere and physic's-based scattering before reading this section.
Narasimhan and Nayar [Narasimhan 01] discuss in detail the issue of Atmosphere and visible
spectrum images. In this work, they develop a Dichromatic Atmosphere Scattering model that
tracks how images are affected by atmosphere particles as a function of color, distance, and
environmental lighting interactions ("airlight" phenomenon, see section 2.8.1.1). In general,
when particle sizes are comparable to the wavelength of reflected object light the transmitted
light through the particle will have its spectral composition altered. They discuss for fog and
dens haze, the shifts in the spectral composition for the visible spectrum is minimal and hence
they assume that the hue of the transmitted light to be independent of the depth of the
atmosphere. They postulate that with an overcast sky the hue of "airlight" depends on the
particle size distribution and tends to be gray or light blue in the case of haze and fog. Recall
"airlight" is the transmitted light that originated directly from environmental light like direct
sunlight. Their model for the spectral distribution of light received by the observer is the sum of
the distribution from the scene objects reflected light taking into account the attenuation
phenomenon (see section 2.8.1.1) and airlight. It is similar to the dichromatic reflectance model
in [Shafer 85] that describes the spectral effects of diffuse and specular surface reflections.
[Narasimhan 01] reduce their wavelength-based equations to the Red, Green and Blue color
space and in doing so show that equation relating the received color remains a linear
combination of the scene object transmitted light color, and the environmental "airlight" color.
[Narasimhan 01] hypothesize that a simple way of reducing the effect of atmosphere on an image
would be to subtract the "airlight" color component from the image.
8
Figure 6 [Narasimhan01] fog removal system. (a) and (b) Foggy images under overcast day. (c)
Defogged image. (d) Image taken on clear day under partly cloudy sky.
Hence, [Narasimhan 01] discuss how to measure the "airlight" color component. A color
component (like "airlight") is described by a color unit vector and its magnitude. The color unit
vector for "airligth" can be estimated as the unit vector of the average color in an area of the
image that should be registered as black. However, this kind of calibration may not be possible
and they discuss a method of computing it using all of the color pixels values in an image
formulated as an optimization problem.
Using the estimate of the "airlight" color component's unit vector, and given the magnitude of
this component at just one point in an image, as well as two perfectly registered images of the
scene, [Narasimhan 01] are able to calculate the "airlight" component at each pixel. Subtracting
the "airlight" color component at each pixel value yields an atmosphere corrected image. It is
also sufficient to know the true transmission color component at one point to perform this
process. Figure 6 shows the results of this process. This system requires multiple registered
images of the scene as well as true color information at one pixel in the image. If this is not
possible, a different technique should be applied. Obviously this technique works for visiblespectrum images and under the assumption that the atmosphere condition does not (significantly)
alter the color of the received light.
In [Richter 98] a method for correcting satellite imagery taken over mountainous terrain has been
developed to remove atmospheric and topographic effects. The algorithm accounts for
horizontally varying atmospheric conditions and also includes the height dependence of the
atmospheric radiance and transmittance functions to simulate the simplified properties of a threedimensional atmosphere. A database was compiled that contains the results of radiative transfer
calculations for a wide range of weather conditions. A digital elevation model is used to obtain
information about the surface elevation, slope and orientation. Based on Lambertian
assumptions the surface reflectance in rugged terrain is calculated for the specified atmospheric
conditions. Regions with extreme illumination geometries sensitive to BRDF effects can be
processed separately. This method works for high spatial resolution satellite sensor data with
small swath angles.
Figure 7 shows the results of the method in [Richter 98] on satellite imagery.
9
(a)
(b)
(c)
(d)
(e)
(f)
Figure 7 Images from [Richter 98], system to model and remove atmospheric effects and
topographic effects. (a) DEM- digital elevation model. (b) Sky view factor (c) Illumination
image. (d) Original TM band 4 image (e) Reflectance image without processing of low
illumination areas. (f) Reflectance image with processing of low illumination areas.
2.8.4 Heuristics and Non-Physic's based Solutions
Heuristic and non-physic's based approaches represent the other paradigm of atmosphere
detection and correction algorithms. These algorithms do not attempt to directly model the
physics behind the atmosphere. There is a wide range of algorithms some based on empirical
data, others on observations and others that alter the sensing system itself.
Work has been done to specifically detect image areas where various atmospheric conditions are
present with the idea that further processing to enhance the image in these areas could then be
done. In particular, there is a body of research on cloud detection. After detection, some of
these systems attempt to eliminate the clouds while most use this information for weather
analysis.
10
Table 1: MTI bands
In [Rohde 01] a system is discussed to detect dense clouds in daytime Multispectral Thermal
Imager (MTI) Satellite images. The [Rohde 01] system uses 15 spectral bands (images) shown
in Table 1 that ranges from visible wavelengths to longwave Infrared. Rhode, et all hypothesize
that clouds in the visible wavelength spectrum are bright and evenly reflect much of the
wavelengths from visible to near-Infrared. Also, clouds appear higher in the atmosphere than
most image features and are hence, colder and drier than other features. Recall that the IR
spectrum is a measure of temperature. Also, for the spatial resolution of their images, clouds
cover large areas of the image. Given these characteristics, they have come up with a number of
parameters that can be used to classify pixels as belonging to a cloud or not.
First the pixels are subjected to a thresholding technique in both the visible and IR ranges that
threshold on "clouds being bright" and "even-reflection" properties. A minimum brightness
values is selected and in the visible range (band C) any pixel above this brightness value is
retained as a potential cloud pixel, all others are rejected. In the infrared band N an upper limit
on temperature is given and used to reject pixels as potential cloud pixels.
Next, the "whiteness" property is tested by using a ratio of the difference of the E and C bands to
their sum. Evenly reflected or "white" pixels will have a ratio around zero. Only pixels with
ratios near zero will be retained as potential cloud pixels. The last thresholding operation is done
using CIBR, which stands for Continuum Interpolated Band Ratio (see [Gao 90] for details).
CIBR can be used as a measure of the "wetness" (and thus "dryness") in the Infrared spectrum.
CIBR is a ratio of band F to a linear combination of bands E and G.
At this point the system has a set of pixels that conform to being "bright", "white", "cold", and
"dry". What is left is to group nearby pixels and test if they form regions large enough to
indicate a cloud. This is accomplished through the removal to blobs too small to indicate clouds
via a morphological opening operator (see chapter on Image Processing Background). The final
result is a binary "cloud map" where the non-zero pixels represent cloud pixels. The system
11
Figure 8 Results of Fusion system for Atmosphere Correction [Grewe 98]. (a) One of two
original foggy images, visible spectrum. (b) Fussed image from two foggy images. (c) Blow-up
of portion of an original foggy image. (d) Corresponding region to image c for fused image.
seems to work well on the presented images but its performance is tuned to the spatial resolution
of the images and is not extensible to non-MTI like satellite imagery. However, combinations of
photometric images with IR camera images could use these techniques and there do exist a
number of sensor networks with these two kinds of imaging devices.
[Grewe 01] and [Grewe 98] describe the creation of a system for correction of images in
atmospheric conditions. [Grewe 98] describes a system that uses Multi-Image Fusion to reduce
atmospheric attenuation in visible spectrum images. The premise is that atmospheric conditions
like fog, rain, and smoke are transient. Airflow moves the atmosphere particles in time such that
at one moment certain areas of a scene are clearly imaged while at other moments other areas are
visible. By fusing multiple images taken at different times, we may be able to improve the
quality. Figure 8 shows typical results of the system when only two images are fused. It was
discovered that the fusion engine parameters are a function of the type and level of atmosphere in
the scene. Hence, this system first detects the level and type of atmospheric conditions in the
image ([Grewe 01]) and this is used to select which fusion engine to apply to the image set. The
system uses a Wavelet Transform to both detect atmospheric conditions as well as fuse multiple
images of the same scene to reduce these effects. A neural network is trained to detect the level
and type of atmosphere using a combination of wavelet and spatial features like brightness, focus
and scale-changes. Typical results are shown in Figure 9.
In [Honda 92] a system is developed to detect moving objects in a time-series sequence of
images that is applied to cloud. Again the fact that clouds move through time is taken advantage
of. Here a motion detection algorithm is applied. This work only looks at extracting these
clouds for further processing for weather applications no correction phase takes place.
There are also systems that use sensor selection to avoid atmospheric distortions. Sweet and
Tiana [Sweet 96] discuss a system in which the sensors are selected to penetrate obscuring visual
phenomena such as fog, snow, smoke for the application of enabling aircraft landings in low
visibility conditions. In particular, the use of infrared (IR) and millimeter wave (MMW) imaging
radar is investigated.
Figure 10 shows the components of the [Sweet 96] system and Figure 11 shows some images
from that system. Both the IR and MMW radar images go through a pre-processing stage
12
(a)
(b)
(c)
(d)
Figure 9 Detection of level and type of Atmosphere Conditions [Grewe 01]. (a) Original Image.
(b) Application showing pseudo-colored detection image, blue= no fog, green= light fog, pink =
fog. (c) Original image with cloud cover. (d) Superimposed pseudo-color detection map where
green = heavy atmosphere (cloud), grey = no atmosphere.
followed by registration and fusion. The pilot through a Head-Up Display is able to select the
IR, MMW or fused images to view for assistance in landing.
[Burt 93] discuss a system similar to [Sweet 96] but MMW images are not used in the fusion
process.
Another possible solution to the atmospheric correction problem is to treat it more generically as
an image restoration problem. In this case, there is some similarity with blurred images and
noisy images. There are a number of techniques, some discussed in the Chapter on Image
Processing, like de-blurring or sharpening filters that could improve image quality. See also
[Banham 96] and [Kundur 96] for algorithms for de-blurring images.
2.8.5 Conclusion
Unless you can effectively model the atmosphere, a very challenging problem, heuristics and
non-physic's based solutions may be the only viable solutions for a distributed sensor network.
Even in the case of accurate modeling, the model would need to be dynamic and alter with the
temporal changes to the atmosphere.
Very little work has been done in comparing techniques. However, in [Nikolakopoulos 02] the
authors compare two algorithms one that explicitly models the atmosphere using a number of
environmental parameters and another that uses a heuristic approach. The tests were done on
multispectral data and they found superior results for the model-based technique. Unfortunately
this work only compares two very specific algorithms and the heuristic approach is simplistic as
it involves only histogram shifting. So, this should not be taken as an indication that modelbased techniques are superior. If the assumptions and measurements made in the model are
accurate, it is reasonable to assume that model-based techniques will give good results. The
information you have, the environments you wish your system to operate in and finally empirical
testing is what is required to select the best atmospheric correction technique for your system.
13
Figure 10 Components of a sensor system using IR and MMW radard to aid in landing aircraft
in low visibility conditions like fog. [Sweet 96].
Figure 11 Images from [Sweet 96] (upper left) Visible Spectrum Image. (upper right) Infrared
image. (lower) Fused Infrared and visible spectrum images.
14
References
[Arya 96] V. Arya, P. Duncan, M. Devries, R. Claus, “Optical Fiber Sensors for Monitoring Visibility”, SPIE
Transportation Sensors and Controls: Collisions Avoidance, Traffic Management, and ITS, pp. 212-218,. Nov.
1996.
[Banham 96] M. Banham, A. Katasaggelos, "Spatially Adaptive Wavelet-Based Multiscale Image Restoration",
IEEE Transactions on Image Processing, Vol 5, No. 4, pp. 619-634, 1996.
[Barducci 95] A. Barducci, I. Pippi, “Retrieval of Atmospheric Parameters from Hyperspectral Image Data”,
International Geoscience and Remote Sensing Symposium, pp. 138-140, July 1995.
[Bohren 89] C. Bohren, Selected Papers on Scattering in the Atmosphere", SPIE, 1989.
[Brooks 01] R. Brooks, L. Grewe and S. Iyengar, "Recognition in the Wavelet Domain", Journal of Electronic
Imaging, July 2001.
[Brooks 98] R. R. Brooks and S. S. Iyengar, Multi-Sensor Fusion: Fundamentals and Applications with Software,
Prentice Hall PTR, Saddle River, NJ, 1998.
[Burt 93] P. Burt, R. Kolezynski, "Enhanced Image Capture through Fusion", IEEE 4 th International Conference on
Computer Vision, pp. 173-182, 1993.
[Campbell 98] K. Campbell, Performance of Imaging Laser Radar in Rain and Fog, Master's Thesis, WrightPatterson AFB, 1998.
[Curry 02] J. Curry, J. Pyle, J. Holton, Encyclopedia of Atmospheric Sciences, Academic Press, 2002.
[Day 98] J. Day, V. Schaefer, C. Day, A Field Guide to the Atmosphere, Houghton Mifflin Co, 1998.
[Gao 90] B. Gao, A. Goetz , "Column Atmospheric Water Vapor and Vegation Liquid Water Retrievals from
Airborne Imaging Spectrometer Data," J. Geophys. Res., Vol. 95, pp. 3549-3564, 1990.
[Grewe 01] L. Grewe, "Detection of Atmospheric Conditions in Images", SPIE AeroSense: Signal Processing,
Sensor Fusion and Target Recognition, April 2001.
[Grewe 98] L. Grewe, R. Brooks, K. Scott, "Atmospheric Attenuation through Multi-Sensor Fusion", SPIE
AeroSense: Sensor Fusion: Architectures, Algorithms, and Applications II, April 1998.
[Honda 02] R. Honda, S. Wang, T. Kikuchi, O. Konishi, "Mining of Moving Obejcts from Time-Series Images and
its Application to Satellite Weather Imagery", Journal of Intelligent Information Systems, Vol. 19:1, pp. 79-93,
2002.
[Hosgood 03] B. Hosgood, "Some examples of Thermal Infrared Applications", IPSC website, http://humanitariansecurity.jrc.it/demining/infrared_files/IR_show4/sld006.htm, 2002.
[Huxtable 97] B. Huxtable, et. al. “A synthetic aperture radar processing system for Search and Rescue”, SPIE
Automatic Target Recognition VII, pp. 185-192, April 1997.
[Koschmieder 24] H. Koschmieder, "Theorie der horizontalen sichtweite, " Beitr. Phys. Freien Atm., Vol 12:33-53,
pp. 171-181, 1924.
[Kundur 96] D. Kundur, D. Hatzinakos, "Blind Image Deconvolution", IEEE Signal Processing Magazine, Vol
13(3), pp. 43-64, 1996.
15
[Kyle 91] T. Kyle, Atmospheric Transmission, Emission and Scattering, Pergamon Press, 1991.
[McCartney 75] E. McCartney, Optics of the Atmosphere: Scattering by Molecules and Particles, John Wiley and
Sons, NY, 1975.
[Mie 08] G. Mie, "A contribution to the optics of turbid media, especially colloidal metallic suspensions", Ann. Of
Physics, Vol 25(4), pp. 377-445, 1908.
[Moller 94] H. Moller, G. Sachs, “Synthetic Vision for Enhancing Poor Visibility Flight Operations”, IEEE AES
Systems Magazine, pp. 27-33, March 1994.
[Oakley 96] J. Oakley, B. Satherley, C. Harrixon, C. Xydeas, “Enhancement of Image Sequences from a ForwardLooking Airborne Camera”, SPIE Image and Video Proeccing IV, pp. 266- 276, Feb. 1996.
[Nasa 03] IPAC, NASA, "What is Infrared?", http://sirtf.caltech.edu/EPO/Kidszone/infrared.html, 2003.
[Narasimhan 01] S. Narasimhan, S. Nayar, "Vision and the Atmosphere", International Journal of Computer Vision,
48(3), pp. 233-254, 2002.
[Nikolakopoulos 02] K. Nikolakopoulos, D. Vaiopoulos, G. Skiani, "A Comparative Study of Different
Atmospheric Correction Algorithms over an Area with Complex Geomorphology in Western Peloponnese, Greece",
IEEE International Geoscience and Remote Sensing Symposium Proceedings, Vol 4, pp. 2492-2494, 2002.
[Pencikoswki 96] P. Pencikowski, “A Low Cost Vehicle-Mounted Enhanced Vision System Comprised of a Laser
Illuminator and Range-Gated Camera”, SPIE Enhanced and Synthetic Vision, pp. 222-227, April 1996.
[Richter 98] R. Richter, "Correction of Satellite Imagery over Mountainous Terrain", Applied Optics, Vol. 37, No.
18, pp. 4004-40015, 1998
[Rohde 01] C. Rohde, K. Hirsch, A. Davis, "Performance of the Interactive Procedures for Daytime Detection of
Dense Clouds in the MTI pipeline", Algorithms for Multispectral, Hyperspectral, and Ultraspectral Imagery VII,
SPIE Vol. 4381, pp. 204-212, 2001.
[Sadot 95] D. Sadot, N. Kopeika, S. Rotman, "Target acquisition modeling for contrast-limited imaging: effects of
atmospheric Blur and Image Restoration", J. Opt. Soc. Am. Vol 12, No.11, pp.2401-2414, 1995.
[Shafer 85] S. Shafer, "Using color to separate reflection components", Color Research and Applications, pp. 210218, 1985.
[Sweet 96] B. Sweet, C. Tiana, "Image Processing and Fusion for Landing Guidance", SPIE Vol. 2736, pp. 84-95,
1996.
Download