Uploaded by Engr Shahid

002. FUNDAMENTALS OF REMOTE SENSING (505)

advertisement
FUNDAMENTALS OF REMOTE SENSING (505)
CHAPTER – 1
BASICS OF RS
(Dr. Mohd. Shamsul Alam)
* Wave Length and Frequency are crucial subject in RS.
* Scattering: Aerosol, Dust, Diatom - everything contributes on scattering.
* Atmospheric window: Solar energy can come and go towards sensor through the window.
* Only 4% to 6% of Solar Energy (Due to atmospheric incoming and outgoing absorption) is available for
satellite remote sensing only via atmospheric window.
** Please refer: http://www.fis.uni-bonn.de/en/recherchetools/infobox/professionals/what-remote-sensing
1. What is remote sensing?
Remote sensing can be defined as the acquisition and recording of information about an object
without being direct contact with that object. Thus, our eyes and ears are remote sensors, and the
same is true for cameras and microphones and for many instruments used for all kinds of
applications.
According to Canada Centre for Remote Sensing (CCRS): "Remote sensing is the science of
acquiring information about the Earth's surface without actually being in contact with it. This is
done by sensing and recording reflected or emitted energy and processing, analyzing, and applying
that information."
Remote sensing images are obtained at distances that fall within three broad ranges:
1. Sensors carried by aircraft generally obtain images at heights of 500 m to 20 km.
2. Sensors carried by spacecraft and satellites operate at distances of 250 km to 1,000 km.
3. Very high-altitude satellites operate 36,000 km above the Earth. These are geostationary
2. Draw a graphic sketch to show the essential components of remote sensing. Or:
Explain the remote sensing process from beginning to end. Or: Explain the
principles and processes included in remote sensing.
Remote Sensing process is exemplified by the use of imaging systems where the following seven
elements are involved from beginning to end.
i)
Energy Source or Illumination (A)
ii)
Radiation and the Atmosphere (B)
iii) Interaction with the Target (C)
iv) Recording of Energy by the Sensor (D)
v)
Transmission, Reception, and Processing (E)
vi) Interpretation and Analysis (F)
vii) Application (G)
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
Principles and Processes included in Remote Sensing
Remote Sensing Process is exemplified by the use of imaging systems where the following seven
elements are involved.
1. Energy Source or Illumination (A) - the
first requirement for remote sensing is to have an energy source which illuminates or provides
electromagnetic energy to the target of interest.
2. Radiation and the Atmosphere (B) - as
the energy travels from its source to the target, it will come in contact with and interact with the
atmosphere it passes through. This interaction may take place a second time as the energy travels
from the target to the sensor.
3. Interaction with the Target (C) - once the energy makes its way to the target through the
atmosphere, it interacts with the target depending on the properties of both the target and the
radiation.
4. Recording of Energy by the Sensor (D) - after the energy has been scattered by, or emitted
from the target, we require a sensor (remote - not in contact with the target) to collect and record
the electromagnetic radiation.
5. Transmission, Reception, and Processing (E) - the energy recorded by the sensor has to be
transmitted, often in electronic form, to a receiving and processing station where the data are
processed into an image (hardcopy and/or digital).
6. Interpretation and Analysis (F) - the processed image is interpreted, visually and/or digitally or
electronically, to extract information about the target which was illuminated.
7. Application (G) - the final element of the remote sensing process is achieved when we apply the
information, we have been able to extract from the imagery about the target in order to better
understand it, reveal some new information, or assist in solving a particular problem. These seven
elements comprise the remote sensing process from beginning to end.
3. What are the major advantages of remote sensing in environmental
monitoring?
The major advantages of remote sensing in environmental monitoring
Following are the major advantages of remote sensing in environmental monitoring:
i) Environmental and Nature analysis and assessments, inventories, monitoring and research,
ii) Landscape architecture analysis, planning and design,
iii) Environmental modeling (climatological, hydrological and hydro dynamical), geo-statistics,
spatio-temporal modeling and GIS (including analysis of data gathered by remote sensing),
iv) Natural resource management (including spatial planning, agriculture, forestry and game
management).
v) Remote Sensing applications to the
 Mining environment,
 Urban environment management,
 Coastal and marine environment,
 Wasteland environment etc.
 Environmental impact assessment,
 Impact caused of urban development,
 Mining and changes that appeared due to the human factor or natural factors,
 RS technology can help support the management of water, land, aquatic ecosystems and
biodiversity,
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
Table 1: Application of Remote Sensing in Environmental Studies
Atmospheric
Hydrological
Natural and Man- Land use Planning Environmental
Parameters
Parameters
made Hazards
Protection
 Environmental
 Aerosol
 Water Quality o Floods, Tsunami,  Land use/Land
Earthquakes,
cover changes
Impact
 Fog
 Soil Moisture
Landslides
 Urban Planning
Assessment
 Black Carbon  Sea Surface
Mapping
and

Urban
Heat
Temperature
 Dust Storm
Risk Assessment
Islands
 Clouds Optical
 Ozone and
o
Droughts

Agriculture
Properties
other trace
o Epidemic
 Forests (land &
gases
 Snow Cover
Mapping
coastal)
o Forest Fires
 Coastal zone
monitoring
4. Explain two utility point of view of RS application.

Highway departments produces topographic map using satellite imagery and terrain analysis
for trafficability of the highways.

Light detection and ranging (LiDAR) is well known in examples of weapon ranging, laser
illuminated homing of projectiles. LIDAR is used to detect and measure the concentration of
various chemicals in the atmosphere, while airborne LIDAR can be used to measure heights
of objects and features on the ground more accurately than with radar technology. Vegetation
remote sensing is a principal application of LIDAR.

Hyperspectral imaging produces an image where each pixel has full spectral information
with imaging narrow spectral bands over a contiguous spectral range. Hyperspectral
imagers are used in various applications including mineralogy, biology, defence, and
environmental measurements.
5. Describe the chronological development of remote sensing technology. (Fall
2016)
Brief History of Remote Sensing i.e. chronological development of remote sensing technology:
1826 The invention of photography
1850’s Photography from balloons
1873 Theory of electromagnetic energy by J. C. Maxwell
1909 Photography from airplanes
1910’s World War I: aerial reconnaissance
1920’s Development and applications of aerial photography and photogrammetry
1930’s Development of radar in Germany, USA, and UK
1940’s World War II: application of Infrared and microwave regions
1950’s Military Research and Development
1960’s the satellite era: Space race between USA and USSR.
1960 The first meteorological satellite (TIROS-1)
1960’s First use of term “remote sensing”
1960’s Skylab remote sensing observations from the space
1972 Launch of the first earth resource satellite (Landsat-1)
1970’s Rapid advances in digital image processing
1980’s Landsat-4: new generation of Landsat sensors
1986 Launch of French earth observation satellite (SPOT-1)
1980’s Development of hyperspectral sensors
1990’s Launch of earth resource satellites by national space agencies and commercial companies
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
6. What is EMR? (Spring 2018)
EMR (Electromagnetic Radiation):
The first requirement for remote sensing is to have an energy source to illuminate the target
(unless the sensed energy is being emitted by the target). This energy is in the form of
electromagnetic radiation.
Electromagnetic radiation is a form of energy emitted by all matter above absolute zero
temperature (0 Kelvin or -273° Celsius). X-rays, ultraviolet rays, visible light, infrared light, heat,
microwaves, and radio and television waves are all examples of electromagnetic energy.
7. How EM waves propagate? (Spring 2018)
EM waves propagate:
Electromagnetic radiation consists of an electrical field (E) which varies in magnitude in a
direction perpendicular to the direction in which the radiation is traveling, and a magnetic field
(M) oriented at right angles to the electrical field. Both these fields travel at the speed of light (c).
Two characteristics of electromagnetic radiation are particularly important for understanding
remote sensing. These are the wave length and frequency.
Fig.: Electromagnetic
Waves
Electric field
Magnetic field
C = Speed of light
The wave length is the length of one wave cycle, which can be measured as the distance
between successive wave crests. Wave length is usually represented by the Greek letter
lambda (). Wave length is measured in metres (m) or some factor of metres such as
nanometers (nm, 10-9 metres), micrometres (m, 10-6 metres) or centimetres (cm, 10-2
metres). Frequency(f) refers to the number of cycles of a wave passing a fixed point per unit
of time. Frequency is normally measured in hertz (Hz), equivalent to one cycle per second,
and various multiples of hertz.
Wave length and frequency are related by the following formula:
C = f
Where:
 = Wavelength (m)
f = Frequency (cycles per second, Hz)
C = Speed of light (3 × 10 / )
Therefore, the two are inversely related to each other. The shorter the wavelength, the higher
the frequency. The longer the wavelength, the lower the frequency. Understanding the
characteristics of electromagnetic radiation in terms of their wavelength and frequency is
crucial to understanding the information to be extracted from remote sensing data.
8. What is Electromagnetic spectrum (EMS)?
The Electromagnetic Spectrum and Bands: It is measurements of electromagnetic radiation.
The electromagnetic spectrum is a continuum of energy from short wave high frequency cosmic
waves to longer wavelength is low frequency radio waves.
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
Our eyes are sensitive to the visible part of the electromagnetic spectrum 0.4µm to 0.7µm. Within
the visible spectrum our eyes can see the different color’s which are variations in the wavelengths.
The EM spectrum can be divided into seven different regions --- gamma rays, X-rays,
ultraviolet, visible light, infrared, microwaves and radio waves.
9. Describe about absorption in the Atmosphere and atmospheric window
Electromagnetic radiation is reflected or absorbed mainly by several gases in the Earth's atmosphere,
among the most important being water vapor, carbon dioxide, and ozone. Some radiation, such as
visible light, largely passes (is transmitted) through the atmosphere. These regions of the spectrum
with wavelengths that can pass through the atmosphere are referred to as "atmospheric windows."
Some microwaves can even pass through clouds, which make them the best wavelength for
transmitting satellite communication signals.
(a) Absorption characteristics of gaseous components of atmosphere.
(b) Transmission through the atmosphere as a function of wavelength and the location of atmospheric windows.
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
Absorption in the Atmosphere
Although electromagnetic radiation of all wavelengths emitted by the Sun reaches the top of the
atmosphere, only radiation within specific wavebands can pass through the atmosphere to reach the
surface of the Earth. This is because the gaseous components of the atmosphere act as selective
absorbers. Different molecules absorb different wavelengths.
 Nitrogen, the commonest gaseous component of the atmosphere, has no prominent absorption
features apart from an absorption band at wavelengths less than 0.1µm.
 Oxygen absorbs in the ultraviolet and also has an absorption band centred on 6.3 µm.
 Carbon dioxide prevents a number of wavelengths reaching the surface. A broad absorption band
exists between 14 and 17 µm and narrower ones occur at 2.7 µm and 4.5 µm (Figure a).
 Water vapour is an extremely important absorber of electromagnetic radiation within the infrared
part of the spectrum. Absorption bands also exist at 1.4 µm, 2.7 µm and 6.3 µm.
 Ozone (O3) has an important absorption blind within the ultraviolet, hence the concern about the
depletion of this gas in the atmosphere.
The combined effects of the absorption characteristics of the atmospheric gases mean that:
1. electromagnetic radiation at particular wavelengths is totally absorbed and does not reach the
Earth’s surface;
2. electromagnetic radiation at particular wavelengths is partially absorbed and only a proportion
that reaches the Earth's outer atmosphere passes through the atmosphere to reach the ground;
3. electromagnetic radiation at particular wavelengths is unaffected by atmospheric absorption and
the virtually all that reaches the Earth’s outer atmosphere reaches the surface.
Atmospheric Windows
The wavelengths at which electromagnetic radiation are partially or wholly transmitted through the
atmosphere are known as atmospheric windows (Table and Figure b).
The major atmospheric windows occur:
1. within the visible and photographic infrared range where there is approximately 95 per cent
transmission of the radiation across a broad atmospheric window. The window extends into
the higher wavelength sections of the ultraviolet and thus encompasses a total waveband of
0.3—1.0 µm;
Table below: Wavelength of atmospheric windows
Atmospheric window
Waveband (µm)
Upper ultraviolet - photographic infrared
0.3-1.0 (approx.)
Reflected infrared
1.3, 1.6, 2.2
Thermal infrared
3.0-5.0
Thermal infrared
8.0-14.0
Microwave
> 5,000
within the reflected infrared part of the electromagnetic spectrum cencred on specific narrow
wavebands, 1,3 µm, 1.6µm and 2.2 µm;
3. within two broad bands in the thermal infrared at 3—5 µm and 8-14 µm;
4. wavelengths greater than about 0.5 cm because the atmosphere is transparent to
electromagnetic radiation at microwave wavelengths. To obtain information in this part of the
spectrum, either very small signals must be measured or an artificial source of microwaves is
required.
2.
The sensors on remote sensing systems must be designed in such a way as to obtain their data within
these well-defined atmospheric windows.
More about Atmospheric Windows
Some wavelengths cannot be used in remote sensing because our atmosphere absorbs essentially all
the photons at these wavelengths that are produced by the sun. In particular, the molecules of water,
carbon dioxide, oxygen, and ozone in our atmosphere block solar radiation. The wavelength ranges
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
in which the atmosphere is transparent are called atmospheric windows. Remote sensing projects
must be conducted in wavelengths that occur within atmospheric windows. Outside of these
windows, there is simply no radiation from the sun to detect--the atmosphere has blocked it.
10. What is absorption?
Absorption is the other main mechanism at work when electromagnetic radiation interacts with the
atmosphere. In contrast to scattering, this phenomenon causes molecules in the atmosphere to
absorb energy at various wavelengths. Ozone, carbon dioxide, and water vapour are the three main
atmospheric constituents which absorb radiation.
Interactions with the Atmosphere
Before radiation used for remote sensing reaches the Earth's surface it has to travel through some
distance of the Earth's atmosphere. Particles and gases in the atmosphere can affect the incoming
light and radiation. These effects are caused by the mechanisms of Scattering and Absorption.
11. What is scattering? Explain different types of scattering.
Scattering occurs when particles or large gas molecules present in the atmosphere interact with and
cause the electromagnetic radiation to be redirected from its original path. How much scattering takes
place depends on several factors including the wavelength of the radiation, the abundance ( াচুয)
of particles or gases, and the distance the radiation travels through the atmosphere.
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
 Scattering of electromagnetic radiation by aerosols in the atmosphere is a phenomenon that we
experience in our everyday lives.
 When you walk into the shadow of a building out of direct sunlight, although you are now in the
shade, you are able to see because of scattering in the atmosphere.
 This is because incoming radiation is scattered by the aerosols and indirectly provides illumination
(Figure a). The Scattering mechanisms can he selective or non-selective.
 In selective scattering, the relative size of the particles in the atmosphere and the wavelength of
the electromagnetic radiation is important
 whereas in non-selective scattering the dimensions of the particles and the wavelengths are not
relevant.
Why Scattering occurs? Due to diffusion of incident radiation by atmospheric particles, e.g. by haze
and water droplets as well as by molecules. Such diffused radiation is often refracted many times in
passing through the atmosphere.
There are three (3) types of scattering which take place.
Selective Scattering
i)
Rayleigh Scattering: Rayleigh scattering mainly consists of scattering from
atmospheric gases. Rayleigh scattering (also termed molecular scattering) occurs when
the dimensions of the stutterers are small (less than one-tenth the size) compared with
the wavelengths of the electromagnetic radiation. Molecules of oxygen and nitrogen
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
(commonest constituents of the atmosphere) fulfil this role for visible radiation. The
amount of scattering is inversely proportional to the fourth power of the wavelength.
Within the visible range of the electromagnetic spectrum, blue light is scattered to a
much greater degree than green or red. This is the principal reason why the sky appears
blue - this waveband is scattered in all directions including towards the surface. As one
moves vertically up through the atmosphere, it becomes darker because there are fewer
scattering aerosols present at higher altitudes. At sunset we often see spectacular reds
and yellows in the sky. This is because the radiation has had to travel through a greater
thickness of atmosphere, causing a greater degree of scattering, and thus most of the
shorter wavelength radiation has been scattered away and the longer wavelengths’
colours reach our eyes (Figure b).
ii)
Mie Scattering: Mie scattering or non- molecular scattering occurs when the particles
are just about the same size as the wavelength of the radiation. Dust, pollen, smoke and
water vapor are common causes of Mie scattering which tends to affect longer
wavelengths than those affected by Rayleigh scattering. Mie scattering occurs mostly
in the lower portions of the atmosphere where larger particles are more abundant, and
dominates when cloud conditions are overcast.
Mie scattering is also wavelength dependent and varies approximately as the inverse of
the wavelength.
iii)
Nonselective Scattering occurs when the particles are much larger than the wavelength
of the radiation. Water droplets and large dust particles can cause this type of scattering.
This type of scattering causes fog and clouds to appear white to our eyes because blue,
green, and red light are all scattered in approximately equal quantities (blue + green +
red light = white light).
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
12. Write down the interaction of Electromagnetic Wave with Earth surface.
When electromagnetic radiation strikes a surface, it may be reflected, scattered, absorbed or
transmitted (Figure).
These processes are not mutually exclusive:
 a beam of light may be partially reflected and partially absorbed. Which processes actually
occur depends on the wavelength of the radiation, the angle at which the radiation intersects
the surface and the roughness of the surface. Reflected radiation is returned from a surface at
the same angle as it approached; the angle of incidence thus equals the angle of reflectance.
 Scattered radiation, however, leaves the surface in all directions. The concept of scattering is
often subsumed within reflection and it is termed 'diffuse reflection'. Whether or not incident
energy is reflected or scattered is partly a function of the roughness variations of the surface
compared to the wavelength of the incident radiation. If the ratio of roughness to wavelength
is low (less than one), the radiation is reflected whereas, if the ratio is greater than one, the
radiation is scattered.
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
13. What is Spectral Signature?
Spectral signature is the variation of reflectance or emittance of a material with respect to
wavelengths (i.e., reflectance/emittance as a function of wavelength).
14. Write down about Spectral Signature of different Landscape Features or Elaborate the
spectral signatures of Vegetation, Soil, Water, Rocks and Cultural Features.
(Spring 2018) Or Write down about the Factors that affect Remote Sensing Signatures:
Our experiences in everyday life tell us that different features are different colours: grass is green,
sky is blue and so on. Grass is green to our eyes because it has a higher reflectance in green than
in blue or red. The reflectance of a body is wavelength dependent and it may change markedly
over a range of few micrometers (Figure, for example).
Remote sensing systems often operate at wavelengths which cannot be detected visually, and in
order to understand the signatures obtained by their sensors we require a knowledge of the
reflectance and absorption properties of different features that make up a landscape. A graphical
representation of the reflectance variations as a function of wavelength is known as a spectral
reflectivity curve.
The characteristics of a scene, which may be imaged by remote sensing sensors and which are
discussed here, are the spectral signatures of:
• vegetation and soil;
• water (liquid and solid phase);
• rocks;
• cultural features
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
For any given material, the amount of solar radiation that is reflected (absorbed, transmitted)
will vary with wavelength. This important property of matter allows us to separate distinct
cover types based on their response values for a given wavelength. When we plot the response
characteristics of a certain cover type against wavelength, we define what is termed the
spectral signature of that cover. The diagram below illustrates the spectral signatures for some
common cover types.
15. Write down the reflectance properties of water body.
 The spectral response from a water body is complex, as water in any quantity is a medium
that is semi-transparent to electromagnetic radiation.
 The spectral response also varies according to wavelength, the nature of the water surface,
the optical properties of the water and the angles of illumination and observation of reflected
radiation from the surface and the bottom of shallow water bodies.
 Pure clear water has a relatively high reflectance in shorter wavelengths between 0.4 and 0.6
µm and virtually no reflectance at wavelengths greater than 0.7 µm.
 Thus, clear water appears black on an infrared image.
 However, shorter wavelength reflectance is further complicated by the fact that the maximum
transmittance of visible wavelengths occurs between 0.44 and 0.54 µm, also within the
blue/green part of the spectrum. Thus, both reflection and transmittance take place at these
wavelengths.
 Shorter wavelengths can penetrate deeper into a water body than longer wavelengths (Figure
a).
 Once the radiation at these short wavelengths has penetrated the water body, it meets
component particles in the water, some of which are of similar size to the wavelength and
therefore scattering occurs in much the same way as in the atmosphere (Figure b).
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
 The colour or response of a water body is determined by the radiation winch is scattered and
reflected within the body itself, not from its surface.
 This is termed ‘volume reflection' as it occurs over a range of depths. In shallow, calm, clear
water bodies up to 30 m deep, electromagnetic radiation can penetrate to the bed of the water
body. In this case, the colour and nature of the bed material also influence the response for
the water.
 Water containing a heavy load of sediment, for example estuarine water, is termed 'turbid'.
The sediment is suspended within the body of the water and
16. What are the primary colors? What are the spectrum range of visible colors?
Answer:
Blue, green, and red are the primary colors or wavelengths of the visible spectrum. They are
defined as such because no single primary color can be created from the other two, but all
other colors can be formed by combining blue, green, and red in various proportions. Although
we see sunlight as a uniform or homogeneous color, it is actually composed of various
wavelengths of radiation in primarily the ultraviolet, visible and infrared portions of the
spectrum.
The visible portion of this radiation can be shown in its-
Violet:
Blue:
Green:
Yellow:
Orange:
Red:
0.4 - 0.446 μm
0.446 - 0.500 μm
0.500 - 0.578 μm
0.578 - 0.592 μm
0.592 - 0.620 μm
0.620 - 0.7 μm
17. Distinguished between passive and active sensor with example.
Sensor: A sensor is a device which comprises of optical component or system and a detector
with electronic circuits used to record reflected and/or emitted energy from various objects.
Active sensor requires an external power source to operate, and they transmit and detect the
energy at the same time.
Passive Sensor:
Passive sensors do not require any external power source to produce output signal and they
not only transmit energy but only detects the energy, transmitted from an energy source.
Remote sensing systems which measure energy that is naturally available are called passive
sensors.
Passive sensors can only be used to detect energy when the naturally occurring energy is
available. For all reflected energy, this can only take place during the time when the sun is
illuminating the Earth.
Examples of passive sensor-based technologies include: Photographic Sensor.
Active Sensor:
Active sensors, provide their own energy source for illumination. The sensor emits radiation
which is directed toward the target to be investigated. The radiation reflected from that target
is detected and measured by the sensor. Advantages for active sensors include the ability to
obtain measurements anytime, regardless of the time of day or season.
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
Examples of other active sensor-based technologies include: scanning electron
microscopes, LiDAR.
Passive System
Active System
 Passive sensor receives naturally
emitted EM energy within its field of
view (FIV) and performs measurement
using it.
 Examples: remote sensing satellite,
SPOT-1, LANDSAT-1 etc.
 Passive sensors rely on other sources
such as sun for its operation.
 Passive sensors obtain measurements
only in day time.
 Active sensor emits their own EM
(Electromagnetic) energy which is
transmitted towards the earth and
receives energy reflected from the earth.
The received EM energy is used for
measurement purpose.
 Examples: communication satellite,
earth observation satellite (e.g.
RADARSAT-1), LISS-1 etc.
 Active sensors use their own source of
energy for operation.
 Active sensors can obtain
measurements anytime (Day & Night).
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
18. What is the difference between "image" and "photograph" in remote sensing?
Photographs are taken in the visible portion (0.3 mm to 0.9 mm) of the electromagnetic spectrum
(EMS) and are chemically registered on papers. A photograph refers specifically to images that
have been detected as well as recorded on photographic film.
Images are taken based on the sensors; sensors measure the data based on certain segments on
the EMS which are digitally recorded and user convert it to the color image.
Based on these definitions, we can say that all photographs are images, but not all images are
photographs.
19. Define photographic remote sensing. (Spring 2018). What are the main types of image?
Elaborate different types of remote sensing photographs and refer their advantages and
disadvantages. (Spring 2018)
In photographic remote sensing, the representation of a scene is recorded by a camera onto
photographic film. Electromagnetic radiation passes through the lens at the front of the
camera and is focused on the recording medium (film). The characteristics of the film (and
also the lens and the nature of any filters that are being used) determine the signal that is
recorded.
Different types of remote sensing photographs may be produced, each of which has advantages and disadvantages.
The main types of images are:
1. Panchromatic
The simplest film is one that records variations in electromagnetic radiation within the visible
range of the spectrum (0.4—0.7 μm) in black and white and shades of grey. The resultant
image is a panchromatic photograph but is often referred to as a black and white photograph.
The acquisition of a panchromatic photograph is a multi-stage process.
2. Photographic Infrared
A typical black and white infrared film that is used in aerial photography is the Kodak Infrared
Aerographic Film 2424. The sensitivity of this film, unlike a panchromatic one, extends into
the near infrared. However, this infrared film has a greater sensitivity in the ultraviolet range
than in the infrared part of the spectrum. Infrared images are ideal for vegetation surveys.
3. Multispectral
Multispectral photography involves simultaneously obtaining images of the same scene at
different wavelengths. The most common arrangement for multi- spectral imaging is the
acquisition of four images in the blue, green, red and photographic infrared parts of the
spectrum.
A major advantage of multispectral imaging is that a degree of flexibility is introduced to the
data. The scene may initially he examined separately in the blue, green, red and infrared parts
of the spectrum and as a natural or false colour composite in which information in parts of
the electromagnetic spectrum which is invisible to the human eye can be displayed on an
image.
A disadvantage is that the camera system is much more complex than that for obtaining
panchromatic or natural colour photographs and exposure times for the various films have to
be matched. The production of colour images from the individual black and white ones is also
more complicated than if a single colour film is used.
4. Natural Color
Although black and white images can often provide information about a particular area, in
order to differentiate surfaces it is often necessary to employ colour images. Different colours
can be produced by an additive combination of red, green and blue light which are known as
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
the primary additive colours. If the primary colours are superimposed in combinations of two,
three other colours can be produced which are known as complementary colours. Thus, the
superimposition of red and green light of equal intensity produces yellow, green and blue
produces cyan and red and blue yields magenta.
The advantages of normal colour photographs are:
Surfaces that are indistinguishable on a black and white image can often be easily
differentiated on a colour image because the human visual system can differentiate hundreds
of thousands of colours but relatively few grey levels,
The colours produced accord with our perception of everyday living. They are thus relatively
easy to interpret.
The disadvantages of colour photography are:
More complex and expensive processing of the film is required.
Colour images can have less definition than black and white images.
The colours may disrupt the continuity of linear features which cross differently coloured
surfaces. The colour photograph may contain too much 'distracting' information.
5. False Color
A false colour composite, which is formed by projecting green data in blue, red in green and
infrared in red, is known as a standard false colour composite. Many remote sensing systems
obtain data in a number of bands at longer wavelengths than visible light.
20. (a) What is digital image? What is their significance in remote sensing? (Spring 2018)
(b) Draw a hypothetical graphic form of digital image and explain it. (Spring 2018)
Answer (a): Digital Image: A digital image is a regular grid array of squares where each square is
assigned a number which is related to some parameter (such as reflectance or emittance) which is
being measured by a remote sensing system’s sensor. It records a scene electronically and form an
image of it. A digital image is a grid of picture elements, called pixels.
A digital image is a numeric representation, normally binary, of a two-dimensional image.
Depending on whether the image resolution is fixed, it may be of vector or raster type. By itself, the
term "digital image" usually refers to raster images or bitmapped images (as opposed to
vector images).
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
Answer (b): Hypothetical graphic form of digital image:
21. What is Pixel and DN Value?
Pixel and DN values
A photograph could also be represented and displayed in a digital format by subdividing the image
into small equal-sized and shaped areas, called picture elements or pixels, and representing the
brightness of each area with a numeric value or digital number.
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
22. What are the data characteristics in RS?
The quality of remote sensing data consists of its spatial, spectral, radiometric and temporal
resolutions.
Spatial resolution: Related to pixel size
- The size of a pixel that is recorded in a raster image – typically pixels may correspond to square
areas ranging in side length from 1 to 1,000 meters (3.3 to 3,280.8 ft).
Spectral resolution: Related to Number of Band
- The wavelength of the different frequency bands recorded – usually, this is related to the
number of frequency bands recorded by the platform. Current Landsat collection is that of
seven bands, including several in the infrared spectrum, ranging from a spectral resolution of
0.7 to 2.1 μm. The Hyperion sensor on Earth Observing-1 resolves 220 bands from 0.4 to 2.5
μm, with a spectral resolution of 0.10 to 0.11 μm per band.
Radiometric resolution: Related to intensity level of reflection
- The number of different intensities of radiation the sensor is able to distinguish. Typically, this
ranges from 8 to 14 bits, corresponding to 256 levels of the gray scale and up to 16,384
intensities or "shades" of colour, in each band. It also depends on the instrument noise.
Temporal resolution: Related to Revisit time.
- The frequency of flyovers by the satellite or plane, and is only relevant in time-series studies or
those requiring an averaged or mosaic image as in deforesting monitoring. This was first used
by the intelligence community where repeated coverage revealed changes in infrastructure, the
deployment of units or the modification/introduction of equipment. Cloud cover over a given
area or object makes it necessary to repeat the collection of said location.
23. (a) What is image resolution? (Spring 2018)
(b) Elaborate the different aspects of image resolution with example. (Spring 2018)
Or: What is Resolution? Discuss various types of resolutions.
Answer:
Resolution is a broad term commonly used to describe:
 The number of pixels you can display on a display device, or
 The area on the ground that a pixel represents in an image file.
These broad definitions are inadequate when describing remotely sensed data.
There are four types of resolutions in Remote Sensing:




i)
Spatial resolution – ‘Area’ aspects
Spectral resolution -‘ Band’ aspect
Radiometric resolution- ‘Radiance’ aspect
Temporal resolution- ‘Frequency’ aspect
Spatial Resolution:
The earth surface area covered by a pixel of an image is known as spatial resolution.
Large area covered by a pixel means low spatial resolution and vice versa.
 ‘Area’ aspects.
 How small an object do you need to see (pixel size) and how large an area do you
need to cove (swath width)?
 The area on the ground represented by each pixel.
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
 Spatial resolution
 Pixel: smallest unit of an image
 Pixel size
 Spatial coverage
 Field of view (FOV), or
 Area of coverage, such as MODIS:
2300km
30 meter, Spatial Resolution
1 meter, Spatial Resolution
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
Spatial Resolution
ii)
Spectral Resolution:
 ‘Band’ aspects.
 What part of the spectrum do you want to measure?
 The specific wavelength intervals that a sensor can record.
Is the ability to resolve spectral features and bands into their separate components. More
number of bands in a specified bandwidth means higher spectral resolution and vice versa
 Spectral resolution describes the ability of a sensor
to define fine wavelength intervals
 The finer the spectral resolution, the narrower the
wavelength range for a particular channel or band
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
Radiometric Resolution:
 ‘Radiance’ aspects.
 How finely do you need to quantify data?
 The number of possible data file values in each band (indicated by the number of
bits into which the recorded energy is divided).
The radiometric resolution of an imaging system describes its ability to discriminate very
slight differences in energy. The finer the radiometric resolution of a sensor, the more
sensitive it is to detecting small differences in reflected or emitted energy. The maximum
number of brightness levels available depends on the number of bits used in representing the
energy recorded. Thus, if a sensor used 8 bits to record the data, there would be 28=256 digital
values available, ranging from 0 to 255.
iii)
2-bit Image
8-bit Image
Image Resolution Range:
2-bit Range: 00
11 in Decimal 3
4-bit Range: 0000
111 in Decimal 7
6-bit Range: 0000 00
8-bit Range: 0000 0000
10-bit Range: 0000 0000 00
1111 11 in Decimal 63
1111 1111 in Decimal 255
1111 1111 11
in Decimal 1023
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
Temporal Resolution:
Temporal resolution is the revisit period, and is the length of time for a satellite to complete
one entire orbit cycle, i.e. start and back to the exact same area at the same viewing angle.
For example,
 Landsat needs 16 days,
 MODIS needs one day,
 NEXRAD needs 6 minutes for rain mode and 10 minutes for clear sky mode.
iv)



‘Frequency’ aspects
How often do you need to look?
How often a sensor obtains imagery of a particular area?
These four domains contain separate information that can be extracted from the raw data.
The bellowing figure Illustrated all four type of resolution in a single frame.
24. What is the satellite system based on spatial resolution?
Answer:
Based on the spatial resolution, satellite system can be classified into four types:
i)
Low resolution system,
ii)
Medium resolution system,
iii)
High resolution system, and
iv)
Very high-resolution system.
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
25. What are the differences between Aerial Photographs and Satellite Imageries?
Answer:
Differences between Aerial Photographs and Satellite Imageries are
Aerial Photograph
Satellite Imagery
1. Covers a small area, normally a few
1. It covers a very large area which ranges
tens of square kilometers to a
from 3500 to above 30,000 sq.km.
few hundred square kilometers.
2. Such photographs are taken from an
2. Such images are taken from 600-900
altitude of a few hundred meters to
km.
thousand meters.
3. Snapshots are taken by cameras and
3. Reconstruction of radiance values is
photographic film.
done over a region by a series of
detectors, each gathering data over small
pockets of the entire region.
4. It is an analogue record, so no
4. Images are taken digitally which can be
further improvement is possible
further improved or enhanced.
after obtaining photographs.
5. Very high degree of details
5. Degree of details is restricted to pixel
regarding the terrain may be
resolution of the sensors.
obtained.
6. It takes a stereo-view of the terrain.
6. Satellite imagery is not capable of
providing stereo.
7. Aerial surveys lack fixed
7. Satellite surveys are highly repetitive.
repetitively.
8. Surveys are highly expensive.
8. Surveys are much less expensive than
aerial surveys.
9. Surveys are adversely affected by
9. Satellite surveys are not constrained by
bad weather.
weather.
26. What is image interpretation?
Image interpretation is the process of examining an aerial photo or digital remote sensing image and
manually identifying the features in that image. Image interpretation is the process of examining an
aerial photo or digital remote sensing image and manually identifying the features in that image.
These image characteristics (also called image attributes) are comprised of seven elements that we
use to derive information about objects in an image.
This method can be highly reliable and a wide variety of features can be identified, such as riparian
vegetation type and condition, and anthropogenic features such as roads and mineral extraction
activity. However, the process is time consuming and requires a skilled analyst who has a groundlevel familiarity with the study area.
Key points in image interpretation are:
- Differentiation / Recognize the things
- Identification
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
27. Explain Prime Elements of Visual Image Interpretation.
Principles of image interpretation have been developed empirically for more than 150 years.
The most basic of these principles are the elements of image interpretation. They are: location,
size, shape, shadow, tone/color, texture, pattern, height/depth and site/situation/association.
Template of Image Interpretation
Features
Pattern
Tone /
Color
Size
Shape
Indicator
Texture Orientation
Association
Time
Location
Settlement
Agriculture
Waterbodies
Bare soil
a)
b)
c)
d)
e)
f)
g)
Shape:
i)
Many natural and human-made features have unique shapes.
ii)
Often used are adjectives like linear, curvilinear, circular, elliptical, radial, square,
rectangular, triangular, hexagonal, star, elongated, and amorphous.
Shadow:
i)
Shadow reduction is of concern in remote sensing because shadows tend to
obscure objects that might otherwise be detected.
ii)
However, the shadow cast by an object may be the only real clue to its identity.
iii) Shadows can also provide information on the height of an object either
qualitatively or quantitatively.
Tone and Color:
i)
A band of EMR recorded by a remote sensing instrument can be displayed on an
image in shades of gray ranging from black to white.
ii)
These shades are called “tones”, and can be qualitatively referred to as dark, light,
or intermediate (humans can see 40-50 tones).
iii) Tone is related to the amount of light reflected from the scene in a specific
wavelength interval (band).
Texture:
i)
Texture refers to the arrangement of tone or color in an image.
ii)
Useful because Earth features that exhibit similar tones often exhibit different
textures.
iii) Adjectives include smooth (uniform, homogeneous), intermediate, and rough
(coarse, heterogeneous).
Pattern:
i)
Pattern is the spatial arrangement of objects on the landscape.
ii)
General descriptions include random and systematic; natural and human-made.
iii) More specific descriptions include circular, oval, curvilinear, linear, radiating,
rectangular, etc.
Height and Depth (Size):
i)
As discussed, shadows can often offer clues to the height of objects.
ii)
In turn, relative heights can be used to interpret objects.
iii) In a similar fashion, relative depths can often be interpreted.
iv) Descriptions include tall, intermediate, and
short; deep, intermediate, and
shallow.
Association:
This is very important when trying to interpret an object or activity.
i)
Association refers to the fact that certain features and activities are almost always
ii)
related to the presence of certain other features and activities.
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
28. Explain Satellite Image Interpretation or How to Interpret a Satellite Image?
Five Tips and Strategies
Satellite images are like maps: they are full of useful and interesting information, provided you
have a key. They can show us how much a city has changed, how well our crops are growing,
where a fire is burning, or when a storm is coming. To unlock the rich information in a satellite
image, you need to:
1. Look for a scale
2. Look for patterns, shapes, and textures
3. Define the colors (including shadows)
4. Find north
5. Consider your prior knowledge
Interpretation- is the processes of detection, identification, description and assessment of
significant of an object and pattern imaged. The method of interpretation may be either visual or
digital or combination of both. Both the interpretation techniques have merits and demerits and
even after the digital analysis the output are also visually analysed.
Green band (1)
Red band (2)
Near IR band (3)
False colour
Composite (123)
Figure 1: Combination of 3 bands generates colour composite images.
The ability of human to identify an object through the data content in an image/photo by combining
several elements of interpretation. There are two types of extraction of information from the
images/photographs namely;
1. Interpretation of data by visual analysis,
2. Semi-automatic processing by computer followed by visual analysis like generation of vector
layer from raster image through onscreen digitisation and DTM/DEM generation. Similarly,
interpretation of aerial photographs through 3D generation through visual studies. In general
analog format in remote sensing data is being used in visual interpretation. This involves the
systematic examination of data, studying existing maps, collection of field information and
works at various levels of complexity. The analysis depends upon the individual perception,
and experience of the interpreter, nature of the object, quality of the data, scale, combination
of special bands etc.
The entire process of visual interpretation can be divided into following few steps namely
detection of an object, interpretation, recognition and identification, analysis, classification,
deduction and idealisation and based on this identifying an object conclusion. Hence
interpretation is the combined result of identification of feature through photo recognition
elements, field verification and preparation of final thematic maps. It also requires the process
of observation coupled with imagination and great deal of patience.
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
Basic elements of interpretation
The interpretation of satellite imagery and aerial photographs involves the study of various basic
characters of an object with reference to spectral bands which is useful in visual analysis. The basic
elements are shape, size, pattern, tone, texture, shadows, location, association and resolution.
Shape: The external form, outline or configuration of the object. This includes natural features
(Example: Yamuna River), in Delhi Man Made feature (Example : Nehru Stadium, Delhi.
Size : This property depends on the scale and resolution of the image/photo. Smaller feature will be
easily indented in large scale image/photo.
Pattern: Spatial arrangement of an object into distinctive recurring forms: This can be easily
explained through the pattern of a road and railway line. Eventhough both looks linear, major roads
associated with steep curves and many intersection with minor road.
Shadow: Indicates the outline of an object and its length which is useful is measuring the height of
an object. The shadow effect in Radar images is due to look angle and slope of the terrain. Taller
features cast larger shadows than shorter features.
Tone: Refers to the colour or relative brightness of an object. The tonal variation is due to the
reflection, emittance, transmission or absorption character of an objects. This may vary from one
object to another and also changes with reference to different bands. In General smooth surface tends
to have high reflectance, rougher surface less reflectance. This phenomenon can be easily explained
through Infrared and Radar imagery .
Infrared imagery: Healthy vegetation reflects Infrared radiation much more stronger than green
energy and appears very bright in the image. A simple example is the appearance of light tone by
vegetation species and dark tone by water. Particularly in thermal infrared images the brightness tone
represents warmest temperature and darkness represent coolest temperature. The image (Fig2)
illustrates daytime and night time thermal data. The changes in kinetic water temperature cause for
the tonal changes. Hence time is also to be taken consideration before interpretation
Radar Imagery : Smooth surfaces reflect highly and area blocked from radar signal and appear dark.
Bridges and cities show very bright tone, on the contrary calm water, pavement and dry lake beds
appears very dark tone.
Texture: The frequency of tonal change. It creaks a visual impression of surface roughness or
smoothness of objects. This property depends upon the size, shape, pattern and shadow:
Location Site : The relationship of feature to the surrounding features provides clues to words its
identity. Example: certain tree species words associated with high altitude areas
Resolution: It depends upon the photographic/imaging device namely cameras or sensors. This
includes of spectral and spatial resolutions. The spectral resolution helps in identifying the feature in
specific spectral bands. The high spatial resolutions imagery/photographs is useful in identifying
small objects.
Association: Occurrence of features in relation to others.
Hence, careful examination has to be done to identify the features in the imagery combined with field
information.
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
Advantages of digital image processing:
 Cost-effective for large geographic areas
 Cost-effective for repetitive interpretations
 Cost-effective for standard image formats
 Consistent results
 Simultaneous interpretations of several channels
 Complex interpretation algorithms possible
 Speed may be an advantage
 Explore alternatives
 Compatible with other digital data
Disadvantages in digital processing:
 Expensive for small areas
 Expensive for one-time interpretations
 Start-up costs may be high
 Requires elaborate, single-purpose equipment
 Accuracy may be difficult to evaluate
 Requires standard image formats
 Data may be expensive, or not available
 Preprocessing may be required
 May require large support staff
29. What is image classification?
The intent of the classification process is to categorize all pixels in a digital image into one of several
land cover classes, or "themes". This categorized data may then be used to produce thematic maps of
the land cover present in an image. Normally, multispectral data are used to perform the classification
and, indeed, the spectral pattern present within the data for each pixel is used as the numerical basis
for categorization. The objective of image classification is to identify and portray, as a unique gray
level (or color), the features occurring in an image in terms of the object or type of land cover these
features actually represent on the ground.
Image classification refers to the task of extracting information classes from a multiband raster image.
The resulting raster from image classification can be used to create thematic maps. Depending on the
interaction between the analyst and the computer during classification, there are two types of
classification: supervised and unsupervised.
With the ArcGIS Spatial Analyst extension, there is a full suite of tools in the Multivariate toolset to
perform supervised and unsupervised classification (see An overview of the Multivariate toolset).
The classification process is a multi-step workflow; therefore, the Image Classification toolbar has
been developed to provide an integrated environment to perform classifications with the tools. Not
only does the toolbar help with the workflow for performing unsupervised and supervised
classification, it also contains additional functionality for analyzing input data, creating training
samples and signature files, and determining the quality of the training samples and signature files.
The recommended way to perform classification and multivariate analysis is through the Image
Classification toolbar.
Supervised classification
Supervised classification uses the spectral signatures obtained from training samples to classify an
image. With the assistance of the Image Classification toolbar, you can easily create training samples
to represent the classes you want to extract. You can also easily create a signature file from the
training samples, which is then used by the multivariate classification tools to classify the image.
Unsupervised classification
Unsupervised classification finds spectral classes (or clusters) in a multiband image without the
analyst’s intervention. The Image Classification toolbar aids in unsupervised classification by
providing access to the tools to create the clusters, capability to analyze the quality of the clusters,
and access to classification tools.
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
30. What is Panchromatic Images?
A panchromatic image consists of only one band. It is usually displayed as a grey scale image, i.e.
the displayed brightness of a particular pixel is proportional to the pixel digital number which is
related to the intensity of solar radiation reflected by the targets in the pixel and detected by the
detector. Thus, a panchromatic image may be similarly interpreted as a black-and-white aerial
photograph of the area. The Radiometric Information is the main information type utilized in the
interpretation.
A panchromatic image extracted from a SPOT panchromatic scene at a ground resolution of 10 m.
The ground coverage is about 6.5 km (width) by 5.5 km (height). The urban area at the bottom left
and a clearing near the top of the image have high reflected intensity, while the vegetated areas on
the right part of the image are generally dark. Roads and blocks of buildings in the urban area are
visible. A river flowing through the vegetated area, cutting across the top right corner of the image
can be seen. The river appears bright due to sediments while the sea at the bottom edge of the image
appears dark.
31. What is Multi-spectral image, (Spring 2018)
A multi-spectral image is a collection of several monochrome images of the same scene, each of them
taken with a different sensor. Each image is referred to as a band. A well-known multi-spectral (or
multi-band image) is a RGB color image, consisting of a red, a green and a blue image, each of them
taken with a sensor sensitive to a different wavelength. In image processing, multi-spectral images
are most commonly used for Remote Sensing applications. Satellites usually take several images
from frequency bands in the visual and non-visual range. Landsat 5, for example, produces 7 band
images with the wavelength of the bands being between 450 and 1250 nm.
The disadvantage of multi-spectral images is that, since we have to process additional data, the
required computation time and memory increase significantly. However, since the speed of the
hardware will increase and the costs for memory will decrease in the future, it can be expected that
multi-spectral images will become more important in many fields of computer vision.
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
32. Write about Anderson Image Classification
A commonly used classification scheme has been the Anderson Classification System - A land
cover/land use classification system developed for use with remote sensing systems in the 1970's,
adopted for the USGS-NPS Vegetation Mapping Program to map cultural land cover (Anderson et
al. 1976).
Land Cover Classification
1
2
3
4
5
6
7
8
9
LEVEL I
Urban or Built-Up Land
LEVEL II
11
12
13
14
15
16
17
Residential
Commercial Services
Industrial
Transportation, Communications
Industrial and Commercial
Mixed Urban or Built-Up Land
Other Urban or Built-Up Land
21
22
23
24
Cropland and Pasture
Orchards, Groves, Vineyards, Nurseries
Confined Feeding Operations
Other Agricultural Land
31
32
33
Herbaceous Rangeland
Shrub and Brush Rangeland
Mixed Rangeland
41
42
43
Deciduous Forest Land
Evergreen Forest Land
Mixed Forest Land
51
52
53
54
Streams and Canals
Lakes
Reservoirs
Bays and Estuaries
61
62
Forested Wetlands
Nonforested Wetlands
71
72
73
74
75
76
77
Dry Salt Flats
Beaches
Sandy Areas Other than Beaches
Bare Exposed Rock
Strip Mines, Quarries, and Gravel Pits
Transitional Areas
Mixed Barren Land
81
82
83
84
85
Shrub and Brush Tundra
Herbaceous Tundra
Bare Ground
Wet Tundra
Mixed Tundra
91
92
Perennial Snowfields
Glaciers
Agricultural Land
Rangeland
Forest Land
Water
Wetland
Barren Land
Tundra
Perennial Snow and Ice
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
Example of Sub-categorization of Residential Land (Level III)
Level I
Level II
Level III
1. Urban or Built-up 1.1. Residential
1.1.1. Single-family Units
1.1.2. Multi-family Units
1.1.3. Group Quarters
1.1.4. Residential Hotels
1.1.5. Mobile Home Parks
1.1.6. Transient Lodgings
33. Write about Remote sensing application for environmental monitoring. (Spring 2018)
The utilization of remotely sensed data for environmental monitoring has various advantages over
traditional approaches. Remote sensing provides a continuous monitoring and mapping, both spatial
and temporal, as opposed to a limited frequency point measurement. Therefore, the process of
environmental decision-making where environmental changes and impacts are being monitored at a
regular basis can be greatly enhanced using RS data and techniques. If RS data is combined with
information from other sources and ground observations the environmental monitoring techniques
may further be improved. Remotely sensed information can be used in many environmental
applications (see Table 1).
Table 1: Application of Remote Sensing in Environmental Studies
Atmospheric
Atmospheric
Atmospheric
Atmospheric
Atmospheric
– Aerosol
– Fog
– Black Carbon
– Dust Storm
– Ozone and
other
trace gases
– Water Quality
– Soil Moisture
– Sea Surface
Temperature
– Clouds Optical
Properties
– Snow Cover
– Floods, Tsunami,
Earthquakes,
landslides
Mapping and Risk
Assessment
– Droughts
– Epidemic
Mapping
– Forest Fires
– Land use/Land
cover changes
– Urban Planning
– Urban Heat
Islands
– Agriculture
– Forests (land &
coastal)
– Coastal zone
monitoring
– Environmental
Impact Assessment
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
34. Write about Pixel & Digital Number (DN),
A photograph could also be represented and displayed in a digital format by subdividing the image
into small equal-sized and shaped areas, called picture elements or pixels, and representing the
brightness of each area with a numeric value or digital number.
The generic term for pixel values is Digital Number or DN. It is commonly used to describe pixel
values that have not yet been calibrated into physically meaningful units.
If you just want to look at an image, and don't intend to interpret the pixel values in terms of some
physically meaningful, quantitative value like radiance or reflectance (or any value derived from
radiance or reflectance values, such as abundance), then it may be just fine to keep your image in
its original DN values.
Pixel: A digital image comprises of a two-dimensional array of individual picture elements called
pixels. It arranged in columns and rows. Each pixel represents an area on the Earth's surface. A
pixel has an intensity value and a location address in the two-dimensional image.
Digital Number (DN): In remote sensing, the numerical value of a specific pixel called DN. It is
commonly used to describe pixel values that have not yet been calibrated into physically meaningful
units.
35. Write about Platform
Answer:
The vehicle or carrier for a remote sensor to collect and record energy reflected or emitted from a
target or surface is called a platform. The sensor must reside on a stable platform removed from the
target or surface being observed. Platforms for remote sensors may be situated on the ground, on an
aircraft or balloon (or some other platform within the Earth's atmosphere), or on a spacecraft or
satellite outside of the Earth's atmosphere. There are three broad categories of remote sensing
platforms.
i)
Ground Based
ii)
Air Borne Based
iii) Space Borne Based
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
Aircraft and satellites are the common platforms for remote sensing of the earth and its natural
resources. Aerial photography in the visible portion of the electromagnetic wavelength was the
original form of remote sensing but technological developments has enabled the acquisition of
information at other wavelengths including near infrared, thermal infrared and microwave.
Collection of information over a large numbers of wavelength bands is referred to as multispectral or
hyperspectral data. The development and deployment of manned and unmanned satellites has
enhanced the collection of remotely sensed data and offers an inexpensive way to obtain information
over large areas. The capacity of remote sensing to identify and monitor land surfaces and
environmental conditions has expanded greatly over the last few years and remotely sensed data will
be an essential tool in natural resource management.
36. Write about Aerial photography
Aerial photography has been used in agricultural and natural resource management for many years.
These photographs can be black and white, color, or color infrared. Depending on the camera, lens,
and flying height these images can have a variety of scales. Photographs can be used to determine
spatial arrangement of fields, irrigation ditches, roads, and other features or they can be used to view
individual features within a field.
37. Write about Infrared images
Infrared images can detect stress in crops before it is visible with the naked eye. Healthy canopies
reflect strongly in the infrared spectral range, whereas plants that are stressed will reflect a dull color.
These images can tell a farmer that there is a problem but does not tell him what is causing the problem.
The stress might be from lack of water, insect damage, and improper nutrition or soil problems, such
as compaction, salinity or inefficient drainage. The farmer must assess the cause of the stress from
other information. If the dull areas disappear on subsequent pictures, the stress could have been lack
of water that was eased with irrigation.
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
If the stress continues it could be a sign of insect infestation. The farmer still has to conduct in-field
assessment to identify the causes of the problem. The development of cameras that measure
reflectance in a wider range of wavelengths may lead to better quantify plant stress. The use of these
multi-spectral cameras is increasing and will become an important tool in precision agriculture.
38. Write about Application of Remote Sensing
General Remote Sensing Applications:
Each application itself has specific demands for spectral resolution, spatial resolution, and temporal
resolution of the satellite sensor. There can be many applications for remote sensing in different
fields. Some of them are described below.
 Agriculture:
Agriculture plays a dominant role in the economies of both developed and undeveloped countries.
Satellite and airborne images are used as mapping tools to classify crops, examine their health,
examine their viability, and monitor farming practices. Agricultural applications of remote
sensing include crop type classification, crop condition assessment, crop yield estimation,
mapping of soil characteristics, mapping of soil management practices, and compliance
monitoring (farming practices).
 Forestry:
Forests are a valuable resource for providing food, shelter, wildlife habitat, fuel, and daily
supplies (such as medicinal ingredients and paper). Forests play an important role in balancing
the earth’s CO2 supply and exchange, acting as a key link between the atmosphere, geosphere,
and hydrosphere. Forestry applications of remote sensing include the following:
Reconnaissance mapping: Objectives to be met by national environment agencies include forest
cover updating, depletion monitoring, and measuring biophysical properties of forest stands.
Commercial forestry: Of importance to commercial forestry companies and to resource
management agencies are inventory and mapping applications. These include collecting harvest
information, updating inventory information for timber supply, broad forest type, vegetation
density, and biomass measurements.
Environmental monitoring: Conservation authorities are concerned with monitoring the quantity,
health, and diversity of the earth’s forests.
 Geology:
Geology involves the study of landforms, structures, and the subsurface to understand physical
processes that create and modify the earth’s crust. It is most commonly understood as the
exploration and exploitation of mineral/hydrocarbon resources to improve the standard of living
in society.
Geological applications of remote sensing include the following: Bedrock mapping, lithological
mapping, € structural mapping, sand and gravel exploration/ exploitation, mineral exploration,
hydrocarbon exploration, environmental geology, geobotany, baseline infrastructure,
sedimentation monitoring, event/monitoring, geo-hazard mapping, and planetary mapping.
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
 Hydrology:
Hydrology is the study of water on the earth’s surface, whether flowing above ground, frozen in
ice or snow, or retained by soil. Examples of hydrological applications include wetlands
monitoring, soil moisture estimation, snow pack monitoring, measuring snow thickness,
determining the snow-water equivalent, ice monitoring, flood monitoring, glacier dynamics
monitoring (surges, ablation), € river/delta change detection, drainage basin mapping, watershed
modelling, irrigation canal leakage detection, and irrigation scheduling.
 Sea Ice:
Ice covers a substantial part of the earth’s surface and is a major factor in commercial
fishing/shipping industries, Coast Guard operations, and global climate change studies. Examples
of sea ice information and applications include ice concentration, ice type/age/motion, iceberg
detection, surface topography€¬€ tactical identification of leads, navigation, safe shipping routes,
ice condition, historical ice, iceberg conditions, dynamics for planning purposes, wildlife habitat,
pollution monitoring, and meteorological change research.
 Land Cover and Land Use:
Although the terms ‘land cover’ and ‘land uses’ are often used interchangeably, their actual
meanings are quite distinct. Land cover refers to the surface cover on the ground, while land use
refers to the purpose the land serves. The properties measured with remote sensing techniques
relate to land cover from which land use can be inferred, particularly with ancillary data or a
priori knowledge.
Land use applications of remote sensing include € natural resource management, wildlife habitat
protection, baseline mapping for GIS input, urban expansion, logistics planning for
seismic/exploration/resource extraction activities, damage delineation (tornadoes, flooding,
volcanic, seismic, fire), legal boundaries for tax/property evaluation, target detection, and
identification of landing strips, roads, clearings, bridges, and land/water interface.
 Mapping:
Mapping constitutes an integral component of the process of managing land resources, with
mapped information the common product of the analysis of remotely sensed data.
Mapping applications of remote sensing include the following:
- Planimetry: Land surveying techniques accompanied by the use of a GPS can be used to meet
high accuracy requirements, but limitations include cost effectiveness and difficulties in
attempting to map large or remote areas. Remote sensing provides a means of identifying
planimetric data in an efficient manner, so imagery is available in varying scales to meet the
requirements of many different users. Defense applications typify the scope of planimetry
applications, such as extracting transportation route information, building/facilities locations,
urban infrastructure, and general land cover.
- Digital elevation models (DEMs): Generating DEMs from remotely sensed data can be cost
effective and efficient. A variety of sensors and methodologies to generate such models are
available for mapping applications. Two primary methods of generating elevation data are
stereogrammetry techniques using airphotos (photogrammetry), VIR imagery, radar data
(radargrammetry), and radar interferometry.
- Baseline topographic mapping: As a base map, imagery provides ancillary information to the
extracted planimetric detail. Sensitivity to surface expression makes radar a useful tool for
creating base maps and providing reconnaissance abilities for hydrocarbon/mineralogical
companies involved in exploration activities. This is particularly true in remote northern
regions where vegetation cover does not mask the microtopography and where information
may be sparse.
 Oceans & Coastal Monitoring:
The oceans provide valuable food-biophysical resources, serve as transportation routes, are
crucially important in weather system formation and CO2 storage, and are an important link in
the earth’s hydrological balance. Coastlines are environmentally sensitive interfaces between the
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
ocean and land, and they respond to changes brought about by economic development and
changing land-use patterns. Often coastlines are also biologically diverse inter-tidal zones and
can be highly urbanized. Ocean applications of remote sensing include the following:
- Ocean pattern identification:€ Currents, regional circulation patterns, shears, frontal zones,
internal waves, gravity waves, eddies, upwelling zones, and shallow water bathymetry.
- Storm forecasting: Wind and wave retrieval.
- Fish stock and marine mammal assessment: Water temperature monitoring, water quality,
ocean productivity, phytoplankton concentration, drift,€ € aquaculture inventory, and
monitoring.
- Oil spill: Predicting the oil spill extent and drift, strategic support for oil spill emergency
response decisions, and identification of natural oil seepage areas for exploration.
- Shipping: Navigation routing, traffic density studies, operational fisheries surveillance, and
near-shore bathymetry mapping.
 General Observations on Remote Sensing in Geography
Higgitt & Warburton (1999) have argued that remote sensing techniques provide fresh insights
in geography in four main ways:
They provide new applications for geography.
They provide new and improved accuracy of measurement.
They provide new data that allow the investigation of ideas that were previously untestable.
They involve the development of data processing capability.
 Application of Remote Sensing in Geography
Geographic applications of remotely sensed data typically take one of four explanatory forms:
Remote sensing images have specific uses within various fields of geographical study.
Remote sensing data possess advantages over conventional data and can provide multispectral,
multidata, and multisensor information. This data is very useful in the agricultural fields for the
crop type classification, crop condition assessment, crop yield estimation, and soil mapping.
In geology, remote sensing can be applied to analyze large, remote areas. Remote sensing
interpretation also makes it easy for geologists to identify an area’s rock types, geomorphology,
and changes from natural events such as a flood, erosion, or landslide.
The interpretation of remote sensing images allows physical- and biogeographers, ecologists,
agricultural researchers, and foresters to easily detect what vegetation is present in certain areas,
its growth potential, and sometimes what conditions are conducive to its being there.
Additionally, those studying urban land use applications are also concerned with remote sensing
because it allows them to easily pick out which land uses are present in an area. This can then be
used as data in city planning applications and in the study of species habitat.
39. Write about Synergistic use of passive and active remotely sensed imagery: selected
examples from Bangladesh
Sir, Print Copy.
40. Write Short Notes on Photographic Map and Digital Map
A photographic copy of an assembly of individual aerial photographs which are arranged along
the flight line in their proper relative positions. 2) An overlay containing the delineated boundary of
each photograph, keyed to a base map, and depicting the location and area of coverage of each
photograph and/or flight strips of photographs.
The digital map is an electronic map, which operation is based on a combination of graphic elements
assigned to it in the form of electronic information. It is based on naturally harvested and processed
into digital cartographic data.
Engr. Md. Shahidul Islam, +8801713062224, shd@globalsources.com.bd, shd2051@gmail.com , www.globalsources.com.bd
Download