remote sensing - UniMAP Portal

advertisement
INTRODUCTION REMOTE
SENSING
Prepared by:
Samera Samsuddin Sah
Biosystems Engineering Programme
School of Bioprocess Engineering
Universiti Malaysia Perlis (UniMAP)
DEFINITION
RS is the science or techniques of deriving information
about the Earth’s land and water areas from images at
a distance.
It relies upon measurement of electromagnetic (EM)
energy reflected or emitted from the object of interest
at the surface of the Earth.
So, one is looking at the physical nature of spatially
distributed features.
Remote Sensing Process Components
Energy Source or Illumination (A)
Radiation and the Atmosphere (B)
Interaction with the Target (C)
Recording of Energy by the Sensor (D)
Transmission, Reception, and
Processing (E)
Interpretation and Analysis (F)
Source: Canadian Centre for Remote Sensing
Application (G)
FIELD OF APPLICATION
Meteorology
• weather forecast, climate studies, global change
Hydrology
• water balance, energy balance, agro-hydrology.
Soil science
Biology/Nature
Conservation
Forestry
Environmental
Studies
Agricultural Studies
Physical Planning
Land Surveying
• land evaluation, soil mapping
• vegetation mapping/monitoring, vegetation condition
assessment
• forest inventarization /mapping, de/re- forestation,
forest fire detection
• Sources/effect pollution, ground water quality, climate
change
• Land use development, erosion assessment, water
management
• Physical planning, scenario studies
• Topography (DTM), spatial data models, GIS
SATELLITE
Man-made
object launched
into space to
orbit the Earth,
moon, sun or
other celestial
body.
Moon is a
natural
satellite
Satellite
communication :
 Ground
segment
 Space segment
Platforms
launched for
remote sensing,
communication,
and telemetry
(location and
navigation)
purposes.
Satellite : Power sources
 Solar cells
 Nickel Cadmium battery
Technology has not progressed sufficiently for
nuclear power sources to be used as a power
source.
Satellite : Orbits & Swaths
The path followed by a satellite is referred to as its orbit.
Satellite orbits are matched to the capability and objective of the sensor(s)
they carry.
Orbit selection can vary in terms of altitude (their height above the Earth's
surface) and their orientation and rotation relative to the Earth.
Satellites at very high altitudes (altitudes of approximately 36,000
kilometers ) revolve at speeds that match the rotation of the Earth.
Weather and communications satellites commonly have these types of
orbits.
Many remote sensing platforms are designed to follow an orbit
(basically north-south) .
Allows them to cover most of the Earth's surface over a certain period of
time.
Many of these satellite orbits are also sun-synchronous such that they
cover each area of the world at a constant local time of day called local
sun time.
At any given latitude, the position of the sun in the sky as the satellite
passes overhead will be the same within the same season.
This ensures consistent illumination conditions when acquiring images
in a specific season over successive years, or over a particular area over a
series of days.
SENSOR
“Sensor” is preferred because it refers to a broader way of
getting information than a camera(only be seen by the
eye).
Is a device used to acquire data i.e. to measure the
radiation arriving to the satellite instruments.
Types 0f Sensor
 Have 2 types of sensor,
 Passive Sensor and
 Active Sensor
Passive Sensor:
 Passive
sensors detect electromagnetic
radiation emitted from an object.
 Record incoming radiation that has been
scattered, absorbed and transmitted from
the Earth in transit from its original source,
the Sun.
Electromagnetic Radiation
• Day – reflected
any emitted
• Night- emitted
• Contrast with
other
geophysical
techniques
Electromagnetic Spectrum
 Some are sensor are designed to receive all
‘green’ wavelengths, other that more targeted
toward infrared wavelengths.
 In infrared viewer, is specially made to ‘see’
objects emitting infrared radiation (even in the
dark).
 In general terms, sensor that use external energy
sources to “observe” an object are called
“Passive Sensor”.
 Sun is main sources.
Types of Passive Sensor:
 Have 5 types of Passive Sensor, they are:-
1) Gamma-ray spectrometer
 Passive sensor that detects gamma rays.
 The sources for the radiation is are generally
upper-soil layers as well as rock layers.
 Caused by radioactive decay.
 Used to explore mineral deposits.
Passive Sensor:
2) Aerial cameras
 Used in aerial photography.
 Aircraft serve as a platform as well as many low-
earth orbiting satellites deploy many aerial
cameras.
 Used for topographic mapping.
Passive Sensor:
3) Thermal infrared video cameras
 Equipped to detect radiation in the near-infrared
range.
 Sometimes combined with active sensors, such as
radar, to provide additional information.
 Aircraft as well as satellites can serve as platforms.
Passive Sensor:
4) Multispectral scanner
 Records information in the visible and infrared
spectrum.
 Scans the Earth's surface for various wavelength
bands.
 Satellites act as platforms for such passive
sensors.
 Used for geological purposes.
Passive Sensor:
5) Imaging Spectrometer
 Similar to the multispectral scanner.
 Scans very narrow wavelength bands of the
spectrum.
 Satellites are used as platforms.
 Used for determining the mineral composition of
the Earth's surface
and concentrations of
suspended matter in surface water.
Passive sensor
•Disadvantage – if the sky is covered with clouds, they
cannot be used to observe the Earth surface (or
oceans)
Active sensor:
 Sensor that able to direct energy at an object
in the form of electromagnetic radiation
(EMR).
 Object is scanned and the sensors detect any
radiation reflected back from the object.
 Types of active remote sensing:
 Active Optical Remote Sensing
 Active Thermal Remote Sensing
 Active Microwave Remote Sensing
Active Optical Remote
Sensing
 Active optical remote sensing involves using a
laser beam upon a remote target to
illuminate it, analyzing the reflected or
backscattered radiation in order to acquire
certain properties about the target.
 The velocity, location, temperature and
material composition of a distant target can
be determined using this method.
 Example:
 LIDAR( Light Detection and Ranging)
Active Thermal Remote
Sensing
 Thermal remote sensing deals with
information acquired primarily in the thermal
infrared range.
 The majority of the thermal remote sensing is
done using passive sensors.
Active Microwave Remote Sensing
 Active microwave remote sensing uses
sensors that operate in the microwave region
of the electromagnetic spectrum.
 Example:
 RADAR (Radio detection and ranging)
Passive sensing relies on reflected sunlight and emission from hot
objects (top). Active sensing illuminates the object with its own light
source; a laser in this example (bottom).
PLATFORM
 A satellite platform is the service module
section of a satellite.
 Or the vehicles or carriers for remote
sensors
 3 types of platform:
 Ground Based Platforms
 Airborne Platforms
 Space-borne Platforms
1) Ground Based Platform:
 Is the remote sensing platform that
position the sensor at the Earth's surface
 Used for close-range, high-accuracy
applications, such as architectural
restoration, crime and accident scene
analysis, landslide and erosion mapping.
 It is either static (tripod or mast) or
dynamic (moving vehicle).
 These systems are fixed to the Earth
 The ground-based sensors are often used to
record detailed information about the surface or
measure environmental conditions such as air
temperature, wind characteristics, water
salinity, earthquake intensity and such.
 Example:
-DOE ARM (Atmospheric radiation Program)
-NASA AERONET (Aerosol Robotic
NETwork).
2) Airborne platforms:
 Are primarily stable wing aircraft, although
helicopters are occasionally used.
 Used to collect very detailed images and
facilitate the collection of data.
 Up to 50 km from earth.
 Examples:
 NCAR, NOAA, and NASA research aircrafts.
3) Space-borne platforms:
 Platforms that located about 100 km to 36000
km from earth.
 Examples:
-rockets, satellites, shuttle
 Types of spaceborne platforms:
-Space shuttle: 250-300 km
-Space station: 300-400 km
-Low-level satellites: 700-1500 km
-High-level satellites: about 36000 km
AIRBORNE IMAGERY
Aerial Camera Systems
The recent introduction of digital cameras has
revolutionized photography.
Digital and film-based cameras both use optical lenses.
Film-based cameras: use photographic film to record an image.
Digital cameras: record image data with electronic sensors.
Advantage of digital camera: can store, transmit and
analyze the image data.
Types of camera systems: small-format and large-format.
 Small-format system
 Consists one or more cameras
 Using smaller photographic format (negative size-
35mm)
 Do not have high-quality lens to meet the normal
measurement accuracies.
 Very useful and inexpensive for updating land-use
changes.
 Large-format system
 Consists only single camera.
 Uses fixed focal length, large-format negative
(230mm by 230mm).
 Strictly used for aerial photography.
 Equipped with a highly corrected lens and vacuum
pressure to minimize distortion.
Flight Lines and Photograph
Overlap
 Factors need to consider:
 Using suitable aircraft and technical staff.
 The study area must be outlined carefully – using GPS
to maintain flight line alignment.
 Photographs must be taken under cloudless skies.
Figure 1: Flight line and photograph overlap
Figure 2: Photograph overlap along flight line
 The number of air photos required to cover study
area is very important.
 For photographic scale of 1: 10,000
 1 photograph cover 1 km2
 if area need to cover 500 km2 , the number of air
photos is 500/1 = 500 nos .
 But, if the scale decreased to 1:5000, the number of
air photos is 500 x (10000/5000)2 = 2000 nos.
Ground Control for Mapping
 Aerial photography is not perfect if it involves
exposure to sudden and a flat surface.
 Ground control points is the best way to
overcome this problem.
 To establish the control point;
 Existing photography used for mapping.
 Prior to the acquisition of the air photos.
 Ground control is required for each data point
positioning.
 The accuracy depending on following
requirements:
 Measurements of distances and elevations
 Preparation of topographic maps
 Construction of controlled mosaics
 Construction of ortho-photos and rectified
photographs
 Selection of ground control points for existing
photograph based on following criteria:
 Must be separated in the overlap area – model more
stable and result more accurate.
 Must be easily identifiable on both air photos – useless
(if not)
 Should be selected on the assumption that there are
no changes since using the existing photographs.
 Surveyor should consider ease of access to all points to
minimize open-ended traverse lines.
 For new photography, there are criteria need
to follow;
 The areas containing few identifiable ground
control points.
 Legal surveys of densely developed areas.
 Municipal survey of roads and services
Mosaics
 An assembly of two or more air photos to
form one continuous picture of the terrain.
 Extremely useful for the following
application;
 Plotting of ground control points at the optimum
locations to ensure the required distribution and
strength of figure.
 A map substitute for field checkpoint locations and
approximate locations of natural and cultural
features.
 A medium for presenting ground data.
 Advantages:
 Can be produced more rapidly.
 Less expensive (cheaper).
 Shows more terrain details.
 Can interpreting subtle terrain characteristics ( tone,
texture , and vegetation)
 Disadvantages:
 Horizontal scale measurements are limited due to
relief displacement.
 Not topographic maps (do not show elevations)
Aerial Surveying and
Photogrammetric Mapping
 Advantages using AS and PM over traditional
ground surveying methods:
 Low cost.
 Reduced field work.
 Faster in compilation (time saving).
 Easy to record inaccessible terrain conditions.
 Provide an accurate record of the terrain features.
 Flexibility in term of scale.
 More relevant (new technology)
 Disadvantages:
 Cannot get the real picture at dense vegetation
area.
 Cannot show the contour line
 Need to do site visit - type of roads, surfacing etc
Aerial Photography Interpretation
 Image interpretation is achieved by a combination
of direct human analysis and by automated soft-copy
processes.
 Image interpretation techniques are based on 3
fundamental assumptions:
 The remotely sensed image are records of the results of
long- and short-term natural and human processes.
 The surface features can be grouped together to form
patterns that are characteristic of particular
environmental conditions.
 The environmental conditions and reflected image
patterns are repeated within major climatic zones.
Applications of Air Photo
Interpretation for Engineer and
the Surveyor.
 Can identify the land forms and site conditions
(type of soil, soil depth, average topographic
slopes, etc)
 Can examine the topographic slopes, areas of
unstable ground and density, and type of
vegetation cover.
 Air photo provide an excellent overview of the site
and surrounding area.
 Soil test holes should be used to verify the results
of the air photo interpretation.
Elements of Image Interpretation
1. Shape:
– Many natural and human-made features have
unique shapes.
– Often used are adjectives like linear, curvilinear,
circular, elliptical, radial, square, rectangular,
triangular, hexagonal, star, elongated, and
amorphous.
Shape
Jensen (2000)
Elements of Image Interpretation
2. Shadow:
– Shadow reduction is of concern in remote sensing
because shadows tend to obscure objects that might
otherwise be detected.
– However, the shadow cast by an object may be the
only real clue to its identity.
– Shadows can also provide information on the height of
an object either qualitatively or quantitatively.
Shadow
Jensen (2000)
Elements of Image Interpretation
3. Tone and Color:
– A band of EMR recorded by a remote sensing
instrument can be displayed on an image in shades of
gray ranging from black to white.
– These shades are called “tones”, and can be
qualitatively referred to as dark, light, or intermediate
(humans can see 40-50 tones).
– Tone is related to the amount of light reflected from
the scene in a specific wavelength interval (band).
Tone and Color
Jensen (2000)
Elements of Image Interpretation
4. Texture/pattern:
– Texture refers to the arrangement of tone or color in an
image.
– Useful because Earth features that exhibit similar tones
often exhibit different textures.
– Adjectives include smooth (uniform, homogeneous),
intermediate, and rough (coarse, heterogeneous).
Texture/Pattern
Jensen (2000)
Elements of Image Interpretation
5. Height and Depth:
– As discussed, shadows can often offer clues to the
height of objects.
– In turn, relative heights can be used to interpret
objects.
– In a similar fashion, relative depths can often be
interpreted.
– Descriptions include tall, intermediate, and short;
deep, intermediate, and shallow.
Height and Depth
Elements of Image Interpretation
6. Association:
– This is very important when trying to interpret
an object or activity.
Association refers to the fact that certain
features and activities are almost always
related to the presence of certain other
features and activities.
Association
Jensen (2000)
Imaging Tools and Data
• Google Earth
• ERDAS Imagine
• Digital Northern
Great Plains
Assignment: Applications of
Remote Sensing.
 Form a group of 5 person for each.
 Each group need to submit a report (max. 25
pages) about the applications of remote
sensing according to these categories:
 Atmosphere
 Geosphere
 Biosphere
 Hydrosphere
 Cryosphere
 Format:
 Font Times New Roman
 Size 12
 Spacing 1.5
 Content/Outline
 Introduction
 Applications of remote sensing based on category
(Atmosphere, Geosphere, Biosphere,
Hydrosphere, Cryosphere)
 Conclusion
 Submit on 2nd Dec 2013 before 5pm
Download