0% found this document useful (0 votes)
9 views

Lesson 4 - Satellite Based Remote Sensing

Uploaded by

SERENE
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

Lesson 4 - Satellite Based Remote Sensing

Uploaded by

SERENE
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 14

SATELLITE BASED REMOTE SENSING

Satellite - based remote sensing systems, also referred to as space – borne remote sensing
systems, for earth resource monitoring and inventory, are designed and operated for long-term
consistent image acquisition over the entire globe. Unlike airborne remote sensing systems, the
choice of wavelength bands, flight path and other parameters that impact the type and
characteristics of the data acquired, are not chosen on a mission by mission basis but remain
constant over the lifespan of the mission. Replacement satellites are launched periodically to
replace aging systems, upgrade sensors or add sensors.

Sensors may be categorized or classified based on:

 Type of orbit
 Spatial Resolution
 Spectral Resolution

Orbit

It is a circular or elliptical path described by a satellite in its movement around the earth. The
time taken to complete one revolution of the orbit is called the orbital period. The satellite
traces out a path on the earth’s surface, called its ground track, as it moves across the sky. As
the earth below is rotating, the satellite traces out a different path on the ground in each
subsequent cycle. Remote sensing satellites are often launched into special orbits such that the
satellite repeats its path after a fixed time interval. This time interval is called the repeat cycle of
the satellite and defines the temporal resolution of the satellite. Orbital characteristics that are
relevant to earth observation applications include:

 Orbital altitude: It is the distance in meters from the satellite to the earth’s surface. It
influences the spatial resolution and spatial coverage. In general, the higher the orbital
altitude, the higher the spatial coverage but the lower the spatial resolution.
 Orbital inclination: It is the angle in degrees between the orbital plane and the equator. It
determines, together with the field of view (FOV) of the sensor, the latitudes which the
sensor can observe.
 Orbital period: the time required to complete one full orbit.
 Repeat cycle: it is the time in days between two successive identical orbits. The revisit
time, which is the time between two successive images of the same area, is determined by
the repeat cycle and the pointing capability of the sensor.

There are three types of orbit:

i. Geostationary Orbit

Satellites that follow an orbital path that is parallel to the equator in the same direction as the
rotation of the earth, and with the same period, are said to have a geostationary or
geosynchronous orbit. They are usually located at a high altitude, about 36000 Km above the
earth. They are located near the equator since at this latitude there is constant force of gravity in
all directions. At other latitudes, the bulge at the center of the earth would pull on the satellite.
Geostationary satellites enable viewing of the same area of the earth and can view large areas.
They are commonly used by meteorological and communication satellites.

ii. Near-polar orbit

A satellite that has a near-polar orbit is one whose orbital plane has an inclination of near 90
degrees or is inclined at a small angle with respect to the earth’s rotation axis. It therefore passes
near the poles and is able to cover virtually every part of the earth as the earth rotates underneath
it in one repeat cycle. It takes approximately 90 minutes for the satellite to complete one orbit.
iii. Sun synchronous orbit

This is an orbit where the satellite passes over a given location on the earth’s surface at a given
latitude at the same local solar time hence providing solar illumination. Since the earth is not
perfectly round, the bulge around the equator causes an additional gravitational pull to act on the
satellite thus causing the satellites orbit to proceed or recede. As such, the satellites orbit has to
shift by approximately one degree per day since there are 365 days in a year and 360 degrees.
Name Type Altitude(km) Inclination(° Period(min Repeat Sensors
) ) Circle(days)
SPOT Sun 832 98.7 101 26 HRV
synchronous
LANDSAT Sun 705 98.2 99 16 MSS, TM,
synchronous ETM+
MOS Sun 908 103 17 MESSR,
synchronous VTIR,
MSR
EO-1 Sun 705 98.2 99 16 Hyperion
synchronous
IRS Sun 817 98.69 101 24 LISS,
synchronous PAN,
WIFS
RESURS Sun 678 98.04 98 21 MSU-SK
synchronous
NOAA Sun 870-833 98.7-98.99 101.4 AVHRR/2,
synchronous TOVS,
SSU,
MSU
EROS-A1 Sun 475-491 97.3 94 CCD
synchronous
IKONOS Sun 681 98.1 98 PAN
synchronous
ADEOS2- Sun 803 98.6 101 4 GLI
GLI synchronous
SPOT-5 Sun 832 98.7 101 26 HRV
synchronous
Quickbird-2 Sun 450 98 93.4 PAN
synchronous
ADEOS Sun 800 98.6 101 41 OCTS
synchronous
RADARSAT Sun 798 98.6 100.7 24 SAR
synchronous

Spatial Resolution

Sensors may be classified based on their spatial resolution into four categories. They are:

i. Low Resolution Sensors

These include sensors aboard geostationary satellites, which image a portion of the earth
repeatedly and those in sun-synchronous orbits, which progressively build near-complete
coverage of the earth over a period of one or two days. The average resolution for low spatial
resolution sensors ranges from approximately 30 meters to 1 kilometer or more. Examples of low
spatial resolution satellites and sensors are discussed hereafter.

a. GOES

The Geostationary Observational Environmental Satellite (GOES) is operated jointly


by NASA and NOAA. NASA is responsible for launch and NOAA is responsible for
operation. The satellites orbit the earth at an altitude of 36000 kilometers in the same
direction as the rotation of the earth thus maintaining a stationary position relative to the
earth. The GOES series constantly views an entire hemispherical disk. Full earth disk and
selected smaller areas are imaged at spatial resolutions of 1, 4 or 8 Ground Sampling
Distance (GSD) depending on the band. Repeat coverage is every 30 minutes with
smaller area or lower resolution repeat coverage being available in as little as four
minutes. The GOES satellite provides continuous monitoring of temperature, humidity,
cloud cover and other data for weather forecasting. The GOES satellites are part of a
global network of meteorological satellites that are spaced at intervals around the world.

b. TERRA

TERRA is the flagship satellite of the Earth Observing System (EOS), which is a series
of earth observation satellites operated by NASA. It has a sun synchronous orbit and
carries five instruments or sensors including ASTER, CERES, MISR, MODIS and
MOPITT. MODIS stands for Moderate Resolution Imaging Spectroradiometer and it
is a key instrument aboard TERRA. It is an optical imaging system with multispectral
capabilities. Below are some of its specifications.

Data Type Optical


Sensor Type Multispectral
Spatial Resolution [m] 250.0, 500.0, 1000.0
No. of Spectral / Frequency Bands 36
Revisit Time [day] 1
Programmable YES
Stereo NO

c. AVHRR

The Advanced Very High Resolution Radiometer (AVHRR) six channel scanning radiometer
with 1.1-km resolution is sensitive in the visible and near-infrared, and the infrared 'window'
regions. AVHRR data are broadcast for reception by ground stations and also tape-recorded
onboard the spacecraft for readout at the Fairbanks and Wallops Command Data Acquisition
stations. These data can be recorded in 1.1-km resolution (the basic resolution of the AVHRR
instrument) or at 4 km resolution; the swath width is >2600 km. The stored high resolution (1.1-
km) imagery is known as Local Area Coverage (LAC). Owing to the large number of data bits,
only about 11+ minutes of LAC can be accommodated on a single recorder. In contrast, 115
minutes of the lower resolution (4-km) imagery, called Global Area Coverage (GAC), can be
stored on a recorder, enough to cover an entire 102 minute orbit of data.

The AVHRR has flown on the following U.S. civilian meteorological satellites: TIROS-N;
NOAA-6 through NOAA-14, inclusive.

NOAA-K, L and M (NOAA-15 onwards) carry an enhanced version of the AVHRR scanner. It
has six channels (three visible and three infra-red) but, for compatibility at receiving stations,
only five are transmitted. Channel 3 is the visible channel during the daytime and the infra-red
channel at nighttime, although sometimes it is infra-red during the day too for fire detection.
Additionally, the visible channels have been modified with a dual slope for calibration to give
greater sensitivity at low light levels.

Wavelength
Channel Primary Use
(microns)
1 0.58 - 0.68 Daytime cloud/surface mapping
2 0.725 - 1.10 Surface water delineation, ice and snow melt
3A 1.58 - 1.64 Snow / ice discrimination (NOAA K,L,M)
3 (or 3B) 3.55 - 3.93 Sea surface temperature, nighttime cloud mapping
4 10.30 - 11.30 Sea surface temperature, day and night cloud mapping
5 11.50 - 12.50 Sea surface temperature, day and night cloud mapping

ASSIGNMENT 2

Highlight the various medium and high resolution sensors citing their specifications
including but not limited to altitude, spatial resolution, swath width, orbital period, bands
or channels and their applications.
MULTISPECTRAL SCANNERS

Multiband imaging systems using film-based, digital or video camera systems generally detect
only three or four relatively wide spectral bands ranging from 0.3 to 0.9 μm. A multispectral
scanner operates on the same principle of selective sensing in multiple spectral bands with the
exception that such instruments can sense in many spectral bands over a greater range of the EM
spectrum.

Airborne multispectral scanners build up two-dimensional images of the terrain for a swath
beneath the sensor platform in two different ways – across-track (whiskbroom) scanning or
along-track (pushbroom) scanning.

a) Across – track scanning: The Cross Track mode normally uses an rotating (spinning) or
oscillating mirror (making the sensor an optical-mechanical device) to sweep the scene
along a line traversing the ground that is very long (kilometers; miles) but also very
narrow (meters; yards), or more commonly a series of adjacent lines. This is sometimes
referred to as the whiskbroom mode from the vision of sweeping a table side to side by a
small handheld broom.) A general scheme of a typical Cross-Track Scanner is shown
below. The essential components of this instrument (most are shared with Along Track
systems) are:

1) A light gathering telescope that defines the scene dimensions at any moment (not
shown)

2) Appropriate optics (e.g., lens) within the light path train

3) A mirror (on aircraft scanners this may completely rotate; on spacecraft scanners this
usually oscillates over small angles)

4) A device (spectroscope; spectral diffraction grating; band filters) to break the incoming
radiation into spectral intervals

5) A means to direct the light so dispersed onto a battery or bank of detectors


6) An electronic means to sample the photoelectric effect at each detector and to then
reset to a base state to receive the next incoming light packet, resulting in a signal that
relates to changes in light values coming from the ground or target

7) A recording component that either reads the signal as an analog (displayable as an


intensity-varying plot (curve) over time or converts the signal to digital numbers. A
scanner can also have a chopper, which is a moving slit, or opening that as it rotates
alternately allows the signal to pass to the detectors or interrupts the signal (area of no
opening) and commonly redirects it to a reference detector for calibration of the
instrument response.

b) Along – track scanning: The Along Track scanning mode does not have a mirror
looking off at varying angles. Instead, there is a line of small sensitive detectors stacked
side by side, each having some tiny dimension on its plate surface; these may number
several thousand. Each detector is a charge-coupled device (CCD. In this mode, the
pixels that will eventually make up the image correspond to these individual detectors in
the line array. As the platform advances along the track, at any given moment, radiation
from each ground cell area along the ground line is received simultaneously at the sensor
and the collection of photons from every cell impinges in the proper geometric relation to
its ground position on every individual detector in the linear array equivalent to that
position. The signal is removed from each detector in succession from the array in a very
short time (milliseconds), the detectors are reset to a null state, and are then exposed to
new radiation from the next line on the ground that has been reached by the sensor's
forward motion. This type of scanning is also referred to as pushbroom scanning (from
the mental image of cleaning a floor with a wide broom through successive forward
sweeps). As signal sampling improves, the possibility of sets of linear arrays, leading to
area arrays, all being sampled at once will increase the equivalent area of ground
coverage.
Figure 1: Along-track scanner
Radiometer - It is a general term for any instrument that quantitatively measures the EM
radiation in some interval of the EM spectrum. When the radiation is light from the narrow
spectral band including the visible, the term photometer may be used. If the sensor includes a
component, such as a prism or diffraction grating, that can break radiation extending over a part
of the spectrum into discrete wavelengths and disperse (or separate) them at different angles to
detectors, it is called a spectrometer. The term Spectroradiometer tends to imply that the
dispersed radiation is in bands rather than discrete wavelengths. Most air/space sensors are
spectroradiometers.

Field of View (FOV) – It is the largest area that your imager can see at a set distance measured in
degrees and is typically described in horizontal degrees (HFOV) by vertical degrees (VFOV)—
for example, 23º X 17º.

Instantaneous Field of View (IFOV) – It is the smallest detail within the FOV that can be
detected or seen at a set distance often measured in milliradians. It is normally expressed as the
cone angle or solid angle within which incidents radiation is focused on the detector. It is
determined by the instruments optical systems and size of its detectors.

Ground Resolution cell – This is the ground segment sensed at any moment, also referred to as
ground element.

Ground Sampling Distance (GSD) – It is the ground pixel size and is the ground distance
between adjacent sampling points in a digital scanner image or distance between pixel centers
measured on the ground.

SENSOR TYPES

There are many remote sensing satellites and their classification or categorization may be:

1) Functional - The first is a functional treatment of several classes of sensors, plotted as a


triangle diagram, in which the corner members are determined by the principal parameter
measured: Spectral, Spatial or Intensity.
2) The second covers a wider array of sensor types based on scanning and imaging modes.
VISUAL IMAGE INTERPRETATION

Image interpretation is the process of identifying features or objects in aerial and space images
and communicating this information to other people. It is driven by individual perceptions,
training and experience. Aerial and space images contain a detailed record of features on the
ground at the time of data acquisition and an image interpreter with the aid of reference materials
such as maps and field observations systematically examines images in order to identify objects
and features based on the physical nature of objects and phenomena appearing in the images.
Successful image interpretation varies with the training and experience of the interpreter, the
nature of the objects or phenomena being interpreted and the quality of images being utilized. It
is important for the interpreter to have a thorough understanding about the phenomena under
study and knowledge of the geographical area or region being studied.

Aerial and space image interpretation departs from normal photograph image interpretation in
three ways:

1) The perspective is overhead or birds-eye view perspective


2) There is frequent use of wavelengths outside the visible portion of the spectrum
3) The depiction of earth’s features at unfamiliar scales or resolution

For most applications, the elements of image interpretation are:

a) Shape: This refers to the general form, configuration or outline of individual objects. In
stereoscopic images, the height of the object also defines its shape.
b) Size: Size of objects on images must be considered in the context of the image scale.
Relative sizes of objects on images of the same scale must also be considered.
c) Pattern: Relates to the spatial arrangement of objects. The repetition of certain general
forms or relationships is characteristic of many objects, both natural and man-made, and
gives objects a pattern that aids the image interpreter in recognizing them.
d) Tone (or Hue): It refers to the relative brightness (for black and white images) or color
of objects on an image. Without tonal differences, the shapes, patterns and textures of
objects would not be discerned.
e) Texture: This is the frequency of tonal change on an image. Texture is produced by an
aggregation of unit features that may be too small to be discerned individually on the
image e.g. tree leaves. It determines the overall visual “smoothness” or “coarseness” of
image features. As the scale of the image is reduced, the texture of any given objects or
area becomes progressively finer and ultimately disappears. An interpreter can
distinguish between features with similar reflectances based on their texture differences.
f) Shadows: It is of importance to an interpreter since the shape or outline of a shadow
gives an impression of the profile view of objects thus aiding interpretation and objects
within shadows reflect little light and are difficult to discern hence impeding
interpretation.
g) Location (site): This refers to the topographic or geographic location. It is particularly
important for identification of vegetation types. For example, certain vegetation types
occur only in certain areas and not in others.
h) Association: This refers to the occurrence of certain features in relation to others. For
example, a train station is to be expected at the point of convergence of many railway
lines.
i) Resolution: It depends on many factors but it places a practical limit on interpretation
because some features are too small or have too little contrast with their surroundings to
be clearly seen on an image.

You might also like