0% found this document useful (0 votes)
46 views

Remote Sensing and Geographical Information System

The document discusses geoinformatics and its components like remote sensing, geographical information systems, and global positioning systems. It explains key concepts in remote sensing like different types of platforms, sensors, electromagnetic spectrum, and atmospheric interactions with radiation.

Uploaded by

Pranav Patil
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
46 views

Remote Sensing and Geographical Information System

The document discusses geoinformatics and its components like remote sensing, geographical information systems, and global positioning systems. It explains key concepts in remote sensing like different types of platforms, sensors, electromagnetic spectrum, and atmospheric interactions with radiation.

Uploaded by

Pranav Patil
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 136

GEOINFORMATICS

A TECHNOLOGY
TO ENHANCE
QUALITY OF
LIFE
Quality of Life

 Information
 Transparency
 Concern
 Solutions
 Value for Money
Enhancing Quality of Life

• Identification of
the problem
• Extent of the
problem
• Existing status of
the problem
• Parameters
affecting the
problem

Sincerity & Will Power to solve the problem


Policy Maker
GEO(infor)MATICS
“GEOMATICS IS A FIELD OF ACTIVITY, WHICH
USING A SYSTEMATIC APPROACH,
INTEGRATES ALL THE MEANS USED TO
ACQUIRE AND MANAGE SPATIAL DATA
REQUIRED AS PART OF SCIENTIFIC,
ADMINISTRATIVE, LEGAL AND TECHNICAL
OPERATIONS INVOLVED IN THE PROCESS
OF PRODUCTION AND MANAGEMENT OF
SPATIAL INFORMATION”
GEOMATICS - PROCESS
USER

• NEED A COMPUTER (HIGH-END?)


• NEED A SOFTWARE (POTENTIAL?)
• NEED AN OPERATOR (EXPERIENCE?)
• NEED TIME

• NEED DATA
GIS

ASPATIAL SPATIAL

MIS ANALYSIS CONVTL RS

GEOMATICS

FIN GPS

USER
GEOINFORMATICS

Remote Sensing

 Geographical Information System

Global Positioning System

Applications in Resources
Management
Remote Sensing

Do we have To
characterize
Sensing

1.Scanning
2.Characterizing
3.Classification
4.Identification/ Quantification
5.Analysis
REMOTE SENSING
• "Remote sensing is the science of acquiring
information about the Earth's surface without actually
being in contact with it. This is done by sensing and
recording reflected or emitted energy and processing,
analyzing, and applying that information."
Three Essential Things for Remote
Sensing

Object
Sensor to be
Electro Magnetic Radiation sensed
Platforms
Decision Making
& Sensors
Solar Energy
Outputs Hard Copy

Outputs Softcopy

Space Borne

Air Borne
Absorption
Digital Interpretation

Scattering

Visual Interpretation
Reflected energy

Data Products Hard Copy

Incident Radiation

Thermal emission
Data Products Soft Copy

Ground Borne

Transmission
Data Processing
Antenna
Electromagnetic Radiation
AN ELECTROMAGNETIC WAVE

Electromagnetic radiation consists of an electrical


field(E) which varies in magnitude in a direction
perpendicular to the direction in which the radiation is
traveling, and a magnetic field (M) oriented at right
angles to the electrical field. Both these fields travel at
the speed of light (c).
THE ELECTROMAGNETIC
SPECTRUM
Electromagnetic Spectrum
INFRARED (IR) REGION

• Covers the wavelength range from approximately 0.7


mm to 100 mm

• Divided into two categories based on their radiation


properties – the reflected IR(0.7-3.0 µm), NIR(0.7-
1.1µm), SWIR(1.55-1.7 µm) and TIR(3-14 µm).
REFLECTED IR

• Radiation in the reflected IR region is used for remote


sensing purposes in ways very similar to radiation in
the visible portion. The reflected IR covers
wavelengths from approximately 0.7 µm to 3.0 µm.

• Photographic IR ranges from 0.7 to 0.9 µm


THERMAL IR
• The thermal IR region is quite different than the
visible and reflected IR portions, as this energy is
essentially the radiation that is emitted from the
Earth's surface in the form of heat. The thermal IR
covers wavelengths from approximately 3.0 µm to 100
µm. But the heat energy is sensed in windows at 3 to
5.5 µm and 8 to 14 µm.

MICROWAVE REGION
•The microwave region from about 1 mm to 1m
Thermal Infra
Red Image (TIR )
Ultraviolet

X-ray
PLATFORM
CHARACTERISTICS
USUAL PLATFORMS

• Aircraft
– Helicopters
– Microlites
– Low altitude aircrafts
– High altitude aircraft
• Satellites
– Orbiting satellites
– Geostationary satellites
CHARACTERISTICS OF
PLATFORMS
• Aircraft
– Defence permission needed
– Imagery can be obtained at the time and place of
our choice
– Expensive
– Usually used for cameras
– Narrow limited view
– Platform less stable
– Large scales(1:1000 to 1:30000)
– Flexible repeat coverage
– High spatial resolution
– Less cost effective
• Satellite
– Global coverage
– No fuel needed(for 3 years operation)
– Defence permission not needed
– Usually used for scanners and radars which
transmit information in electronic format
– Wide,synoptic view
– Very stable paltform
– Small scale(>1:50000)
– Limited repeat coverage(3 to 26 days)
– Low spatial resolution
– Highly cost effective
ORBIT
The path followed by a satellite

Two types of Orbits are


1) Geostationary Orbits
2) Near Polar Orbits
GEOSTATIONARY ORBITS
at altitudes of approximately 36,000 kilometres
revolve at speeds which match the rotation of the Earth so that they
seem stationary, relative to the Earth's surface
This allows the satellites to observe and collect information
continuously over specific areas
NEAR-POLAR ORBITS

• The inclination of the orbit


relative to polar axis.

 Some of these satellites’ orbits


are also sun-synchronous.
This means that they cover
each area of the world at a
constant local time of day
called local sun time.
 Platforms
SWATH

• As the satellite revolves around the Earth, the sensor


"sees" a certain portion of the Earth's surface.

• The width of the strip imaged is referred to as the


swath width.
SCANNERS
ACROSS-TRACK SCANNERS
• Scan the Earth in a series of lines.

• The lines are oriented perpendicular to the


direction of motion of the sensor platform (i.e.
across the swath).

• Each line is scanned from one side of the sensor to


the other, using a rotating mirror (A).
ALONG-TRACK SCANNERS

• Along-track scanners also use the forward motion of the


platform to record successive scan lines and build up a
two-dimensional image, perpendicular to the flight
direction.
• Instead of a scanning mirror, they use a linear array of
detectors (A) located at the focal plane of the image (B)
formed by lens systems (C), which are "pushed" along in
the flight track direction (i.e. along track).
• Also referred to as pushbroom scanners
• Principle of MSS.
PRINCIPLE OF MULTI SPECTRAL
SCANNER
CHARACTERISTICS OF SCANNER

• Output in digital form


• Highly amenable for computer processing
• No consumables
• Flexible w.r.t. radiometric and spectral
resolutions
PIXELS (Picture + elements)

Black - 0
Grey
White - values
255
Pixel
Energy Interaction
ATMOSPHERIC WINDOWS

• Because certain gases absorb electromagnetic


energy in very specific regions of the spectrum,
they influence the wavelengths, which reach the
Earth available for remote sensing.

• Those areas of the spectrum which are not


severely influenced by atmospheric absorption
and thus, are useful to remote sensors, are called
atmospheric windows.
The transmission and absorption phenomenon varying with the
wavelength
SCATTERING

• Scattering occurs when


particles or large gas
molecules present in the
atmosphere interact with and
cause the electromagnetic
radiation to be redirected from
its original path.
RAYLEIGH SCATTERING

• Rayleigh scattering occurs


particles are very small compared
to the wavelength of the radiation

 Small specks of dust or nitrogen


and oxygen molecules
 The fact that the sky appears "blue"
during the day is because of this
phenomenon.
 Rayleigh Scattering  1/λ4
MIE SCATTERING
• Mie scattering occurs when
the particles are just about the
same size as the wavelength of
the radiation. e.g., Dust, pollen,
smoke and water vapour

• Mie Scattering  1/ λ to 1/λ2


NONSELECTIVE SCATTERING

• This occurs when the particles are


much larger than the wavelength of
the radiation. Water droplets and
large dust particles can cause this
type of scattering.

• Nonselective scattering gets its


name from the fact that all
wavelengths are scattered in all
directions without following any law.
RADIATION - TARGET INTERACTIONS

• Radiation that is not absorbed or scattered in the


atmosphere can reach and interact with the Earth's
surface

Three forms of interaction are

Absorption (A)
Transmission (T)
Radiation (R).
Incident energy (I) from the source
Absorption (A) occurs when radiation (energy) is
absorbed into the target
Transmission (T) occurs when radiation passes
through a target
Radiation (R) occurs when radiation "bounces" off
the target and is redirected.
SPECTRAL REFLECTANCE OF
VEGETATION,SOIL AND WATER
SPECTRAL REFLECTANCE OF VEGETATION

Strong absorption in
blue and red bands.
Reflection depends on
the amount of
chlorophyll in the leaf.
SPECTRAL REFLECTANCE OF VEGETATION

Reflectance peaks in
green, corresponds
with solar maximum
SPECTRAL REFLECTANCE OF VEGETATION

Major reflectance
peaks in NIR,
provides energy
balance for
vegetation
SPECTRAL REFLECTANCE OF VEGETATION

Water absorption at
1.4 and 1.9 microns
due to leaf moisture
SPECTRAL REFLECTANCE OF SOIL
Soil reflectance generally increases gradually from visible to infrared.
SPECTRAL REFLECTANCE OF SOIL

Molecular water absorptions at 1.4


and 1.9 microns
Spectral reflectance curve
Spectral Signature Curves

160 160
135
CLASS:
CLASS:WATER
THICK VEGETATION
140 140
120 Ambazari
Phutala VNIT NEERI
105 Gorewada 120 Koradi Road 120
Gandhi sagar
Seminary Hills
90 Lendi talav

DN VALUES
DN VALUES

100 100
75
80 80
60

60 60
45

30 40 40

15
20 20
0
B1 B2 B3 B4 B5 B6 B7 0 2 4 6 8

BANDS BANDS

Class: Water Class: Thick vegetation


SENSORS

Camera Scanner
PAN

LISS III

WiFS

IRS 1C Sensors overview


Types of RS system

Active RS Passive RS
system system

Artificial Energy Natural Energy


source source

e.g.sensors on
e.g. radar systems
satellites
SLAR,SAR Landsat,SPOT
PASSIVE SENSORS
 Remote sensing systems which measure energy that
is naturally available are called passive sensors.
 Passive sensors can only be used to detect energy
when the naturally occurring energy is available.
ACTIVE SENSORS
• Active sensors, on the other hand, provide their own energy source
for illumination.

• The sensor emits radiation which is directed toward the target to be


investigated.

• Eg: RADAR
Sensor Detection
1.Passive Detection
• sensors measure levels of energy that are
naturally emitted, reflected, or transmitted by the
target object.
• Passive sensors are those which detects naturally
occurring energy. Most often, the source of radioactive
energy is the sun.
• Detection of reflected solar energy, for example, can only
proceed when the target is illuminated by the sun, thus
limiting visible light sensors on satellites from being used
during a nighttime pass.
• The Thematic Mapper, the primary sensor on the Landsat
satellites, is a good example of a passive sensor.
Active detection

1. Active Sensors provide their own energy source for


illumination of the target by directing a burst of radiation at
the target and use sensors to measure how the target
interacts with the energy.
2. Most often the sensor detects the reflection of the energy,
measuring the angle of reflection or the amount of time it
took for the energy to return.
3. Active sensors provide the capability to obtain
measurements anytime, regardless of the time of day or
season.
4. They can be used for examining energy types that are not
sufficiently provided by the sun, such as microwaves, or to
better control the way a target is illuminated. However, active
systems require the generation of a fairly large amount of
energy to adequately illuminate targets.
Doppler radar is an example of an active remote sensing technology.
Resolution of a Sensor

• Spatial Resolution – ‘Area’ aspects


• Spectral resolution – ‘ Band
aspect’
• Radiometric resolution –
‘Radiance’ aspect
• Temporal resolution – ‘Frequency’
aspect
FOUR TYPES OF RESOLUTION ARE:

Spatial Resolution
Spectral Resolution
Radiometric Resolution
Temporal Resolution
Spatial Resolution

• refers to the size of the smallest possible feature that


can be sensed

IRS 1C/D – 5.8m


(PAN)

IKONOS – 1m (PAN)
RESOURCESAT – 5.8m
MULTISPECTRAL
Spatial
Resolution
LANDSAT
30 m

LISS III
23.5

PAN
5.8 m

IKONOS
1m
60 cm spatial
resolution
Spectral Resolution

• Ability of a sensor to define fine wavelength intervals.

• The finer the spectral resolution, the narrower is the


wavelength range for a particular channel or band.
IRS
IRS 1C
1C PAN
PAN IMAGE
IMAGE OF
OF VIZAG
VIZAG STEEL
STEEL PLANT,
PLANT, 1996
1996
IRS 1C LISS III IMAGE OF VIZAG STEEL PLANT, 1996
IRS 1C PAN, LISS III MERGED IMAGE
OF VIZAG STEEL PLANT, 1996
True Color Thermal Infrared
Red, Green Blue Display

R G B R G B R G B
3 2 1 Composite of Landsat 7
4 3 2 Composite of Landsat 7
7 4 2 Composite of Landsat 7
Radiometric Resolution
• Ability of the sensor to discriminate very slight
differences in energy.
• The finer the radiometric resolution of a
sensor, the more sensitive it is to detecting
small differences in reflected or emitted
energy.
0 255 0 127 0 63
256 Level Grey Scale
Temporal Resolution

• Time interval between two successive visits of


the satellite for the same place.

• Spectral characteristics of features may change


over time and these changes can be detected
by collecting and comparing multi-temporal
imagery.
RESOURCESAT

SPECIFICATIONS
IRS-P6 (RESOURCESAT-1) is the most advanced
remote sensing satellite built by ISRO.

The tenth satellite of ISRO in IRS series, IRS-P6 , launched on Oct. 17, 2003
PSLV-C5

Sun Synchronous Orbit

Three Sensors

RESOURCESAT (IRS-P6) sensors viz., LISS-3, LISS-4 and AWiFS are designed
to provide monoscopic and stereoscopic data of varying resolutions. The
geometric and spectral characteristics of the sensors are given.

The payloads will greatly aid crop/vegetation and


integrated land and water resources related applications.
Radiome
Steerabl Swath Spectral
Sensor Spatial tric
ity Km Bands
Resn m Resn

B2, B3, B4 &


LISS-3 23.5 No 140 7 bits
B5

B3 (Mono
5.8 70
chromatic)
LISS-4 Yes 7 bits

23
5.8 B2, B3 & B4

AWiFS-
70 370 B2, B3 & B4
A
No 10 bits
-B ---Do---
370
70
Salient Features :

• Orbit : Circular Polar Sun Synchronous

• Orbit height : 817 km

• Orbit inclination : 98.7 deg

• Orbit period : 101.35 min

• Number of Orbits per day : 14

10.30 a.m.
• Local Time of Equator crossing :

• Repetivity (LISS-3) : 24 days

• Revisit (LISS-4) : 5 days

• Lift-off Mass : 1360 kg


3-axis body stabilised using
• Attitude and Orbit Control : Reaction Wheels, Magnetic Torquers
and Hydrazine Thrusters
Solar Array generating 1250 W, Two
• Power :
24 Ah Ni-Cd batteries
• Mission Life : 5 years
IRS 1C/1D RESOURCESAT OCEANSAT

LISS III PAN WiFS LISS III LISS IV AWifs OCM


(Ocean
Colour
Monitor)
No of 4 1 2 4 3 3 8
Spectral
bands
Spectral B2 0.52-0.59 0.50 -0.75 B3 0.62-0.68 0.52-0.59 0.52-0.59 0.52-0.59 0.402-0.422
Bands B3 0.62-0.68 B4 0.77-0.86 0.62-0.68 0.62-0.68 0.62-0.68 0.433-0.453
B4 0.77-0.86 0.77-0.86 0.77-0.86 0.77-0.86 0.480-0.500
(microns)
B5 1.55-1.70 1.55-1.70 1.55-1.70 0.500-0.520
0.545-0.565
0.660-0.680
0.745-0.785
0.845-0.885

Spatial 23.5 for B2,B3,B4 Better than 188 23.5 5.8 56 360
Resolution and 10
70.5 for B5
(m)

Swath 142 km for B2, 70 km, nadir 774 km 141km 23.9 740 km 1420 km
B3,B4 and 148 Steering (MX mode)
km for B5 Range ± 26° 70.3
(PAN mode)

Radiometri 128 64 128 7 bits 7 bits 10 bits


c levels
Platform/Sensor / Image Image Size Spec. Visible Near IR
Launch Year Cell Size (Cross x Bands Bands
Along- (m) (m)
Track)
Bands

Ikonos-2 4m 11 x 11 km 4 B 0.45-0.52 0.76-0.90


VNIR G 0.52-0.60
1999 R 0.63-0.69
Terra 15 m (Vis, 60 x 60 km 14 G 0.76-0.86
(EOS-AM-1) NIR) 0.52-0.60
ASTER 1999 30 m (MIR) R
90m (TIR) 0.63-0.69
SPOT 4 20 m 60 x 60 km 4 G 0.79-0.89
HRVIR (XS) 0.50-0.59
1999 R
0.61-0.68
SPOT 1, 2, 3 20 m 60 x 60 km 3 G 0.50-0.59 0.79-0.89
HRV (XS) R 0.61-0.68
1986
IRS-1C, 1D 23.6 142 x 142 3 G 0.77-0.86
LISS III m km 0.52-0.59
1995 70.8 70 x 70 R
m km 0.62-0.68
(MIR) Pan
Landsat 7 30 m 185 x 170 7 B 0.45-0.515 0.75-0.90
ETM+ 1999 km G 0.525-
0.605
Landsat 4, 5 30 m 185 x 170 7 B 0.45-0.52 0.76-0.90
TM 1982 km G 0.52-0.60
R 0.63-0.69
IRS-1A, 1B 36.25 m 148 x 148 4 B 0.77-0.86
LISS I, II (LISSII) km 0.45-0.52
1988 72.5 m (LISS G
1) 0.52-0.60
R
Landsat 4, 5 79 m 185 x 185 4 0.63-0.69
G 0.7-0.8
MSS 1982 km 0.5-0.6 0.8-0.9
R
0.6-0.7
IRS-1C, 1D 189 m 810 x 810 2 R 0.62-0.68 0.77-0.86
WiFS 1995 km
Image Pre - Processing / Restoration

Corrected for Distortions, Degradations and Noise


introduced during Imaging Process

Radiometric and Geometric Errors

Sources of Errors - Internal & External


•Internal - Due to Sensor effects and determined from Prelaunch
measurements - Predictable & Unpredictable
•External - Due to Platform Perturbations, Atmosphere ; variable
in nature and determined from GCPs and Tracking data

1. Radiometric Corrections
•Restoring Periodic line dropouts
•Restoring Periodic line stripping
•Filtering of Random Noise
•Correcting for Atmospheric Scattering - Haze Removal
•Correcting for Topographic Slope and Aspect
2. Geometric Corrections
Refers to the modification of input geometry to achieve desired geometry
Corrected for Systematic and Non - Systematic Errors
a) Systematic Errors
•Scan skew, Panoramic distortion, Platform Velocity,
Earth Rotation, Perspective
b) Non - Systematic Errors
•Altitude, Attitude - Pitch, Yaw and Roll
Image - to - Map Rectification
is the process by which image is made planimetric, required when accurate
area, direction and distance measurements are made - For thematic overlay
Image - to - Image Registration
Positions two images of like geometry and same geographic area coincident
with respect to one another - For Change detection

Resampling Techniques - Nearest Neighbor, Bilinear & Cubic Convolution


PART II Image
Processing
•Data Manipulation and Management
•Image Display and Scientific
Visualization
•Image Pre - Processing / Restoration
•Image Enhancement
•Spatial Filtering
•Thematic Information Extraction
•Utilities
•Special Transformations
Data Display and Scientific Visualization
Structure of the Image - BVs, Pixels; Rows, Columns

1. Scatter Diagrams

2. Black & White Display

3. False Color Composites

4. I H S Transformation

5. Natural Color Display


Image Processing

Pixels and Digital Number


Pitch variation

Earth Rotation Altitude Variation

Yaw variation

Spacecraft Velocity Roll Variation

Scan Skew
Rectification

It is the process of transforming the data from


one Gird system to another grid system using
geometric transformation.

• Resampling is the process of extrapolating


data values for the pixels on the new grid
from the values of the source pixels.
• Rearranging the pixels of the image onto a
new grid, which conforms to a plane in the
new map projection and coordinate
system.
Steps in Image Registration
• Selection of Standard Projection System
• Selection of Ground Control Points
• Selection of transformation system
• Calculation of RMS Error
• Validation
Steps in Resampling
Resampling Techniques

• Nearest Neighbour - uses the value of closest pixel to assign to


the output pixel
• Bilinear Interpolation - uses the data file value of the four
pixels in a 2 x 2 window to calculate an output value with a
bilinear function
• Cubic Convolution - uses the data file values of sixteen pixels
in a 4 x 4 window to calculate the output value with a cubic
function

Nearest Neighbor Bilinear Interpolation Cubic Convolution


RMS Error
• RMS Error is the distance between the
input (source) location of GCP and the
retransformed location for the same
Ground Control Point.

RMS error =  [ ( Xr – Xi) *( Xr – Xi) + (Yr – Yi) *(Yr – Yi)]


Where

Xi and Yi are input source coordinate and


Xr and Yr are the retransformed coordinates
Need for Image Correction

To Compensate for Sensor errors and to achieve proper


conditions to support image-data analysis and interpretation

1. Ensuring High Geometric accuracy / Fidelity for


Cartographic applications
2. Registering multiple Images of like geometry and same
geographic area coincident with respect to another
3. For Mosaicking multiple scenes covering adjacent and
overlapping spatial regions
4. For removing the errors caused by Sensor defects
and Improper Calibration data
5. For determining Calibration parameters to establish
relation between Detector output and Scene Radiance
Contrast Enhancement
Image Reduction & Magnification
Transects
Contrast - I max / I min

Causes for low Contrast in Images


1. Low Contrast ratio in the Scene itself
2. Atmospheric Scattering

3. Poor Sensitivity of the Detector to detect and record the contrast


Need for Contrast Enhancement
•To improve the quality of the image
•To enable to discriminate different objects
•To Emphasize areas of interest
•To utilize the entire brightness range of the display
Data Manipulation and Enhancement
Cause for Low Contrast in Images
Need for Enhancement
1. Point Operations - Takes individual Pixels into account and
Inherent relations will not change
•Brightness / Contrast Manipulation
•Linear Contrast Stretch
•Non - Linear Contrast Stretch

2. Neighborhood Operations - Takes surrounding Pixels also into account


and Inherent relations will change
•Spatial Filtering - Edge Enhancement and Edge Detection, Fourier Transform

3. Global Operations - Takes total image as a whole into account and


Inherent relations will change
•Histogram Equalization & Gaussian Stretch
4. Structural Operations - Takes the structural characteristics in the Image
into account and Inherent relations will change
•Structural and Textural Transformations
Linear Contrast Stretch
•Min - Max Contrast Stretch
•Percentage Contrast Stretch

Non - Linear Contrast Stretch

•Logarithmic Stretch
•Arc tan
•Inverse Logarithmic stretch
•Square, Cube, Cube root, Square root
•Exponential etc.

Spatial Filtering

•Low pass - Mean, Median, Mode, Min / Max, Olympic etc


•Band pass
•High pass - Edge enhancement & Edge detection
•Mathematical Transformations - Fourier, Hilbert, Laplace etc.
DATA SCALING

0 255
Histogram Equalization
*The process of redistributing pixel values so that there are approx.
the same number pixels with each value wihin a range. The
result is flat histogram.

Original Histogram After Equalization

peak

Pixels at
tail are
grouped
contrast
is lost
tail
Pixels at peak are spread
--contrast gain
CONTRAST STRETCHING

CONTRAST OF DATA FILE VALUES IN THE MIDDLE


ARE STRETCHED OVER WIDER RANGE IN THE OUTPUT
BRIGHTNESS VALUES FOR THE SAME PIXELS
255 255
Output brightness values

Output brightness values


255 Input data file values

Graph of look up table Enhancement with lookup table


Enhancement Curves
Data compression
1. Image Algebra and Band Ratioing -
To remove the additive and multiplicative influences
For Image Enhancement by Individual band substitution
For Thematic Information Extraction
For discriminating Various objects
•Brightness Index
•Shadow Index
•Vegetation Indices

•PVI - Perpendicular Vegetation Index


•NDVI - Normalised Difference Vegetation Index
•TVI - Transformed NDVI
•Advanced Vegetation Index
•MIR Index
•Kauth - Thomas Transformation
•Tasseled Cap Transformation
2. Principal Component Analysis

3. Canonical Component Analysis


Principal Component Analysis

Original PCA
Utilities
1. Data Mosaicking

2. Resolution Merging
3. Density Slicing
4. Data Subsetting
5. Image, Map and Thematic layer Overlay & Cartographic Composition
6. Digital Elevation Models and Image Draping
7. Layer Stacking

8. Raster to Vector Conversion and Vice-versa


9. Data scaling

10. Masking
NDVI
Global Sea Surface Temperature and Vegetation Distribution
The act of examining images to identify objects, Area or
Phenomenon and judge their significance. It categorizes
The Remotely Sensed data into different classes of
interest. It is an information extraction process.

Qualitative
Quantitative

Objective
Subjective

Visual
Digital
Aspects of Image Interpretation

•Detection
•Recognition & Identification
•Analysis / Grouping
•Significance and level
•Classification
•Standardization/Quantization
Elements of Interpretation
 Tone
 Size
 Shape
 Pattern
 Shadow
 Texture
 Location / Site
 Association
 Resolution
 Slope & Aspect
1.Tone
•Relative Brightness or Color of the Objects

•On Panchromatic imagery - Shades of Gray


•On True / False Color Imagery - RGB/IHS

•Depends upon Properties of the objects

•For Vegetation Depends upon Aspect & Slope,


soil changes, Season/Time etc.
2.Size •Spatial dimension of the object

3.Shape •General form, Configuration of the object

4.Pattern •Spatial arrangement of the surface


features
•Linear, Regular or Irregular – Manmade
/Natural forest
5.Shadows •Due to Sun’s illumination angle, size and
shape of the object

6.Texture •Frequency of tonal changes


•Product of size, shape, tone, pattern shadow
•Fine, Medium, Coarse, Rough

•Grass lands - Fine, Homogeneous young


Forest - Medium to Coarse, Degraded
Forest - Coarse
TEXTURE

FINE

MEDIUM

COARSE
7. Association •Occurrence of certain features in
relation to other
• Building and shadow

8.Location/Site •Geographic or Topographic location


•Altitudinal variation

9.Resolution •Spatial, Spectral, Radiometric, Temporal

10.Aspect •Direction in which slope is facing


1.Data Selection & Screening

•Quality & Contrast


•Cloud, Fog etc free or minimal %

2.Mapping Scale

•For Country - 1:1Million


•For State - 1:250,000
•For District / Division - 1:50000
•For Watersheds - 1:25000
•For Micro-plans - 1:12500 or larger
Advantages of visual interpretation

1. Very useful for small areas

2. Physical themes like roads, rivers, clouds


Shades etc., Can be delineated perfectly.

3. This method can be adopted when the


budget is very low.

4. Generalization is possible for


Quantification
Advantages of digital classification

1.Each and every pixel in the image can be classified.


2.Any number of band combinations can be used for delineation
3.work can be done faster.
4.Classification can be done number of times using different
combinations of algorithm.
5.Computation of results is easy.
6.Conversion of data into vector form after generalization
is easy.
Classification approaches

1.Hard - Supervised & Unsupervised

2. Fuzzy logic & Neural Networks

3. Hybrid Classification approach


Unsupervised Classification
In this classification computer plays the role and useful in
case of complex terrains

Basic steps involved


•Specify the no. of classes
•Specify the convergence threshold
•Specify the accuracy level
•Specify the algorithm
•Specify the no. of iterations
•Regroup the nearly similar classes
•Assign colors
•Ground Truthing
•Refinement
•Ground Checking & Accuracy assessment
•Generation of Statistics, Maps & Reports
Supervised classification
The image analyst supervises the pixel categorisation
Useful in case of simple terrain

Basic steps involved in this classification

Selection of appropriate classification scheme


Selection of appropriate bands

Idetification of Training sets

Selection of appropriate classification algorithm

Performing the classification

Ground checking and accuracy assessment


Ground Truthing
•Process of establishing correlation between spectral
signatures and ground information
•Scene dependant
High contrast images - Less amount
If tonal & textural variations are high - More amount
•Utilized for calibrating the radiometers, Spectrometers etc.
•Probability Proportionate Stratified Random sampling
•Field visit and collection of ground information, ancillary
information and reference material for inaccessible areas
•Collection of spectral signatures for various representative
objects
•Feeding the details to computer and Refinement of classification
Classified Landsat Image(Supervised)
N

Dense urban
Sparse urban
Thick vegetation
Sparse vegetation
Open space
Pavement
Waste land
Water
Accuracy assessment
Issues to be addressed
•Use of Training versus test reference information
•Sampling intensity and sample size
•Appropriate descriptive and multivariate statistics
to be applied

•Sampling Strategy - Probability Proportionate


Stratified Random Sampling

•Producer’s accuracy - Measure of omission error

•User’s accuracy - Measure of commission error

•Kappa Coefficient
CHANGE DETECTION ANALYSIS
General requirements
Data should be obtained from from the same satellite and sensor,
of same geographic area, of same season, should be georeferenced
to 1/4 th pixel accuracy

General Problems
•Generally for Andhra Pradesh October - December data is required
•Difficulty in procurement of same season data without cloud & fog
•Inaccuracies in georeferencing
•Inaccuracies in Interpretation

Remedies
•Proper georeferencing
•Obtain data without clouds & fog or with min %
•Intensive ground truthing for different season data
•Interpretation of images by same person
LISS-III IMAGERY OF LISS-III IMAGERY OF
MUTHUKUNTA 96 MUTHUKUNTA 98
CHANGE DETECTION ANALYSIS OF
NIZAMABADDISTRICT
(FOREST BLOCKS ONLY)

FOREST COVER IN 1996 AND IN 1998


(In sq.kms) (In sq.kms)
DENSE 124.54 204.02
OPEN 231.38 355.85
SCRUB 799.86 855.12
BLANKS 545.30 279.82
WATER 5.37 9.47
TOTAL 1706.45 1704.28
ADVANTAGES OF REMOTE SENSING

• REAL TIME
• SPATIAL LOCATIONS
AND EXTENTS OF
FEATURES CAN BE
COLLECTED
ACURATELY
• CHEAPER
• FASTER
• DIFFERENT SCALES
• EASY UPDATION
• MORE ANALYTICAL
THEMES
Rationale for use of Remote Sensing data

1. Readily available at range of scales

2. Cheaper and more accurate than field surveys

3. Gives Synoptic view

4. Remote sensing data is a record of earth surface at one point in time


5. The radiation outside the human sensitivity range
i.e.,UV, IR, MW etc., can be sensed through remote
sensing
6. Stereoscopic view can be created and measured horizontally
and vertically
7. Inaccessible areas can be mapped easily

You might also like