0% found this document useful (0 votes)
13 views

CSEN316 LecturePresentation 2023-2024

This document outlines the course content and assessment for CSEN 316 Advanced Engineering Surveying. The course covers topics including GIS, remote sensing, satellite positioning, photogrammetry, underground surveying, and hydrographic surveying. Students will be assessed through an end of semester exam worth 70% and coursework including CATs, assignments, and practicals worth 30%. The pass mark is 40% of the total score.

Uploaded by

Fred Wasike
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

CSEN316 LecturePresentation 2023-2024

This document outlines the course content and assessment for CSEN 316 Advanced Engineering Surveying. The course covers topics including GIS, remote sensing, satellite positioning, photogrammetry, underground surveying, and hydrographic surveying. Students will be assessed through an end of semester exam worth 70% and coursework including CATs, assignments, and practicals worth 30%. The pass mark is 40% of the total score.

Uploaded by

Fred Wasike
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 83

–8/6/2023

CSEN 316
Advanced Engineering
Surveying

OKUSIMBA, George
Dept. of Civil & Structural Engineering
University of Eldoret

Course Content
1. GIS
2. Remote Sensing
3. Satellite Positioning (GNSS)
4. Photogrammetry
5. Underground Surveying
6. Hydrographic Surveying

–1
–8/6/2023

Assessment
 End of Semester Exam - 70

 Course Work (CATs, Assignments,


Practicals) – 30

 Pass Mark – 40/100

1. GIS
 References
 Introduction to GIS
 GIS and Digital Mapping
 GIS Components
 Digital Data for GIS
 GIS Data Capture and Editing

–2
–8/6/2023

GIS References
• Longley, P.A., M.F. Goodchild, D.J. Maguire, and D.W. Rhind,
Geographic Information Systems and Science. Wiley Ed. ISBN 0-
471-89275-0, 2001.
• Burrough P.A. and R.A. McDonnell, Principles of geographical
information systems. University Press, 1997.
• Heywood, I., Cornelius, S., & Carver, S: An Introduction to
Geographical Information Systems, Pearson education publishers,
2006.
• Mulaku, G.C.: lecture notes for FSP 591-Land Information Systems
I, University of Nairobi, unpublished, 2005.
• https://mgimond.github.io/Spatial/index.html (accessed on
29.07.2018)

Introduction to GIS
 Definition: computerised data input, management
(storage, retrieval, update), processing and
dissemination of spatially referenced
data/information
 History of GIS-lecture notes
 Digital Mapping: production & reproduction of
maps from digital spatial data. Enables data
capture, limited data processing, data visualisation,
data output. An automated map making system
that lacks data analysis and modelling capabilities
of GIS

–3
–8/6/2023

Introduction to GIS…
Advantages of Digital Mapping
a) Quick data retrieval-enables map production at different
scales, on different projections, etc
b) Quick and easy data addition/deletion-speeds up map
revision
c) Ability to merge data sets
d) Data storage over time without distortion
e) Selective browsing, magnification or highlighting of
features
f) Great data compaction in digital storage
g) Ability to produce maps that are difficult to produce by
hands e.g. 3D maps

Introduction to GIS…
Why GIS?? Problems of conventional methods of handling
spatial data
a) Poor maintenance of geospatial data
b) Maps and statistics are often out of date
c) Inaccurate data/information
d) Inconsistency of geospatial data
e) Lack of standards
f) No sharing of geospatial data
g) No retrieval capacities
h) Lack of scientific decision making

–4
–8/6/2023

Introduction to GIS…
Why GIS?? Benefits of adopting GIS in handling spatial
data:
a) Geospatial data maintained in a standard format
b) Easy to revise and update
c) Search, analysis and representation much easier
d) Possibility of value added products
e) Possible to share and exchange data
f) Productivity more improved
g) Time and cost saved
h) Better decision making

GIS Components
1) Computer Hardware
 Technical equipment for running GIS-computer, input devices,
output devices (scanners, digitizers, GPS data loggers, printers,
etc)
 Purchase of H/Ware must be synchronised with that of S/Ware.
GIS S/ware packages have certain H/ware requirements
2) GIS Software
 Instructions written in formal programming language that a
computer can carry out. GIS S/ware may be expensive than
H/ware
 GIS S/ware is application S/ware designed to run in a particular
operating system
 Modules-digitizing and editing; data storage and management;
analysis; plotting/display; data exchange
 Example: QGIS, GRASS, Global Mapper, ArcGIS, ILWIS etc

–5
–8/6/2023

GIS Components…
3) Data
 Raw material from which information will be processed-
consists of feature positions, attributes and relationship
amongst features
 Kept in a database managed by a database management
system-dbase creation costly and time consuming
 Guidelines before acquiring geographic data: lineage;
positional accuracy; attribute accuracy; logical consistency;
completeness
4) Procedures
 In order for GIS to succeed, it needs an appropriate
organizational environment
 Procedures create awareness about GIS and fit it appropriately
in overall operations of organization
 Issues-standards, access protocols; dbase admin, QA, security

GIS Components…
5) Network
 Used for communication and digital data sharing-internet,
intranet
 connects computers hence linking together distributed users
 Allows information exchange and delivery of GIS software
products over the internet
6) People
 Make GIS work-GIS managers, dbase admins, application
specialists, system analysts, programmers
 Responsible for GIS dbase maintenance and provision of
technical support
 Categories: viewers; general users; GIS specialists

–6
–8/6/2023

GIS Components…

Digital data for GIS


 GIS database is an abstract of the real world
 Real world has analogue and continuous data
 These data must be discrete (digitized) before it
can be handled by a computer

Analogue

digital

–7
–8/6/2023

Digital data for GIS…


Nature of GIS data
 GIS data consists of geographic features/entities
e.g. Roads, rivers, cities etc
 All features represented by one of four
mathematical objects:
• Points;
• Lines;
• Polygons;
• Surfaces.

Digital data for GIS…


Feature definition in GIS
 Three parameters
• Position
• Attributes
• Relationships
 Position: locational data defining extent. Locational data
represented as vector or raster
 Vector formats defined by coordinates in some reference
system
 Raster format, data space consits of pixels/cells.
Row/colunm of such pixels define locational data.
Accuracy no more than pixel size

–8
–8/6/2023

Digital data for GIS…

Vector data model

Rastor data model

Digital data for GIS…


 Vector model advantages
• Precise point location
• Fine output graphics
• Suitable for network analysis, buffer generation, etc
• Less data volume
• Fast retrieval
• Fast conversion
 Disadvantages of vector model
• Complicated structure
• Difficulty in overlay
• Difficulty in updating
• Expensive data capture

–9
–8/6/2023

Digital data for GIS…


Feature Attributes
 Specify non-geometric characteristics of a feature. Positional data
are meaningless by themselves
 Attributes may be numeric-size, slope, height etc or semantic-name,
type, etc
 In raster format, feature types need not be explicitly stated. They
are instead encoded by pixel values e.g. 0-water; 55-forest; 256-
cloud
 Generally, feature attributes stored by means of feature codes that
describe the type of feature that the given coordinates represent
Position Attributes
Feature type Feature Name
Point (x,y) Hill Vyumba hill
Line (x1,y1), (x2,y2),….(xn,yn) Road Nyagatare road
Polygon (x1,y1), (x2,y2….(xn,yn), Land Parcel L/R 50/37
(x1,y1)

Digital data for GIS…


Feature Relationships
 Spatial relationships e.g. Neighbourhood are quite ovious in
analogue data but not apparent in digital data
 Digital data cannot adequately represent the real world without
relationships. In real life, relationships position features than
coordinates
 My home is 9km down the road from Eldoret town to Iten; My
home is at about
Kinds of Relationships
a) Proximal-nearness or farness, difficult to encode digitally
b) Hierachical-between subclasses and superclasses. Encoded by tree
structure. e.g. Secondary road belongs to feature class road
c) Topological-describe neighbourhood, connectivity, and inclusion
properties of features. Most important in GIS

–10
–8/6/2023

GIS data Capture and Editing


 Refers to identification, collection, digitization and error correction
for data necessary to build a GIS database
 Main sources of GIS data
 Maps and Plans
 Satellite Imagery
 Ground Surveying
 Aerial Photographs
 GPS receivers
 Tabular data e.g. Census data
 Direct input from other GIS systems
 Drones etc
 User needs assessment determines type of data to be collected. The
data is usually in analogue or digital format

GIS data Capture and Editing…


 The collected data to be converted to a format compatible with GIS
systems in use
 Digitization
 Tablet digitizer
 Scanning
 Vectorization/heads up on screen digitizing
 GIS data editing
 Detecting and correcting for errors introduced during data
capture process
 Common data editing problems-for vector as well as raster

–11
–8/6/2023

GIS data Capture and Editing…


Common editing problems
 Vector data
 Polygon misclosures
 Overshoots/undershoots
 Polygon labels
 Knots, backtracks, wild lines
 Map sheet combination
 Line generalization
 Rastor data
 Noise removal
 Line thinning
 Gap removal
 Stray pixel removal

GIS data Capture and Editing…

–12
–8/6/2023

2. Remote Sensing
 References
 Introduction to RS
 Electromagnetic Energy
 Sensors and Platforms
 Multispectral Scanners
 Digital Image Enhancement
 Pre-processing and Restoration
 Digital Image Classification

References
 Lillesand T.M., and R.W. Kiefer, Remote sensing and image
interpretation. 4th ed, John Wiley & Sons, 2000.
 Jensen J.R., Introductory digital image processing: a remote sensing
perspective. 2nd ed Prentice Hall, 1996.
 Schowengerdt R.A., Remote sensing - models and methods for
image processing. 2nd ed, Academic Press, 1997.
 Avery T.E., and G.L. Berlin, Fundamentals of remote sensing and
air photo interpretation.
 Internet Resources
 Gomarasca, M.A., Basics of Geomatics, Springer-London New
York, 2004.
 https://rise.articulate.com/share/Cs2pB_Kx2Mdnuv4zyeakUAPMh
EEdOond#/lessons/rIc-a44K62-8kfdCAnNfS1QU56nFTqZl
(accessed on 6/8/2021-fundamentals of RS)

–13
–8/6/2023

Introduction to RS
The Concept of RS
 Two methods of geospatial data capture: ground based and RS
methods
 RS is science and art of acquiring information about earth’s
surface without being in contact with it
 Reflected and emitted energy is recorded, processed, analyzed
and information generated applied
 Process of RS involves interaction between incident radiation
and the targets of interest. It also involves sensing of emitted
energy and use of non-imaging sensors
 Seven stages of RS: energy source/illumination; radiation and
atmosphere; interaction with target; energy recording by
sensor; transmission, reception & processing; interpretation
and analysis; application

Introduction to RS…

–14
–8/6/2023

Introduction to RS…
seven stages of RS

Electromagnetic Energy
 Understanding EM energy, its characteristics and its
interactions is required to understand the RS sensor and RS
data interpretation
 EM energy modelled in two forms: EM waves and photons
 EM waves-energy propagates in sine waves characterized by
two field; electrical and magnetic fields which are
perpendicular to each other. Vibrations of both fields
perpendicular to direction of travel of wave at a peed of light,
c=2,999,790,000 m/s
 Definitions:
Wavelength, λ: distance between successive wave crests
Frequency, υ: number of cycles of a wave passing a fixed
point over a specific period of time-Hz

–15
–8/6/2023

Electromagnetic Energy…
 Definitions:
Speed, c=λυ
 Photons quantify amount of energy measured by RS sensor.
Energy held by a photon, Q=hυ

Electromagnetic Energy…
Electromagnetic spectrum
 All matter with certain temperature radiate EM waves. EM
spectrum is the total range of wavelengths extending from
shorter wavelengths [Gamma & X-rays] to the longer
wavelengths [microwaves & radio waves]
 RS operates in several regions of spectrum.
 Optical part: optical laws applicable here-reflectance and
radiation. Extends from X-rays (0.02μm) through visible part
to and including infra red (1000μm)
 UV has shortest wavelengths which can be exploited by RS.
Its just beyond the violet portion of the visible wavelengths.
Rocks and minerals emit visible light when illuminated by UV
radiation.

–16
–8/6/2023

Electromagnetic Energy…
Electromagnetic spectrum

 Light is part of the visible spectrum-very small relative to


rest of spectrum, alot of radiation around us is invisible to
our eyes but can be detected by other RS sensors

Electromagnetic Energy…
Electromagnetic spectrum
 Visible wavelength cover a range from 0.4mm-0.7mm. The
longest being red and the shortest violet
 Sunlight is seen as a uniform color but it is composed of
various wavelengths-ultraviolet, visible and infra red.
 Visible portion shown in its components when sunlight is
passed through a prism bending the light in differing amounts
according to wave lengths.

–17
–8/6/2023

Electromagnetic Energy…
Electromagnetic spectrum
 infrared (IR) region covers wavelength range from approximately
0.7 mm to 100 mm - is more than 100 times as wide as the visible
portion. It can be divided into two categories based on their
radiation properties - the reflected IR, and the emitted or thermal IR
 Radiation in the reflected IR region is used for RS purposes in ways
very similar to radiation in the visible portion. It covers
wavelengths from approximately 0.7 mm to 3.0 mm.
 The thermal IR region is quite different from the visible and
reflected IR portions, as this energy is essentially the radiation that
is emitted from the Earth's surface in the form of heat. The thermal
IR covers wavelengths from approximately 3.0 mm to 100 mm and
give information about surface temperature which in turn can be
related to mineral composition of rocks or condition of vegetation.

Electromagnetic Energy…
Electromagnetic spectrum
 Microwaves range covers wavelengths of 1mm to 1m. They provide
information on roughness and properties of surface such as water
content

–18
–8/6/2023

Electromagnetic Energy…
Electromagnetic spectrum

Electromagnetic Energy…
Energy interactions in the atmosphere-particles & gases
 Scattering and absorption takes place
 Scattering occurs when particles or large gas molecules present in
the atmosphere interact with and cause the electromagnetic
radiation to be redirected from its original path
 Factors affecting scattering are:
– Wavelength of the radiation;
– Abundance of particles and gases;
– Distance the radiation travels through the atmosphere;
– Number of gas particles per unit volume.
 Types of Scattering
– Rayleigh scattering
– Mie scattering
– Non-selective scattering

–19
–8/6/2023

Electromagnetic Energy…
 Rayleigh scattering:
 particles very small compared to the wavelength of the
radiation-specks of dust or nitrogen and oxygen molecules.
 causes shorter wavelengths of energy to be scattered much more
than longer wavelengths.
 Rayleigh scattering is the dominant scattering mechanism in the
upper atmosphere. In the absence of particles and scattering the
sky would appear black.
 During the day the sun’s rays travel the shortest distance through
the atmosphere, hence Rayleigh scattering causes a clear sky to
appear blue.
 At sunrise and sunset, the sun‘s rays travel a longer distance
through the earth‘s atmosphere before they reach the surface
such that all the shorter wavelengths are scattered, only the
longer wavelengths reach the earth‘s surface, hence the sky
appears orange or red.

Electromagnetic Energy…
Effects of Rayleigh scattering

–20
–8/6/2023

Electromagnetic Energy…
 Mie scattering:
 occurs when the particles are just about the same size as the
wavelength of the radiation.
 Dust, pollen, smoke and water vapour are common causes of
Mie scattering which tends to affect longer wavelengths.
 Mie scattering occurs mostly in the lower portions of the
atmosphere where larger particles are more abundant, and
dominates when cloud conditions are overcast. It influences the
entire spectral region from the near-ultraviolet up to and
including the near-infrared.

Electromagnetic Energy…
 Non-selective scattering:
 occurs when the particles are much larger than the wavelength of
the radiation.
 Water droplets and large dust particles can cause this type of
scattering. Non-selective scattering gets its name from the fact
that all wavelengths are scattered about equally.
 This type of scattering causes fog and clouds to appear white to
our eyes because blue, green, and red light are all scattered in
approximately equal quantities.
 It is important to note that optical remote sensing cannot
penetrate clouds. This type of scattering causes parts of the
earth‘s surface to have shadows..

–21
–8/6/2023

Electromagnetic Energy…
Absorption:
 causes molecules in the atmosphere to absorb energy at
various wavelengths. Ozone, carbon dioxide, and water vapour
are the three main atmospheric constituents which absorb
radiation.
 atmospheric windows: areas of the spectrum which are not
severely influenced by atmospheric absorption and thus, are
useful to remote sensors.
• A window in the visible and reflected Infrared region between
0.2 and 4m. this is the window where optical remote sensors
operate.
• Three windows in the thermal infrared region; two narrow
windows around 3 and 5m and a third window extending
from 8 to 14m.

Electromagnetic Energy…
atmospheric windows

–22
–8/6/2023

Electromagnetic Energy…
Energy interaction with the earth’s surface
 incident energy hits the earth’s surface- reflection, absorption and
transmission.
 RS systems operate in the wavelength regions where reflected
energy dominates or is prevalent-equation relating the incident,
reflected, transmitted and absorbed energy written as:
𝐸𝑅 𝜆 = 𝐸𝐼 𝜆 − [𝐸𝐴 𝜆 + 𝐸𝑇 𝜆 ]
 The reflected energy is thus equal to energy incident on a given
feature, reduced by the energy that is either absorbed or transmitted
by a target/feature.
 Reflection occurs when radiation bounces off the target and is then
redirected; absorption occurs when radiation is absorbed by the
target and transmission occurs when radiation passes through a
target

Electromagnetic Energy…
 Specular Reflection: where all (or almost all) of the energy is
directed away from the surface (smooth) in a single direction.
 Diffuse Reflection (Lambertian): when surface is rough and
the energy is reflected almost uniformly in all directions
 Most earth surface features lie somewhere between
perfectly specular or perfectly diffuse reflectors.

types of reflection

–23
–8/6/2023

Electromagnetic Energy…
Spectral Reflectance Curves
 what do we measure in RS? The energy
reaching a particular surface is called irradiance
while the energy reflected by the surface is
called radiance.
 spectral reflectance curves show the fraction of
incident radiation that is reflected as a function
of the wavelength. Reflectance measurement
can be carried out in the laboratory or in the
field using a field spectrometer.

Electromagnetic Energy…
Spectral Reflectance Curves

–24
–8/6/2023

Sensors & Platforms


Sensors:
 Sensors measure and record EM energy.
They are divided into two:
– Active Sensors: have their own source of
energy; measurements are more controlled;
include laser altimeter and radar.
– Passive Sensors: depend on external
source of energy e.g. sun; example is the
photographic camera

Sensors & Platforms…


Platforms:
 Sensors are mounted on platforms such as aircrafts and
satellites. Static satellites are also occasionally used.
 Airborne Observations: carried out using aircraft with
specific modifications to carry sensors. Ultra light
vehicles, balloons and kites are also sometimes used.
altitudes between 100m and 30-40km. RS data affected by
the altitude and orientation of the aircraft.
 Spaceborne Observations: sensors mounted on satellites
launched in space by rockets. Satellites for earth
observation are usually positioned in orbits between 150-
36000km altitudes. The specific orbit depends on
objective of mission-meteorology, urban areas

–25
–8/6/2023

Sensors & Platforms…


Orbit Characteristics relevant to RS
 Altitude: distance from the satellite to mean sea level of
the earth. RS satellites orbit either at 600-800km (polar
orbit) or at 36000km (geostationary orbit) from the earth.
Altitude influences which area is viewed and at which
detail (resolution).
 Inclination angle: angle in degrees between the orbit and
equator. It determines, together with the field of view of
sensor, which latitudes can be observed. If the inclination
is 60o, then the satellite flies over the earth between
latitudes 60o south and 60o north. It cannot observe parts
of earth at latitudes above 60o.

Sensors & Platforms…


Orbit Characteristics relevant to RS…
 Period: it is the time required to complete one full orbit in
minutes. A polar satellite orbits at 800km altitude and has
a period of 90 minutes. The speed of platform has
implications for the type of images that can be acquired
(‘time of exposure’).
 Repeat cycle: it is the time in days between two
successive identical orbits. The revisit time is the time
between two successive images of the same area. Revisit
time is determined by repeat cycle together with pointing
capability of sensor. Pointing capability refers to the
possibility of sensor-platform to ‘look sideways’

–26
–8/6/2023

Sensors & Platforms…


Orbit Characteristics relevant to RS…
Orbit Types
 Polar/Near polar orbit: inclination angle is between 80o and 100o.
They enable observation of the whole globe; placed at altitude of
600-800km.
 Sun-Synchronous orbit: chosen such that the satellite always
passes overhead at same local solar time. Most of these satellites
cross the equator at mid morning (1030 hrs) when the sun is low
and the resultant shadows reveal terrain relief. They allow the
satellite to record images at two fixed times during one 24 hr
period. Landsat, SPOT and IRS are examples.
 Geostationery/geo-synchronous orbit: the satellite is placed
above the equator (inclination angle is 0o) at a distance of
36000km. the period of satellite is equal to the period of the
earth.

Sensors & Platforms…

–27
–8/6/2023

Sensors & Platforms…

Sensors & Platforms…


RS Images
 are a measurement of EM energy and
therefore are more than a picture.
 Image data is stored in a regular grid format
(rows and columns) with the single element
called pixel or picture element.
 Each pixel has a Digital Number (DN) value.
Typically for each wavelength band, a
separate layer is stored.

–28
–8/6/2023

Sensors & Platforms…


RS Images

Sensors & Platforms…


RS Images
The quality of image data is determined by the sensor-
platform system. The following are sensor-platform
characteristics that determine the quality of image:
 Spectral and Radiometric resolution: spectral resolution is
the part of EM spectrum measured while radiometric
resolution refers to the differences in the energy that can
be observed.
 Spatial resolution: it is the smallest unit area measured. It
indicates the minimum size of objects that can be
detected.
 Revisit time: it is the time between two successive image
acquisitions over same location on the earth.

–29
–8/6/2023

Sensors & Platforms…


Characteristics of Image Data
 Image Size: number of rows and columns in a scene;
 Number of bands: number of wavelengths stored e.g. 1 band for
black/white photo; 4 bands for SPOT image
 Quantization: this is the data format used to store energy
measurements. For each measurement, one byte is used to store the
energy. The byte represents the discrete values 0-255 which are
known as digital numbers (DN). The DN values can be converted
into measured energy (watt).
 Ground pixel size: this is the area coverage of a pixel on the ground,
usually around figure e.g. 20m or 30m. It is related to the spatial
resolution but they are not necessarily the same.
 The image size, number of bands and quantification allow the disk
space required to be computed. E.g. SPOT multispectral images
requires 3000(rows)x3000(columns)x4(bands)x1(byte)=36MB

Sensors & Platforms…


Data Selection Criteria
• Spatio-Temporal characteristics: understand
information requirements for a specific application.
analyse the spatio-temporal characteristics of the
phenomena being investigated. E.g. monitoring fast
urban growth in an area or for studying slow
desertification in an area.
• Availability of Data: refers to data that has already
been acquired and resides in archives or data needs to
be acquired at your request. If up to date data is
required, the request has to be made to aerial survey
companies or RS data distributors.

–30
–8/6/2023

Sensors & Platforms…


Data Selection Criteria…
 Cost of Data: the costs of different types of image
data can only be compared when calculated for a
specific project with specific data requirements. Data
in archives is usually cheaper than specially ordered
data. Care must also be taken on the quality of data
i.e. processing level. Costs of optical satellite data
vary from free (public domain) to 45 euro/km2 while
that of aerial photos depend on area size, photo scale,
film type, processing used and range between 5-20
euro/km2. The prices are for European conditions.

Sensors & Platforms…

–31
–8/6/2023

Multispectral Scanners
 MSS record EM energy in multiple spectral bands (Visible, Near
IR, Mid IR, Thermal, Microwave) at different spatial resolutions
(Low, Medium, High, Very high).
 platforms are normally airborne or spaceborne. Multiband imaging
systems using film-based, digital, or video cameras generally sense
in three or four relatively wide wavelength bands from 0.3 to 0.9
m.
 MSS sense in very narrow spectral bands, but over a greater range
of the EM spectrum. They use electronic detectors and can extend
the range from 0.3m to about 14m; this covers: UV, visible,
near-IR, mid-IR, and thermal IR.
 MSS images are acquired through across-track and along-track
scanning.
 Hyperspectral sensing acquires images in many very narrow,
contiguous spectral bands throughout the visible, near-IR, and mid-
IR portions of the EM spectrum.

Multispectral Scanners…
Across Track (Whisk Broom) Scanning
• scan the Earth in a series of lines oriented perpendicular to the
direction of motion of the sensor platform. Each line is scanned
from one side of the sensor to the other, using a rotating mirror
(A). As the platform moves forward over the Earth, successive
scans build up a two-dimensional image of the Earth´s surface.
• reflected or emitted radiation is separated into several spectral
components that are detected independently. The UV, visible,
near-infrared, and thermal radiation are dispersed into their
constituent wavelengths.
• internal detectors (B), each sensitive to a specific range of
wavelengths, detects and measures the energy for each spectral
band and then, as an electrical signal, they are converted to
digital data and recorded for subsequent computer processing.

–32
–8/6/2023

Multispectral Scanners…
Across Track (Whisk Broom) Scanning…
• The IFOV (C) of the sensor and the altitude of the platform
determine the ground resolution cell viewed (D), and thus the
spatial resolution.
• The angular field of view (E) is the sweep of the mirror,
measured in degrees, used to record a scan line, and determines
the width of the imaged swath (F).
• Airborne scanners typically sweep large angles (between 90º and
120º), while satellites, because of their higher altitude need only
to sweep fairly small angles (10-20º) to cover a broad region.
• Because the distance from the sensor to the target increases
towards the edges of the swath, the ground resolution cells also
become larger and introduce geometric distortions to the images.

Multispectral Scanners…
Across Track (Whisk Broom) Scanning…
• Also, the length of time the IFOV "sees" a ground resolution cell as
the rotating mirror scans (called the dwell time), is generally quite
short and influences the design of the spatial, spectral, and
radiometric resolution of the sensor.
• The scanner “sees” either cone angle or rectangular area on ground.
For cone angle: D = H
Where:
D = diameter of the circular ground area viewed
H = flying height above terrain
 = IFOV of the system in radians
• Landsat, NOAA/AVHRR

–33
–8/6/2023

Multispectral Scanners…
Across Track (Whisk Broom) Scanning…

Multispectral Scanners…
Along Track (Push Broom) Scanning
• These systems also use the forward motion of the platform to
record successive scan lines, perpendicular to the flight
direction that build up a two-dimensional image.
• use a linear array of detectors (A) (also Charged Coupled
Devices-CCD) located at the focal plane of the image (B)
formed by lens systems (C), which are "pushed" along in the
flight track direction.
• Each individual detector measures the energy for a single
ground resolution cell (D). A separate linear array is required
to measure each spectral band or channel. For each scan line,
the energy detected by each detector of each linear array is
sampled electronically and digitally recorded-at any instant, a
row of pixels is formed.

–34
–8/6/2023

Multispectral Scanners…
Along Track (Push Broom) Scanning…
• have the ability of off-track viewing: the scanner points to the left
and right of the orbit track or back and forth along the track. This
has the advantage of enabling acquisition of stereo images and also
imaging areas not covered by clouds at that particular moment.
• SPOT, IKONOS, Orbview3
Advantages over across-track scanning systems:
– Each detector has a longer dwell time in which to accumulate
energy from a ground resolution cell;
– Higher signal to noise ratio (SNR or S/N);
– Greater range in signal levels that can be sensed hence better
radiometric resolution;
– Geometric integrity better due to fixed relationship among
detectors. Variations in the scan mirror velocity of across-track
scanners not present in along-track scanners.

Multispectral Scanners…
Along Track (Push Broom) Scanning…
– Along-track scanners generally smaller, lighter and require less
power.
– No moving parts mean higher reliability, and thus a longer life.
– The disadvantage of pushbroom scanners is that one has to
calibrate many detectors in the array to achieve uniform
sensitivity.

–35
–8/6/2023

Multispectral Scanners…
Along Track (Push Broom) Scanning…

Major Earth Observing Satellites


 Operational MSS can be divided into three major groups: low
resolution satellites; high resolution and very high resolution
satellites.
 characteristics of the satellites that fall in the different resolution
groups
• Altitude
• Inclination angle
• Orbit
• Revisit time
• Scene size/Resolution
• Sensor
• Bands
• Applications
• Year of launch, Operational?, Scanner, owner etc

–36
–8/6/2023

Major Earth Observing Satellites…


Low Resolution Satellites (50-500m)
 NOAA/AVHRR
 Meteosat
 MODIS
 Envisat
 Landsat
 SPOT etc
High Resolution Satellites (5-50m)
 Landsat
 SPOT
 IRS
 ASTER etc

Major Earth Observing Satellites…


Very High Resolution Satellites (0.5-5m)
 IKONOS
 QB
 EROS
 World View
 Geo Eye
 Sentinel etc

–37
–8/6/2023

Major Earth Observing Satellites…


LandSat Programme-US
Year Programme Sensor on board Resolution (m)
1972 LandSat-1 MSS 80
1975-1982 LandSat-2 TM 80
1978-1983 LandSat-3 TM 80
1982-1987 LandSat-4 TM 30
1984 to date LandSat-5 TM 30
1993: failed after launch LandSat-6 ETM+ 30
1999 to date LandSat-7 ETM+ 30
2013 to date Landsat 8 OLI 30
MSS

2020 Landsat 9 Thermal IR 100


TIRS

Major Earth Observing Satellites…


Principle applications of TM bands
Band Wavelength () Applications
1 0.45-0.52 Coastal water mapping; soil-vegetation
discrimination; forest type mapping
2 0.52-0.60 Designed to measure green reflectance peak of
vegetation hence used for vegetation discrimination.
3 0.63-0.69 Designed to sense on chlorophyll absorption region
aiding in plant species discrimination.
4 0.76-0.90 Determining vegetation types; biomass content;
delineating water bodies and soil moisture
discrimination.

5 1.55-1.75 Indicate vegetation moisture content & soil


moisture; differentiation of snow from clouds
6 10.4-12.5 Vegetation stress analysis; soil moisture
discrimination; thermal mapping applications.
7 2.08-2.35 Discrimination of mineral and rock types;
vegetation moisture content.

–38
–8/6/2023

Major Earth Observing Satellites…

 Multispectral: 3-7 bands; 50-120nm band width;


high spectral resolution; low spatial resolution
 Hyperspectral: 10 or more bands; 1-5nm band
width; low spectral resolution; high spatial
resolution

–39
–8/6/2023

Digital Image Enhancement


 conversion of the image quality to a better and more
understandable level for feature extraction or image
interpretation.
 Objective-improve the appearance of an image (visual
interpretability) and the process is usually subjective and
interactive.
 Human mind is good at interpreting spatial attributes on
an image and can detect and identify subtle features but
the human eye is poor in discriminating small radiometric
or spectral differences.
 Digital enhancement aims at visually amplifying small
differences for visualization.

–40
–8/6/2023

Digital Image Enhancement…


two categories of elementary image enhancement:
 Histogram Operations: aim at global contrast
enhancement. They look at pixels without
considering where they are in the image and assign
a new value to a pixel by making use of look up
tables which are set up from image statistics.
 Filtering Operations: enhance brightness, edges
and suppress unwanted image details. local
operator computing new pixel values based on
values of pixel in the local neighbourhood.

Digital Image Enhancement…


Representation of histogram data:
 Tabular form:
– DN-Digital numbers in the range [0.....255]
– Npix-the number of pixels in the image with this DN
(frequency)
– Perc-frequency as a percentage of the total number of
image pixels.
– CumNpix-cumulative number of pixels in the image
with values less than or equal to DN.
– CumPerc-Cumulative frequency as a percentage of the
total number of image pixels.

–41
–8/6/2023

Digital Image Enhancement…


• Histogram Data can further be summarized in
some statistical characteristics e.g. Mean, Standard
deviation (statistical measure of the spread of
values around the mean), Minimum, and
Maximum.
• A narrow histogram (small standard deviation)
represents an image of low contrast because all the
DNs are similar and mapped to only a few grey
values.

Digital Image Enhancement…


 Graphical form

–42
–8/6/2023

Digital Image Enhancement…


 Single band image display
– Histogram is used for optimum display of single band
images. The images are displayed using a grey scale. Grey
shades of the monitor range from black (value 0) to white
(value 255).
– using the original values to control the way an image is
displayed on a monitor results in an image with little
contrast since only a limited number of grey values are
used.
– In order to make use of the range of grey values, a transfer
function is used to map DN-values into grey shades on the
monitor. The transfer function can be chosen in a number
of ways: linear contrast stretch; and histogram equalization.

Digital Image Enhancement…


 Linear contrast stretch
– the lowest input DN of interest becomes zero and the highest
becomes 255. This procedure is enabled by transfer functions.
The monitor will display zero as black and 255 as white. Linear
stretch is straight forward method of contrast enhancement,
giving fair results when the input image has narrow histogram
but close to uniform distribution.
 Histogram equalization
– Here DN values are distributed as equally as possible over the
available grey levels. It aims at achieving a more uniform
histogram. It is a non-linear transformation.
 Contrast enhancement amplifies small differences in the data so that
we can easily visually differentiate features but it does not increase
the information content of the data.

–43
–8/6/2023

Digital Image Enhancement…


 Filter Operations
– Filtering carried out on a single band. Filters can be used
for image enhancement to reduce noise (smooth an image)
or to sharpen a blurred image.
– Filter operations also used to extract features from images
e.g. edges and lines for automatically recognizing patterns
and detecting objects.
– A large group of filters are the linear filters. They calculate
the new value of a pixel as a linear combination of the
given values at the pixel under consideration and its
neighbours. Linear filters are defined through kernels. The
kernel specifies the size of the neighbourhood that is
considered i.e. 3 by 3; 5 by 5; or 3 by 1 and the coefficients
for the linear combination.

Digital Image Enhancement…


 Filter Operations

Noise Reduction
Consider kernel (a) below in which all values are 1. This
implies that values of the nine pixels in the neighbourhood are
summed up and the result divided by 9. In this case the gain is
1/9.

–44
–8/6/2023

Digital Image Enhancement…


 Filter Operations

(a) averaging filter; for noise reduction; gain is 1/9;


smoothens/blurred image
(b) less drastic blurring; hzl & vtl neighbours influence
results more strongly than diagonal ones

Digital Image Enhancement…


 Filter Operations
Edge Enhancement
filtering emphasizes local differences in grey
values for example related to linear features such
as roads. This is done using an edge enhancing
filter which calculates the difference between
central pixel and its neighbours.
(c) above-implemented using negative values for
the non-central kernel values. The gain is
calculated as 1/(16-8)=1/8. The sharpening effect
can be made stronger by using smaller values for
the central pixel with a minimum of 9.

–45
–8/6/2023

Pre-processing & Restoration


 Rectification, visualization and interpretation
 Involves radiometric and geometric corrections

Pre-processing & Restoration…


 Radiometric Corrections
Cosmetic rectification: compensate for data loss
Atmospheric corrections: compensate for effect of
atmospheric and illumination parameters e.g. haze, sun
angle and skylight on image data

 Cosmetic corrections
– remove visible errors and noise in the image data.
– Defects in the data maybe in the form of:
• Periodic or random missing lines (line dropouts),
• Line striping and
• Random or spike noise.

–46
–8/6/2023

Pre-processing & Restoration…


Periodic line dropouts
 occur as a result of recording problems when one of the
detectors of the sensor in question gives wrong data or stops
functioning.
 Restoration involve:
• Calculate average DN-value per scan line for the entire scene;
• The average DN-value for each scan line is compared with the
scene average;
• Any scan line deviating from the average by more than a given
threshold is said to be defective.
• The next step is to replace the defective lines. For each pixel in
the defective line an average DN is calculated using DNs for the
corresponding pixel in the preceding and succeeding scan lines.
The average DN is substituted for the defective pixel.

Pre-processing & Restoration…


Line Stripping
 more common than line dropouts. It occurs due to non-
identical detector response. Note that all detectors for all
satellite sensors are carefully calibrated and matched
before the launch of the satellite. However with time the
response of some detectors may drift to either lower or
higher levels thus resulting in either brighter or darker
scan lines recorded by that detector.
 restoration involves the use of histogram matching. For
each detector unit a histogram is constructed and taking
one response as the standard or reference, all other
detector units are suitably adjusted and new DN-values
computed and assigned.

–47
–8/6/2023

Pre-processing & Restoration…


Random Noise/Spike
 Periodic line drop-outs and line stripping are examples of non-
random noise which are easily recognized and restored.
 Random noise is restored by digital filtering. Causes of
random noise may be due to errors during transmission of data
or temporary disturbance. The effect is that individual pixels
acquire DN values which are much higher or lower than the
surrounding pixels.
 In the image, these pixels appear as bright and dark spots that
interfere with information extraction process.
 A spike noise can be identified by comparing with the
neighbouring pixel values. If neighbouring pixel values differ
by more than a specific threshold, it is designated as a spike
noise and the DN is replaced by an interpolated DN-value.

Pre-processing & Restoration…


Atmospheric Corrections
 The attenuation (decrease) of all reflected and
emitted radiations are due to the absorption and
scattering by the constituents in the atmosphere.
 These distortions are wavelength dependent and
their effect on remote sensing data can be reduced
by applying atmospheric correction techniques.
 These corrections are related to the influence of
haze, sun angle and skylight.

–48
–8/6/2023

Pre-processing & Restoration…


Haze
• has an additive effect that results in higher DN-values and a
decrease in the overall contrast in the image data.
• more pronounced in the shorter wavelength and negligible in
the IR.
Sun angle correction
• The position of the sun relative to the earth changes depending
on the time of the day and day of the year. In the northern
hemisphere, the solar elevation angle is smaller in winter than
in summer. This implies that image data of different seasons
are acquired under different solar illumination.
• Sun angle correction becomes important when one wants to
generate mosaics or perform change detection studies.

Pre-processing & Restoration…


Sky light correction
• Scattered light reaching the sensor after being reflected
from the earth’s surface constitutes the skylight or sky
irradiance. This causes a reduction in contrast of the
image.
• Correcting for this effect requires additional information
e.g. aerosol distribution, gas composition which is
difficult to obtain.

–49
–8/6/2023

Pre-processing & Restoration…


Geometric Corrections
Taken into account when data is being used to:
 Derive 2-D (x,y) and 3-D (x,y,z) co-ordinate information.
2-D geometric descriptions of objects (points, lines, areas)
can be derived from a single image or photo. 3-D can be
derived from stereo pairs of images or photos.
 Merge different types of image data for integrated
processing and analysis e.g. land cover classification
based on Landsat and SPOT (geocoded)
 Visualize the image data in a GIS environment
(georeferenced to the co-ordinate system).

Pre-processing & Restoration…


Relief displacement
 One of the characteristics of most sensor systems is the
distortion of the geometric relationship between image
data and the terrain caused by relief differences on the
ground. This effect is more critical in aerial photographs
and airborne scanner data.
 The effect of relief displacement is that inaccurate or
wrong co-ordinates might be determined e.g. when
digitizing from image data.
 Relief displacement can be corrected for if information on
the terrain topography is available in form of a DTM.

–50
–8/6/2023

Pre-processing & Restoration…


Relief displacement…

 two approaches in geometric corrections:


 Two Dimensional; and
 Three Dimensional.

Pre-processing & Restoration…


Two Dimensional Approaches
 Applicable to images where relief displacement can be neglected
 georeferencing
 geocoding

Three Dimensional Approaches


 The vertical extent is considered in mapping in two types of
situations:
 When we want 2D data but terrain under consideration has large
elevation differences. These cause relief displacement which result
in errors in the computed map coordinates.
 When we want 3D data-DTM & DSM (DEMs)

–51
–8/6/2023

Digital Image Classification


Image Space
 digital image is a 2D array of pixels.
 The value of the DN of a pixel is in the case of 8 bits recording in
the range 0-255.
 The DN corresponds to the energy reflected or emitted from the
ground resolution cell.
 The spatial distribution of DNs defines the image or image space.
 A multispectral sensor records radiation from a particular ground
resolution cell in different channels according to its spectral band
separation.
 A sensor recording in four bands yields four pixels with the same
row-column tuple stemming from one and the same ground
resolution cell.

Digital Image Classification…


Image Space

–52
–8/6/2023

Digital Image Classification…


Feature Space
• Consider a two band image. The 2 DN values for a ground
resolution cell can be considered to be components of a 2D vector
called the feature vector. A feature vector could be represented by
[13,15] which means that the conjugate pixels of band 1 and band 2
have the DN values 13 and 15 respectively.
• A feature space is a graph that shows feature vectors. The feature
space is also called a feature space plot or scatter plot.
• Plotting all feature vectors of digital image pairs yields a 2D scatter
plot of many points. This 2D scatter plot provides information about
pixel value pairs that occur within a 2 band image. Some
combinations will occur more frequently which can be visualized
by using intensity or color.

Digital Image Classification…


Feature Space

–53
–8/6/2023

Digital Image Classification…


Distances & Clusters

the distance between


[10,10] and [40,30] is
given by the square root
of (40-10)2+(30-10)2. For
three or more
dimensions, the distance
is calculated in a similar
way.

Digital Image Classification…


Image Classification
 A scatter plot shows the distribution of conjugate pixel values
of a two band image.
 Feature vectors of ground resolution cells for a particular class
say water will form a compact cluster.
 The different clusters corresponding to different classes will
occupy their own area in the feature space although there are
cases where some classes overlap say wheat and grass
 The basic assumption in image classification is that a specific
part of the feature space corresponds to a specific class.
 Once the classes have been defined in the feature space, each
feature vector of a multi band image can be plotted and
checked against these classes and assigned to the class where
it fits best.

–54
–8/6/2023

Digital Image Classification…


Image Classification Process
 Selection and preparation of RS images: depending on land cover
types or whatever needs to be classified, the most appropriate
sensor, the most appropriate dates of acquisition and the most
appropriate wavelength bands should be selected;
• Definition of clusters in the image space: two approaches possible-
supervised and unsupervised classification;
• Selection of classification algorithm:
• Running the actual classification: based on DN values, each
multiband pixel in the image is assigned to one of the predefined
classes.
• Validation of results: once a classified image has been produced, its
quality is assessed by comparing it to reference data (ground truth).

Digital Image Classification…

Classification Algorithms

 Box Classifier

 Minimum Distance to Mean Classifier

 Maximum Likelihood Classifier

–55
–8/6/2023

3. Satellite Positioning
Global Navigation Satellite Systems (GNSS)

 GNSS is general term describing any satellite


constellation that provides positioning, navigation, and
timing (PNT) services on a global or regional basis.

 GPS is the most prevalent GNSS-other nations are


fielding, or have fielded, their own systems to provide
complementary, independent PNT capability.

Others:
 BeiDou Navigation Satellite System (BDS)
BeiDou, or BDS, is a global GNSS owned and operated
by the People's Republic of China. BDS was formally
commissioned in 2020. The operational system consists
of 35 satellites. BDS was previously called Compass
http://en.beidou.gov.cn/
 Galileo
Galileo is a global GNSS owned and operated by the
European Union. Initial Services started in 2016 and
plans to complete the system of 24+ satellites in 2021.
https://ec.europa.eu/defence-industry-space/eu-space-
policy/galileo_en

–56
–8/6/2023

 GLONASS
Globalnaya Navigazionnaya Sputnikovaya Sistema, or
Global Navigation Satellite System is a global GNSS
owned and operated by the Russian Federation. The fully
operational system consists of 24+ satellites
https://www.glonass-iac.ru/en/about_glonass/
 Indian Regional Navigation Satellite System (IRNSS) /
Navigation Indian Constellation (NavIC)
IRNSS is a regional GNSS by the Government of India.
IRNSS is an autonomous system designed to cover the
Indian region and 1500 km around the Indian mainland.
The system consists of 7 satellites. In 2016, India renamed
IRNSS as the Navigation Indian Constellation (NavIC,
meaning "sailor" or "navigator").
https://www.isro.gov.in/irnss-programme

 Quasi-Zenith Satellite System (QZSS)


QZSS is a regional GNSS owned by the Government of
Japan and operated by QZS System Service Inc. (QSS).
QZSS complements GPS to improve coverage in East
Asia and Oceania.
Japan declared the official start of QZSS services in 2018
with 4 operational satellites, and plans to expand the
constellation to 7 satellites by 2023 for autonomous
capability.

Web page: https://qzss.go.jp/en/

–57
–8/6/2023

Satellite Positioning…
NAVSTAR GPS
 Navigation Satellite Timing and Ranging (NAVSTAR)
Global Positioning System (GPS)
Space Segment
 Transmit radio signals to users;
 At least 24 satellites available 95% of the time;
 31 operational satellites for over a decade;
 4 atomic clocks on each satellite;
 6 orbital planes (4 slots of baseline satellites);
 More than 24 satellites flown to maintain coverage when
baseline satellites serviced/decommissioned

Satellite Positioning…
 2011, 24 slots expanded, 6 satellites repositioned, 3 extra
satellites became part of baseline constellation (27 in
total)
 55° inclination angle;
 Periods-11 hours 58 minutes (orbits earth twice a day).
 Altitude 20,200 km of the GPS satellites permit more to
be seen simultaneously from virtually anywhere on Earth.

–58
–8/6/2023

Satellite Positioning…

GPS Satellite
Constellation

Satellite Positioning…
 mix of old and new
satellites.

 current and future


generations of GPS
satellites, including
Block IIA (2nd generation,
"Advanced"), Block IIR
("Replenishment"),
Block IIR-M
("Modernized"), Block IIF
("Follow-on"), GPS III, and
GPS IIIF ("Follow-on").

As of June 15, 2021, 31 operational satellites in the GPS


constellation, not including the decommissioned, on-orbit spares

–59
–8/6/2023

Satellite Positioning…
Control Segment

 Ground facilities;

 Track satellites;

 Monitor satellite transmissions;

 Perform analysis;

 Send commands/data to constellation

Satellite Positioning…
Consists of:

 Master control station;

 Alternate master control station;

 11 command and control antennas;

 16 monitoring sites

–60
–8/6/2023

Satellite Positioning…

–61
–8/6/2023

Satellite Positioning…
User Segment
 Antennas and Receivers;
 Provide-position; time; navigation
 Application requirements
accuracy;
reliability;
operational constraints;
user hardware;
data processing algorithms;
latency of GPS results etc

 Groups of applications:
 Land, Sea and Air Navigation and Tracking, including
enroute as well as precise navigation, collision avoidance,
cargo monitoring, vehicle tracking, search and rescue
operations, etc.

 Surveying and Mapping, on land, at sea and from the air.


Includes geophysical and resource surveys, GIS data capture
surveys, etc.

 Military Applications. developed to "military specifications"


and a greater emphasis is placed on system reliability.

–62
–8/6/2023

 Recreational Uses, on land, at sea and in the air. The


primary requirement is for low cost instruments which
are very easy to use.

 Other specialised uses, such as time transfer, attitude


determination, spacecraft operations, atmospheric
studies, etc. such applications require specially
developed, high cost systems, often with additional
demanding requirements such as real-time operation,
etc.

Satellite Positioning…
User Segment

–63
–8/6/2023

Satellite Positioning…

Positioning Principle
 Computation of
receiver position
similar to distance
resection.
 Pseudoranges to
satellites measured
 GPS positions known
 To each satellite is
measured the range
(pseudorange), d from
receiver position

GNSS receivers use an inexpensive crystal clock which is not precisely set to GPS
time. The distances thus measured are either shorter or longer than the true ones and
therefore they are not true ranges and are called pseudoranges

–64
–8/6/2023

Positioning Principle…
𝑹𝒔𝒑 = 𝒄(𝜟𝒕 − 𝒆)

 RSP is the true range from satellite S, e is the clock error,


Δt is the signal flight time. In terms of satellite
coordinates, we write for each satellite
𝟏
[(𝑿𝟏 − 𝑿𝑷 )𝟐 +(𝒀𝟏 − 𝒀𝑷 )𝟐 +(𝒁𝟏 − 𝒁𝑷 )𝟐 ]𝟐 = 𝒄(𝜟𝒕 − 𝒆)

 Δt is the observable ; Unknowns- XP, YP, ZP and the


clock error e.

 Solution-four satellites must be in view

GNSS Observation Techniques


Code Pseudorange Measurement

 The C/A code pseudorange receivers use only the C/A- code.
The C/A code receivers work by generating a replica of the
GPS signal. There is a time shift between the receiver
generated signal and the emitted signal.

 This time shift is the signal flight time Δt, that the signal
takes to reach the receiver.

 The product of this flight time and the speed of light, c, is the
distance between the receiver and the satellite (pseudorange).

–65
–8/6/2023

GNSS Observation Techniques…


Phase Pseudorange Measurements

 The phases of the carriers L1 and L2 are the observables. The


carriers have very short wavelengths (0.19m for L1 and 0.24 m for
L2), which means it is possible to observe the ranges with
millimeter precision

 The ambiguity N, complete integer number of wavelengths for


each signal from the satellite to the receiver is determined.

 The fractional part of the wavelength that completes the whole


range is the phase. It is this phase, Δλ that is measured.

 R=Nλ + Δλ; R is the range

Error Sources in GNSS


The final position of the survey point is influenced by:
 the error in the range measurement

 the satellite- receiver geometry (Geometric Dilution of


Precision-GDOP)

 the accuracy of the satellite ephemeris-orbit

 the effect of atmospheric refraction

 the processing software.

–66
–8/6/2023

Error Sources in GNSS…


Major error sources:

 Receiver Clock Error

 Satellite Clock Error

 Satellite Orbital Error

 Ionospheric Error

 Tropospheric Errors

 Multipath Errors

GNSS Positioning Methods


[a] Point Positioning

 C/A mode

 One receiver

 Stand alone positioning

 ±3-10m accuracy

 Land, marine, air navigation

–67
–8/6/2023

[b] DGPS (C/A Code)

 C/A mode

 Reference/base + rover + comm link

 Relative positioning

 ±1-2m accuracy

 High precision applications

–68
–8/6/2023

[c] DGPS (C/A + Phase mode)

 C/A + Phase difference mode

 Receiver + Reference/base + computer + S/W

 Relative positioning

 Kinematic/Post processed

 ±2 cm accuracy [over 80km real time]; communication


between GNSS receivers

 ±5 mm [baseline measurement]; post processed- no


communication between GNSS receivers.

–69
–8/6/2023

DGPS Real-Time
(RTK Surveying)

DGPS Post-Processing
Survey grade accuracies

–70
–8/6/2023

Heights from GNSS


Earth Models:
 Plane-plane surveying
 Sphere-geodetic surveys

 Oblate Spheroid/Ellipsoid of revolution-geodetic surveys

Heights:
 Ellipsoidal/Geodetic/Geometrical heights: above
reference ellipsoid
 Orthometric/Physical heights-sensitive to gravity; above
MSL/Mean Ocean Surface

Heights from GNSS…


Geoid:

 Equipotential surface that coincide with mean ocean


surface of earth-if oceans and atmosphere were in
equilibrium at rest relative to rotating earth as extended
through continents

 Not as rough as planet earth geography

 Shows difference (geoidal undulation) of about +78m


and -108m compared to best fitting rotational ellipsoid
(WGS 84 for GPS)

–71
–8/6/2023

 Current best fitting ellipsoid EGM2008, published in


2008 by US National Geospatial Intelligence Agency

 Global and Local ellipsoids comparisons reveal


accuracy discrepancies in the order of decimeters and
larger for mountainous regions-not sufficient for
surveying applications

 Continental/National geoids aligned to global/continental


ellipsoids are compiled using local gravity data,
measurement of deflection of vertical and
GNSS/Levelling data

Allows accuracies of upto 1-3cm (flat regions)


and 3-5cm for mountainous regions
 Sometimes transformation between orthometric
and local height systems required
 Ellipsoidal/Geodetic [φ,λ,һ] and Cartesian
[X,Y,Z] Coordinates from GNSS
 Surveying, Mapping and Engineering
applications use physical/orthometric heights
 Heights from GNSS should therefore not be used
for setting out engineering structures, but can be
used for feasibility studies and during recce
missions

–72
–8/6/2023

 h: geodetic/ellipsoidal/geometric height from GNSS


 H: physical/orthometric heights from differential levelling
 N: geoid undulation, h-H
 Geoid approximates the MSL used as reference for physical heights

GNSS Advantages
All weather. Works in rain, clouds, sun, snow, any
space weather, etc.

Provides coordinates on a global system.

24 hours capability

World-wide availability

No subscription fee (GPS,GLONASS)

No line of site needed between terrestrial points

Disadvantages?

–73
–8/6/2023

C.A.T Time: 30 Minutes 2020/2021


1. GNSS play an important role in the provision of
timing, positioning and ranging services. Discuss any
four disadvantages of these systems (6 Marks)
2. Discuss five possible limitations of remotely sensed
data (5 Marks)
3. Describe four application areas of GIS and RS in
Civil Engineering (6 Marks)
4. Define ‘GIS database’. Describe four functions of the
database (5 Marks)
5. Why are heights from GNSS not recommended for
setting out civil engineering structures? (3 Marks)

4. Photogrammetry
science and art of determining qualitative and quantitative
characteristics of objects from the images recorded on
photographic emulsions

Types of Photogrammetry:
 Terrestrial Photogrammetry: camera is supported on the
surface of the earth. Typically the camera is mounted on a
tripod and the camera axis is oriented in or near the
horizontal plane. When the distance to the object is less
than approximately 300 m, the method is often referred to
as close-range photogrammetry.
 Aerial Photogrammetry: camera fixed on an airborne
spacecraft, photographs captured from above the ground

–74
–8/6/2023

Cameras
 Metric cameras: stable and precisely known internal
geometries and very low lens distortions. The principal
distance is constant, which means, that the lens cannot be
sharpened when taking photographs. As a result, metric
cameras are only usable within a limited range of distances
towards the object.
 Stereometric camera: If an object is photographed from two
different positions, the line between the two projection centers
is called "base". If both photographs have viewing directions,
which are parallel to each other and in a right angle to the base
(the so called "normal case"), then they have similar properties
as the two images of our retinas. Therefore, the overlapping
area of these two photographs (which are called a "stereo-
pair") can be seen in 3D, simulating man's stereoscopic vision.

Cameras…
• Non metric/'Amateur' cameras: The photogrammetrist
speaks of an "amateur camera", when the internal
geometry is not stable and unknown, as is the case with
any commercial available camera. Therefore, they can
only be used for purposes where no high accuracy is
demanded.

–75
–8/6/2023

Data Acquisition
The remotely (Sensor-camera; Platform-aeroplane)
received information can be grouped into four
categories:
• Geometric-involves the spatial position and the shape
of objects.
• Physical-refers to properties of electromagnetic
radiation, e.g., radiant energy, wavelength etc
• Semantic-related to the meaning of an image. It is
usually obtained by interpreting the recorded data.
• Temporal-related to the change of an object in time,
usually obtained by comparing several images which
were recorded at different times.

–76
–8/6/2023

History of Photogrammetry

Major Phases in Photogrammetry

C.A.T 2021/2022 Time: 30 Minutes


1. Using a clear diagram, differentiate between
orthometric and ellipsoidal heights (5)

2. Explain how position of real world features is


represented in a GIS (6)

3. Describe digital image classification of remotely


sensed data (5)

4. Outline four differences between an aerial


photograph and a map (4)

–77
–8/6/2023

C.A.T 2022/2023 Time: 30 Minutes


1. Describe THREE possible limitations of remotely sensed
data (9 marks)

2. In terms of APPLICATIONS and with a clear diagram,


differentiate between orthometric and ellipsoidal heights
(6 marks)

3. Briefly describe the parameters that define features in a


GIS (9 marks)

4. Outline FOUR differences between airborne and space


borne remote sensing (6 marks)

Classifications of Aerial Photos


(a) Mainly classified according to the orientation of the
camera axis
• true vertical photograph the camera axis perfectly vertical
(identical to plumb line through exposure center). hardly exist in
reality.

• near vertical photograph the camera axis nearly vertical. The


deviation from the vertical is called tilt. It must not exceed
mechanical limitations of stereoplotter to accomodate it.
Gyroscopically controlled mounts provide stability of the camera so
that the tilt is usually less than two to three degrees.

–78
–8/6/2023

Classifications of Aerial Photos…


• oblique photograph the camera axis intentially tilted between the
vertical and horizontal. A high oblique photograph, depicted in is
tilted so much that the horizon is visible on the photograph. A low
oblique does not show the horizon

• The total area photographed with obliques is much larger than that
of vertical photographs. The main application of oblique
photographs is in reconnaissance.

Classifications of Aerial Photos…

–79
–8/6/2023

Classifications of Aerial Photos…


(b) Angular Coverage

• The angular coverage is a function of focal length and format size.


Since the format size is almost exclusively 9’’ × 9’’ the angular
coverage depends on the focal length of the camera only

Classifications of Aerial Photos…


(c) Emulsion Type
• panchromatic black and white-most widely used type of
emulsion for photogrammetric mapping;

• color photography mainly used for interpretation purposes.


Recently, color is increasingly being used for mapping
applications;

• infrared black and white-Since infrared is less affected by haze


it is used in applications where weather conditions may not be as
favorable as for mapping missions (e.g. intelligence);

• false color This is particular useful for interpretation, mainly for


analyzing vegetation (e.g. crop disease) and water pollution.

–80
–8/6/2023

Geometric Properties of Aerial Photos


We restrict the discussion about geometric properties to frame
photography, that is, photographs exposed in one instant.
Furthermore, we assume central projection.

Definitions:

• perspective center C calibrated perspective center

• focal length c calibrated focal length

• principal point PP principal point of autocollimation

• camera axis C-PP axis defined by the projection center C and the
principal point PP. The camera axis represents the optical axis. It is
perpendicular to the image plane

Geometric Properties of Aerial Photos…

Tilted photograph
in diapositive
position and
ground control
coordinate
system.

–81
–8/6/2023

Geometric Properties of Aerial Photos…


• nadir point N also called photo nadir point, is the intersection of
vertical (plumb line) from perspective center with photograph.

• ground nadir point N’ intersection of vertical from perspective


center with the earth’s surface.

• tilt angle t angle between vertical and camera axis.

• swing angle s is the angle at the principal point measured from the
+y-axis counterclockwise to the nadir N.

Geometric Properties of Aerial Photos…


• azimuth α is the angle at the ground nadir N’ measured from the
+Y-axis in the ground system counterclockwise to the intersection O
of the camera axis with the ground surface. It is the azimuth of the
trace of the principal plane in the XY -plane of the ground system.

• principal line pl intersection of plane defined by the vertical


through perspective center and camera axis with the photograph.
Both, the nadir N and the principal point PP are on the principal
line. The principal line is oriented in the direction of steepest
inclination of the tilted photograph.

–82
–8/6/2023

Geometric Properties of Aerial Photos…


• isocenter I is the intersection of the bisector of angle t with the
photograph. It is on the principal line.

• isometric parallel ip is in the plane of photograph and is


perpendicular to the principal line at the isocenter.

• true horizon line intersection of a horizontal plane through


perspective center with photograph or its extension. The horizon
line falls within the extent of the photograph only for high oblique
photographs.

• horizon point intersection of principal line with true horizon line.

Image and Object Space


• The photograph is a perspective (central) projection.

• During the image formation process, the physical


projection center object side is the center of the entrance
pupil while the center of the exit pupil is the projection
center image side.

• The two projection centers are separated by the nodal


separation. The two projection centers also separate the
space into image space and object space as indicated in
below.

–83

You might also like