0% found this document useful (0 votes)
3 views

Unit5_MicrowaveRS

The document discusses Microwave Remote Sensing, detailing both passive and active microwave sensors, their applications, and imaging techniques. It highlights the advantages of microwave imaging over optical imaging, such as the ability to penetrate cloud cover and vegetation. Key concepts include radar systems, backscatter coefficients, and the importance of polarization in interpreting radar data.

Uploaded by

avinash singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Unit5_MicrowaveRS

The document discusses Microwave Remote Sensing, detailing both passive and active microwave sensors, their applications, and imaging techniques. It highlights the advantages of microwave imaging over optical imaging, such as the ability to penetrate cloud cover and vegetation. Key concepts include radar systems, backscatter coefficients, and the importance of polarization in interpreting radar data.

Uploaded by

avinash singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 98

Microwave Remote Sensing

Unit-V

Dr. Bikash Parida, DGI, CUJ


Contents
2

 Microwave: Passive & Active Microwave Sensors,

 Side looking Airborne/ Space borne RADAR.

 Back scatter coefficient, Phase unwrapping, coherence,

 Radargrammetry.
Introduction
3

 More difficult than with optical imaging because the


technology is more complicated and the image data
recorded is more varied.
 There are many concepts and techniques to be assimilated
in this context.

 Optical imaging technologies operate at wavelengths of the


order of 1 μm (i.e a millionth of a meter)
 Radar imaging is based on microwaves that have
wavelengths of the order of 10 cm.
Introduction: Microwave RS
4

 Longer wavelength microwave radiation can penetrate


through cloud cover, haze, dust, and all but the heaviest
rainfall as the longer wavelengths are not susceptible to
atmospheric scattering which affects shorter optical
wavelengths.
 This property allows detection of microwave energy under
almost all weather and environmental conditions so that data
can be collected at any time.
Introduction: Microwave RS
5

 Due to such a disparity in wavelength the features on the


Earths surface appear differently at microwave than they
would optically.
 Optical and Microwave data types are complimentary and
hence together used in application.

 Another major difference between optical and microwave is


the wave penetration capability
 While there can be some penetration through media such as
water and thin leaves at optical wavelengths, the longer
wavelengths of radar can often penetrate vegetation
canopies and even soils.
Introduction: Microwave RS
6
Introduction: Microwave RS
7

 The imagery recorded optically usually represents the


surface elements of the landscape, whereas the radar image
data is more complex because it often contains volumetric
and sub-surface information as well

 With the long wavelengths used for radar imaging, the


surfaces appear much smoother than at visible and IR
wavelengths
Range of Applications of Microwave
8

 Topographic Mapping
 Landscape change detection,
 3D modeling of volume
Imaging with Microwave
9

 To form an image with any technology the first consideration


is to know the energy source to view the landscape

 In case of optical data, the energy source is visible and IR


sunlight or Thermal energy from the Earth itself

 Although there is a limited amount of Microwave energy


available from the Earth and Sun, it is so small that we
generally need to provide our own source of incident radiation
- Active Microwave Remote Sensing
Imaging with Microwave
10

 There could be two remote sensing platforms –


 One carrying the energy source
 The other (can be several) receiving scattered
energy

 Most radar remote sensing systems have used the same


platform (for transmitting and receiving) and are called
Monostatic
 When two platforms (for transmitting and receiving) are used
the radar system is called Bistatic
Imaging with Microwave
11
Imaging with Microwave
12

 Microwave energy is just one form of Electromagnetic (EM)


radiation
 The most significant difference in characterizing remote
sensing image properties is wavelength

 Regions in which there is little absorption are referred to as


Atmospheric Windows!!
 For wavelengths beyond 3 cm the atmosphere is regarded
as transparent !!
Imaging with Microwave
13
Passive Microwave RS
14

 Passive microwave sensing is similar in concept to thermal


remote sensing. All objects emit microwave energy of
some magnitude, but the amounts are generally very small.

 A passive microwave sensor detects the naturally emitted


microwave energy within its field of view. This emitted
energy is related to the temperature and moisture
properties of the emitting object or surface.
 Passive microwave sensors are typically radiometers (or
scanners) and an antenna is used to detect and record the
microwave energy.
Passive Microwave RS
15

 The microwave energy recorded by a passive sensor can be emitted by


the atmosphere (1), reflected from the surface (2), emitted from the
surface (3), or transmitted from the subsurface (4).

 Because the wavelengths are so long, the energy available is quite


small compared to optical wavelengths. Thus, the FOV must be large
to detect enough energy to record a signal. Most passive microwave
sensors are therefore characterized by low spatial resolution (e.g.
TRMM, AMSR)
Passive Microwave RS
16

 Applications of passive microwave remote sensing include:

 Meteorology: determine water and ozone content in the


atmosphere.
 Hydrology: measure soil moisture
 Oceanography: map sea ice, currents, surface winds,
detection of pollutants, such as oil slicks.
17
18
Active Microwave RS
19

 Active microwave sensors provide their own source of microwave (MW)


radiation to illuminate the target. Active MW sensors are generally
divided into two categories: imaging and non-imaging.
 The most common form of imaging active MW sensors is RADAR.
RADAR is an acronym for RAdio Detection And Ranging.

 The sensor transmits a microwave (radio) signal towards the target and
detects the backscattered portion of the signal.
 The strength of the backscattered signal is measured to discriminate
between different targets and the time delay between the transmitted
and reflected signals determines the distance (or range) to the
target.
RADAR reflection from various surfaces
20

 The surface roughness of a feature controls how the


microwave energy interacts with the surface or target and is
generally the dominant factor in determining the tones seen
on a radar image..
Three typical reflectors of microwaves
1. Diffuse
2. Specular
3. Corner
Active Microwave RS
21

 Non-imaging MW sensors take measurements in 1D, as opposed to


the 2D representation of imaging sensors.
 Radar altimeters transmit short MW pulses and measure the round
trip time delay to targets to determine their distance from the sensor.
They are used on aircraft for altitude determination & on aircraft and
satellites for topographic mapping and sea surface height estimation.
 Scatterometers are used to make precise quantitative measurements
of the amount of energy backscattered from targets, which is dependent
on the surface properties (roughness) and the angle at which the
microwave energy strikes the target.
 For ocean surfaces: to estimate wind speeds based on the sea surface
roughness.
 From the ground: to accurately measure the backscatter from various
targets to characterize different materials and surface types.
Active Radar system: History
22

 The first demonstration of the transmission and reflection of


radio microwaves was achieved by Hertz in 1886.
 The first radar was developed for ship detection in early 20th c.
 In the 1920s -30s, experimental ground-based pulsed radars
were developed for detecting objects at a distance.
 The first imaging radars (in World War II) had rotating sweep
displays, used for detection and positioning of aircrafts and
ships.
 After World War II, side-looking airborne radar (SLAR) was
developed for military terrain reconnaissance and surveillance.
 In the 1950s, synthetic aperture radar (SAR) were developed for
military purposes.
 In the 1960s these radars began to be used for civilian mapping
Radar Basics
23
 The transmitter generates
successive short pulses of microwave
(A) at regular intervals which are
focused by the antenna into a beam
(B).
 The radar beam illuminates the surface obliquely at a right angle to the
motion of the platform. The antenna receives a portion of the transmitted
energy reflected (or backscattered) from various objects within the
illuminated beam (C).
 By measuring the time delay between the transmission of a pulse and
the reception of the backscattered "echo" from different targets, their
distance from the radar can be determined. As the platform moves
forward, recording the backscattered signals builds up a 2D image of the
surface
Side Looking Airborne Radar (SLAR)
24

 SLAR is an active sensor; the system provides its own source of


illumination in the form of microwave energy.
 The SLAR is a RAR primarily. This requires a reasonable large antenna
for The azimuth resolution.

 Since the radar signal is transmitted at a depressional angle below the


horizontal plane in which the aircraft is flying, the signal strikes the
terrain at an oblique angle, and the surficial expression of the
geologic structure may thus be enhanced.
 The topographic expression of some surface features, such as subtle
faults and folds, may be more clearly seen on radar imagery than on
conventional aerial photographs or satellite images.

http://www.radartutorial.eu/20.airborne/ab06.en.html
Principles of SLAR
25
Radar Bands
26

• Ka, K, and Ku bands: very short


wavelengths used in early airborne
radar systems but uncommon today.
• X-band: used extensively on
airborne systems (e.g. PI-SAR) for
military reconnaissance and terrain
mapping.
• C-band: common on many airborn
research systems (e.g. NASA
AirSAR) and spaceborne systems
(ERS-1, 2 and RADARSAT).
• S-band: used on board the Russian
ALMAZ satellite.
• L-band: used by SEASAT, JERS-1,
PALSAR and airborne systems.
• P-band: longest radar wavelengths,
used on NASA airborne system.
Radar Bands
27
Multi-Frequency Observation
28

Backscattering property of microwaves is dependent on


their wavelength.
Polarisation
29

 Polarization refers to the orientation of the electric field.


 Most radars are designed to transmit microwave radiation
either horizontally polarized (H) or vertically
polarized (V).
 Similarly, the antenna receives either the horizontally (H) or
vertically (V) polarized backscattered energy, and some
radars can receive both.
Polarisation
30

Four combinations of both transmit and receive polarizations:


 HH - for horizontal transmit and horizontal receive,
 VV - for vertical transmit and vertical receive,
 HV - for horizontal transmit and vertical receive, and
 VH - for vertical transmit and horizontal receive.

 HH, VV: co (like)-polarization


 HV, VH: cross-polarization
Polarimetry
31
 Polarimetry is the technique to distinguish the object by the
difference of the polarization.

Co-polarization
(HH)

Cross-
polarization
(HV)
Composite Image
Purple: HH, Green: HV
Polarimetry
32
 Depending on the transmit
and receive polarizations, the
radiation will interact with
and be backscattered
differently from the surface.

 Radar imagery collected


using different polarization
and wavelength combinations
may provide different and
complementary information
about the targets on the
surface.
Imaging Geometry of a Radar System
33
 The platform travels forward in the flight
direction (A) with the nadir (B) directly beneath
the platform.
 The microwave beam is transmitted obliquely at
right angles to the direction of flight illuminating
a swath (C) which is offset from nadir.

 Range (D) refers to the across-track dimension


perpendicular to the flight direction, while
azimuth (E) refers to the along-track
dimension parallel to the flight direction.

 This side-looking viewing geometry is typical of


imaging radar systems (airborne or spaceborne).
Imaging Geometry of a Radar System
34
 The portion of the image swath closest
to the nadir track of the radar platform
is called the near range (N) while that
farthest from the nadir is called the far
range (F).
 The incidence angle is the angle
between the radar beam and ground
surface (A) which increases, moving
across the swath from near to far range.
 The look angle (B) is the angle at
which the radar "looks" at the surface.
 In the near range, the viewing geometry
may be referred to as being steep,
relative to the far range, where the
viewing geometry is shallow.
Imaging Geometry of a Radar System
35

 At all ranges the radar antenna


measures the radial line of sight
distance between the radar and each
target on the surface. This is the
slant range distance (C).

 The ground range distance (D) is


the true horizontal distance along the
ground corresponding to each point
measured in slant range.
Spatial Resolution of a Radar System
36

 The resolution of an imaging radar


system is determined by two
independent system parameters:

1. pulse length – the range


(across-track) resolution
2. beam width -- the azimuth
(along-track) resolution
The Range Resolution/Across-Track
37

 If a Real Aperture Radar (RAR) is used for image formation (as


in SLAR) a single transmit pulse and the backscattered
signal are used to form the image.
 The range resolution is dependent on the length of the pulse
(P).
Two distinct targets on the surface will
be resolved in the range dimension if
their separation is greater than half
the pulse length. (P/2)

For example, targets 1 and 2 will not be


separable while targets 3 and 4 will.
Range Resolution
38
 If the slant distance between two buildings is less than PL/2,
the overlap of the signals occurs.
Range Resolution
39


The Range Resolution
40

 Slant range resolution remains constant, independent of range.


However, the resolution in ground range will be dependent of the
incidence angle.
 Thus, for fixed slant range resolution, the ground range
resolution will decrease with increasing range.
The Azimuth Resolution
41

 The azimuth resolution is determined by the angular width of


the radiated microwave beam and the slant range distance.
 This beamwidth (A) is a measure of the width of the
illumination pattern. As increasing distance from the sensor, the
azimuth resolution deteriorates (becomes coarser).
 Targets 1 and 2 in the near range would be separable, but targets
3 and 4 at further range would not.
Azimuth Resolution
42

 Azimuth resolution Ra is given by


Azimuth Resolution & RAR
43

 Finer azimuth resolution can be achieved by increasing the


antenna length. However, the actual length of the antenna
is limited by an airborne (1-2 m) or space borne (10-15 m)
platform.

 Those systems where antenna length is controlled physically


are called real aperture systems.
 These systems are relatively simple to design (i.e. cheap), and
the data are relatively easy to process.
 However, their main drawback is their restriction to short
range, low altitude operation - with only short wavelengths.
Azimuth Resolution in RAR
44

 In RAR imaging, the ground resolution is limited by the size of


the microwave beam sent out from the antenna.
 Finer details on the ground can be resolved by using a
narrower beam. The beam width is inversely proportional to
the size of the antenna, i.e. the longer the antenna, the
narrower the beam.
SAR
45

 Synthetic Aperture Radar (SAR) systems over come some of the


problems of RAR.
 To overcome the size limitation, the forward motion of the platform
and special recording and processing of the backscattered echoes are
used to simulate a very long antenna and thus increase azimuth
resolution.
Algorithm of SAR
46
 As a target (A) first enters the radar beam (1), the backscattered echoes
from each transmitted pulse begin to be recorded. As the platform
continues to move forward, all echoes from the target are recorded
while the target is within the beam.
 The point at which the target leaves the view of the radar beam (3),
determines the length of the simulated or synthesized antenna (B)/
artificial (or 'virtual') antenna.

Concept of an array of real


antenna positions forming a
synthetic aperture
Algorithm of SAR
47

 In the diagram, target (A) remains in the radar beam for


the distance (B) in which the plane travels. The length of
the synthesized antenna is equivalent to the distance (B).

 Synthetic Aperture Radar allows for a resolution of 3


meters from aircraft and 25 meters from satellites.
Algorithm of SAR
48
 Targets at far range will be illuminated for a longer period of
time than objects at near range. The expanding beamwidth,
combined with the increased time a target is within the beam, as ground
range increases, balance each other, such that the resolution remains
constant across the entire swath.
 This method of achieving uniform, fine azimuth resolution across the
entire imaging swath is called synthetic aperture radar. Most airborne
and spaceborne radars employ this type of radar.
Azimuth Resolution of SAR
49


Components of an Imaging Radar System
50
Components of an Imaging Radar System
51
 The ability of the radar system to resolve the field of interest
into resolution cells, or pixels
 Different principles are used to create resolution in the
direction parallel to the motion of the platform (along track
or azimuth), and orthogonal to it (across track or
range)

The landscape is irradiated using pulses of energy


 The time taken from transmission to the landscape and back
to the radar determines the how far away that part of the
landscape
 Innovative signal processing techniques are used to high
spatial resolution possible in this dimension
Components of an Imaging Radar System
52

 The Synthetic Aperture Concept :

 In the along track direction the motion of the platform relative


to the landscape gives a Doppler change in the frequency
of radiation used for illuminating the landscape

 The signal processing methods are used to achieve very high


spatial resolution in the azimuth direction by keeping track of
the Doppler shift as the platform passes through the regions
of interest
Components of an Imaging Radar System
53

 We can use the radar data meaningfully by understanding the


distortion (geometrical/signal noise) introduced into the
recorded imagery

 The incident radiation scattered from the landscape should be


understood since the backscatter energy contains about the
properties of the part of the earth surface being imaged

 We need to understand the scattering properties of Earth surface


materials and also be able to model them which is an important
step in radar image interpretation
Components of an Imaging Radar System
54

 The Earths surface will respond differently for different


polarizations and wavelength of the incident energy
and also for the different angle of incidence
 The polarization characteristic of the EM wave adds a new
complexity to the radar imagery.

 A distinct feature of radar images is that they have an


overlying speckled appearance as a result of interference of
the energy reflected from the many elemental scatterers that
occur within a resolution cell (pixel)
55
Airborne versus Spaceborne Radar
56
 The angle at which a radar beam strikes a ground object is an
important variable in determining the appearance of an image such as
shadow, layover and foreshortening. By altering the incidence angle
(higher elevation) it is possible to decrease the image distortions.

Airborne radar must image over


a wide range of incidence angles Spaceborne radar does not require
in order to cover a wide swath. a wide range of incidence angles to
cover a wide swath.
Airborne Radar
57

Canadian Centre for Remote Sensing (CCRS)


X-Band and C-Band Synthetic Aperture Radar (SAR)

National Aeronautics & Space Administration (NASA)


C-Band, L-Band, and P-Band SAR
Airborne SAR (Synthetic Aperture Radar)
58

The Sea Ice and Terrain Assessment


Radar (STAR) systems operated by
Intermap Technologies were among
the first SAR systems used
commercially around the world.
Both STAR-1 and STAR-2 operate at
X-band (3.2 cm) with HH polarization
in two different resolution modes.
F-SAR – The New Airborne SAR System
Airborne SAR (Synthetic Aperture Radar)
59

 RAR is operated from aerial platform (called Side Looking


Airborne Radar or SLAR).

 SAR is operated from aerial as well as space platforms.

 SAR from aerial platform is called AirSAR.

 Satellite SAR is superior than AirSAR.


Spaceborne Radar
60

SEASAT (NASA) JERS-1 (Japan)


1978: First civilian spaceborne SAR Launched: 1992
L-Band; HH Polarization L-Band; HH Polarization
Incidence Angle: 9-15 degrees Incidence Angle: 35 degrees
Swath Width: 100 Km Swath Width: 75 Km
Spatial Resolution: 25 meters Spatial Resolution: 18 meters

ERS-1 (European Space Agency) RADARSAT (Canada)


Launched: 1991 Launch: 1994
C-Band; VV Polarization C-Band; HH Polarization
Incidence Angle: 20-26 degrees Incidence Angle: <20 - > 50 degrees
Swath Width: 100 Km Swath Width: 35-500 km
Spatial Resolution: 30 meters Spatial Resolution: 10-100 meters
Spaceborne Radar
61

SIR-A (Shuttle Imaging Radar)


SIR Launch: 1981
L-Band; HH Polarization
Incidence Angle: 37-53 degrees
Swath Width:
Spatial Resolution: 38 meters
62
63

 Terra SAR-X
– High Resolution 1m
– Multi-Polarization
– High-Precision DEM
 All-weather
 Agriculture, Natural Resource
 Crop Growth
 Disaster

Multi-resolution - Multi-scale - Multi-polarised


64
65
66
67
Radar Imaging
68

An imaging radar consists of the following


components:
Transmitter
Duplexer
Receiver
Radar antenna (directional / receive)
Recording device

A duplexer is the network that permits a transmitter and receiver to use the
same antenna, at or very near the same frequency.
Radar Imaging
69
Brightness of features in a radar image is usually a combination of
several variables that can be grouped into the following categories:

System parameters
Frequency/wavelength
Polarization
Viewing geometry
Spatial resolution
Speckle

Terrain parameters
Surface geometry
Surface roughness
Dielectric properties
Wavelengths Used in Radar Imaging
70
Different Polarized Images
71
HH VV

HV Colour composite
Slant-range Distortions/Scale Distortions
72
 In radar imaging, distortions of the target image are due to the
side-looking viewing geometry & the fact that the slant-range
distortion occurs because the radar is measuring the distance to
features in slant-range rather than the true horizontal distance
along the ground.

 Scale Distortion occurs in the slant-range direction because


radar measures distances in the direction of the beam not in
ground distance.
 This results in a varying image scale, moving from near to far
range.
 The same distance on the ground (A1) and (B1) are seen by the
radar as (A2) and (B2). The scale is shortened in the near range
compared to the far range.
Ground Range versus Slant Range
73
Ground Range versus Slant Range
74
 figure shows two types of radar data :
 slant range image, in which distances
are measured between the antenna
and the target
 ground range image, in which
distances are measured between the
platform ground track and the target,
and placed in the correct position on
the chosen reference plane

 Slant range data is the natural result


of radar range measurements.
 Transformation to ground range
requires correction at each data point
for local terrain slope and elevation.
Ground Range versus Slant Range
75

The slant range image

ground range image.


Surface Geometry
76

 Relief Displacement: As in aerial photography radar images


are subject to relief displacement but is one-dimensional
occurring perpendicular to the flight path.
 There are two types of relief displacement: foreshortening
and layover.

 Foreshortening
 Layover
 Shadow
Foreshortening
77

Top B
Base A
Foreshortening
78

 Foreshortening occurs when the radar beam reaches the


base of a tall feature tilted towards the radar (e.g. a
mountain) before it reaches the top.

 Because the radar measures distance in slant-range, the


slope (from point A to point B) will appear compressed and
the length of the slope will be represented incorrectly (A' to
B') at the image plane.

 Foreshortening is a dominant effect in SAR images of


mountainous areas, where the mountains seem to "lean"
towards the sensor.
Layover
79
Layover
80

 It occurs when the radar beam reaches the top of a tall


feature (B) before it reaches the base (A). The return signal
from the top of the feature will be received before the signal
from the bottom.

 As a result, the top of the feature is displaced towards the


radar from its true position on the ground, and laysover the
base of the feature (B' to A').
Shadow
82
 Radar Shadow occurs when the radar beam is not able to illuminate the
ground surface.
 The shadowing effect increases with greater incident angle θ, just as our
shadows lengthen as the sun sets.
Surface Roughness
83
Surface Roughness
84
 Surface Roughness: This is the average height variation in the surface
cover (measured in the order of centimeters). A surface is considered
"smooth" if the height variations are much smaller than the radar
wavelength.

 Specular Reflection (A) is caused by a smooth surface where the


incident energy is reflected and not backscattered. This results in
smooth surfaces appearing as darker toned areas on an image.

 Diffuse Reflection (B) is caused by a rough surface which scatters the


energy equally in all directions. A significant portion of the energy will
be backscattered to the radar, such that a rough surface will appear
lighter in tone on an image.

http://hosting.soonet.ca/eliris/remotesensing/bl130lec13.html
Surface Roughness
85

 Corner Reflection (C) occurs when the target object reflect most of the
energy directly back to the antenna resulting in a very bright
appearance to the object. This occurs where there are buildings,
metallic structures (urban environments) and cliff faces, folded rock
(natural environments).
Surface Roughness
86
Surface Roughness
87
Backscattered Radar Intensity
88

 A single radar image is usually displayed as a grey scale image.


The intensity of each pixel represents the proportion of
microwave backscattered from that area on the ground.

 The pixel intensity values are often converted to a physical


quantity called the backscattering
coefficient or normalised radar cross-section measured
in decibel (dB) units with values ranging from +5 dB for very
bright objects to -40 dB for very dark surfaces.
Interpreting SAR Images
89

 The higher the backscattered intensity, the rougher is the surface being
imaged.

 Trees and other vegetations are usually moderately rough on the


wavelength scale.

 Hence, they appear as moderately bright features in the image. The


tropical rain forests have a characteristic backscatter coefficient of
between -6 and -7 dB, which is spatially homogeneous and remains
stable in time.
Interaction between Microwaves and Earth's Surface
90

 When microwaves strike a surface, the proportion of energy


scattered back to the sensor depends on many factors:

 Physical factors such as the dielectric constant of the surface


materials which also depends strongly on the moisture content;
 Geometric factors such as surface roughness, slopes, orientation
of the objects relative to the radar beam direction;

 The types of landcover (soil, vegetation or man-made objects).


 Microwave frequency, polarisation and incident angle.
Dielectric Properties
91

Dry soil Wet soil Flooded soil

Dry Soil: Some of the incident radar energy is able to penetrate into the
soil surface, resulting in less backscattered intensity.

Wet Soil: The large difference in electrical properties between water


and air results in higher backscattered radar intensity.
Dielectric Properties
92

Dry soil Wet soil Flooded soil

Flooded Soil: Radar is specularly reflected off the water surface,


resulting in low backscattered intensity. The flooded area appears dark in
the SAR image.
Dielectric Properties
93
94
Speckle
95

Suppressed by:
(1) Multilook processing
(2) Spatial filtering
(3) Temporal averaging
Speckle
96
Speckle
97
 Lee, Enhanced Lee, Kuan, Frost, Enhanced Frost filters have been used.
 The Lee filter, derived from both additive and multiplicative noise, is a
very popular statistical filter, which is designed to eliminate speckle
noise while preserving edges and point features in SAR imagery.
 Based on Minimum Mean Square Error (MMSE) criterion, a smoothed
pixel for multiplicative noise model can be represented as follows

where Is is the mean value of the intensity within the filter


window ns ,and ks is the adaptive filter coefficient determined

https://asp-eurasipjournals.springeropen.com/articles/10.1155/2010/745129
Multilook SAR Processing
98

 Multilook processing is a widely used speckle reduction method in


conventional SAR signal processing,

 SAR SLC image is characterized by speckle which can be modeled as a


multiplicative noise. This Speckle is a result of constructive and
destructive interference of all the reflected waves in a resolution cell.
This is the cause for the ‘salt and pepper’ appearance of SAR image.
 One way to reduce the effect of speckle is to perform multi-looking by
making a trade-off for image resolution.

 An image with reduced resolution but less grainy appearance can be


formed due to multi-looking.
Multilook SAR Processing
99

 The process of multi-looking improve the SAR image quality by


reducing the speckle and allows us to obtain a square pixel on the
output image.

 Either in range or azimuth direction or in both the directions,


subsequent lines are averaged to get a better image.

 Averaging the different looks of one single image will result in a


loss of resolution.

You might also like