Thesis PiersTitus Final 11-11-29
Thesis PiersTitus Final 11-11-29
supervisors:
Ramon Hanssen
Sami Samiei Esfahany
cover image:
Deformation speed change rate over Delft, as measured with Persistent
Scatterer InSAR (PSI) using TerraSAR-X radar images from April 2009
till November 2011. Featured in red is the construction of a train tunnel,
started in 2010. See section 6.2 for more details.
deformation speed change [mm/y²]
3 2 1 0 -1 -2 -3
Legend:
ABSTRACT
In this study methods are developed for improved analysis and pro-
cessing of PSI data. PSI, or radar interferometry, makes it possible to use
satellite radar images to measure deformation of the Earth’s surface
and objects on it with millimetre accuracy. However, interpretation of
the measurements and identifying the actually measured objects is still
a common problem.
There are no dedicated tools available for validation, for finding both
falsely detected and falsely rejected points, or for deeper analysis of PSI
results. Existing algorithms for automatic coherent scatterer selection
need a lot of acquisitions to obtain reliable results, which makes it
necessary to collect data for many months or years before processing
can be done.
In this study a suite of tools is developed that facilitate detailed analysis
of results and versatile processing of radar data. This suite consists of a
visual inspection tool, and a toolbox that handles metadata and can do
versatile processing of radar data.
Furthermore a method is developed for reliable point scatterer selection,
that works for a small number of acquisitions, among other improve-
ments.
iii
CONTENTS
1 i n t ro du c t i o n 1
2 r a da r s i g na l t h e o r y 5
2.1 Introduction 5
2.2 Radar 5
2.3 Scatterers 7
2.4 The impulse response 8
2.5 Interferometry 10
2.6 Signal, clutter and coherence 10
3 i n sa r i n v e s t i g at i o n t o o l s 15
3.1 Introduction 15
3.2 Methodology 16
3.3 InSAR Inspector 17
3.3.1 Zoomable background 17
3.3.2 Phase time series plot. 18
3.3.3 Point information table 20
3.3.4 Additional information plot 20
3.3.5 The GUI object 21
3.4 Object oriented InSAR toolbox 21
3.4.1 Slc and SlcStack 22
3.4.2 SlcImage 23
3.4.3 PointSet 24
3.4.4 Spectrum 25
3.4.5 Other classes 26
3.5 Outlook 26
3.5.1 GIS Integration 27
3.5.2 3D AHN2 viewer 27
3.5.3 Bayesian Point Selection 27
4 p o i n t s c at t e r e r s e l e c t i o n 29
4.1 Introduction 29
4.2 Algorithm 29
4.2.1 Implementation 31
4.2.2 Stack processing 31
4.2.3 Subpixel position 31
4.3 Results 32
4.3.1 Focus on point scatterers 32
4.3.2 Single observation per scatterer 32
4.3.3 More accurate quality measure for smaller stacks 32
4.3.4 Quality measure per acquisition 36
4.3.5 Sub pixel precision location 36
4.3.6 Amplitude calibration is not necessary 38
4.4 Relation to SCR 38
4.5 Conclusions 39
5 g e o c o d i n g i m p rov e m e n t s 41
5.1 Introduction 41
5.2 Geocoding 41
5.3 Topographic phase computation 42
v
vi contents
5.4 Conclusions 43
6 cases 45
6.1 Introduction 45
6.2 Delft train tunnel 45
6.3 Storage tank monitoring 49
6.4 Corner reflector experiments 53
7 c o n c l u s i o n s a n d r e c o m m e n dat i o n s 55
bibliography 57
i appendix 59
a t o o l b ox m a n ua l 61
ACRONYMS
vii
INTRODUCTION
1
On a lot of places on Earth the surface or objects on it move due to
natural phenomena or human activities, such as earthquakes, volcano
eruptions, mining, and water extraction. Often these movements are
very subtle, in the order of millimetres. Even so they may be precursors
to bigger events, or lead in accumulation to failure of infrastructure or
houses. Measuring them is important for better understanding of the
earth and can give early warnings for more serious events.
The traditional way of measuring these movements is using geodetic
surveying techniques such as levelling. While those techniques are very
accurate and reliable, it is very labour intensive per measured point.
Because of this resolution in both space and time is usually low.
In the last decades a new technology has been researched which over-
comes these problems, while allowing similar accuracy. This is achieved
using satellite radar. The type of radar that is used is an imaging side
looking radar, that can achieve a very high resolution using Synthetic
Aperture Radar (SAR).
Satellite radar collects complex scattering signals from the Earth’s
surface and objects on it, which results in a radar image. The phase
difference between two of such images is related to topography, defor-
mation and atmosphere. By processing time series those parameters
can be estimated. This technique is called radar interferometry (InSAR).
Radar interferometry makes it possible to use satellite radar images
to measure deformation of the Earth’s surface and objects on it with
millimetre accuracy [Hanssen, 2001]. Since new images are continuously
acquired all over the world, and have been acquired already for almost
two decades, the amount of available data gives a lot of opportunities.
It has been successfully applied for measuring deformations caused
by earthquakes, volcano eruptions, gas and salt mining, and for dike
monitoring [Hooper et al., 2004, Ketelaar and Hanssen, 2006, Hanssen
and van Leijen, 2008].
However, interpretation of the measurements and identifying the actu-
ally measured objects is still a common problem. This is caused by the
nature of radar scatterers, which are the ’objects’ that are measured by
the radar. Opposed to geodetic surveying techniques, where a measure-
ment is done on a known point, with imaging radar the geometry and
material properties of nearby objects define the measurement. The ge-
ometry of such a scatterer can be quite complex, and can include parts
which are not directly expected. For example the wall of a building
and the street in front of it can together form a scatterer, of which the
movement can be related to either one. This is because all radar signal
that travels the same distance arrives at the same time, and hence is in
the same measurement. Since this way of observing is so different from
human observation interpretation errors are very probable.
1
2 introduction
1 The phase of the radar signal is the part containing the deformation signal. It is ambiguous
because as it is only the phase it is impossible to distinguish signals one wavelength apart.
The phase is influenced by elevation, atmosphere and deformation, making multiple
interpretations possible. A phase value only has meaning if it can be related to the same
point at another observation, ánd another point in these observations, hence to another
phase value in space and time. At last multiple properties of each point can be extracted,
besides the position in the image also properties such as height, size, orientation, and
behaviour in time, resulting in a multivariate dataset.
introduction 3
2.2 radar
5
Figure 1: The way radar observes the earth. High points get overlayed on low
points, and behind buildings a shadow is observed.
6
2.3 scatterers 7
3
scatterers
samples
signal 2
real part of signal
phase
0
−1
−2
−3
samples
Figure 3: Scatterers, sampled signal, and fully reconstructed signal. This gives
an idea of how a scattering surface could look and what signal that
would give.
2.3 scatterers
spectrum spectrum
real part of signal
−1 0 1 −1 0 1
←→ ←→
1.20 samples 1.80 samples
(a) rectangular window (b) Hamming window
spectrum spectrum
real part of signal
−1 0 1 −1 0 1
←→ ←→
1.60 samples 1.60 samples
(c) raised cosine window with α = 0.6 (d) Kaiser window with β = 4
9
10 radar signal theory
lobe width. This is the same as the distance that two point scatterers
need to be apart to be separatable. This is shown in Figure 5.
2.5 interferometry
So SCR is defined for one image, while coherence is defined for a pair
of images. In formula form SCR becomes:
s2
SCR = . (2.2)
c2
where s2 is the power of the signal and c2 the power of the clutter.
One inconvenience is that SCR has a range of [0, ∞], while coherence
has a range of [0, 1]. To relieve us from this inconvenience the signal to
signal+clutter ratio (SSCR) is defined:
s2 SCR
SSCR = 2 2
= (2.3)
s +c SCR + 1
SSCR
SCR = − .
SSCR − 1
So coherence is very related to SCR, but for two images. As such it can
be related to SCR or SSCR: [Just and Bamler, 1994, Hanssen, 2001]
p
γ = SSCR1 · SSCR2 (2.4)
1
= r
1 + SCR−1
1 · 1 + SCR−1
2
where SSCR1 and SCR1 relate to the first image and SSCR2 and SCR2
to the second.
12 radar signal theory
2
experimental
1.8 CEOS93
Hanssen01
1.6
1.4
0.8
0.6
0.4
0.2
0
0 0.2 0.4 0.6 0.8 1
SSCR
Figure 7: SCR can be related to the standard deviation of the phase error of a
scatterer. Two different formulas for doing this are shown here, both
only valid for SSCR close to 1. Formula 2.6 (CEOS93) and formula 2.7
(hanssen01)
γ = SSCR. (2.5)
The phase error can be related to the SCR, for high SCR or SSCR close
to 1 the following is valid: [CEOS, 1993]
1 1 1 − SSCR
σψ ≈ √ , σ2ψ ≈ = . (2.6)
2 · SCR 2 · SCR 2 · SSCR
1 − SSCR2 2 · SCR + 1
σ2ψ ≈ 2
= , (2.7)
4 · SSCR 4 · SCR2
The last formula is a bit closer the the experimental values in the
sensible range of SSCR > 0.5, but both are very accurate above SSCR =
0.9, as can be seen in figure 7.
The SCR of a scatterer can be estimated using various methods. For
SCR estimation of corner reflectors on even fields the clutter level is
directly estimated from a neighbourhood that is out of reach of the
mainlobe and sidelobes of the scatterer. The best signal level estimation
is the signal at the location of the corner reflector. See figure 8. This
method is used for calibration of test corner reflectors [CEOS, 1993].
Also the commonly used metric amplitude dispersion can be related
to SCR, assuming the SCR is the same in all observations. Amplitude
2.6 signal, clutter and coherence 13
Figure 8: SCR estimation window used for calibration of corner reflectors. The
hatched area is used for clutter level estimation, the cross for signal
level.
2
experimental
1.8 theoretical
1.6
1.4
Amplitude dispersion DA
1.2
0.8
0.6
0.4
0.2
0
0 0.2 0.4 0.6 0.8 1
SSCR
This relation to SCR is also only valid for high SCR, as can be seen in
figure 9.
The coherence between two images can be estimated too. This is done
by comparing the images using a spatial window. This window is often
chosen to cover an area containing a reasonable number of scatterers,
which are then assumed to be stationary as a whole. This window can
also be narrowed down to contain just about one scatterer, by using
a weighted window which puts more weight in the centre. However,
the smaller the window, the more bias will appear in the estimation
[Hanssen, 2001, 4.3.4]. The formula for coherence estimation including
the optional window can be constructed as follows:
Rk ∗
−k s1 (t + u) · s2 (t + u) · w (u) du
γ̂ (t) = qR Rk (2.10)
k 2 2
−k |s1 (t + u)| · w (u) du · −k |s2 (t + u)| · w (u) du
where s1 and s2 are the radar images, k is the window size, and w
the optional window weights. This is the 1D case, the 2D case follows
logically from this.
14 radar signal theory
This gives the complex coherence, and is often converted to real coher-
ence by just taking the absolute value.
Another option is to project the complex coherence on the observation
phase difference. This will cause the coherence estimate to be reduced
when the interferometric phase of the observation is different from the
average of the observation window.
|γ| (t) = Re γ (t) · e−φ(t)i (2.11)
15
16 insar investigation tools
that is needed for creating such an environment. Since all data come
preprocessed from DORIS the processing functions are not available
in an interactive environment, so to use different settings for filtering,
oversampling, or other DORIS functions, DORIS must be called again.
The functional requirements for this environment are set as follows:
• Give easy access to all data, metadata and functions.
• Quick and interactive loading, calculating and displaying of data
and metadata.
• Ability to handle large datasets faste and efficient.
• Reimplement DORIS functionality to be accessible from Matlab.
• Minimize possible error sources, such as geoid, continental drift,
nonlinearity of topographic phase.
• Processing functions have sufficient precision, least possible error
sources.
• Consistent coordinates, coordinates should not depend on the
used crop of oversampling ratio.
In this study a suite of tools is developed that satisfies these require-
ments, which is described in this chapter.
First the methodology is explained, followed by sections about the
graphical InSAR Inspector tool and the underlying object-oriented InSAR
processing and metadata toolbox. The chapter concludes with a section
about the outlook to the future of the tools, describing where it is used
and experiments with features to be added.
3.2 methodology
The InSAR Inspector is a GUI that provides a way to make the spatially
dense, multivariate, and relative information that is involved in InSAR
processing fully explorable.
An overview map is shown at the main screen, on which all scatterers
are shown. When a scatterer is selected, either on this overview map
or using a unique point ID, all information about a scatterer is shown,
both as numerical data and using plots. All information that is added
during processing is shown, such as quality measures, height, and
deformation. Information that varies per acquisition is plotted, such as
amplitude or deformation. For deeper inspection several options are
available, such as showing the underlying radar image data, calculating
a coherence matrix, and data access from the command line.
The base components for the GUI are:
• Zoomable background.
• Phase time series plot.
• Point information table.
• Additional information plot.
The state and contained information is exposed to the programming
environment as an object. This way it is possible to interact with the
GUI in a programmatic way, and do custom processing on information
shown in the GUI.
The GUI is initiated with a PointSet. This is an object that contains all
information that is necessary for operation, such as access to the radar
images, the mean amplitude, and a set of selected points.
In the following sections the GUI components are described.
The largest panel contains a zoomable map on which the mean ampli-
tude of the radar images or mean reflectivity map (MRM) is shown, with
the selected scatterers on top, see figure 11. The canvas can be dragged
and zoomed using the mouse. On loading a medium resolution MRM
is shown, a higher resolution version can be loaded when zooming in
to see the full resolution details of the radar image. On the canvas the
following features are shown:
• Extracted scatterers, coloured with a property. Any property can
be used.
18 insar investigation tools
Figure 10: The elements that appear in the InSAR Inspector GUI.
Figure 12: Automatic unwrapping clearly makes a mistake in the case shown
here. Points with a suspicious linear deformation velocity can be
inspected and errors can be tracked down.
19
20 insar investigation tools
Figure 13: The phase shown in the plot is corrected for topographic phase. If the
amount of topographic phase is wrong, the height can be changed
until the residual phase is plausible to be deformation. By changing
the height smoothly by sliding the mouse this can be done very
quick.
The point information table gives a quick overview of the point proper-
ties extracted during processing. All point properties that are available
in the PointSet are shown.
The table starts with the point ID, a unique identifier based on the
radar coordinates. This doubles as input field to quickly return to a
previously selected point. This way interesting points can be written
down and found back later on.
Following are all numerical point properties, such as the height, defor-
mation rate and ensemble coherence. Only properties with one value
per scatterer are shown, other properties, which have one value per
acquisition, can be shown in the additional information plot.
Using this slider the currently selected radar acquisition can be changed.
This selection is shown in the phase plot, additional plot, and subpixel
position plot on the background. It also determines the displayed image
of the SLC stack. This way changes in any of those properties can be
related.
Settings
Some frequently used settings are shown in the GUI, other settings
need to be accessed through the GUI object via the command line. The
GUI contains toggles for:
3.4 object oriented insar toolbox 21
(a) (b)
Figure 14: The additional information plot is used for finding a relation between
properties. a) perpendicular baseline shows a clear relation with
amplitude, which is useful for determining the scatterer size. b)
Multiple relations can be shown together by using colour or size in
addition to the x-, and y-axis.
The GUI object gives access to the selected points and all their prop-
erties, and also to all GUI elements and display functions. Interaction
via the command line is necessary when more flexibility or more user
input is required than can be given through the GUI. For example to
show a custom property on the background, make high quality plots of
selected points, or estimate new parameters or relations. See appendix
A for the implementation details and examples.
The information that is shown in the GUI, and also is used while
processing or post-processing, needs to be stored in an accessible way.
Functions, which are also used by the GUI and processing routines,
must be able to access this information too, so it doesn’t have to be
supplied manually. All this is done in an object oriented structure.
All information is stored in objects, and a lot of processing functions
are added to the object classes, such as algorithms for radar image
processing, scatterer selection, and geocoding.
The toolbox is able to perform the following actions:
• Load metadata from DORIS .res files
• Calculate metadata dependent properties
• Load SLC data with on the fly oversampling and filtering
22 insar investigation tools
PointSet SlcStack
tsinsar_inspector +slc +master_idx
+n_points +Slc
+pointset +n_observations +...
+handles +pnt_line
+slc +make_poly_refpha_h2ph()
+pnt_pixel +xyz2linepixel()
+... +phase +latlonh2linepixel()
+loadbackgrounddetail() +pnt_height +...()
+...()
1
SlcImage 1..*
+data Slc
+raster +precise_orbit
+slc +wavelength
+spectrum_az +date
+spectrum_ra +poly_latlon
+coherence() +poly_refpha
+interferogram() +poly_h2ph
+subtract_refpha() +poly_dopplercentroid
+filter_az_to() +...
Spectrum
+filter_ra_to() +linepixelheight2xyz()
+...() +window +xyz2linepixel()
+bandwidth +make_poly_geocoding()
+shift +...()
Figure 15: The most important classes and their connections are shown in an
UML Class diagram. This shows for each class the properties and
functions, and how they relate to each other.
angle, but also pixel spacing and resolution, which depend on the
signal parameters. Also functions for geocoding and inverse geocoding
are included, which transform given coordinates using orbit and signal
parameters.
The information is imported from DORIS result files, complemented
with some satellite specific information from a small database, which is
not available in the result files.
The SlcStack is mainly a container for a stack of Slc objects. It provides
convenient access to properties that are either the same for the whole
stack, or most often needed from only the master. These are mostly
properties that are related to the coordinates, for example geocoding or
ground resolution.
To use a stack of radar images the radar data should be coregistrated
and resampled using DORIS, so from that moment the only the master
coordinates have to be used. Any following DORIS processing steps,
such as interferogram creation or topographic phase correction, are
not used, as those steps can be performed more flexible using the
toolbox. Oversampling in DORIS is allowed, the coordinates are auto-
matically converted back to not oversampled coordinates, but it is not
recommended due to the big increase in disk usage.
Topographic phase computation is done using the SlcStack, since the
master SLC is necessary for this.
Geocoding and topographic phase computation is done using 3D poly-
nomials. This way it is very fast even for millions of points, and an
otherwise computationally expensive geoid model can be included
at no extra cost. In chapter 5.2 the theory and implementation of the
computation of these polynomials is discussed.
3.4.2 SlcImage
3.4.3 PointSet
shift
window
shape
−1 0 1
bandwidth
Figure 16: A normalised spectrum is defined with three properties: window
shape, bandwidth, and center frequency shift.
3.4.4 Spectrum
For processing radar images optimally the signal properties are neces-
sary. Most of the radar signal properties are defined by its spectrum,
such as the impulse response, resolution and spectral shift. For process-
ing data only the normalised spectrum is important, which relative to
the sampling rate, and so always goes from -1 to 1. This normalised
spectrum is stored in this object. The absolute frequencies and band-
width can be found in the Slc object, so can also easily be accessed
when necessary.
The spectrum is defined by the following properties:
• Window shape. Defined as a polyline.
• Bandwidth. Relative to the full bandwidth, so a value in the range
(0,1).
• Centre frequency shift. When the spectrum is not zero centred, as
with zero doppler processed SAR images, this property defines
the shift.
The spectrum class is used to represent the actual state of a radar image.
This might be the original spectrum, or a modified spectrum due to
filtering or oversampling.
26 insar investigation tools
Furthermore some other useful classes are included, which are more
general and are used on various places. These are described below.
Poly3D is a class that stores a 3D polynomial, and includes a function
to efficiently evaluate it.
A Raster object stores the range and step size of gridded data, and
several functions to make intersections, subdivisions or convert between
coordinates and indices.
RasterData contains gridded data, and the corresponding coordinates
in a Raster object. It is similar to SlcImage, and contain the processing
functions of SlcImage which also can be done on general images, such
as peak selection or gaussian blur.
hdf5prop is a class which handles very large arrays by reading and
writing data on disk transparently. Those variables can be used and
indexed just as normal variables. The data is stored in a HDF5! (HDF5!)
file, which is designed for storing large amounts scientific data in a fast
an structured way.
dependentprop makes dependent properties in a PointSet act as normal
variables. It contains a reference to a function which calculates the
desired property.
3.5 outlook
There are also several future projects that may improve the selection
and characterisation of the point measurements. Below, a few of these
future improvements are outlined.
28
P O I N T S C AT T E R E R S E L E C T I O N
4
4.1 introduction
4.2 algorithm
29
30 point scatterer selection
Figure 18: Estimation window. Only the main lobe is used for SCR estimation.
have higher accuracy, but has a lower resolution. The window is chosen
so that only the mainlobe is included, as shown in figure 18.
The complex impulse response function (IRF) correlation at position x
is defined in 1D as:
R∞ ∗
−∞ s (x + u) · f (u) · w (u) du
ρirfcpx (x) = qR R∞ (4.1)
∞ 2 2
−∞ |s (x + u)| · w (u) du · −∞ |f (u)| · w (u) du
where s (x) is the signal at position x, f is the IRF and f∗ its complex
conjugate, and w is the weighting function.
This equation is constructed by taking the (cross) correlation function
of the signals s and f [Cumming and Wong, 2005, p.23]:
Z∞
ρ (x) = s (u) · f∗ (u − x) du (4.2)
Z−∞
∞
= s (x + u) · f∗ (u) du
−∞
To this equation weight term w is added that is aligned with the IRF.
For the weighting function the power of the absolute value of the
impulse response w (u) = |f (u)|2 or w (u) = |f (u)| is used. This way
the weight is zero on places where nearby scatterers have no influence,
and high where scatterers have more influence. In other words: it equals
the amount of influence of other scatterers at each distance from the
point scatterer. These windows gives good performance in resolution
and accuracy. Other window function may shift the balance between
resolution or accuracy.
The result is a complex correlation. For a real valued correlation, sig-
nifying the correlation of the signal with an ideal point scatterer with
same phase as the signal, this complex correlation is projected on the
signal phase:
s∗ (x)
ρirf (x) = Re ρirfcpx (x) · . (4.4)
|s (x)|
4.2 algorithm 31
This is an alternative to taking the absolute value, this way the value
of ρirf is reduced when the signal phase differs from the average
phase of the estimation window, which is the case when the scatterer
is influenced by clutter. The value of ρirf can become zero or negative
when the phase difference is very large.
4.2.1 Implementation
The given functions are in 1D, in reality the 2D case is used, taking
both azimuth and range direction into account.
These functions are defined in the continuous domain, because the radar
signal is a continuous signal. For evaluation of the functions a sampled
signal is used. For sufficient accuracy the signal is oversampled at least
2 times. It can be implemented efficiently using two convolutions in
both azimuth and range direction, respectively. The impulse response is
truncated to the main lobe, which has little influence since the weighting
values are very low outside the main lobe. With 2 times oversampling
this results in a window size of about 5 pixels, which is small enough
to be fast in computation.
where σ2az and σ2ra are the standard deviations of the subpixel position
in all of the images in azimuth and range direction, respectively.
4.3 results
80
40
height [m]
0
-40
-80
Figure 19: Point scatterers selected using IRF correlation are shown on a MRM.
The sidelobes of the bright scatterer in the center are not selected, but
nearby scatterers which stronger than the sidelobes are selected. In
this case the strong scatterer with visible sidelobes is at the bottom
of a building, while at each story above that scatterer is another
scatterer. The building is the highrise at the Bosboom Tousaintplein
in Delft, a 16 story building, the building has an interior corner, as
can be seen on the photo, which causes strong reflections. The image
is acquired by TerraSAR-X in descending orbit with a resolution of
3x3m.
Figure 20: Overview of the processed area of 50x30 km, including the
Maasvlakte, Rotterdam and Delft. The stack consists of 69 images
and is acquired by Envisat in the descending pass. The colour shows
IRF correlation, according to eq. (4.4), all points with ρirf > 0.6 are
shown.
0.69 0.76
0.67 0.74
4 0.65 4 0.72
included pixels
included pixels
All All
3 3
2 2
1 1
0 0
0.4 0.5 0.6 0.7 0.8 0.9 1 0.4 0.5 0.6 0.7 0.8 0.9 1
ensemble coherence ensemble coherence
4 4
x 10 x 10
6 6
238 589 152 46 29 21 16 11 9 6 x 104 238 589 152 46 29 21 16 11 9 6 x 104
0.78 0.73
0.71 0.69
20 aquisitions
5 0.66 5 0.66
0.64 0.64
4 0.61 4 0.62
included pixels
included pixels
All All
3 3
2 2
1 1
0 0
0.4 0.5 0.6 0.7 0.8 0.9 1 0.4 0.5 0.6 0.7 0.8 0.9 1
ensemble coherence ensemble coherence
4 4
x 10 x 10
6 6
238 589 152 46 29 21 16 11 9 6 x 104 238 589 152 46 29 21 16 11 9 6 x 104
0.77 0.70
0.70 0.63
69 aquisitions
5 0.65 5 0.59
0.62 0.57
4 0.60 4 0.56
included pixels
included pixels
All All
3 3
2 2
1 1
0 0
0.4 0.5 0.6 0.7 0.8 0.9 1 0.4 0.5 0.6 0.7 0.8 0.9 1
ensemble coherence ensemble coherence
Figure 21: The accuracy of IRF correlation as quality estimate, compared with
amplitude dispersion, for a stack size of 8, 20 and 69 images. For
each stack size overlayed histograms of the ensemble coherence
distribution are shown, with: in gray the total amount of included
pixels, where it is clipped the values are shown in the bars; in blue till
orange the distribution of the best 20 000, 60 000, 120 000, 200 000 and
300 000 points respectively, selected by IRF correlation or amplitude
dispersion. In the legends the thresholds corresponding to those
amounts are shown. (instead of amplitude dispersion amplitude
consistency= 1 − DA is used here). Roughly points with an ensemble
coherence < 0.5 can be regarded as bad.
35
36 point scatterer selection
0.8
partially persistent
or partially persistent
0.7 point scatterers
semi distributed
0.6
IRF correlation
distributed 0.5 Noise
scatterers stable distributed
scatterers
0.4
0.3
0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
Ensemble coherence (linear deformation)
Table 2: Several scatterer classes based on IRF correllation and ensemble coher-
ence. The classes that most prominent in the test data are shown. One
example of this is water, while water surface scattering just ends up in
as noise, dual bounce reflection on a water surface-wall corner gives
a valid point scatterer, but moving along with the water level, which
even in the Netherlands not stable on mm level.
to get same quality point selection results with smaller stacks, or better
quality results with big stacks. This is especially important for monitor-
ing campaigns, where it is a big burden to have to wait until more than
20 images are acquired, as that can easily take 2 years.
For each point scatterer the position is estimated with sub pixel preci-
sion, allowing more precise geocoding.
Besides this also the position can be estimated per acquisition, which
can be used for an additional quality measure, for investigating bigger
movements, or to improve coregistration. Figure 23 shows the estimated
position through time.
The position standard deviation, as defined in equation 4.5, shows a
clear correlation with ensemble coherence, as can be seen in figure 24.
As such it is useful as additional quality estimate.
phase diff points L64675P73584, L44732P73404 phase diff points L59205P76228, L44732P73404
20 20
phase difference [mm]
0 0
−10 −10
−20 −20
1 1
Jan10 Jan11 Jan10 Jan11
Time Time
0.8 0.8
IRF correlation
IRF correlation
0.6 0.6
0.4 0.4
0.2 0.2
0 0
Jan10 Jan11 Jan10 Jan11
observation date observation date
(a) (b)
Figure 22: IRF correlation through time can be used for change detection, to
detect appearing and disappearing point scatterers. High IRF corre-
lation values occur also during the incoherent time, this is because
in a single observation random clutter has a fair chance of reflecting
concentrated energy. Here two scatterers from the spoorzone Delft
are shown, where construction of a train tunnel has started early
2011.
−0.4
azimuth [m]
−0.2
0
0.2
0.4
0.3
slant range [m]
0.2
0.1
−0.1
Jul09 Jan10 Jul10 Jan11 Jul11
observation date
(a) (b)
Figure 23: Detected scatterer position through time for a newly appearing point
scatterer in early 2011. The positions before the change are very
varying, while after the change the position stays very much the
same. On a closer look during the coherent period a movement
in range and azimuth can be seen. This movement is about 30
cm in azimuth, to the south, and it is unclear what is causing the
movement, since it seems unlikely that the point has moved so much
in reality. (It is in the new bus station near Delft CS.). In slant range
the movement is about 4 cm, while the deformation measured with
the phase is less than 1 cm, as seen in figure 22a. These positions are
all estimated relative to the coregistration, so care should be taken
when interpreting, however a shift in coregistration correlated with
time seems improbable.
37
38 point scatterer selection
0.8
position deviation
0.6
0.4
0.2
0
0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
ensemble coherence
Figure 24: Position standard deviation, as defined in equation 4.5, can be used
as a quality measure. In this figure the relation with ensemble co-
herence is shown for the stack used in 4.3.3. The problems with
ensemble coherence as quality measure that are shown in table 2 are
applicable here too.
where γ can be exchanged with ρirf , SSCR1 is the SSCR of the scatterer,
and SSCR2 = 1. This is because IRF correlation is in fact the coherence
between the signal and a perfect clutter-free signal. So this leads to:
√
ρirf = SSCR (4.6)
SSCR = ρ2irf
4.5 conclusions 39
0.8
IRF correlation ρ2
0.6
0.4
0.2
0
0 0.2 0.4 0.6 0.8 1
SSCR
Figure 25: The relation for ρ2irf as a estimator for SSCR. It is very accurate for
SSCR above 0.7, for lower values it is biased due to the small number
of looks. The thick black line shows the experimentally obtained
relation, the thin black line shows the theoretical relation.
4.5 conclusions
The new method for selection of stable point scatterers based on their
theoretical shape is shown to be very effective. As a quality estimate it
creates new possibilities for processing small stacks, and for large stacks
improvement is shown compared to amplitude dispersion. Besides
being a quality estimate it also neglects sidelobes, solving the problem
of sidelobe filtering inherently. Another benefit is the subpixel precision
position estimation, which is shown to be useful as quality measure
and for increasing geocoding accuracy. Furthermore a quality estimate
per acquisition is available, creating opportunities for change detection.
Since it could be used as a replacement for amplitude dispersion both
methods are compared. The IRF correlation method differs from ampli-
tude dispersion in the following ways:
• Only point scatterers are selected, with sub pixel precision loca-
tion and without sidelobes.
40 point scatterer selection
5.2 geocoding
41
42 geocoding improvements
trend is corrected with about 2 mm/y. While this reduces the mysteri-
ous trend seen in Envisat results, it doesn’t explain the whole 15 mm/y
that is observed.
5.4 conclusions
The techniques presented in this paper are beneficial for almost all
InSAR processing, in this chapter several cases which benefit most are
described.
The first case shown is the case that was the main reason for this
research, monitoring the construction of the Delft train tunnel using
InSAR.
Second, a case involving detailed monitoring of unstable storage tanks
in harbours is treated, combining speckle tracking with interferometry
on a very detailed scale.
At last the benefits for the corner reflector experiments undertaken by
the TU Delft are shown.
45
(a)
(b)
Figure 26: Selected scatterers shown on top of AHN2 and GBKN as very ac-
curate reference data. Light points are below 3m NAP, while dark
points are above. A stack of 16 TerraSAR-X images in descending
orbit is used a) Scatterers selected using amplitude dispersion with
sidelobe filtering. This is the data used in Arroyo et al. [2009]. Even
with strong filtering there are points in the water. There is a shift
in geocoding that is variable spacially. b) Scatterers selected using
IRF correlation. The density has been improved manyfold, while no
scatterers in water are selected. Clear straight lines of scatterers are
visible, making it easy to relate them to buildings. There is still a
spacially varying shift due to a small trend in the height estimation
which has to be removed.
46
6.2 delft train tunnel 47
Figure 27: Scatterers selected using IRF correlation, in green: descending stack
of 72 images, in pink: ascending stack of 68 images. Dark points are
high and light point low. The accuracy is clearly improved compared
to the 16 image stack in figure 26b. The accuracy of geocoding is a lot
better, because the trend in the height estimation is greatly reduced.
A lot of scatterers disappeared because the area on the left has been
demolished for the train tunnel construction. In both ascending and
descending data still a shift is visible, both relating to about a meter
atmospheric delay in range.
images. Since the results using 16 images are already so good, the
improvement is less obvious than the impact the use of a new method
has. The scatterers in the 72 image results are more on straight lines,
which reveals improved local accuracy. Also the absolute accuracy is
higher, the points are accurately positioned in the whole processed
area, while using 16 images a trend in estimated height appeared,
influencing the geocoding. The difference in both relative and absolute
positioning accuracy is mainly caused by height estimation, which is
less accurate with less acquisitions, and depends on the correctness of
the used deformation model.
Besides better point selection and improved geolocation another major
improvement is the ability to easily inspect all scatterers in detail using
the InSAR Inspector. This way it is easy to search for interesting points.
For finding points that go down because of the tunnel construction a
quadratic term is fitted to the deformation in addition to the linear one,
and this quadratic term is selected for the point colour. It is immediately
visible that the area around the spoorzone is accelerating, so one of
the points, near the Binnenwatersloot, is selected to see its timeseries
and other properties, as seen in figure 28. Until January 2011 it is
stable, after that it is going down with about 10 mm/y. To identify the
selected scatterer better a secondary screen can be opened, showing the
scatterers on an aerial image, see figure 29. Also this secondary screen
can be zoomed in further, and points can be selected from here. As can
be seen in the table, the point is on the ground, and it appears to be on
the street.
Figure 28: Spoorzone data loaded into the InSAR Inspector, with the quadratic
deformation term selected for display.
−3
x 10
4
52.01
1
52.0095
quadratic [m/year2]
Latitude [deg]
0
phase diff points L61635P74247, L66981P67837
52.009 20
−1
phase difference [mm]
10
−2
52.0085 0
−3
−10
−4 −20
4.355 4.3555 4.356 4.3565 4.357 4.3575 4.358 4.3585
Longtitude [deg] Jan10 Jan11
Time
Figure 29: For further identification a secondary screen with an aerial photo is
shown. The white circle denotes the selected point. It is on ground
level on the Binnenwatersloot.
48
6.3 storage tank monitoring 49
−3
x 10 phase diff points L59876P75970, L66981P67837
4
52.0144
20
52.014
0
2
52.0138
−10
1
quadratic [m/year2]
52.0136
−20
Latitude [deg]
52.0134 0
phase diff points L59819P76009, L66981P67837
52.0132 20
−1
−2
52.0128 0
52.0126 −3 −10
52.0124 −20
−4
4.3505 4.351 4.3515 4.352 4.3525 4.353 4.3535 Jan10 Jan11
Longtitude [deg] Time
Some other points have been selected on a house, for one of the people
who reads this, see figure 30. Neither the top and the bottom front of
the house are moving alarmingly, but there is clear movement visible
in both.
In harbours very large tanks are used for storage of liquids. These
tanks have to be monitored accurately to guarantee their safety. As an
alternative for traditional measuring techniques a feasibility study for
the use of InSAR for monitoring those tanks has been done by Hansje
Brinker.
A processing using the standard processing chain didn’t give satisfying
results, almost all scatterers on the tanks were discarded because of
unwrapping problems.
A detailed analysis follows, exploring the raw data, alternative unwrap-
ping techniques and other new estimation methods. Also auxiliary
information such as tank diameter and height is incorporated. For
this analysis the toolbox showed to be extremely useful, being flexible
enough to completely custom processing, while it still takes care of all
the metadata, coordinate transformations and easy data access.
The rest of this section will show some results, while showing the ease
of implementation.
Both EnviSat and TerraSAR-X are investigated, but the resolution of
EnviSat turned out to be too low to distinguish separate tanks. Using
TerraSAR-X the tanks are very distinguishable, each tank having at least
a few bright scatterers, and the bigger tanks having the entire top edge
clearly visible, as can be seen in figure 31. The downside of TerraSAR-X
is the short wavelength, which makes that a phase ambiguity already
happens after 16 mm deformation.
(a)
(b)
Figure 31: Overview of the inspected tanks. a) Reference optical image for
interpretation purposes. b) Selected scatterers on the mean amplitude
radar image. The image was acquired in descending orbit, hence the
satellite is looking from the left. Reference point 1 was chosen for its
recognisable location, but point 2 turned out to be more stable, so is
used instead for further investigation.
50
6.3 storage tank monitoring 51
6820 20
15
6840 5
0
6850
−5
6860 −10
−15
6870 −20
3800 3810 3820 3830 3840 3850 3860 3870 3880 Apr09 Jul09 Oct09 Jan10 Apr10 Jul10
Range [px] Time
(a) (b)
phase diff hoog−laag, qual: 6.5 deformation from subpixel position, hoog
0.6
20
10
0.2
5
0 0
−5
−0.2
−10
−15 −0.4
−20
Apr09 Jul09 Oct09 Jan10 Apr10 Jul10 Apr09 Jul09 Oct09 Jan10 Apr10 Jul10
Time time
(c) (d)
O 1/2 pi 0.04
tilt [m w.r.t. center]
0.02
ZO
0
Z 0
−0.02
ZW −0.04
−0.06
W −1/2 pi
−0.08
NW −0.1
Apr09 Jul09 Oct09 Jan10 Apr10 Jul10
time
N −pi
Apr Jul Oct Jan 2010 Apr Jul
(e) (f)
Figure 32: The results for one of the tanks, a tank of 18.5 m high and 39 m
wide. a) Point scatterers selected on this tank, red=high, blue=low.
b) Phase difference with reference point, a possible interpretation is
that the deformation is accelerating with about 32 mm/y2 , which
would result in such a pattern. c) The top-bottom difference shows a
seasonal pattern, which can be explained by thermal expansion, the
variation of 8mm on 18.5m relates to a temperature variation of 36◦ .
d) The subpixel movement shows about 7 cm/y, but the accuracy
is probably not high enough to fix ambiguities. e) The tilt of the
top is estimated using the phase of the entire top edge, shown here
for each acquisition, with the first acquisition as reference. f) The
estimated tilt through time, both in azimuth and range direction.
The red dots indicate discarded observations due to low quality of
fit.
52
6.4 corner reflector experiments 53
Figure 33: The corner reflector experiment in Cabauw. It lasted only 5 observa-
tions before the final EnviSat image was acquired.
55
BIBLIOGRAPHY
57
58 bibliography
APPENDIX
TOOLBOX MANUAL
A
On the following pages a full reference manual of the created toolbox
classes is given, including the visual InSAR Inspector tool, and the
SlcStack, Slc, SlcImage, PointSet, Filter, Poly3D, Raster, RasterData,
hdf5prop, and dependentprop classes.
61
Chapter 1
Visual inspector tool for analysis of PointSet objects. The InSAR Inspector is a GUI that provides
a way to make the spatially dense, multivariate, and relative information that is involved in InSAR
processing fully explorable.
An overview map is shown at the main screen, on which all scatterers are shown. When a scatterer
is selected, either on this overview map or using a unique point ID, all information about a scatterer
is shown, both as numerical data and using plots. All information that is added during processing is
shown, such as quality measures, height, and deformation. Information that varies per acquisition is
plotted, such as amplitude or deformation. For deeper inspection several options are available, such
as showing the underlying radar image data, calculating a coherence matrix, and data access from
the command line.
The GUI object gives access to the selected points and all their properties, and also to all GUI elements
and display functions. Interaction via the command line is necessary when more flexibility or more
user input is required than can be given through the GUI. For example to show a custom property on
the background, make high quality plots of selected points, or estimate new parameters or relations.
psinspector = pointset inspector( pointset ) Start an instance of the PointSet
Inspector on the given PointSet. The returned object can be used to interact with the GUI via
the command line.
1.1 Properties
handles Struct containing all graphics handles. Using this custom data can be added to the plots.
pointset The PointSet with which the GUI was initiated. convenience copies
slc SlcStack object from the pointset
poly Polygon selected using the polygon selection tool image buffers
mrm Stack of RasterData objects shown on the background.
selected1 PointSet containing the selected point.
1
selected2 PointSet containing the reference point.
timesel Selected acquisition index.
pnt custom Custom point attribute for visualisation, can for example be set to interferemetric
phase using pnt custom = wrap(diff(pointset.phase corrected(:,[master slave]),[],2));
obs custom Custom observation attribute for visualisation.
filt Optional selection of the points that are displayed on the background. settings
use points Setting whether to use points from the pointset or raw data.
show slc Setting whether to load and show the slc stack on selecting a point.
estimate height Setting whether to estimate the height difference on the fly or use the given
point height.
subtr atmo Setting whether to correct the phases in the phase plot for atmosphere.
norefpoint Setting whether to subtract the reference point or not.
1.2 loadbackgrounddetail
Load high resolution background around selected point.
gui.loadbackgrounddetail(ovs) Use given oversampling rate for the background. If omitted
it defaults to 1. If ovs == 0 the background will be removed.
1.3 loadaerial
Load an aerial view with selectable points.
gui.loadaerial(zoom) Load an aerial view with Google zoomlevel zoom. Default is 17.
2
Chapter 2
Class: Slc
2.1 Properties
id short identification string: acquisition date as number.
sensor Sensor description string.
date The date of observation (including time) [days].
timeOfDay The time of day of the first pixel, for ms accuracy, also the time used for orbits [s].
timeToFirstPixel two way travel time to first pixel [s].
wavelength Radar wavelength [m].
PRF Pulse Repetition Frequency [Hz].
RSR Range Sampling Rate [Hz].
bandwidthAz Bandwidth in azimuth direction [Hz].
bandwidthRa Bandwidth in range direction [Hz].
poly fDC Doppler centroid frequeny [Hz]. Poly2D object with input in seconds from first pixel,
[line / PRF, pixel / RSR]. For TerraSAR it is given 2D (but not in Doris .res file), for Envisat
only 1D.
preciseOrbit Satellite orbit, n-by-4 array with columns [time of day, x, y, z]
is ascending True if the orbit is ascending, false if descending.
orig Info about original file. .filename .fileformat .window [Raster]
crop Info about crop file. .filename .fileformat .window % Raster
3
rsmp Info about resampled file (to master coordinates). .filename .fileformat .window [Raster]
bperp Perpendicular baseline (from DORIS)
bpar Paralel baseline (from DORIS)
btemp Temporal baseline (from DORIS)
poly coreg Coregistration polynomial, 1x2 Poly2D object for line and pixel.
n coregwin Number of coregistration windows.
poly h2ph h2ph polynomial as calculated by DORIS. Poly2D object. Not used anymore,
superceded by poly3 h2ph.
poly flat flat earth polynomial as calculated by DORIS. Poly2D object. Not used anymore
superceded by poly3 refpha.
poly3 refpha Topographic phase polynomial. Poly3D object. Returns the topographic phase on a
[line, pixel, height] coordinate. Height is relative to the geoid.
poly3 h2ph h2ph polynomial. Poly3D object. Returns the derivative of poly3 refpha to height on
a [line, pixel, height] coordinate.
poly geocoding Geocoding polynomial for geocoding to WGS84 XYZ coordinates. 1x3 Poly3D
object. Returns the XYZ coordinates for a [line, pixel, height] coordinate. Height is relative to
the geoid.
poly latlon Geocoding polynomial for geocoding to WGS84 [lat,lon] coordinates. 1x2 Poly3D
object. Returns the [lat,lon] coordinates for a [line, pixel, height] coordinate. Height is relative
to the geoid.
poly xyz2linepixel Inverse geocoding polynomial from WGS84 XYZ coordinates. 1x2 Poly3D
object. Returns the [line,pixel] coordinates for a [X,Y,Z] coordinate.
poly latlonh2linepixel Inverse geocoding polynomial from WGS84 [lat,lon] coordinates. 1x2
Poly3D object. Returns the [line,pixel] coordinates for a [lat,lon,height] coordinate.
geocoding shift Shift in [line,pixel]
calibrationFactor General calibration factor to apply when reading data.
res slave Content of DORIS slave res file. Only for debugging.
res ifg Content of DORIS interferogram res file. Only for debugging.
res master Content of DORIS master res file. Only for debugging.
spectrumAz Spectrum in azimuth, Filter object. Contains relative bandwidth and spectrum shape.
Not shifted to doppler centroid because that depends on location.
spectrumRa Spectrum in range, Filter object. Contains relative bandwidth and spectrum shape.
velocity Indication of satellite velocity
heading Satellite flight direction, calculated at center of original frame
preciseOrbit etrs Satellite orbit converted to from ITRS to ETRS
look angle Indication of satellite look angle (angle between earth center and look direction)
4
incidence angle Indication of satellite signal incidence angle (angle at which the radar signal
reaches the ground)
spacing az Pixel spacing on ground in azimuth/flight direction, or line spacing [m]. Calculated at
center of original frame.
spacing ra slant Range pixel spacing, in looking direction [m].
spacing ra ground Range pixel spacing projected to geoid [m]. Calculated at center of original
frame.
fDC mean
fDC
poly coregRa
poly coregAz
2.2 clone
Create a clone of self.
2.4 readDataRsmp
Read resampled SLC data from disk, and return as an SlcImage.
imc = slc.readDataRsmp( crop, filterwindow ) Read the specified crop from the SLC.
crop must be given as a Raster object. If omitted the whole file is read. If the crop is
oversampled the SLC is automatically oversampled to meet the crop. If filterwindow is
specified the SLC data is spectrally filtered to the given window, which can be used to reduce
sidelobe levels or increase resolution.
Example:
5
imc = slc.readDataRsmp( Raster(5101:6100,2401:2500).oversample(2), kaiser(100, 3) )
Load a crop of 1000x100 pixels, oversampled 2 times, and filtered to a Kaiser window with
beta = 3.
2.6 lph2xyz
Calculate the XYZ coordinate from line, pixel and height above ellipsiod.
[x,y,z] = slc.lph2xyz( line, pixel, height, refframe ) refframe can be ETRS to
convert the orbits to ETRS before usage.
In general the geocoding polynomials can be used instead of this function, either direct or via the
geocoding functions in SlcStack
6
height range Valid height range, default [-500 9000], which includes all places on the earth surface
from the Dead Sea to the Mount Everest.
geoid Whether to use the geoid as reference heigth, otherwise the ellipsoid is used. default true.
refframe Can be ETRS to convert the orbits to ETRS before usage.
The returned structure includes quality descriptions of the polynomials, with the fields std for standard
deviation and max for maximum error.
2.8 xyz2azra
Calculate the [line, pixel] coordinates from XYZ coordinates.
[line,pixel,dist] = slc.xyz2azra( xyz, refframe) refframe can be ETRS to convert
the orbits to ETRS before usage.
In general the geocoding polynomials can be used instead of this function, either direct or via the
geocoding functions in SlcStack
2.9 makeInterferograms
make interferograms of an slc stack and save them as png. Input parameters: dirname Directory
where to save images master Master index for single master ifgs, or incremental, positive, all,
or magnitude crop Raster object of crop doFilter do range and azimuth filtering window apply
window, kaiser(N,4) is default flip lr, ud ovs Oversample factor range magnitude range (for mixed),
logarithmic. Adaptive is default ref reference coordinate phaseonly don’t mix phase and magnitude
cohAsMag estimate coherence and use that as magnitude
TODO: Too many input parameters...
7
Chapter 3
Class: SlcStack
The SlcStack class represents a stack of Slc images. Properties of the class apply to the stack as
a whole, such as wavelength and pixel spacing. Properties that are aqcuisition-specific can be found
in the individual Slc objects in the slc array.
3.1 Properties
n images The number of Slc images in the stack.
slc An array of Slc objects with length equal to n images.
date Array of dates (datenum) of all aqcuisitions.
dir The base directory of the raw Slc data.
master idx Index in the Slc array of the master image, to which all the other images are
coregistred.
master The master Slc object
sensor Sensor(s) description string.
wavelength Radar wavelength [m].
PRF Pulse Repetition Frequency [Hz].
RSR Range Sampling Rate [Hz].
is ascending True if the orbits are ascending, false if orbits are descending.
poly geocoding Geocoding polynomial for geocoding to WGS84 XYZ coordinates. 1x3 Poly3D
object. Returns the XYZ coordinates for a [line, pixel, height] coordinate. Height is relative to
the geoid. See Slc/make poly geocoding for technical details.
poly latlon Geocoding polynomial for geocoding to WGS84 [lat,lon] coordinates. 1x2 Poly3D
object. Returns the [lat,lon] coordinates for a [line, pixel, height] coordinate. Height is relative
to the geoid.
8
poly xyz2linepixel Inverse geocoding polynomial from WGS84 XYZ coordinates. 1x2 Poly3D
object. Returns the [line,pixel] coordinates for a [X,Y,Z] coordinate.
poly latlonh2linepixel Inverse geocoding polynomial from WGS84 [lat,lon] coordinates. 1x2
Poly3D object. Returns the [line,pixel] coordinates for a [lat,lon,height] coordinate.
poly h2ph h2ph polynomial as calculated by DORIS. Poly2D object. Not used anymore,
superceded by poly3 h2ph.
poly flat flat earth polynomial as calculated by DORIS. Poly2D object. Not used anymore
superceded by poly3 refpha.
poly3 refpha Topographic phase polynomial. Poly3D object. Returns the topographic phase on a
[line, pixel, height] coordinate. Height is relative to the geoid. See Slc/make poly refpha h2ph
for technical details.
poly3 h2ph h2ph polynomial. Poly3D object. Returns the derivative of poly3 refpha to height on
a [line, pixel, height] coordinate. Difference to the DORIS computed is that multiplying with
m2ph is not necessary.
poly accuracy polynomial accuracy and range information
spacing az Pixel spacing on ground in azimuth/flight direction, or line spacing [m]. Calculated at
center of original frame.
spacing ra slant Range pixel spacing, in looking direction [m].
spacing ra ground Range pixel spacing projected to geoid [m]. Calculated at center of original
frame.
velocity Satellite velocity
heading Satellite flight direction, calculated at center of original frame
look angle Indication of satellite look angle (angle between earth center and look direction)
incidence angle Indication of satellite signal incidence angle (angle at which the radar signal
reaches the ground)
3.2 SlcStack
Create an SlcStack object. The first argument is mandatory, the rest is optional and follows the
key-value convention. Any of the optional arguments can be combined.
slcstack = SlcStack(dir) Creates an SlcStack object from data in directory dir, using
autodetection to determine the data format. Supported formats are DORISstack, DePSI and
Stamps, relying on the specific constructors fromDorisStackDir, fromDepsiDir and
fromStampsDir.
slcstack = SlcStack(dir,'reject',reject) Rejects images based on a boolean or integer
reject array.
slcstack = SlcStack(dir,'fromdate',from,'todate',to) Rejects images taken before
from or after to, both specified as datenum.
9
slcstack = SlcStack(dir,'refframe','ETRS') Use ETRS reference frame for orbits in
computation of refpha and h2ph.
3.3 linepixelheight2xyz
Convert the given coordinates to WGS84 XYZ.
Height is interpreted as height w.r.t. the geoid.
xyz = slcstack.linepixelheight2xyz([line, pixel, height])
[x, y, z] = slcstack.linepixelheight2xyz([line, pixel, height])
3.4 linepixelheight2latlon
Convert the given coordinates to WGS84 [latitude, longtitude].
Height is interpreted as height w.r.t. the geoid.
latlon = slcstack.linepixelheight2latlonh([line, pixel, height])
[lat, lon] = slcstack.linepixelheight2latlonh([line, pixel, height])
3.5 xyz2linepixel
Convert the given coordinates to line, pixel. Coordinates are interpreted as WGS84 XYZ coordinates.
linepixel = slcstack.xyz2linepixel([x, y, z])
[line, pixel] = slcstack.xyz2linepixel([x, y, z])
3.6 latlonh2linepixel
Convert the given coordinates to line, pixel. Coordinates are interpreted as WGS84 coordinates with
height w.r.t. the geoid.
linepixel = slcstack.latlonh2linepixel([lat, lon, height])
[line, pixel] = slcstack.latlonh2linepixel([lat, lon, height]) If the height
column is omitted it is set to 0.
10
Chapter 4
Class: SlcImage
4.1 Properties
raster Coordinates and spacing of the data. Raster object.
data Data array.
slc Slc object of the SLC.
coordinateSystem Coordinate system in which the data is sampled. Can be original or
resampled. resampled means resampled to master coordinates.
subtrflat True if flatearth is removed.
filtAz Azimuth spectrum of data.
filtRa Range spectrum of data.
filtFreq
crop
4.2 oversample
Oversample Slc data. The filter used for oversampling is the MATLAB default with a filter order of
10. The shift of the spectrum in azimuth direction, the doppler shift, is taken into account by shifting
the filter.
imc = imc.oversample(ovs) Oversample data ovs times.
11
4.3 cropTo
Crop the data to the given raster.
imc = imc.cropTo(raster) Crop the data to raster, where raster is a Raster object.
4.4 filterRa
Filter the data in range direction. The filter to be applied is given as a Filter object, which can be
used to construct spectral filters.
imc = imc.filterRa(filter, n) Filter the data in range direction, using the given Filter,
using a n point filter. If n is omitted n=64.
4.5 filterAz
Filter the data in azimuth direction. The filter to be applied is given as a Filter object, which can
be used to construct spectral filters.
imc = imc.filterAz(filter, n) Filter the data in azimuth direction, using the given Filter,
using a n point filter. If n is omitted n=64.
4.6 rewindow
Filter the data to the given window. A very useful window is the Kaiser window, constructed in
MATLAB with window = kaiser(N, beta), where N is sufficiently large and beta defines the
tradeoff between resolution and side lobe levels.
imc = imc.rewindow(window, n) Filter the data so that the spectrum shape is equal to
window, using a n point filter. If n is omitted n=64. Example:
imc = imc.rewindow(kaiser(1025, 3))
4.7 subtractReferencePhase
Subtract the reference phase from the data. The reference phase is calculated using the polynomial
in slc.poly3 refpha, evaluated at height = 0. See Slc.make poly refpha h2ph() for the polynomial
calculation.
imc = imc.subtractReferencePhase() Subtract the reference phase from the data.
4.8 show
Show image or stack of image using Image class.
12
4.9 plotSpectrumRange
Plot the real and theoretical spectrum of the data in range direction.
4.10 plotSpectrumAzimuth
Plot the real and theoretical spectrum of the data in azimuth direction.
4.11 mean
Return mean of a stack.
4.12 std
Return std of a stack.
4.15 value
Return the values at the given coordinates.
val = imc.value(lines, pixels) Return the values at the given coordinates. The values are
interpolated using a cubic kernel if they are not exact on data points.
4.16 clone
Create a clone of self.
13
4.17 filterRaOverlap
Filter all SLC images to have the same spectrum in range (pixel) direction, so to the part of the
spectrum that is overlapping. In this direction the spectrum can be shifted due to incidence angle
differences. For this filtering the inclination and pointing angle of the ground are needed. For this
poly flat is now used, which follows the ellipsoid.
By supplying an additional slope value the incidence angle can be multiplied to get another angle.
This is still experimental.
4.20 coherence
Calculates the coherence between two SLC images.
coh = imc1.coherence(imc2 [, win, type]) Calculate the coherence between imc1 and
imc2, using the window win. If win is omitted a window is constructed based on the impulse
response of the signal, giving a high calculation resolution, and independence from
deformation or other phase signals. type can be projected, complex, or absolute, where
projected is the default.
4.21 peakcoords
Find the coordinates of all local maxima in an image with subpixel precision.
[lines,pixels] = im.peakcoords( mask ) Return the line and pixel coordinates of all local
maxima, based on absolute values. The coordinates are estimated with subpixel precision. An
optional mask can be supplied to mask out unneeded areas.
14
4.22 interferogram
ifg = interferogram( self, master, doFilter, window, doSubtrflat, cohAsMag ) TODO: must be cleaned
up/generalised
4.23 filterAzOverlap
Filter all SLC images to have the same spectrum in azimuth (line) direction, so to the part of the
spectrum that is overlapping. In this direction the spectrum can be shifted due to squint angle or
doppler centroid differences.
4.24 coherencematrix
Calculate the coherence between each two SlcImages in an array. The upper triangle gives the average
coherence over the area, the lower triangle gives the maximum coherence in the area.
cohmat = coherencematrix([imc1, imc2, ...], crop, win, type) Calculate the
coherence between each two SlcImages in the array. crop defines the area that should be
included in the calculation, and must be a boolean matrix of the same size as imc.data. win
and type are described in coherence and are usually omitted.
15
Chapter 5
Class: PointSet
The PointSet class represents a database of point data such as phase, coordinates, etc.
Point data is organized in matlab arrays, or similar reacting object as hdf5prop or dependentprop,
and follows the convention that the first dimension equals the number of points on the pointset
(n points) and the second the number of observations (n observations), i.e. the number of
images in the stack. Data that is the same for all points (such as the aqcuisition date) has its first
dimension equal to one (row vector) and a name that starts with obs , such as in obs date. Data
that is independent of aqcuisition has its second dimension equal to one (column vector) and a name
that starts with pnt , such as in pnt quality. Data that changes with both point and aqcuisition
has no prefix, eg. amplitude and phase. Properties can be added dynamically, and should follow
this convention. The only properties that are exempted are n points, n observations, meta, log,
and slc.
The amplitude is not an element of the PointSet by default as it is not required in any of the main
functions and dependent properties, but it is usually present as a dynamic property. The pnt quality
can mean different things at different processing stanges, but should be the most recent quality
measure of the points in the set. For example, initially it can be the amplitude consistency, and later
after network estimation it will be the ensemble coherence. Separate, specific properties for these
values are likely added in future versions.’
5.1 Properties
n points Number of points.
n observations Number of observations.
meta Structure of meta data that is not attributed to points or observations, such as the mrm or a
polygon that was used to select the current set of points.
log A Logger object that maintains a history of standard output messages.
slc The SlcStack object that the point data is obtained from.
phase The real-valued phase of each point per aqcuisition.
16
pnt line The line coordinate of each point.
pnt pixel The pixel coordinate of each point.
pnt quality The quality of each point.
pnt height The (current estimate of the) height of each point.
pnt lat The estimated lattitude of each point based on the estimated height.
pnt lon The estimated longitude of each point based on the estimated height.
obs date The date of each aqcuisition.
h2ph The height-to-phase of each point per observation.
topo phase The phase corresponding to the current pnt height.
atmo phase The phase corresponding to the estimated atmosphere in meta.
phase notopo The measured phase corrected for the topographic phase topo phase.
phase notopo noatmo The measured phase corrected for the topographic and atmospheric phase
topo phase and atmo phase.
phase corrected The measured phase corrected for the topographic phase and, if available, the
atmospheric phase. This is the best available phase for model estimations.
defo line of sight deformation w.r.t. first observation and w.r.t. meta.refpoint [m]
5.2 PointSet
Create a PointSet object.
pointset = PointSet(slcstack) Creates an empty pointset for an SlcStack object, setting
n observations equal to the images in the stack and n points equal to zero.
pointset = PointSet(slcstack,props) Adds the properties named in the props cell array,
following the naming conventions as outlied in this object’s documentation, and initialized
with nan values.
5.3 append
Add point data to the pointset, increasing the n points number of points. This modifies the pointset
in place.
pointset.append(other) Appends the contents of the other pointset to pointset. The new
value of n points is incremented correspondingly. Data fields of the two pointsets are
merged, and nan values are put where data is missing due to the merge. For example, if
phase exists in other but not in pointset, then phase is added to pointset, initialized
with nan values, and phase data from other is copied into the second part.
17
5.4 add points
Allocates space for extra points in the pointset. This modifies the pointset in place.
index = pointset.add points(n) Increases n points by n and and grows data fields
accordingly with nan values. The index vector points into the modified pointset at the newly
allocated space to aid subsequent filling of data.
5.9 clone
Makes a deep copy of the pointset.
18
cloned = pointset.clone() Creates a new pointset cloned that is a replica of the original
pointset. Of the special properties, log is not copied. meta is copied and may share data
with the original pointset if meta contains objects.
5.11 rdnap
Calculate the coordinates in the Dutch RDNAP coordinate system, computed with sub cm accuracy.
rdnap = pointset.rdnap(sel)
[rdx, rdy, nap] = pointset.rdnap(sel) Return the rdnap coordinates of the subset sel.
19
out = pointset.shift geocoding('shift xy',[xshift, yshift]) Change the
geocoding to shift points. xshift is the shift to be applied in west-east direction, yshift in
south-north. Both given in meter.
5.17 subset
Create a subset of a pointset based on a variety of criteria. Arguments are passed by the key-value
convention. Any of the following settings can be combined.
sub = pointset.subset('index',I) Direct selection of points based on a boolean or index
vector.
sub = pointset.subset('coord',C) Select all points that are (within epsilon radius) on the
given (line, pixel) coordinates. If specified as a matrix, C should consist of two columns. If
specified as cell array then each item C{i} should be a two-element vector. If specified as a
string then the format is L...P...\nL...P...\n..., where the dots contain the line and
pixel coordinate times 10, for exampe, 10.23,14.25 becomes L102P143.
sub = pointset.subset(property, limits) Select points based on pnt property. limits
should be a minimum value or a [minimum, maximum] pair. property should be a property
for which a pnt property exists, like quality, height, line, etc.
sub = pointset.subset('crop', raster) Select only points within the extent of raster
sub = pointset.subset('poly', poly) Select only points within the polygon or
multipolygon poly. Also see INMULTIPOLYGON.
sub = pointset.subset('nearlatlon', [lat lon], distance) Select only points within
distance of a given coordinate Distance can be given in meter (eg 100m) or in mainlobes (eg.
3)
sub = pointset.subset('nearlinepixel', [line pixel], distance) Select only
points within distance of a given coordinate Distance can be given in meter (eg 100m) or in
mainlobes (eg. 3)
[sub,sub2,...] = pointset.subset('mindist',d) Select points that are at a certain
minimum distance to each other. Points are added in descending order of quality. After each
addition the radius around the added point is discarded, and the next best point is added,
until the set is exhausted. If specified as a number the implied unit is the mainlobe size: 1 for
one time the mainlobe. If the distance is specified as a string then the unit is specified, and
20
currently only meters are supported; a distance of 100m is 100 meters in both azimuth and
range. If two or more outputs are requested, the algorithm is repeated for points that are not
selected to create independent subsets. This is possibly useful for testing purposes.
[sub,sub2,...] = pointset.subset('mindist',d,'seedpoints', points) Make sure
the given seedpoints are present in the subset. Points can be given as either PointSet,
LxxxxxPxxxxx or [ lines, pixels].
[sub,sub2,...] = pointset.subset('mindist',d,'seedpoint', points, 'onlytoseedpoints')
Only consider the distance to the given seedpoints, and don’t include the seedpoints themself.
21
Chapter 6
Class: Filter
Spectral metadata and filter class A Filter object contains information about the spectrum of a
signal, and creates and applies filters taking this information into account. The spectrum is defined
as a relative spectrum, which is in the interval [-1 1]. It is defined by a window shape, a bandwith,
and a phase shift.
Filter objects are immutable.
f = Filter(windowAmp, bandwidth, phaseshift, windowFreq) Return a new Filter
object. If any input is omitted the following defaults are used: windowAmp = [1 1]:
rectangular window. bandwidth = 1: full bandwidth. phaseshift = 0: spectrum centered.
windowFreq = linspace(-1, 1, length(windowAmp)): equally spaced window amplitudes.
6.1 Properties
windowAmp Amplitudes of window shape
windowFreq Frequenies of window shape, from -1 to 1.
bandwidth Bandwidth of signal, relative to nyquist. From 0 to 1.
phaseshift Phase shift of center of spectrum.
6.2 shape
Return the shape of the full spectrum.
[a,f] = filter.shape() Return the amplitudes and frequencies of the spectrum in full
resolution.
[a,f] = filter.shape(f in) Return the spectumr interpolated to the given frequencies f in.
If called on filter array, return a and f as cell arrays.
22
6.3 design
Design a FIR filter with the frequency response according to self.
6.4 apply
Apply one spectrum to another, which in fact is multiplication.
6.5 overlap
Return the spectrum that must be applied to get the overlapping spectrum.
6.6 rewindow
Return the spectrum that must be applied to get the given window.
6.7 oversample
Return the spectrum of the oversampled signal.
6.8 shift
Change phaseshift.
6.9 noisebandwidth
Calculate the equivalent noise bandwidth of the filter.
6.10 mainlobewidth
Calculate the width of the mainlobe in fft bins (pixels). The width is defined as up to -3 db (Cumming
p.35)
6.11 sideloberatio
Calculate the integrated sidelobe ratio in dB. The width of the main lobe is defined as null-to-null.
23
6.12 filter
Apply the filter to an image.
6.13 raisedCosine
Raised cosine or generalized Hamming window.
6.14 impulseresponse
Return the impulse response.
irf = filter.impulseresponse(maxdist, oversample) Return the impulse response up to
maxdist samples, oversampled with the given value.
6.15 simulateSidelobes
simulates the sidelobe positions and values of an impulse response with the defined spectrum
24
Chapter 7
Class: Poly3D
7.1 Properties
xshift Shift in 1st dimension
xscale Scale of 1st dimension
yshift Shift in 2nd dimension
yscale Scale of 2nd dimension
zshift Shift in 3rd dimension
zscale Scale of 3rd dimension
poly Array with polynomial values
7.2 Poly3D
Create a Poly3D object.
p = Poly3D( xshift, xscale, yshift, yscale, zshift, zscale, poly ) Create a
Poly3D object with given parameters.
7.3 evaluate
Evaluate the polynomial at the given values.
val = poly3d.evaluate(x, y, z) Evaluate the polynomial at the given values. If poly3d is an
array of Poly3D objects all polynomials are evaluated and the results concatenated. It is
allowed to give a single n-by-3 array as input.
25
Chapter 8
Class: Raster
The Raster class defines a 2D rectangular window with equidistant divisions in both dimensions.
The dimensions are called line and pixel, according to their names in radar sciences.
r = Raster(firstLine:stepLine:lastLine, firstPixel:stepPixel:lastPixel)
Returns an inclusive window (lastLine and lastPixel are included).
r = Raster.ex(firstLine:stepLine:endLine, firstPixel:stepPixel:endPixel)
Returns an exclusive window (endLine and endPixel are excluded).
stepLine and stepPixel default to 1 when they are omitted.
8.1 Properties
firstLine First line included in the raster
firstPixel First pixel included in the raster
stepLine Stepsize in line direction
stepPixel Stepsize in pixel direction
nLines Number of raster elements in line direction
nPixels Number of raster elements in line direction
lastLine Last line included in the raster
lastPixel Last pixel included in the raster
8.2 isInside
sel = raster.isInside(coords) Returns true for coordinates inside the span of the raster.
coords should be a n-by-2 array [line,pixel]
26
8.3 contains
b = raster1.contains(raster2) Returns true if raster2 is fully within the span of raster1.
8.4 index
[I,J] = raster.index(lines,pixels) Returns the indices of the raster cells containing the
coordinates (lines,pixels).
I = raster.index(lines,pixels) Returns linear indices.
see also: locate
8.5 locate
[line, pixel] = raster.locate(I) Returns the location of the linear indices I.
[line, pixel] = raster.locate(I,J) Returns the location of the indices (I,J).
see also: index
8.6 oversample
new = raster.oversample(ovs) Returns a copy of the raster oversampled ovs times. If ovs
has two elements ovs(1) is used in line direction, and ovs(2) in pixel direction.
8.7 resample
new = raster.resample(nLines, nPixels) Returns a raster with the same span, but
containing nLines line elements and nPixels pixel elements.
8.8 bbox
new = raster.bbox(lines, pixels) Returns a raster containing all (lines,pixels) coordinates,
while keeping the stepsize.
8.9 setSteps
new = raster.setSteps(steps) Returns raster with same span, but with stepsize equal to
steps. If steps has two elements steps(1) is used in line direction, and steps(2) in pixel
direction. If steps is a raster the stepsizes of that raster are used.
27
8.10 addBorder
new = raster.addBorder(border) Returns a raster with a border of width border added to
all sides. If border has two elements border(1) is used in line direction, and border(2) in pixel
direction.
8.11 union
new = union([raster1, raster2, ...]) Return a raster spanning all given rasters. Only
works for rasters with same stepsize and compatible start points.
8.12 intersection
Returns intersection of rasters
newraster = intersection([raster1, raster2, ...]) Return Raster containing only
points that are in all given rasters.
8.13 repr
s = raster.repr() Return a string representation of the object.
8.14 makeTiles
tiles = raster.makeTiles( target elements ) Returns an array of Raster objects
spanning the same area as raster. target elements is the approximate number of elements
in each Raster in both directions, and will be rounded to make it fit. If target elements has
two elements target elements(1) is used in line direction, and target elements(2) in pixel
direction.
tiles = raster.makeTiles( target elements, roundfact ) The same, but round the
element count to multiples of roundfact. If roundfact has two elements roundfact(1) is used
in line direction, and roundfact(2) in pixel direction.
28
Chapter 9
Class: RasterData
9.1 Properties
raster Coordinates and spacing of the data. Raster object.
data Data array.
descr Description string.
orientation Orientation, 2 element array [x reverse, y reverse].
scale Scale to convert to [m] or other convenient unit.
9.2 interp
Return the values at the given coordinates.
val = imc.value(lines, pixels) Return the values at the given coordinates. The values are
interpolated using a cubic kernel if they are not exact on data points.
val = imc.value(raster) Return the values at the raster coordinates. The values are
interpolated using a cubic kernel if they are not exact on data points.
9.3 value
Return the values at the given coordinates.
val = im.value(lines, pixels) Return the values at the given coordinates. The values are
interpolated using a cubic kernel if they are not exact on data points.
29
9.4 cropTo
Crop the data to the given raster.
im = im.cropTo(raster) Crop the data to raster, where raster is a Raster object.
9.5 mean
Return mean of a stack.
9.6 std
Return std of a stack.
9.9 show
Show image or stack of image using Image class.
30
Chapter 10
Class: hdf5prop
Class for transparent file data access. Matlab class to create and access HDF5 datasets transparently
as Matlab variable. Data can be accessed and written with subscript referencing and assignment
methods, just like a matlab variable, only size must be explicitly set or changed.
prop = hdf5prop(file, dataset) Creates a hdf5prop object for the given dataset in the
given HDF5 file.
prop = hdf5prop(file, dataset, mode) Specify the access mode, mode = r : readonly
access (default) mode = rw: readwrite access
prop = hdf5prop(file, dataset, parameter, value, ...) Creates a hdf5prop object
for a new dataset with given parameters.
parameter size - size of the dataset. Necessary. chunk size - size of the chunks of the dataset.
compression - 0-9 compression level. type - type of the dataset to create. This can be a string
such as double, in which case the datatype maps to H5T NATIVE DOUBLE, or it can be a derived
HDF5 datatype. max size - the maximum size of the dataset to create. fill - the fill value.
10.1 Properties
file
dataset
mode
propsize
dataset id
10.2 close
Close file if open.
31
10.3 set extent
Allocate space
hdf5prop.set extent(sz) Extend allocated space to size sz.
32
Chapter 11
Class: dependentprop
Property which is dynamically calculated. Use with empty () to get full evaluated array. Without
() to pass the unevaluated property. Indexing works as with normal arrays, only linear or boolean
indexing for multidimensional arrays is not allowed.
11.1 Properties
parent Object of which the property is dependent.
func Function to call on parent to calculate property data. The function should take care of
indexing.
sizefunc Function to call on parent to calculate property size. Should return array with size in
each dimension.
11.2 dependentprop
Create a dependentprop object.
prop = dependentprop(parent, func, sizefunc) Creates a dependentprop object with
given parameters.
33