Applsci 10 04764
Applsci 10 04764
sciences
Article
Estimating Proportion of Vegetation Cover at the
Vicinity of Archaeological Sites Using Sentinel-1 and
-2 Data, Supplemented by Crowdsourced
OpenStreetMap Geodata
Athos Agapiou 1,2
1 Remote Sensing and Geo-Environment Lab, Department of Civil Engineering and Geomatics, Faculty of
Engineering and Technology, Cyprus University of Technology, Saripolou 2-8, 3036 Limassol, Cyprus;
[email protected]; Tel.: +357-25-002471
2 Eratosthenes Centre of Excellence, Saripolou 2-8, 3036 Limassol, Cyprus
Received: 15 June 2020; Accepted: 7 July 2020; Published: 10 July 2020
Abstract: Monitoring vegetation cover is an essential parameter for assessing various natural and
anthropogenic hazards that occur at the vicinity of archaeological sites and landscapes. In this study,
we used free and open access to Copernicus Earth Observation datasets. In particular, the proportion
of vegetation cover is estimated from the analysis of Sentinel-1 radar and Sentinel-2 optical images,
upon their radiometric and geometric corrections. Here, the proportion of vegetation based on the
Radar Vegetation Index and the Normalized Difference Vegetation Index is estimated. Due to the
medium resolution of these datasets (10 m resolution), the crowdsourced OpenStreetMap service
was used to identify fully and non-vegetated pixels. The case study is focused on the western part
of Cyprus, whereas various open-air archaeological sites exist, such as the archaeological site of
“Nea Paphos” and the “Tombs of the Kings”. A cross-comparison of the results between the optical
and the radar images is presented, as well as a comparison with ready products derived from the
Sentinel Hub service such as the Sentinel-1 Synthetic Aperture Radar Urban and Sentinel-2 Scene
classification data. Moreover, the proportion of vegetation cover was evaluated with Google Earth
red-green-blue free high-resolution optical images, indicating that a good correlation between the RVI
and NDVI can be generated only over vegetated areas. The overall findings indicate that Sentinel-1
and -2 indices can provide a similar pattern only over vegetated areas, which can be further elaborated
to estimate temporal changes using integrated optical and radar Sentinel data. This study can support
future investigations related to hazard analysis based on the combined use of optical and radar
sensors, especially in areas with high cloud-coverage.
1. Introduction
Various hazards, both anthropogenic and natural, can affect cultural heritage landscapes, sites,
and monuments. These hazards include fires, landslides, soil erosion, agricultural pressure, and urban
expansion [1–5]. In the recent past, several publications have demonstrated the benefits of Earth
Observation sensors, providing wide and systematic coverage over archaeological sites providing
systematic observations at medium and high-resolution images [6–10]. In addition, access to archival
imageries is a unique advantage of Earth Observation repositories, allowing for the analysis of the
temporal evolution of a hazard [11–13].
Nowadays, open and freely distributed satellite images, both radar and optical, have become
available due to the European Copernicus Programme [14]. The Sentinel sensors, with increased revisit
time and medium-resolution satellite images, can be downloaded through specialized big data cloud
platforms such as the Sentinel Hub [15]. These kinds of services provide radiometric and geometric
corrected Bottom of Atmosphere (BOA) reflectance bands of the Sentinel-2A,B sensors, as well as
orthorectified gamma 0 VV and VH radar images of the Sentinel-1 Synthetic Aperture Radar (SAR)
sensor (ascending and descending). Beyond these calibrated images, other ready products are available
for further re-use, like optical and radar vegetation indices (e.g., the Normalized Difference Vegetation
Index—NDVI and the Radar Vegetation Index—RVI). Pseudo-colour composites can be readily
displayed in the Sentinel Hub service, thus enhancing urban and agricultural areas, while classification
products are also provided.
Monitoring vegetation dynamics and long-term temporal changes of vegetation cover is of great
importance for assessing the risk level of a hazard [16,17]. For instance, the decrease of vegetation
cover, as a change to either the NDVI or RVI, can be a signal for land-use change and a result of the
urbanization sprawl phenomenon [18–21]. In contrast, an increase in the vegetation cover, beyond the
phenological cycle of the crops, can be a warning of agricultural pressure [22–24]. Similarly, vegetation
indices are input parameters for studying other hazards such as soil erosion and fires [25,26]. For soil
erosion, vegetation cover is linked with the land cover factor, which indicates the proportion of water
that runs on the ground surface. A decrease of vegetation in a concise period can be an indicator of a fire
event. For instance, Howland et al. [27] used the NDVI to estimate the soil erosion by water, based on
the Revised Universal Soil Loss Equation. In Burry et al.’s study [28], multi-temporal vegetation
indices analysis was used to map fire events and help to understand and interpret archaeological
problems related to past settlement patterns or environmental scenarios. However, in many cases,
optical and radar vegetation indices are applied separately with no synergistic use, thus minimizing
the full potentials of existing Earth Observation systems.
In this study, we aim to investigate the use of optical and radar images, from freely available
and open-access datasets, like the Sentinel-1 and Sentinel-2 sensors for estimating the proportion
of vegetation cover. The investigation was supported by other open access services, namely the
crowdsourced OpenStreetMap initiative. OpenStreetMap is an editable geo-service with more
than 6 million users all around the world. These ground-truthed datasets can be used to train the
medium-resolution Sentinel images, for detecting fully vegetated and non-vegetated areas. Additionally,
Google Earth, a free platform that provides systematic snapshots of high-resolution optical satellite
data, can be used as an evaluation of the OpenStreetMap datasets. In short, the current study aims
to investigate cross-cutting applications among the Copernicus Sentinel-1 and Sentinel-2 sensors,
core services, and other third-party data for the estimation of the proportion of vegetation cover in the
vicinity of archaeological sites.
urban areas in a period of 30 years, indicating the dramatic and sudden landscape changes in the
surroundings
surroundings of
of these
these important
important archaeological
archaeological sites. Moreover, Panagos
sites. Moreover, Panagos et
et al.
al. [34]
[34] reported
reported this
this area
area
to have high erodibility due to water.
erodibility due to water.
3. Materials and
3. Materials and Methods
Methods
In
In this
this section,
section, the
the overall methodology, as
overall methodology, as well
well as
as the
the datasets
datasets used
used for
for the
the needs
needs of
of the
the current
current
study, are presented. The various equations for estimating the proportion of vegetation and the layers
study, are presented. The various equations for estimating the proportion of vegetation and the layers
of information are also presented.
of information are also presented.
3.1. Methodology
3.1. Methodology
For the needs of the study, various sources were used (Figure 2). The first source of information
For the needs of the study, various sources were used (Figure 2). The first source of information
regards the optical and radar Sentinel datasets, acquired over the area of interest. The second source of
regards the optical and radar Sentinel datasets, acquired over the area of interest. The second source
information considers the Sentinel ready products from the Sentinel Hub service. The third component
of information considers the Sentinel ready products from the Sentinel Hub service. The third
refers to the crowdsourced vector geodata manually digitized by several users of the area of interest,
component refers to the crowdsourced vector geodata manually digitized by several users of the area
available at the OpenStreetMap service [35]. The accuracy assessment of the OpenStreetMap service is
of interest, available at the OpenStreetMap service [35]. The accuracy assessment of the
still debated by researchers while several examples of this service blended with remote sensing can be
OpenStreetMap service is still debated by researchers while several examples of this service blended
found in the literature [36–38]. Finally, the compressed red-green-blue (RGB) high-resolution optical
with remote sensing can be found in the literature [36–38]. Finally, the compressed red-green-blue
data from the Google Earth platform are used for validation purposes.
(RGB) high-resolution optical data from the Google Earth platform are used for validation purposes.
Appl. Sci. 2020, 10, 4764 4 of 17
Appl. Sci. 2020, 10, x FOR PEER REVIEW 4 of 17
(a) (b)
(c) (d)
Figure2.2.AAschematic
Figure schematicrepresentation
representationof
ofthe
thefour
four“layers”
“layers”of
ofinformation
informationused
usedin
in this
this study:
study: the
theEarth
Earth
Observation Sentinel-1 and Sentinel-2 images (a), the Sentinel Hub, an Earth Observation
Observation Sentinel-1 and Sentinel-2 images (a), the Sentinel Hub, an Earth Observation big data big data
cloud platform (b), crowdsourced geodata from OpenStreetMap (c) and the Google Earth
cloud platform (b), crowdsourced geodata from OpenStreetMap (c) and the Google Earth platform platform (d).
(d).
Through the Sentinel Hub service, radar and optical Sentinel images were retrieved (see more
details below). Beyond the calibrated bands from both sensors, additional products were generated.
Through the Sentinel Hub service, radar and optical Sentinel images were retrieved (see more
The NDVI and the RVI were processed using Equations (1) and (2):
details below). Beyond the calibrated bands from both sensors, additional products were
generated.The NDVI and the RVI were processed using Equations (1) and (2):
NDVI = (ρNIR - ρRED )/(ρNIR + ρRED ), (1)
NDVI = (ρNIR – ρRED)/(ρNIR + ρRED), (1)
RVI = (VV/(VV + VH))0.5 (4 VH)/(VV + VH), (2)
RVI = (VV/(VV + VH)) (4 VH)/(VV + VH),
0.5 (2)
where the ρNIR and ρRED refer to the reflectance value (%) of the near-infrared and red bands of the
Sentinel-2
where the sensor
ρNIR and(namely using
ρRED refer to the band 8 and band
reflectance value4), (%)while thenear-infrared
of the VV and VH refer and to
redthebands
polarization
of the
bands of the
Sentinel-2 Sentinel-1
sensor (namely sensor,
usingitthe
should
bandbe mentioned
8 and band 4),that thethe
while estimation
VV and ofVHthe RVItowas
refer the based upon a
polarization
custom
bands ofscript availablesensor,
the Sentinel-1 with the Sentinel-Hub
it should servicesthat
be mentioned at [39]. As mentioned
the estimation in RVI
of the [39],was
an equivalent
based upon to
athe degree
custom of polarization
script available with(DOP) is calculated as
the Sentinel-Hub VV/(total
services power),
at [39]. where theinlatter
As mentioned [39],isanestimated
equivalent as
the
to thesum of the
degree of VV and the VH
polarization (DOP)polarization of the
is calculated as Sentinel
VV/(totalsensor.
power), The DOPthe
where is utilized to obtain the
latter is estimated as
depolarized
the sum of the fraction
VV and m =VH
as the 1 – polarization
DOP and it can range
of the between
Sentinel 0 andThe
sensor. 1. For
DOPpure or elementary
is utilized targets
to obtain the
depolarized
this value isfraction
close toas m = whereas
zero, 1 – DOP and it canrandom
for fully range between
canopy 0(at and 1. For
high pure or elementary
vegetative targets
growth), it reaches
this
closevalue
to 1. is close
This to zero,
m factor whereas forwith
is multiplied fully
therandom canopy
vegetation (at high vegetative
depolarization growth),
power fraction (4 ×itVH)/(VV
reaches
+ VH).
close to The
1. This m factor
m factor is multiplied
is modulated with
with the vegetation
a square depolarization
root scaling for a better power
dynamic fraction
range(4 of×the
VH)/(VV
RVI.
+ VH). The optical
From m factorandis modulated with aindices,
radar vegetation square root scaling for of
the proportion a better dynamic
vegetation range
can be of the RVI.
retrieved. In our
From
study, twooptical and
different radar were
models vegetation
appliedindices,
for boththeoptical
proportion of vegetation
and radar datasets:can be retrieved. In our
study, two different models were applied for both optical and radar datasets:
Pv1 -radar = (RVI - RVI non-veg.)/(RVI veg - RVI non-veg.), (3)
Pv1-radar = (RVI - RVI non-veg.)/(RVI veg - RVI non-veg.), (3)
Pv1 -optical = (NDVI - NDVI non-veg.)/(NDVI veg - NDVI non-veg.), (4)
Pv1-optical = (NDVI - NDVI non-veg.)/(NDVI veg - NDVI non-veg.), (4)
Pv2 -radar = [(RVI - RVI min)/(RVI max - RVI min)]0.5 (5)
Pv2-radar = [(RVI - RVI min)/(RVI max - RVI min)]0.5 (5)
Pv2 -optical = [(NDVI - NDVI min)/(NDVI max - NDVI min)]0.5 (6)
Pv2-optical
where vegetation index = [(NDVI
veg (NDVI - NDVI
veg and min)/(NDVI
RVI veg) max - NDVIindex
and non-vegetation min)](NDVI
0.5 (6)
non-veg. and
RVI non-veg.) represent the vegetated and non-vegetated pixels of the considered index, respectively,
where vegetation index veg (NDVI veg and RVI veg) and non-vegetation index (NDVI non-veg. and
RVI non-veg.) represent the vegetated and non-vegetated pixels of the considered index, respectively,
Appl. Sci. 2020, 10, 4764 5 of 17
vegetation index max (NDVI max and RVI max) and vegetation index min (NDVI min and RVI min)
represent the maximum and minimum histogram value of the vegetation image.
It is evident, therefore, that the selection of vegetated and non-vegetated regions is of great
importance for the correct estimation of the proportion of vegetation cover. For this reason,
the crowdsourced OpenStreetMap was used as a training area for the equations mentioned above,
once these vector polygons have been confirmed from the Google Earth high-resolution products.
Once these sites have been selected, the results from the optical and radar sensors using Equations
(3)–(6) were retrieved. A comparison between the optical and the radar sensors separately (namely the
results from Equation (3) against the results from Equation (4), and the results from Equation (5) with
the results from Equation (6)) was performed.
3.2. Datasets
For the aims of the current study, radar Sentinel-1 and optical Sentinel-2 images were downloaded
through the EO Browser of the Sentinel Hub service [15]. The characteristics of these images are shown
in Tables 1 and 2.
Sentinel-1
Parameter Value
Minimum Ground Swath Width 250 km
Incidence Angle Range 29.1◦ –46.0◦
Number of Sub-swath 3
Azimuth Steering Angle ±0.6◦
Azimuth Resolution 20 m
Ground Range Resolution 5m
Single (HH or VV) or
Polarisation Options
Dual (HH + HV or VV + VH)
Sentinel-2
Parameter Value
Instrument principle Pushbroom
Repeat cycle (days) 5 (at the Equator)
Swath width (km) 290
Spectral bands 13
Spatial resolution (metres) 10, 20, 60
In particular, a radar Sentinel-1 image in Interferometric Wide Swath (IW), acquired on 12 May
2020 was processed. In addition, a Sentinel-2 image with processing level L2A (orthorectified BOA) and
limited cloud-coverage acquired on 13 May 2020 was used. Figure 3 visualizes the Sentinel-1 and -2
images as well as other ready products available from the Sentinel Hub service. It should be mentioned
that all these images were downloaded in high-resolution format (which corresponds to 10-m spatial
resolution per pixel) and in a 32-bit tiff image format. Figure 3a,b display the orthorectified Sentinel-1
VH and VV gamma 0 respectively, while Figure 3c shows the Sentinel-1 SAR urban visualization
product. The SAR urban product is an RGB pseudo-composite, whereas the combination VV-VH-VV is
used for the RGB composite. The Radar Vegetation Index (RVI) is displayed in Figure 3d. Figure 3e–i
refer to products and bands of the optical Sentinel sensor as follow: Figure 3e,f show the Sentinel-2,
BOA Band 5 (red-edge) and band 8 (near-infrared) respectively. The Normalized Difference Vegetation
Index (NDVI) can be observed in Figure 3g. Figure 3h shows the Sentinel-2 Scene Classification
product, which is developed under the Sen2Cor project to support an atmospheric correction module.
Appl. Sci. 2020, 10, x FOR PEER REVIEW 6 of 17
ready products (RVI and NDVI) were used for over a year (May 2019–May 2020) so as to investigate
temporal changes of vegetation growth with these indices.
Appl. Sci. 2020, 10, 4764 6 of 17
Furthermore, OpenStreetMap data were retrieved through the QGIS open-access software [40].
In our case study area, various vector data are available for download, as shown in Figure 3i. From
In addition,
these vectormonthly
polygons,Sentinel-1
areas and Sentinel-2 ready
characterized productsand
as vegetated (RVInon-vegetated
and NDVI) were usedidentified
were for over a year
and
(May 2019–May 2020) so as to investigate temporal changes
compared with available images from the Google Earth platform. of vegetation growth with these indices.
Sentinel-1and
Figure 3. Sentinel-1 andSentinel-2
Sentinel-2datasets
datasetsand
andready
readyproducts
products downloaded
downloaded from
from thethe
EOEO Browser
Browser of
of
thethe Sentinel
Sentinel HubHub Service
Service used used
in thisinstudy.
this study. (a) orthorectified
(a) orthorectified Sentinel-1,Sentinel-1, VH polarization,
VH polarization, gamma 0,
gamma 0, (b) orthorectified
(b) orthorectified Sentinel-1, Sentinel-1, VV polarisation,
VV polarisation, gamma 0, (c)gamma 0, (c)
Sentinel-1 Sentinel-1
Synthetic Synthetic
Aperture Aperture
Radar (SAR)
Radar
urban (SAR) urban visualization
visualization product, (d) product,
Sentinel-1(d) Sentinel-1
Radar Radar Index,
Vegetation Vegetation Index, (e) Bottom-of-
(e) Sentinel-2, Sentinel-2,
Bottom-of-Atmosphere
Atmosphere (BOA) Band (BOA)
5, (f) Band 5, (f)Bottom-of-Atmosphere
Sentinel-2, Sentinel-2, Bottom-of-Atmosphere
(BOA) Band 8,(BOA) Band 8,
(g) Sentinel-2,
(g) Sentinel-2,
Normalized Normalized
Difference Difference
Vegetation IndexVegetation Index
(NDVI), (h) (NDVI),Scene
Sentinel-2, (h) Sentinel-2, Scene
Classification Classification
product, and (i)
product, and (i) the OpenStreetMap vector data of the area. The scale for all sub-figures is
the OpenStreetMap vector data of the area. The scale for all sub-figures is shown in the right bottom shown in the
right bottom part
part of the Figure 3.of the Figure 3.
Furthermore, OpenStreetMap data were retrieved through the QGIS open-access software [40].
4. Results
In our case study area, various vector data are available for download, as shown in Figure 3i. From these
Inpolygons,
vector this section,
areas the overall results
characterized are presented.
as vegetated In Section
and non-vegetated 4.1,identified
were the results from the
and compared
implementation
with of thefrom
available images optical
theand the Earth
Google radar platform.
vegetation indices are presented, while the proportion
of vegetation is estimated and visualized in Section 4.2. The evaluation of these results is finally
4. Results in Section 4.3.
discussed
In this section, the overall results are presented. In Section 4.1, the results from the implementation
4.1. Vegetation Indices
of the optical and the radar vegetation indices are presented, while the proportion of vegetation is
As mentioned
estimated earlier,
and visualized the calibrated
in Section 4.2. Theoptical and of
evaluation radar
thesebands
resultswere retrieved
is finally frominthe
discussed Sentinel
Section 4.3.
Hub platform. Based on the equations provided before Equations (1) and (2) the NDVI and the RVI
4.1.
wereVegetation
estimatedIndices
(also see Figure 3d–g). The range of these indices is between −1 to +1 for the NDVI
whileAsformentioned
the RVI theearlier,
range the
value is between
calibrated 0 to +1.
optical andWith
radarthebands
increase
were of retrieved
vegetation cover,
from thethe value
Sentinel
Hub platform. Based on the equations provided before Equations (1) and (2) the NDVI and the RVI
Appl. Sci. 2020, 10, 4764 7 of 17
were estimated (also see Figure 3d–g). The range of these indices is between −1 to +1 for the NDVI
Appl. Sci. 2020, 10, x FOR PEER REVIEW 7 of 17
while for the RVI the range value is between 0 to +1. With the increase of vegetation cover, the value
of
of both
both indices increased to
indices increased +1.At
to +1. Atthe
thesame
sametime,
time, non-vegetated
non-vegetated areas
areas areare close
close to 0to(and
0 (and negative
negative for
for the NDVI in water bodies). Since the acquisition period for the Sentinel-1
the NDVI in water bodies). Since the acquisition period for the Sentinel-1 and the Sentinel-2 images and the Sentinel-2
images
only had onlya had
one adayonedifference,
day difference,
it wasit was hypothesized
hypothesized thatthat both
both indiceswere
indices wereobtained
obtainedat at the
the same
same
phenological
phenological status status of
of vegetation.
vegetation. The
The results
results from
from the
the application
application of of the
the vegetation
vegetation indices
indices over
over the
the
area of interest are depicted in Figure 4. Figure 4a,d,g show the Sentinel-2 true
area of interest are depicted in Figure 4. Figure 4a,d,g show the Sentinel-2 true colour composite overcolour composite over
the
the area
areaofofinterest,
interest,thethearchaeological
archaeological sitesite
of “Nea Paphos”
of “Nea Paphos”and and
the historic Centre
the historic of theof
Centre Paphos town,
the Paphos
respectively.
town, respectively. Likewise, Figure 4b,e,h
Likewise, Figuredisplay
4b,e,h the results
display theofresults
the RVI,of while Figure
the RVI, 4c,f–i
while show4c,f–i
figures the results
show
from the NDVI.
the results from the NDVI.
Vegetated
Vegetated areas areas as
as estimated
estimated from
from thethe RVI
RVI (Figure
(Figure 4b,h)
4b,h) are
are shown
shown withwith aa green
green colour
colour while
while
non-vegetated areas are visualized with a black colour. Vegetated areas in the
non-vegetated areas are visualized with a black colour. Vegetated areas in the NDVI (Figure 4c,f–i) NDVI (Figure 4c,f–i) are
highlighted with a dark-green colour and non-vegetated areas with a light-yellow
are highlighted with a dark-green colour and non-vegetated areas with a light-yellow colour. From colour. From Figure 4,
itFigure
is evident
4, it isthat optical
evident products
that optical are more easily
products are morevisually
easilyinterpreted comparedcompared
visually interpreted to the radar to products,
the radar
however,
products, however, this could be explored when optical images are not available, such as theofcase
this could be explored when optical images are not available, such as the case high of
cloud-coverage
high cloud-coverage of theofarea
theof interest.
area of interest.
Figure
Figure 4. (a)Sentinel-2
4. (a) Sentinel-2truetruecolour
colourover
overthethearea
areaof of interest,
interest, (b)(b) Radar
Radar Vegetation
Vegetation Index
Index (RVI),
(RVI), (c)
(c) Normalized
Normalized Difference
Difference Vegetation
Vegetation Index
Index (NDVI),
(NDVI), (d)(d) closer
closer look
look of of
(a)(a) over
over thethe archaeological
archaeological site
site of
of “Nea
“Nea Paphos”,
Paphos”, (e) (e) closer
closer looklook of (b)
of (b) overover the archaeological
the archaeological site
site of of “Nea
“Nea Paphos”,
Paphos”, (f) closer
(f) closer look
look of (c)
of (c) the
over over the archaeological
archaeological site ofPaphos”,
site of “Nea “Nea Paphos”,
(g) closer(g)look
closer look
of (a) overofthe
(a) historical
over the historical Centre
Centre of Paphos
of
town, (h) closer look of (b) over the historical Centre of Paphos town, and (i) closer look of (c) look
Paphos town, (h) closer look of (b) over the historical Centre of Paphos town, and (i) closer over
of
the(c)historical
over the historical
Centre ofCentre
Paphos of town.
PaphosVegetated
town. Vegetated
areas are areas
shownare shown with green
with green colourcolour
whilewhile
non-
non-vegetated
vegetated areasareas withwith
blackblack colour
colour forRVI,
for the the RVI,
whilewhile
for theforNDVI
the NDVI vegetation
vegetation is highlighted
is highlighted with with
dark
dark
greengreen
colour.colour.
To investigate if any correlation between the NDVI and the RVI values can be established,
regression models were applied. Figure 5 shows a scatterplot of 1000 points distributed randomly
over the case study area, at different target types, indicating the NDVI and RVI values at the X and Y
Appl. Sci. 2020, 10, 4764 8 of 17
To investigate if any correlation between the NDVI and the RVI values can be established,
Appl. Sci. 2020, 10, x FOR PEER REVIEW 8 of 17
regression models were applied. Figure 5 shows a scatterplot of 1000 points distributed randomly over
the case study area, at different target types, indicating the NDVI and RVI values at the X and Y axis,
axis, respectively.
respectively.AsAs it isshown
it is shown in this
in this scatterplot,
scatterplot, a high a high variance
variance is observed, is indicating
observed, indicating
that no strong that no
strong correlation between
correlation between these
these two two indices
indices can be achieved.
can be achieved. This is alsoThis is also
aligned aligned
with the with
previous the previous
findings,
findings,indicating
indicating thatthat optical
optical andindices
and radar radar doindices do not
not produce produce
similar similar findings.
findings.
1.0
0.8
0.6
RVI
0.4
0.2
0.0
0.0 0.2 0.4 0.6 0.8 1.0
NDVI
Figure
Figure 7. 7. Ground
Ground truthdata
truth data as
aspolygons extracted
polygons from the from
extracted OpenStreetMap service used as “vegetated”
the OpenStreetMap service used as
and “non-vegetated” areas (top). (a–c) shows a high-resolution optical preview of the vegetated areas
“vegetated” and “non-vegetated” areas (top). (a–c) shows a high-resolution optical preview of the
at Google Earth, while (d–f) the non-vegetated areas.
vegetated areas at Google Earth, while (d–f) the non-vegetated areas.
Figure 8a,b show the proportion of vegetation cover after the RVI (Equations (3) and (5)
respectively),
Figures 8a,b showwhilethe
Figure 8c,d showof
proportion thevegetation
results generated
coverfrom
afterthethe
NDVI
RVIusing Equation (3)
(Equations (4) and (5)
and Equation (6), respectively. It is evident from Figure 8 that the proportion of vegetation per index
respectively), while Figure 8c,d show the results generated from the NDVI using Equation (4) and
using image statistics or OpenStreetMap data is quite similar. However, the results vary when we use
Equationoptical
(6), respectively. It is evident from Figure 8 that the proportion of vegetation per index using
and radar datasets. The general deviation of the results was expected, as we also saw this in the
image statistics or OpenStreetMap
previous findings data
(Figures 4–6). The RVIisgenerates
quite similar.
a noisier However, the 8a–b)
outcome (Figure results vary when
compared to the we use
optical and radar datasets. The general deviation of the results was expected, as we also
results obtained from the NDVI (Figure 8c–d). Further elaboration of these results is discussed saw this in
in the
following section.
the previous findings (Figures 4–6). The RVI generates a noisier outcome (Figure 8a–b) compared to
the results obtained from the NDVI (Figure 8c–d). Further elaboration of these results is discussed in
the following section.
Appl. Sci. 2020, 10, 4764 11 of 17
Appl. Sci. 2020, 10, x FOR PEER REVIEW 11 of 17
Appl. Sci. 2020, 10, x FOR PEER REVIEW 11 of 17
Figure
Figure 8. 8.
TheThe proportionofofvegetation
proportion vegetationcover
cover based on
on the
theRVI
RVIusing
usingEquation
Equation(3)(3)(a)(a)
and Equation
and (5)(5)
Equation
Figure 8. The proportion of vegetation cover based on the RVI using Equation (3) (a) and Equation (5)
(b), and the NDVI based on Equation (4) (c) and
(b), and the NDVI based on Equation (4) (c) and EquationEquation (6) (d).
(d).
(b), and the NDVI based on Equation (4) (c) and Equation (6) (d).
4.3.4.3.
Evaluation
Evaluation of ofthe
theResults
Results
4.3. Evaluation of the Results
ToTocompare
compare the results,
the results,we wefirstly
firstlyestimated
estimated thethe difference
difference in inthe
theproportion
proportionofofvegetation
vegetation cover
cover
To compare the results, we firstly estimated the difference in the proportion of vegetation cover
produced
produced from the the
from two two
different models.
different FigureFigure
models. 9a shows the difference
9a shows the OpenStreetMap
the difference the OpenStreetMap (Equation
produced from the two different models. Figure 9a shows the difference the OpenStreetMap
(3)(Equation
and image(3)statistics
and image statistics(5))
(Equation (Equation (5)) for
for the RVI, the Figure
while RVI, while Figurethe
9b shows 9b difference
shows the betweendifferencethe
(Equation (3) and image statistics (Equation (5)) for the RVI, while Figure 9b shows the difference
between of
proportion thevegetation
proportioncover of vegetation
using the NDVIcover (Equations
using the NDVI (4) and(Equations
(6)). Higher(4)differences
and (6)). Higher
between
between the proportion of vegetation cover using the NDVI (Equations (4) and (6)). Higher
differences
thedifferences
two models between the with
two a models are shown with differences
a red colour while lowerwith differences are
between the two models are shown with a red colour while lower differences colour.
are shown red colour while lower are estimated a blue are
estimated
However, with a blue colour. However, examining the findings of Figure 9a,b, it is still difficult to
estimatedexamining
with a bluethe findings
colour. of Figure
However, 9a,b, itthe
examining is still difficult
findings to compare
of Figure 9a,b, it isthe two
still outcomes.
difficult to
compare
Differences the two
in the outcomes.
pattern Differences
are apparent, in the
indicating pattern are
that theare apparent,
selection indicating
of the that
type of that
sensor the selection of
compare the two outcomes. Differences in the pattern apparent, indicating the(i.e., Sentinel-1,
selection of
the and
radar typeSentinel-2,
of sensor (i.e., Sentinel-1,
optical sensors) radar and Sentinel-2,
influences the optical sensors) influences the outcome.
outcome.
the type of sensor (i.e., Sentinel-1, radar and Sentinel-2, optical sensors) influences the outcome.
Figure 9. (a) Difference of the proportion of vegetation cover as estimated from the Radar Vegetaton
Figure
Figure (a)(a)Difference
9. 9. Differenceofofthe
theproportion
proportion ofof vegetation coveras
vegetation cover asestimated
estimatedfrom
fromthethe Radar
Radar Vegetaton
Vegetaton
Index (RVI) based on Equations (3) and (5) and (b) Difference of the proportion of vegetation cover
Index
Index (RVI)
(RVI) based
basedonon
Equations
Equations(3)(3)and
and(5)
(5)and
and(b)
(b)Difference
Differenceofofthe
theproportion
proportionofofvegetation
vegetation cover
cover as
as estimated from the Normalized Difference Vegetation Index (NDVI) based on Equations (4) and
estimated fromfrom
as estimated the Normalized
the NormalizedDifference Vegetation
Difference Vegetation Index (NDVI)
Index (NDVI) based
basedononEquations
Equations(4)
(4)and
and(6).
(6). Higher difference values are indicated with red colour, while lower values with blue colour.
(6). Higher
Higher difference
difference valuesvalues are indicated
are indicated withwith red colour,
red colour, while
while lowerlower values
values with
with blue
blue colour.
colour.
Appl. Sci. 2020, 10, 4764 12 of 17
Appl. Sci. 2020, 10, x FOR PEER REVIEW 12 of 17
We
We further
further estimated
estimated thethe differences
differences between
between thethe findings
findings of of Figure
Figure 9a,b.
9a,b. The
The results
results ofof this
this
analysis
analysis are shown in Figure 10b. As shown in this Figure, differences in the estimation of
are shown in Figure 10b. As shown in this Figure, differences in the estimation of the
the
proportion
proportionof ofvegetation
vegetationcancanvary
varysignificantly
significantly andand
over 40%40%
over using the optical
using (NDVI)
the optical and radar
(NDVI) (RVI)
and radar
indices. However,
(RVI) indices. a closer
However, look atlook
a closer these
at findings showsshows
these findings that the differences
that between
the differences these two
between thesetypes
two
of satellite sources are minimal (< 10%) over vegetated areas (Figure 10c). This finding
types of satellite sources are minimal (< 10%) over vegetated areas (Figure 10c). This finding suggests suggests that
the
thatestimation of the
the estimation proportion
of the proportion of vegetation
of vegetation obtained
obtained byby
the Sentinel-1
the Sentinel-1andand-2-2sensors
sensorsdoes does not
not
differ significantly over vegetated areas. This is also supported from the findings of Figure
differ significantly over vegetated areas. This is also supported from the findings of Figure 6, whereas 6, whereas
aa similar
similar pattern
pattern can
can be
be extracted
extracted for
for optical
optical andand radar
radar products
products over
over vegetated
vegetated areas
areas (but
(but notnot for
for
other
other types of targets). Indeed, the use of the factor in Equation (2) can be adjusted for other types of
types of targets). Indeed, the use of the factor in Equation (2) can be adjusted for other types of
targets.
targets. In
In addition,
addition, the
thescene
sceneclassification
classificationproduct
productofofthetheSentinel-2
Sentinel-2 image
image of of
1313MayMay2020 is shown
2020 is shown in
Figure 10d.
in Figure Vegetated
10d. Vegetatedareas areare
areas shown
shownin this Figure
in this with
Figure a green
with a greencolour, which
colour, which recaps
recapsthethe
findings
findingsof
the results from Figure 10c, however with a lower resolution.
of the results from Figure 10c, however with a lower resolution.
Figure 10. (a) High-resolution satellite image over the case study area, (b) difference between the
Figure 10. (a) High-resolution satellite image over the case study area, (b) difference between the
results of Figure 9a,b, (c) areas with minimum differences between the optical and the radar sensor
results of Figure 9a,b, (c) areas with minimum differences between the optical and the radar sensor
indicated with blue colour over agricultural areas and (d) classification scene of Sentinel-2 images,
indicated with blue colour over agricultural areas and (d) classification scene of Sentinel-2 images,
whereas as vegetated areas are shown with green colour.
whereas as vegetated areas are shown with green colour.
The results of Figure 10c were compared with a high-resolution optical satellite image available at
The results of Figure 10c were compared with a high-resolution optical satellite image available
the ArcGIS environment, at the northern part of the archaeological site of “Nea Paphos” (Figure 11a).
at the ArcGIS environment, at the northern part of the archaeological site of “Nea Paphos” (Figure
Figure 11b shows a closer look at the findings of Figure 10c in this area, indicating that the difference is
11a). Figure 11b shows a closer look at the findings of Figure 10c in this area, indicating that the
relatively low (blue colour in Figure 11b) over the vegetated areas. The results from the proportion of
difference is relatively low (blue colour in Figure 11b) over the vegetated areas. The results from the
proportion of vegetation based on the optical Sentinel-2 sensor and using the NDVI and the
Appl. Sci. 2020, 10, x FOR PEER REVIEW 13 of 17
Figure 11. (a) High-resolution satellite image over the case study area, (b) difference between the
results of Figure 9a,b and (c) Normalized Difference Vegetation Index (NDVI)—proportion of
vegetation.
Figure 11. (a)
11. (a)High-resolution
High-resolutionsatellite image
satellite over
image the the
over casecase
study area,area,
study (b) difference between
(b) difference the results
between the
of Figureof9a,b
results and (c)
Figure 9a,bNormalized Difference Vegetation
and (c) Normalized DifferenceIndex (NDVI)—proportion
Vegetation of vegetation. of
Index (NDVI)—proportion
While a direct
vegetation.
comparison between the vegetation proportion from the RVI and the NDVI is not
While
feasible, in aandirect
attemptcomparison
to integratebetween the vegetation
the findings, proportion
we multiplied the RVIfromandthe the
RVINDVI
and the NDVI
results is
(see
not feasible,
Figure 8a–c),
While in an
whereas
a direct attempt to
high values
comparison integrate the
will be
between the findings,
expected
vegetation we multiplied
in pixels with afrom
proportion the RVI
hightheRVI and
RVI and the
NDVI NDVI results
proportion
the NDVI is not
(see
of Figure
vegetation8a–c), whereas
estimation, high values
indicating will be
vegetation expected
presence.in pixels
The with
results a high
are RVI
shown
feasible, in an attempt to integrate the findings, we multiplied the RVI and the NDVI results (see and
in NDVI
Figure proportion
12. Figure
of
12avegetation
shows
Figure theestimation,
8a–c), multiplied
whereas indicating
highRVI vegetation
× NDVI
values will results, presence.
whereas
be expected The results
higher
in pixels aare
values
with shownRVIinand
indicated
high Figure
with
NDVI a12. Figure
green 12a
colour
proportion
shows
highlight
of thevegetated
vegetation multiplied
estimation,RVI
areas. × NDVI
Figure
indicating results,
12b–d whereas
is a focus
vegetation higher
of the
presence. red values
Thepolygon indicated
results areof Figure
shown12awith atathe
in Figure green
area colour
of the
12. Figure
highlight
historical vegetated
Centre of areas.
Paphos Figure
(Figure 12b–d
12b). is a focus
Figure 12c of
isthe
the red polygon
proportion ofof Figure
vegetation
12a shows the multiplied RVI × NDVI results, whereas higher values indicated with a green colour 12a at
as the area
estimated of the
from
historical
the NDVI Centre
using of Paphos
Equation (Figure
(4), while12b). Figure
Figure 12d 12c
is is
the the proportion
multiplied RVI of ×vegetation
NDVI.
highlight vegetated areas. Figure 12b–d is a focus of the red polygon of Figure 12a at the area of theAs as estimated
shown, the from
latest
the
imageNDVI using
better Equation
canCentre
historical enhance
of Paphosthe(4), while12b).
small
(Figure Figure
differences12din
Figure is vegetation,
12c the multiplied
is the alsoRVI
proportion of×vegetation
NDVI.
capturing someAsasshown,
patterns the
estimatedfromlatest
the
from
image can
buildings better
of the enhance
area. the small differences in vegetation, also capturing
the NDVI using Equation (4), while Figure 12d is the multiplied RVI × NDVI. As shown, the latest some patterns from the
buildings
image canof the area.
better enhance the small differences in vegetation, also capturing some patterns from the
buildings of the area.
Blue
Appl. Sci. (RGB)
2020, orthophoto of the red polygon of (a) showing the historical Centre of Paphos, (c) the
10, 4764 14 of 17
NDVI—related proportion of vegetation and (d) the RVI × NDVI result.
5. Discussion
5. Discussion
Overall,
Overall, itit was
was found
found that
that the
the use
use of
of optical
optical and
and radar
radar vegetation
vegetation indices,
indices, namely
namely thethe NDVI
NDVI and and
the RVI, did not provide comparable results (see Figure 8). Optical indices
the RVI, did not provide comparable results (see Figure 8). Optical indices were more eailywere more eaily interpreted
while the use
interpreted of the
while theRVI
useproduced
of the RVI noisy data. However,
produced noisy data.some findings,
However, somesuch as the those
findings, of the
such as Figure 6,
those
suggest that radar products can be used as an alternative to the optical data for detecting
of Figure 6, suggest that radar products can be used as an alternative to the optical data for detecting patterns
(e.g., vegetation
patterns growth) in
(e.g., vegetation specificinareas
growth) of interest.
specific areas of interest.
To
To investigate
investigate the the potential
potentialuse useof
ofboth
bothproportions
proportionsof ofvegetation
vegetationindices
indicesderived
derivedfrom
fromSentinel-1
Sentinel-
and -2 sensors, we multiplied the RVI × NDVI, as shown in Figure 12. This
1 and -2 sensors, we multiplied the RVI × NDVI, as shown in Figure 12. This new product thatnew product that integrates
the RVI and
integrates NDVI
the outcomes
RVI and NDVIcan be usedcan
outcomes with
be the
used VV andthe
with VHVV polarizations of Sentinel-1
and VH polarizations of to create a
Sentinel-1
new pseudo-colour
to create composite. composite.
a new pseudo-colour The uniqueThe characteristic of the radarof
unique characteristic sensors to depict
the radar urban
sensors areas
to depict
(see also Figure 3c) can be used further to enhance the proportion of vegetation and
urban areas (see also Figure 3c) can be used further to enhance the proportion of vegetation and urbanurban areas. Such a
new
areas.composite
Such a new is shown in Figure
composite is shown13. Using
in FigureSentinel-2
13. Usingspectral bands
Sentinel-2 4 and 8,
spectral we can
bands estimate
4 and the
8, we can
NDVI vegetation proportion while using the Sentinel-1 VV and VH polarizations;
estimate the NDVI vegetation proportion while using the Sentinel-1 VV and VH polarizations; therefore, we can
estimate
therefore,the weRVIcanvegetation
estimate theproportion. The combination
RVI vegetation proportion.ofThethecombination
two new products generates
of the two the new
new products
RVI × NDVI
generates thevegetation
new RVI ×proportion index, which
NDVI vegetation can then
proportion be integrated
index, which canagain
then bewith the VV and
integrated again thewith
VH
polarization to highlight vegetated areas (red colour under the “pseudo-colour
the VV and the VH polarization to highlight vegetated areas (red colour under the “pseudo-colour composite of Figure 13)
while buildings are also visible (with green colour in the same column).
composite of Figure 13) while buildings are also visible (with green colour in the same column).
6. Conclusions
6. Conclusions
Estimating
Estimatingvegetation
vegetationproportion
proportion through
throughEarth Observation
Earth sensors
Observation is important,
sensors especially
is important, when
especially
dealing with the monitoring of hazards in the vicinity of archaeological sites. Vegetation cover
when dealing with the monitoring of hazards in the vicinity of archaeological sites. Vegetation cover is usually
extracted
is usuallyfrom optical
extracted vegetation
from indices such
optical vegetation as thesuch
indices NDVI. as However,
the NDVI.with the increasing
However, with theavailability
increasing
of radar sensors like the Sentinel-1, radar-related vegetation indices have also
availability of radar sensors like the Sentinel-1, radar-related vegetation indices have been introduced
also beenin
the literature.
introduced in the literature.
The Sentinel Hub big data cloud platform can provide readily calibrated reflectance optical bands
and VV/VH radar polarizations to end-users, and other ready products. Here, we explored the use of
Appl. Sci. 2020, 10, 4764 15 of 17
Sentinel-1 and -2 datasets for estimating vegetation proportion, as well as crowdsourced geo-datasets
from the OpenStreetMap service, on an area in the western part of Cyprus
The overall findings indicate that Sentinel-1 and -2 indices can provide a similar pattern only over
vegetated areas, which can be further elaborated to estimate temporal changes using integrated optical
and radar Sentinel data. The use of the OpenStreetMap data was found very helpful as it allowed us
to locate vegetated and non-vegetated areas with high accuracy, which would be difficult to achieve
with the medium-resolution Sentinel data. Additionally, satellite images need to be radiometrically
calibrated and corrected.
RVI and NDVI should be elaborated with caution since no direct correlation can be established.
However, processed products such as differences of the vegetation proportion estimation can uncover
similar patterns between the optical and the radar data. This allows us to fill gaps if no optical data are
available. The combination of RVI × NDVI can enhance the presence of vegetated areas and can be
integrated back to the VV and VH polarization (see Figure 13). The results from these findings can be
further utilized so as to support the extraction of vegetation proportion using integrated radar and
optical datasets, especially in areas with high cloud-coverage, supporting hazard analysis and risk
management studies. In future, the author will explore ways to better adapt these two types of sensors
for risk analysis at archaeological sites.
Funding: This article is submitted under the NAVIGATOR project. The project is being co-funded by the Republic
of Cyprus and the Structural Funds of the European Union in Cyprus under the Research and Innovation
Foundation grant agreement EXCELLENCE/0918/0052 (Copernicus Earth Observation Big Data for Cultural
Heritage).
Acknowledgments: The author would like to acknowledge the “CUT Open Access Author Fund” for covering
the open access publication fees of the paper. The author would like to acknowledge the use of the CORINE
Land Cover 2018 (© European Union, Copernicus Land Monitoring Service 2018, European Environment Agency
(EEA)”, in 2018: “© European Union, Copernicus Land Monitoring Service 2018, European Environment Agency
(EEA)”), the Sentinel Hub (Modified Copernicus Sentinel data 2020/Sentinel Hub”), the OpenStreetMap service
(© OpenStreetMap contributors) and Google Earth platform. Thanks, are also provided to the Eratosthenes
Research Centre of the Cyprus University of Technology for its support. The Centre is currently being upgraded
through the H2020 Teaming Excelsior project (www.excelsior2020.eu).
Conflicts of Interest: The author declares no conflict of interest
References
1. Abate, N.; Lasaponara, R. Preventive archaeology based on open remote sensing data and tools: The cases of
sant’arsenio (SA) and foggia (FG), Italy. Sustainability 2019, 11, 4145. [CrossRef]
2. Garrote, J.; Díez-Herrero, A.; Escudero, C.; García, I. A framework proposal for regional-scale flood-risk
assessment of cultural heritage sites and application to the castile and León region (central Spain). Water
2020, 12, 329. [CrossRef]
3. Rodríguez-Navarro, P.; Gil Piqueras, T. Preservation strategies for southern morocco’s at-risk built heritage.
Buildings 2018, 8, 16. [CrossRef]
4. Tzouvaras, M.; Kouhartsiouk, D.; Agapiou, A.; Danezis, C.; Hadjimitsis, D.G. The use of sentinel-1 synthetic
aperture radar (SAR) images and open-source software for cultural heritage: An example from paphos area
in cyprus for mapping landscape changes after a 5.6 magnitude earthquake. Remote Sens. 2019, 11, 1766.
[CrossRef]
5. Nicu, C.I.; Asăndulesei, A. GIS-based evaluation of diagnostic areas in landslide susceptibility analysis of
bahluiet, river basin (moldavian plateau, NE Romania). Are neolithic sites in danger? Geomorphology 2018,
314, 27–41. [CrossRef]
6. Luo, L.; Wang, X.; Guo, H.; Lasaponara, R.; Zong, X.; Masini, N.; Wang, G.; Shi, P.; Khatteli, H.; Chen, F.; et al.
Airborne and spaceborne remote sensing for archaeological and cultural heritage applications: A review of
the century (1907–2017). Remote Sens. Environ. 2019, 232, 111280. [CrossRef]
7. Agapiou, A.; Lysandrou, V. Remote Sensing Archaeology: Tracking and mapping evolution in scientific
literature from 1999–2015. J. Archaeol. Sci. Rep. 2015, 4, 192–200. [CrossRef]
8. Tapete, D. Remote sensing and geosciences for archaeology. Geosciences 2018, 8, 41. [CrossRef]
Appl. Sci. 2020, 10, 4764 16 of 17
9. Traviglia, A.; Torsello, A. Landscape pattern detection in archaeological remote sensing. Geosciences 2017, 7,
128. [CrossRef]
10. Rutishauser, S.; Erasmi, S.; Rosenbauer, R.; Buchbach, R. SARchaeology—Detecting palaeochannels based
on high resolution radar data and their impact of changes in the settlement pattern in Cilicia (Turkey).
Geosciences 2017, 7, 109. [CrossRef]
11. Ghaffarian, S.; Rezaie Farhadabad, A.; Kerle, N. Post—Disaster recovery monitoring with google earth
engine. Appl. Sci. 2020, 10, 4574. [CrossRef]
12. Venkatappa, M.; Sasaki, N.; Shrestha, R.P.; Tripathi, N.K.; Ma, H.-O. Determination of vegetation thresholds
for assessing land use and land use changes in cambodia using the google earth engine cloud-computing
platform. Remote Sens. 2019, 11, 1514. [CrossRef]
13. Abate, N.; Elfadaly, A.; Masini, N.; Lasaponara, R. Multitemporal 2016–2018 sentinel-2 data enhancement for
landscape archaeology: The case study of the foggia province, southern Italy. Remote Sens. 2020, 12, 1309.
[CrossRef]
14. European Copernicus Programme. Available online: https://www.copernicus.eu/en (accessed on 14
June 2020).
15. Sentinel Hub. Available online: https://www.sentinel-hub.com (accessed on 14 June 2020).
16. Dana Negula, I.; Moise, C.; Lazăr, A.M.; Ris, cut, a, N.C.; Cristescu, C.; Dedulescu, A.L.; Mihalache, C.E.;
Badea, A. Satellite remote sensing for the analysis of the micia and germisara archaeological sites. Remote
Sens. 2020, 12, 2003. [CrossRef]
17. Lin, S.-K. Satellite Remote Sensing: A New Tool for Archaelogy; Lasaponara, R., Masini, N., Eds.; Springer:
Berlin/Heidelberg, Germany, 2012; Volume 4, pp. 3055–3057. ISBN 978-90-481-8800.
18. Roy, A.; Inamdar, B.A. Multi-temporal land use land cover (LULC) change analysis of a dry semi-arid river
basin in western India following a robust multi-sensor satellite image calibration strategy. Heliyon 2019, 5,
e01478. [CrossRef] [PubMed]
19. Haque, I.; Basak, R. Land cover change detection using GIS and remote sensing techniques: A spatio-temporal
study on Tanguar Haor, Sunamganj, Bangladesh. Egypt. J. Remote Sens. Sp. Sci. 2017, 20, 251–263. [CrossRef]
20. Abd El-Kawy, O.R.; Rød, J.K.; Ismail, H.A.; Suliman, A.S. Land use and land cover change detection in the
western nile delta of Egypt using remote sensing data. Appl. Geogr. 2011, 31, 483–494. [CrossRef]
21. Hegazy, I.R.; Kaloop, M.R. Monitoring urban growth and land use change detection with GIS and remote
sensing techniques in daqahlia governorate Egypt. Int. J. Sustain. Built Environ. 2014, 4, 117–124. [CrossRef]
22. Alijani, Z.; Hosseinali, F.; Biswas, A. Spatio-temporal evolution of agricultural land use change drivers:
A case study from Chalous region, Iran. J. Environ. Manag. 2020, 262, 110326. [CrossRef]
23. Chyla, J.M. How can remote sensing help in detecting the threats to archaeological sites in upper Egypt?
Geosciences 2017, 7, 97. [CrossRef]
24. Xu, Y.; Yu, L.; Zhao, Y.; Feng, D.; Cheng, Y.; Cai, X.; Gong, P. Monitoring cropland changes along the Nile
River in Egypt over past three decades (1984–2015) using remote sensing. Int. J. Remote Sens. 2017, 38,
4459–4480. [CrossRef]
25. Senanayake, S.; Pradhan, B.; Huete, A.; Brennan, J. Assessing soil erosion hazards using land-use change and
landslide frequency ratio method: A case study of sabaragamuwa province, Sri Lanka. Remote Sens. 2020,
12, 1483. [CrossRef]
26. Viana-Soto, A.; Aguado, I.; Salas, J.; García, M. Identifying post-fire recovery trajectories and driving factors
using landsat time series in fire-prone mediterranean pine forests. Remote Sens. 2020, 12, 1499. [CrossRef]
27. Howland, D.M.; Ian Jones, W.N.I.; Najjar, M.; Levy, E.T. Quantifying the effects of erosion on archaeological
sites with low-altitude aerial photography, structure from motion, and GIS: A case study from southern
Jordan. J. Archaeol. Sci. 2018, 90, 62–70. [CrossRef]
28. Burry, L.B.; Palacio, I.P.; Somoza, M.; Trivi de Mandri, E.M.; Lindskoug, B.H.; Marconetto, B.M.; D’Antoni, L.H.
Dynamics of fire, precipitation, vegetation and NDVI in dry forest environments in NW Argentina.
contributions to environmental archaeology. J. Archaeol. Sci. Rep. 2018, 18, 747–757. [CrossRef]
29. Department of Antiquities. Archaeological Sites. Available online: http://www.mcw.gov.cy/mcw/da/da.nsf/
DMLsites_en/DMLsites_en?OpenDocument (accessed on 14 June 2020).
30. CORINE 2018 Program. Available online: https://land.copernicus.eu/pan-european/corine-land-cover
(accessed on 14 June 2020).
Appl. Sci. 2020, 10, 4764 17 of 17
31. Cuca, B.; Agapiou, A. Impact of land use change to the soil erosion estimation for cultural landscapes:
Case study of Paphos district in Cyprus. In The International Archives of the Photogrammetry, Remote
Sensing and Spatial Information Sciences; Spatial Inf. Sci., XLII-5/W1; 2017; pp. 25–29. Available online:
https://doi.org/10.5194/isprs-archives-XLII-5-W1-25-2017 (accessed on 9 July 2020).
32. Agapiou, A.; Alexakis, D.D.; Lysandrou, V.; Sarris, A.; Cuca, B.; Themistocleous, K.; Hadjimitsis, D.G. Impact
of urban sprawl to archaeological research: The case study of Paphos area in Cyprus. J. Cult. Herit. 2015, 16,
671–680. [CrossRef]
33. Agapiou, A.; Lysandrou, V.; Alexakis, D.D.; Themistocleous, K.; Cuca, B.; Sarris, A.; Argyrou, N.;
Hadjimitsis, D.G. Cultural heritage management and monitoring using remote sensing data and GIS:
The case study of Paphos area, Cyprus. CEUS Comput. Environ. Urban Syst. 2015, 54, 230–239. [CrossRef]
34. Panagos, P.; Borrelli, P.; Poesen, J.; Ballabio, C.; Lugato, E.; Meusburger, K.; Montanarella, L.; Alewell, C.
The new assessment of soil loss by water erosion in Europe. Environ. Sci. Policy 2015, 54, 438–447. [CrossRef]
35. OpenStreetMap Service. Available online: https://www.openstreetmap.org/ (accessed on 14 June 2020).
36. Schultz, M.; Voss, J.; Auer, M.; Carter, S.; Zipf, A. Open land cover from OpenStreetMap and remote sensing.
Int. J. Appl. Earth Obs. Geoinf. 2017, 63, 206–213. [CrossRef]
37. Zhao, W.; Bo, Y.; Chen, J.; Tiede, D.; Blaschke, T.; Emery, J.W. Exploring semantic elements for urban scene
recognition: Deep integration of high-resolution imagery and OpenStreetMap (OSM). ISPRS J. Photogramm.
Remote Sens. 2019, 151, 237–250. [CrossRef]
38. Johnson, A.B.; Iizuka, K.; Bragais, A.M.; Endo, I.; Magcale-Macandog, B.D. Employing crowdsourced
geographic data and multi-temporal/multi-sensor satellite imagery to monitor land cover change: A case
study in an urbanizing region of the Philippines. Comput. Environ. Urban Syst. 2017, 64, 184–193. [CrossRef]
39. Sentinel-ub Services, Custom Scripts. Available online: https://custom-scripts.sentinel-hub.com/sentinel-1/
radar_vegetation_index (accessed on 8 June 2020).
40. QGIS. A Free and Open Source Geographic Information System. Available online: https://www.qgis.org/en/
site/ (accessed on 2 July 2020).
© 2020 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).