0% found this document useful (0 votes)
25 views20 pages

UAV-Based Terrain Modeling in Low-Vegetation Areas: A Framework Based On Multiscale Elevation Variation Coefficients

UAV-Based Terrain Modeling in Low-Vegetation Areas: A Framework Based on Multiscale Elevation Variation Coefficients

Uploaded by

dwdawn163
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views20 pages

UAV-Based Terrain Modeling in Low-Vegetation Areas: A Framework Based On Multiscale Elevation Variation Coefficients

UAV-Based Terrain Modeling in Low-Vegetation Areas: A Framework Based on Multiscale Elevation Variation Coefficients

Uploaded by

dwdawn163
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

remote sensing

Article
UAV-Based Terrain Modeling in Low-Vegetation Areas:
A Framework Based on Multiscale Elevation
Variation Coefficients
Jiaxin Fan 1 , Wen Dai 2,3 , Bo Wang 1, *, Jingliang Li 4 , Jiahui Yao 1 and Kai Chen 2

1 School of Remote Sensing & Geomatics Engineering, Nanjing University of Information Science & Technology,
Nanjing 211800, China; [email protected] (J.F.); [email protected] (J.Y.)
2 School of Geographical Sciences, Nanjing University of Information Science & Technology,
Nanjing 211800, China; [email protected] or [email protected] (W.D.); [email protected] (K.C.)
3 Institute of Earth Surface Dynamics (IDYST), University of Lausanne, 1015 Lausanne, Switzerland
4 Changwang School of Honors, Nanjing University of Information Science & Technology,
Nanjing 211800, China; [email protected]
* Correspondence: [email protected]

Abstract: The removal of low vegetation is still challenging in UAV photogrammetry. According to the
different topographic features expressed by point-cloud data at different scales, a vegetation-filtering
method based on multiscale elevation-variation coefficients is proposed for terrain modeling. First,
virtual grids are constructed at different scales, and the average elevation values of the corresponding
point clouds are obtained. Second, the amount of elevation change at any two scales in each virtual
grid is calculated to obtain the difference in surface characteristics (degree of elevation change)
at the corresponding two scales. Third, the elevation variation coefficient of the virtual grid that
corresponds to the largest elevation variation degree is calculated, and threshold segmentation is
performed based on the relation that the elevation variation coefficients of vegetated regions are
much larger than those of terrain regions. Finally, the optimal calculation neighborhood radius of
Citation: Fan, J.; Dai, W.; Wang, B.; Li,
J.; Yao, J.; Chen, K. UAV-Based
the elevation variation coefficients is analyzed, and the optimal segmentation threshold is discussed.
Terrain Modeling in Low-Vegetation The experimental results show that the multiscale coefficients of elevation variation method can
Areas: A Framework Based on accurately remove vegetation points and reserve ground points in low- and densely vegetated areas.
Multiscale Elevation Variation The type I error, type II error, and total error in the study areas range from 1.93 to 9.20%, 5.83 to 5.84%,
Coefficients. Remote Sens. 2023, 15, and 2.28 to 7.68%, respectively. The total error of the proposed method is 2.43–2.54% lower than that
3569. https://doi.org/10.3390/ of the CSF, TIN, and PMF algorithms in the study areas. This study provides a foundation for the
rs15143569 rapid establishment of high-precision DEMs based on UAV photogrammetry.
Academic Editors: Syed Agha
Hassnain Mohsan, Pascal Lorenz, Keywords: terrain modeling; coefficients of elevation variation; virtual grid; elevation differences; DEM
Khaled Rabie, Muhammad Asghar
Khan and Muhammad Shafiq

Received: 29 May 2023


1. Introduction
Revised: 10 July 2023
Accepted: 14 July 2023
Unmanned aerial vehicles (UAVs) play an important role in digital terrain modeling [1–4].
Published: 16 July 2023 They support fast and accurate terrain modeling in small areas, thus reducing the typical
cost and workload. However, UVA photogrammetry can only provide digital surface models
(DSMs). A digital elevation model (DEM) constructed with terrain modeling requires the
removal of surface features, such as vegetation [5]. The quality and accuracy of a DEM are
Copyright: © 2023 by the authors. determined by the accuracy of surface-feature removal or by the precision of ground-point se-
Licensee MDPI, Basel, Switzerland. lection. In nature, most surface features are vegetation. Thus, the accuracy of terrain modeling
This article is an open access article is influenced by the accuracy of vegetation removal. Especially in areas with complex topog-
distributed under the terms and raphy, the continuity and inconsistency of topographic undulations and the inconsistency of
conditions of the Creative Commons the density and height of vegetation, particularly the presence of low vegetation, increase the
Attribution (CC BY) license (https:// complexity of terrain modeling. Rapid and accurate removal of vegetation areas is a key issue
creativecommons.org/licenses/by/
in DEM construction [6].
4.0/).

Remote Sens. 2023, 15, 3569. https://doi.org/10.3390/rs15143569 https://www.mdpi.com/journal/remotesensing


Remote Sens. 2023, 15, 3569 2 of 20

Many point-cloud filtering algorithms have been used for identifying ground points
and nonground points (such as vegetation). These point-cloud filtering algorithms can be
mainly classified as the morphological method [7,8], triangulated irregular network-based
(TIN-based) algorithms [9], machine-learning-based algorithms [10], and mathematical
morphology-based algorithms [11,12].
The morphological method filters off-ground points according to the morphological
characteristics of terrain surface, such as terrain slope [7], curvature [7], and simulation
of a virtual surface [8]. For example, cloth simulation filtering (CSF) filters off-ground
points by simulating a cloth overlaid on terrains [13]. CSF works effectively in flat areas,
but the results are not satisfactory in areas with complex terrain, such as both flat and
steep slopes [14]. The morphological method is generally simple and efficient but does not
perform well in areas with complex terrain, especially in areas with rugged terrain and low
vegetation cover [15]. The TIN-based method gradually approximates the ground surface
by iteratively selecting ground points from seed points and densifies a sparse TIN. Whether
a ground point is selected is determined by its angle and distance from the seed point.
This method yields a better filtering effect in urban areas than other methods do and can
adapt to areas with topographic discontinuity, but the method is ineffective in mountainous
areas with more vegetation cover [16]. The machine-learning-based approach simplifies
the filtering process to a point-cloud binary classification problem. First, a point-cloud
classification model is established, and the model is used to complete the labeling of sample
ground points and nonground points. Although these methods can achieve good accuracy,
the pretraining samples are labor-intensive to label and train, and the generalization
performance is inadequate [13,17]. Mathematical morphology-based algorithms apply
open and closed operations to images [12]. The filtering process is completed based on
changes in image characteristics. The most important aspect of this method is the choice of
the filtering window scale. Large objects cannot be effectively removed when the window
scale is small, and terrain detail is easily missed when it is large.
Different point-cloud filtering algorithms have distinct advantages and disadvantages
for different terrain features and areas [18]. However, these methods are mainly based
on the morphological characteristics of terrain surfaces at a unique scale [19,20]. The
characteristics of the ground and nonground points at different spatial scales are distinctly
different [21–23]. Only considering the morphological characteristics of terrain surfaces at
a unique scale makes it difficult to achieve high accuracy. In recent years, researchers have
proposed various improved filtering methods to address this inadequacy [24]. A parameter-
free progressive TIN densification algorithm was developed to make the selection of
thresholds in progressive TIN densification more flexible and adaptive to fit complex terrain
features [8]. Additionally, an improved simple morphological filter was established using a
linearly increasing window and simple slope thresholding to address morphological filter
inadequacies at a single scale [25,26]. However, these improved multiscale filtering methods
generally extract morphological characteristics at different scales and then combine them
to identify ground points [27]. The differences in surface features at different scales are not
directly considered.
If point-cloud data are integrated at different scales through a virtual grid (VG), the
elevation values produced will be associated with a range of variations. This degree of
variation reflects the surface characteristics (e.g., vegetation and topography) that are reflected
in the point cloud at different scales. If the degree of elevation variation in point-cloud data at
different scales can be quantified, a new approach to terrain modeling could be developed.
The elevation variation coefficient (EVC) is an important topographic factor in digital terrain
analysis [28,29]. Notably, it can be used to quantify the degree of elevation variation within
neighborhood units. Therefore, this paper aims to develop a terrain modeling framework
based on multiscale elevation-variation coefficients in low-vegetation areas.
modeling framework based on multiscale elevation-variation coefficients in low-vegeta-
tion areas.
Remote Sens. 2023, 15, 3569 3 of 20
2. Materials and Methods
2.1. Overview
The terrain
2. Materials features in point-cloud data at different scales vary significantly. For ex-
and Methods
ample,
2.1. point-cloud data with a high spatial resolution (average sampling interval) can not
Overview
onlyTheaccurately expressinthe
terrain features terrain, but
point-cloud datacan also encompass
at different other
scales vary features, such
significantly. For as low
vegetation,
example, while low-spatial-resolution
point-cloud data with a high spatialpoint-cloud data can
resolution (average only represent
sampling interval) canlarge-scale
not only accurately
topographic express
relief. If the the terrain,
original but can also encompass
high-precision point-cloudother features,
data are usedsuchtoas low
generate vir-
vegetation, while low-spatial-resolution point-cloud data can only represent large-scale
tual grids (VGs) at different scales, a series of changes (variations) will occur in the eleva-
topographic
tion valuesrelief.
of theIfVGs,
the original
especiallyhigh-precision
in vegetated point-cloud
areas, withdata are usedvariations
elevation to generate that may
virtual grids (VGs) at different scales, a series of changes (variations) will
be inaccurate. However, ground points and vegetation points can potentially be differen- occur in the
elevation values of the VGs, especially in vegetated areas, with elevation variations that
tiated by quantifying the degree of elevation variation at different scales. The methodo-
may be inaccurate. However, ground points and vegetation points can potentially be
logical flowchart of the approach proposed in this study is shown in Figure 1. First, a
differentiated by quantifying the degree of elevation variation at different scales. The
multiscale VGflowchart
methodological was established. The average
of the approach proposedelevation
in this value
study of all points
is shown in the1.VG was
in Figure
calculated
First, to assign
a multiscale VG wasan elevation attribute
established. value to
The average the VG.value
elevation Second, the
of all elevation
points in thechanges
of was
VG VGscalculated
were obtainedto assignbased on difference
an elevation operations.
attribute Then,
value to the VG.the window
Second, shape and focal
the elevation
information
changes of VGswere
wereused to calculate
obtained based onthe standard
difference deviationThen,
operations. and the
mean, the ratio
window shape of which
and
wasfocal
the information were used
elevation variation to calculate
coefficient the standard
(EVC) deviation
of each grid. and amean,
Finally, the ratio
threshold wasofselected
which was the elevation
to discriminate between variation
ground coefficient
points and (EVC) of each grid.
vegetation Finally, a threshold was
points.
selected to discriminate between ground points and vegetation points.

Figure
Figure 1. 1. Workflow
Workflow of the
of the multiscale
multiscale elevation-variation
elevation-variation coefficient
coefficient algorithm.
algorithm.

2.2. Multiscale Virtual Grid Generation


Remote Sens. 2023, 15, 3569 4 of 2
Remote Sens. 2023, 15, 3569 4 of 20

To improve
2.2. Multiscale theGrid
Virtual accuracy of ground-point selection, multiscale VGs are introduced in
Generation
this To
paper. Regular
improve VGs are of
the accuracy composed of multiple
ground-point cubes
selection, of equal
multiscale length
VGs and width (Figur
are introduced
2). First, three-dimensional regular VGs were generated, and the
in this paper. Regular VGs are composed of multiple cubes of equal length and point clouds
width were in
cluded2).
(Figure in First,
the corresponding
three-dimensionalcube according
regular to their
VGs were coordinates
generated, and the (Figure 2a). Then, th
point clouds
pointincluded
were clouds inwere segmented with
the corresponding cubea according
grid approach,
to theirand each grid
coordinates contained
(Figure several ele
2a). Then,
the point clouds were segmented with a grid approach, and each grid contained
vation points. The different scales of the VGs were represented by different colors (Figur several
elevation points. The
2b). The elevation of different
each VGscales of the VGs were
was determined represented
according to the by different
average colors
of the points’ ele
(Figure 2b). The elevation of each VG was determined according to the average of the
vation. The large-scale VGs played a role in smoothing the terrain and obtaining elevation
points’ elevation. The large-scale VGs played a role in smoothing the terrain and obtaining
values. The elevation values obtained by the small-scale VGs were similar to the actua
elevation values. The elevation values obtained by the small-scale VGs were similar to the
elevation
actual of theofground
elevation surface.
the ground surface.

Figure 2. Schematic diagram of the virtual grid: (a) three-dimensional representation of regular VGs;
Figure 2. Schematic diagram of the virtual grid: (a) three-dimensional representation of regular VGs
(b)
(b)horizontal
horizontalprojection of aof
projection multiscale virtual
a multiscale grid. grid.
virtual

2.3. Elevation Differences


2.3. Elevation Differences
The spatial scale of a VG directly impacts the corresponding data volume and repre-
Theaccuracy,
sentation spatial scale of as
as well a VG
the directly
calculation impacts the corresponding
of topographic factors suchdata
as thevolume
elevation and repre
variation coefficient, slope, and slope direction. Different spatial scales of VGs encompasselevation
sentation accuracy, as well as the calculation of topographic factors such as the
variation
different coefficient,
topographic andslope, and slope direction.
geomorphological features. Different spatial
As the spatial scalescales
of a VG ofincreases,
VGs encompas
different
the accuracy topographic and geomorphological
of the description features. smoothed,
of surface details is gradually As the spatial
and the scale of a VG in
features
are gradually integrated. In contrast, as the spatial scale of the VG
creases, the accuracy of the description of surface details is gradually smoothed, becomes finer, the and th
description
features are ofgradually
surface details gradually
integrated. Inincreases,
contrast, and thespatial
as the smallerscale
the spatial
of thescale is, the
VG becomes finer
more accurately and realistically the detailed features of the surface within
the description of surface details gradually increases, and the smaller the spatial scale is the region
are reflected. Therefore, calculating the elevation differences of VGs at different spatial
the more accurately and realistically the detailed features of the surface within the region
scales can reflect discrepancies in the surface features contained in VGs at different scales
are reflected.
(e.g., vegetation). Therefore,
As showncalculating
in Figure 3,the theelevation
black pointsdifferences
representof theVGs at different
ground points, spatia
and the green points represent the vegetation points. A high-precision topographic model scale
scales can reflect discrepancies in the surface features contained in VGs at different
(e.g., vegetation).
generated As shown
from a small-scale VGinprovides
Figure 3, theprecision
high black points representsurface
for describing the ground
details.points,
A and
the green points
low-precision represent
topographic the vegetation
model generated with points. A high-precision
a large-scale VG provides topographic
a rough repre- model gen
sentation
erated from of surface details, VG
a small-scale and provides
surface features tend to befor
high precision flat. The elevation
describing difference
surface details. A low
among
precisionVGstopographic
at different spatial
modelscales can be with
generated calculated, and significant
a large-scale fluctuations
VG provides a roughoftenrepresen
appear at the edges of vegetation points. Because of the different sizes of
tation of surface details, and surface features tend to be flat. The elevation differenc VGs, an output
standard is required when calculating the differences between two VGs. We chose a small
among VGs at different spatial scales can be calculated, and significant fluctuations often
VG as the output standard. To obtain the optimal parameters, VG thresholds ranging from
appear
0.1 to 3.2 at
m the
wereedges of vegetation points. Because of the different sizes of VGs, an outpu
considered.
standard is required when calculating the differences between two VGs. We chose a smal
VG as the output standard. To obtain the optimal parameters, VG thresholds ranging from
0.1 to 3.2 m were considered.
Remote Sens. 2023, 15, 3569 5 of 2

Remote Sens.
Remote Sens.2023,
2023,15,
15, 3569
3569 5 of 20 5 of 2

Figure 3. Schematic diagram of topographic models.

Figure3.3.Schematic
Figure diagram
Schematic of topographic
diagram models.
of topographic models.
2.4. Multiscale Coefficients of Elevation Variation
The elevation
2.4. Multiscale variation
Coefficients coefficient
of Elevation (EVC) is an important terrain factor in digital ter
Variation
2.4. Multiscale Coefficients of Elevation Variation
rainThe
analysis.
elevationAsvariation
a variable that reflects
coefficient the
(EVC) is andispersion degree
important terrain of the
factor average
in digital elevation,
terrain
is theThe
analysis. elevation
ratio
As aof variation
the standard
variable coefficient
deviation
that reflects (EVC)
and is an important
average
the dispersion degreeofof the terrain
the elevations
average factor init digital
at different
elevation, is pointter
rain
which
the analysis.
ratio can
of the As adifferences
reflect variable
standard that
deviation in reflects
andterrain the
ofdispersion
characteristics.
average degree
the elevations The of thepoints,
average
corresponding
at different elevation, i
calculation
which
is
can the ratio
reflect
shown of the
indifferences standard
Formula (1).in terrain
The EVC deviation and
characteristics.
can directly average
The of the
corresponding
represent elevations
calculation
the variations atshown in inpoints
different
iniselevation differ
which
Formula can
(1). reflect
The EVCdifferences
can directlyin terrain
represent
ent-scale VGs and minimize the effect of noise. characteristics.
the variations in The corresponding
elevation in calculation
different-scale i
VGs
shownand inminimize
Formula the(1).
effect
TheofEVC
noise.can directly represent the variations in elevation in differ
ent-scale VGs and minimize the effectEVC = 𝐻 ⁄𝐻
of noise. (1
EVC = Hstd /Hmean (1)
where EVC is the elevation variation EVC coefficient
=𝐻 ⁄ of𝐻a VG in the analysis area, which objec (1
tively reflects differences in elevation in the analysis
where EVC is the elevation variation coefficient of a VG in area. Hstd is the
the analysis area,standard deviation o
which objec-
where
tively EVCin is
reflects
elevation thethe elevation
differences
VG variation
in elevation
statistical coefficient
in the
window, analysis
and of
is athe
Hmeanarea. VGstdin
Hmeanisthe
the analysis
standard
elevation inarea, which
deviation
the VG objec
statistica
tively
of reflects
elevation
window. differences
in the in elevation
VG statistical window, inand
the analysis
Hmean is thearea. Hstd elevation
mean is the standard
in the deviation
VG o
statistical
elevation window.
in the VG statistical window, and H mean is the mean elevation in the VG statistica
To calculate the standard deviation and mean of elevation in each VG statistical win
To calculate the standard deviation and mean of elevation in each VG statistical
window.
dow, elevation differences are determined in neighborhoods, and the result for a neigh
window, elevation differences are determined in neighborhoods, and the result for a
To calculate
borhood grid the standard
is isused deviation and mean of elevationTheinmultiscale
each VG statistical win
neighborhood grid usedas as the new
the new value
value of of
thethe central
central grid. grid.
The multiscale VG statistica
VG statistical
dow,
windows elevation differences are determined in neighborhoods, and the result for
of a1–6
neigh
windows areare shown
shown in Figure
in Figure 4. To4.obtain
To obtain the
the optimaloptimal parameters,
parameters, thresholds
thresholds of 1–6 grid gri
borhood
were grid
considered.
were considered. is used as the new value of the central grid. The multiscale VG statistica
windows are shown in Figure 4. To obtain the optimal parameters, thresholds of 1–6 gri
were considered.

Figure 4. Schematic diagram of the neighborhood radius: (a) VG statistical window of 1-grid
Figure 4. Schematic diagram of the neighborhood radius: (a) VG statistical window of 1-grid neigh
neighborhood radius; (b) VG statistical window of 2-grid neighborhood radius.
borhood radius; (b) VG statistical window of 2-grid neighborhood radius.
Figure 4. Schematic diagram of the neighborhood radius: (a) VG statistical window of 1-grid neigh
2.5. Accuracy
borhood Assessment
radius; (b) VG statistical window of 2-grid neighborhood radius.
2.5. Accuracy Assessment
We manually classified the ground and vegetation points as reference data. The results
of We manually
theAccuracy
proposed classified
method the compared
were then ground andwithvegetation points
the reference as To
data. reference
evaluatedata.
the The re
2.5. Assessment
sults of the proposed method were then compared with the reference
accuracy of terrain modeling, we used the method recommended by the International data. To evaluat
SocietyWe manually
the accuracy classified the
of terrain modeling,
for Photogrammetry ground
and Remote and
weSensing
used the vegetation
method
(ISPRS) points as
recommended
for quantitative reference
by [30].
analysis data.
The The re
the Internationa
sults of for
Society the Photogrammetry
proposed methodand were then compared
Remote with the
Sensing (ISPRS) forreference data.analysis
quantitative To evaluat
[30
the accuracy of terrain modeling, we used the method recommended by the Internationa
The method proposed by the ISPRS in 2003 is shown in Table 1. The precision of the veg
Society for Photogrammetry
etation removal and Remote
results was quantified andSensing (ISPRS)
evaluated basedfor
onquantitative
type I error, analysis [30
type II erro
The method proposed
and total error. by the ISPRS in 2003 is shown in Table 1. The precision of the veg
Remote Sens. 2023, 15, 3569 6 of 20

method proposed by the ISPRS in 2003 is shown in Table 1. The precision of the vegetation
removal results was quantified and evaluated based on type I error, type II error, and
total error.

Table 1. Table method.

Result
Reference Sum
Ground Points Vegetation Points
Ground points a b e=a+b
Vegetation points c d f=c+d
Sum g=a+c h=b+d n=a+b+c+d

In Table 1, a represents the number of ground points correctly classified as ground


points. b represents the number of ground points incorrectly classified as vegetation points
(affecting type I error). c represents the number of vegetation points incorrectly classified
as ground points (affecting type II error). d represents the number of vegetation points
correctly classified as vegetation points. Additionally, e represents the number of ground
points in the reference dataset used for visual interpretation classification. f represents
the number of vegetation points in the reference dataset used for visual interpretation
classification. g represents the number of ground points used in terrain modeling. h
represents the number of feature points used in terrain modeling. n represents the total
number of point clouds.
Three indices were calculated for accuracy assessment. Type I error represents the
proportion of ground points that were incorrectly classified as vegetation points, also
known as truth-rejection errors. Type II error represents the proportion of vegetation points
that were incorrectly classified as ground points, also known as false-tolerance errors. Total
error represents the overall error proportion, which reflects the inconsistency between
terrain modeling results and actual values. The corresponding formulas are as follows.

b
Type I error = × 100% (2)
e

c
Type II error = × 100% (3)
f
b+c
Total error = × 100% (4)
n

2.6. Study Areas and Data


Two study areas (T1 and T2) were used to validate the proposed method. The T1 area
was located in Xining city, Qinghai Province, China. A Da-Jiang Inspire1 UAV equipped
with a 20 mm fixed-focus lens and PIX4Dmapper 4.7.5 software were used to perform
image matching, aerial triangulation, and dense point-cloud matching. The study area
was flanked by steep cliffs to the south, and the quality of the point cloud was poor in this
area. Therefore, the study area was cropped using the main southern road as the boundary.
The study area covered 23,439.46 square meters. The point density was 189 points/m2 . In
addition to ground points, the study area contained a large amount of vegetation and a
small number of man-made ground objects. The vegetation mainly included two types of
dense low vegetation, which were connected in strips in some areas and isolated in others.
The orthophoto image and reference cloud points for T1 are shown in Figure 5. The T2 area
was located in Yulin city, Shaanxi Province, China. UAVs were used to collect data and
produce aerial images of Madigou. The study area covered 28,881.2 square meters. The
point density was 27 points/m2 . This area contained a large amount of vegetation. The
vegetation was mainly isolated as single points, with a small amount of densely aggregated
vegetation. The orthophoto image and reference cloud points for T2 are shown in Figure 6.
Field research and expert visual interpretation were used to select the reference data.
Remote Sens. 2023, 15, 3569 7 of 2
Remote
Remote Sens.
Sens. 2023, 15,3569
2023,15, 3569 7 of 20 7 of 2

Figure5.5. Orthophoto
Orthophoto andreference
reference cloud points
for for
T1: T1:
(a) (a) orthophoto
of T1offrom
T1 from the UAVs; (b
Figure 5. Orthophotoand
Figure
and reference cloud points
cloud points for T1: orthophoto
(a) orthophoto the UAVs;
of T1 from the UAVs; (b
reference
(b) cloud
reference cloudpoints
pointsfor
for T1 fromthe
T1 from theUAVs.
UAVs.
reference cloud points for T1 from the UAVs.

Figure6.6. Orthophoto
Figure Orthophotoand andreference cloud
reference points
cloud for for
points T2: T2:
(a) (a)
orthophoto of T2offrom
orthophoto the UAVs;
T2 from the UAVs; (b
Figure 6. Orthophoto and reference cloud points for T2: (a) orthophoto of T2 from the UAVs; (b
(b) reference
reference cloudpoints
cloud pointsfor
for T2
T2 from
fromthe
theUAVs.
UAVs.
reference cloud points for T2 from the UAVs.

3.
3. Results
Results
3.1.
3.1. Optimal
Optimal Scale
Scale of
of the
the Virtual
Virtual Grid
Grid
Figure
Figure 77 shows
shows the
the digital
digital surface
surface model
model generated
generated for
for the
the study
study area
area at
at different
different VG
VG
scales. The surface features vary at different scales. In the small-scale VG, the surface de
Remote Sens. 2023, 15, 3569 8 of 20

3. Results
Remote Sens. 2023, 15, 3569 3.1. Optimal Scale of the Virtual Grid 8 of 21
Figure 7 shows the digital surface model generated for the study area at different VG
scales. The surface features vary at different scales. In the small-scale VG, the surface details
are obvious, and low vegetation can be clearly expressed. In the 6.4 m VG, only abrupt
surface
surfacefeatures
features such asgullies
such as gulliesand
and steep
steep slopes
slopes can
can be be observed,
observed, and theand the vegetation
surface surface vegeta-
tion features
features are basically
are basically smoothed.
smoothed. In VGs In VGs
with withgreater
a scale a scale greater
than 6.4 m, than 6.4 m,surface
small-scale small-scale
surface features can no longer
features can no longer be observed.be observed.

Figure
Figure7.7.Schematic
Schematicdiagram
diagram ofof the
theVGVGatatdifferent
different scales:
scales: (a) (a) diagram
diagram ofVG
of the theatVGtheat0.1the
m 0.1 m scale
scale
forfor
T1;T1;
(b)(b)diagram
diagramofofthe
the VG atat the
the0.4
0.4mmscale
scale
forfor
T1;T1;
(c) (c) diagram
diagram of VG
of the theatVGtheat1.6
the
m 1.6 mfor
scale scale for
T1;T1;
(d)(d)
diagram
diagramofofthe
theVGVGatatthe
the6.4
6.4 m
m scale
scale for T1;
T1; (e)
(e) diagram
diagramofofthe theVGVGatatthe
the0.10.1
mm scale
scale forfor T2;
(f) diagram of the VG at the 0.4 m scale for T2; (g) diagram of the VG at the 1.6 m scale for
T2; (f) diagram of the VG at the 0.4 m scale for T2; (g) diagram of the VG at the 1.6 m scale for T2;T2; (h)
diagram of theofVG
(h) diagram the at
VGthe 6.4 6.4
at the m scale
m scaleforfor
T2.
T2.

Figure88shows
Figure shows the
the results
results of
ofthe
theelevation
elevationvariations in VGs
variations at different
in VGs scalesscales
at different in the in the
study area. The scale of the output result is equal to that of the small-scale grid
study area. The scale of the output result is equal to that of the small-scale grid when when
calculating the elevation change. The VGs can no longer express the surface features with
calculating the elevation change. The VGs can no longer express the surface features with
scales larger than 6.4 m, so scales larger than 6.4 m were not considered when selecting the
scales larger than 6.4 m, so scales larger than 6.4 m were not considered when selecting
VG scale.
the VG scale.
Remote
Remote Sens.
Sens. 15,15,
2023,
2023, 3569
3569 9 of 20 21
9 of

Figure8. 8.
Figure Results
Results of elevation
of elevation variation
variation for for
VGsVGs at different
at different scales:
scales: (a) elevation
(a) elevation variation
variation for afor a
3.2−1.6
m3.2–1.6
VG of m T1;VG
(b)ofelevation variationvariation
T1; (b) elevation for a 3.2−0.2
for am VG ofmT1;
3.2–0.2 VG(c)ofelevation variation
T1; (c) elevation for a 1.6−0.8
variation for a m
VG of T1;m(d)
1.6–0.8 VGelevation
of T1; (d)variation
elevationforvariation
a 1.6−0.2for
m aVG of T1;m(e)
1.6–0.2 VG elevation
of T1; (e)variation
elevationfor a 0.8−0.4for
variation maVG
of0.8–0.4
T1; (f) m
elevation
VG of T1;variation for a 0.8−0.2
(f) elevation variationm for
VGaof0.8–0.2
T1; (g)melevation
VG of T1;variation for avariation
(g) elevation 3.2−1.6 mforVGa of
T2; (h) elevation
3.2–1.6 m VG ofvariation for a 3.2−0.2
T2; (h) elevation m VGfor
variation of aT2; (i) elevation
3.2–0.2 m VG ofvariation for a 1.6−0.8
T2; (i) elevation m VGforofaT2;
variation
(j)1.6–0.8
elevation
m VGvariation forelevation
of T2; (j) a 1.6−0.2variation
m VG offor T2;a(k) elevation
1.6–0.2 m VGvariation
of T2; (k)for a 0.8−0.4variation
elevation m VG offor T2;a (l)
elevation variation for a 0.8−0.2 m VG of T2.
0.8–0.4 m VG of T2; (l) elevation variation for a 0.8–0.2 m VG of T2.

The
Theelevation
elevationvariations
variations between
between pairs of of VG
VGscales
scaleshighlight
highlightthetheareas
areas with
with abrupt
abrupt
changes
changesininthe surface
the surfaceinin
thethe
sample
sample area. To To
area. obtain thethe
obtain bestbest
VGVGscale, quantitative
scale, quantitativecom-
parisons and analyses
comparisons were performed
and analyses basedbased
were performed on theonterrain modeling
the terrain errorerror
modeling results for the
results
for the different
different elevationelevation variations
variations shown shown
above. above. In the process
In the process of terrain
of terrain modeling,
modeling, the
the neigh-
neighborhood radius and segmentation threshold were selected as constants
borhood radius and segmentation threshold were selected as constants in all cases. The in all cases.
The neighborhood
neighborhood radius
radius waswas selected
selected as as a pixelunit,
a pixel unit,and
and the
the segmentation
segmentationthreshold
threshold waswas
determined using the natural breakpoint method. The thresholds of point-cloud
determined using the natural breakpoint method. The thresholds of point-cloud filtering filtering
areshown
are shownin inTable
Table 2.
2. The
The ground
groundfeature
featurepoints
pointsincluded the the
included vegetation area area
vegetation and vegetation
and vegeta-
boundaries. The threshold value of the vegetation area was (−∞,0], the threshold value
tion boundaries. The threshold value of the vegetation area was (−∞,0], the threshold value
of vegetation boundaries was (1.5,+∞), and the threshold value of ground points was
of vegetation boundaries was (1.5,+∞), and the threshold value of ground points was
(0,1.5]. After threshold segmentation, the attributes of ground points and vegetation
points were assigned to the point cloud in the corresponding virtual grid to remove the
Remote Sens. 2023, 15, 3569 10 of 20

Remote Sens. 2023, 15, 3569 10 of 21


(0,1.5]. After threshold segmentation, the attributes of ground points and vegetation points
were assigned to the point cloud in the corresponding virtual grid to remove the surface
vegetation and improve
surface vegetation and the accuracy
improve the of terrain modeling.
accuracy A comparison
of terrain modeling. of error results
A comparison is
of error
shown in Figure 9.
results is shown in Figure 9.
Table 2. Thresholds of point-cloud filtering.
Table 2. Thresholds of point-cloud filtering.
Ground
Ground PointsPoints VegetationPoints
Vegetation Points
Vegetation
Vegetation area area Vegetation
Vegetationboundary
boundary
Threshold
Threshold (0,1.5]( 0, 1.5] (−∞, 0] (1.5, +∞)
(−∞, 0] (1.5, +∞)

Figure
Figure 9. Filteringerror
9. Filtering errorcomparison
comparisonatatdifferent
differentscales:
scales:(a)(a) error
error comparison
comparison forfor
T1;T1;
(b) (b) error
error com-
comparison for T2.
parison for T2.

As shown in Figures 8 and 9, the contrast between ground points and vegetation
points is gradually enhanced with increasing grid-scale difference. The larger the scale
difference is, the more prominent the vegetation characteristics are. Vegetation tends to be
Remote Sens. 2023, 15, 3569 11 of 20

As shown in Figures 8 and 9, the contrast between ground points and vegetation points
is gradually enhanced with increasing grid-scale difference. The larger the scale difference
is, the more prominent the vegetation characteristics are. Vegetation tends to be difficult
to identify in areas with considerable elevation variation. The smaller the scale difference
is, the higher the similarity between the top of vegetation and the ground. The variation
trends for the type I error, type II error, and total error were different at different scales. As
the VG scale gap increased, the three kinds of errors generally displayed downward trends.
The error in the elevation variation results obtained at the 3.2 m and 1.6 m scales was the
largest because the output scale at 1.6 m could not express terrain features well, resulting in
a large number of errors. The error decreased when 0.2 m was used as the small-scale VG.
Figure 9a indicates that type I error was the smallest among the error results for elevation
variations between the 3.2 m and 0.2 m scales, followed by that between the 2.8 m and
0.2 m scales. The final terrain modeling result based on the difference between grids at
3.2 m and 0.2 m scales was associated with the smallest type II error, followed by that
between the results at the 3.0 m and 0.2 m scales. The total error was lowest based on the
difference between the 3.2 m and 0.2 m scales, followed by that between 3.0 m and 0.2 m.
Based on the three types of errors, 0.2 m was the VG scale that yielded the smallest error
for T1. Figure 9b indicates that type I error was the smallest among the error results for
elevation variations between the 1.8 m and 0.2 m scales, followed by that between the 2.2 m
and 0.2 m scales. The final terrain modeling result based on the difference between grids
at 2.0 m and 0.2 m scales was associated with the smallest type II error, followed by that
between the results at the 0.8 m and 0.4 m scales. The total error was lowest based on the
difference between the 2.0 m and 0.2 m scales, followed by that between 1.8 m and 0.2 m.
Based on the three types of errors, 0.2 m was the VG scale that yielded the smallest error
for T2. The results indicate that it is possible to not only avoid misjudgments caused by a
low output resolution, but also to avoid the increased noise associated with a high output
resolution. Finally, the 3.2 m and 0.2 m scales of VGs of T1 and the 2.0 m and 0.2 m scales
of VGs of T2 were selected to continue the multiscale neighborhood calculation of EVCs to
determine the best-scaled neighborhoods.

3.2. Optimal Neighborhood Radius


Figure 10 shows the results of EVCs obtained under different neighborhood radii for
3.2 m and 0.2 m elevation variations for T1 and 2.0 m and 0.2 m elevation variations for
T2. The ground points displayed obvious false negatives after the neighborhood radius
reached 6 VG radius. Therefore, only the EVCs in a neighborhood radius of 1–6 VGs
were analyzed.
Figure 10 shows that the areas with negative values correspond to vegetation, and
the boundaries around vegetation are highlighted. The vegetation boundaries become
more prominent and the contrast between ground and vegetation becomes more prominent
with increasing neighborhood radius. However, the vegetation areas display obvious
false negatives inside boundaries in some cases, and low and small vegetation areas also
correspond to false negatives.
The error results for terrain modeling with different-scale EVCs were quantitatively
compared and analyzed. The same threshold was selected for segmentation in the process
of terrain modeling. The trends of the type I error and total error are the same in the
quantitative error comparison in Figure 11. The error increased with increasing scale. The
error decreased with increasing scale for the type II error in T1 in Figure 11a, but the range
of change was small. The error increased with increasing scale for the type II error in
T1 in Figure 11b. Based on the results in Figures 10 and 11, it can be concluded that the
filtering result obtained by calculating the EVC in a one-neighborhood radius was the
most accurate.
Remote Sens. 2023, 15, 3569 12 of 20
Remote Sens. 2023, 15, 3569 12 of 21

Figure
Figure10.
10.Multiscale
MultiscaleEVC
EVC results.
results. (a)
(a) EVC for
for aa 1-VG
1−VGneighborhood
neighborhoodradius
radiusof of
T1;T1;
(b)(b)
EVC EVC
for for
a a
3−VG
3-VG neighborhood radius of T1; (c) EVC for a 6-VG neighborhood radius of T1; (d) EVC for a a
neighborhood radius of T1; (c) EVC for a 6−VG neighborhood radius of T1; (d) EVC for
1−VG
1-VGneighborhood
neighborhood radius
radius ofof T2;
T2; (e)
(e) EVC
EVCfor
foraa3-VG
3−VGneighborhood
neighborhood radius
radius of of
T2;T2;
(f)(f)
EVCEVCforfor a 6−VG
a 6-VG
neighborhood radius of T2.
neighborhood radius of T2.

3.3. Figure
Optimal10 shows thatThreshold
Segmentation the areas with negative values correspond to vegetation, and
the boundaries around vegetation
The optimal threshold value was areselected
highlighted.
based The vegetation
on the results of boundaries
the optimal become
scale
more prominent and
elevation-variation the contrast
coefficient. The between groundthresholds
error at different and vegetation becomes
was analyzed more
and promi-
quantita-
nent with increasing neighborhood radius. However, the vegetation areas
tively assessed by comparing the terrain modeling results for the 3.2 m and 0.2 m EVCs display obvious
false negativesneighborhoods.
in one-pixel inside boundaries The in someincases,
values and low
vegetation and
areas small
were vegetation
mostly areas
negative also
after
correspond
calculatingtothefalse negatives.
elevation variation in the VGs at different scales, and the corresponding
EVCs The error
were alsoresults for terrain
negative. modeling
The selection of thewith different-scale
threshold was mainly EVCs werebyquantitatively
affected vegetation
boundaries.
compared andAanalyzed.
comparison Theof same
the filtering errorwas
threshold results is shown
selected in Figure 12. in the process
for segmentation
of terrain modeling. The trends of the type I error and total error are the same in the quan-
titative error comparison in Figure 11. The error increased with increasing scale. The error
decreased with increasing scale for the type II error in T1 in Figure 11a, but the range of
change was small. The error increased with increasing scale for the type II error in T1 in
Figure 11b. Based on the results in Figure 10 and Figure 11, it can be concluded that the
filtering result obtained by calculating the EVC in a one-neighborhood radius was the
most accurate.
Remote
RemoteSens.
Sens.2023,
2023, 15,
15, 3569 13 of13
20 of 21

Figure
Figure 11. Filtering error
11. Filtering error comparison
comparison at
at different
different radii:
radii: (a)
(a) error
error comparison
comparison for
for T1;
T1; (b)
(b)error
errorcom-
comparison
parison for for
T2.T2.

It is obvious
3.3. Optimal from the Threshold
Segmentation quantitative error comparison in Figure 12 that the variation
in the error was related to the selection of threshold. The trends of the type I error and
The optimal
total error threshold
were roughly value was
the same. selected
As the based onthreshold
segmentation the results of the optimal
decreased from 1.0scale
elevation-variation coefficient. The error at different thresholds was
to 1.5, minimum error was observed at 1.5. Then, as the threshold increased from analyzed and 1.5
quanti-
tatively
to 2.0, theassessed by comparing
error continually the terrain
increased. modeling
The overall resultstrend
variation for the
was3.2themsame
and 0.2
for m
theEVCs
in one-pixel
type neighborhoods.
II error, with The values
a minimum error in vegetation
at a threshold areas were
of 1.4, followed by mostly
1.6 and negative
1.5. Afterafter
calculating the consideration,
comprehensive elevation variation in themodeling
the terrain VGs at different
accuracy scales,
was theand the corresponding
highest when the
EVCs were also
segmentation negative.
threshold wasThe selection
1.5. of the threshold
In the experiment, it waswas mainly
found thataffected by vegetation
the segmentation
threshold
boundaries. displayed a certainofrelationship
A comparison the filteringwith the
error vegetation
results height
is shown in in the sample
Figure 12. area.
The abrupt changes in vegetation edges were the most obvious, and the EVC increased
with increasing vegetation height.
Remote
RemoteSens.
Sens.2023,
2023, 15,
15, 3569
3569 14 of14
20 of 21

Figure 12.Filtering
Figure12. Filteringerror
errorcomparison
comparisonat at
different thresholds:
different (a) (a)
thresholds: error comparison
error for T1;
comparison for(b)
T1;error
(b) error
comparison for T2.
comparison for T2.

3.4. Accuracy Analysis


It is obvious from the quantitative error comparison in Figure 12 that the variation in
The terrain-modeling
the error was related to themethod based
selection ofon the EVC was
threshold. The applied
trends of in the
twotype
research areas
I error and tototal
evaluate
error werethe roughly
accuracy of
thethe method.
same. The segmentation
As the specific parameter selection
threshold scheme was
decreased analyzed
from 1.0 to 1.5,
in detail using
minimum thewas
error T1 area as an example,
observed and as
at 1.5. Then, thethe
same approach
threshold was usedfrom
increased for T2.
1.5Table 3 the
to 2.0,
shows the parameters used in the experiment in the two study areas.
error continually increased. The overall variation trend was the same for the type II error,
withThe error results
a minimum based
error on the testof
at a threshold data
1.4,are shown by
followed in Table
1.6 and4. The type Icomprehensive
1.5. After error, type
II error, and total error in the T1 area were 9.20%, 5.83%, and 7.68%, respectively. The
consideration, the terrain modeling accuracy was the highest when the segmentation
type I error, type II error, and total error in the T2 area were 1.93%, 5.84%, and 2.28%,
threshold was 1.5. In the experiment, it was found that the segmentation threshold dis-
respectively. The type I error was larger than the type II error, which was due to the large
played aofcertain
amount relationshipareas
vegetation-covered withinthe
thevegetation
study area. height in theofsample
The purpose area. filtering
point-cloud The abrupt
changes
is in vegetation
to accurately edges were
extract ground pointsthe
andmost
ensureobvious,
terrain and the EVC
accuracy. increased
Therefore, typewith increas-
II errors
ing vegetation height.
Remote Sens. 2023, 15, 3569 15 of 20

should be controlled first. Some type I errors could be sacrificed to ensure that the filtered
data do not contain areas of vegetation.

Table 3. Parameters of our method in the study areas.

Study Areas T1 T2
Large-scale VG 3.2 m 2.0 m
Small-scale VG 0.2 m 0.2 m
Neighborhood radius 1 grid 1 grid
Threshold 1.5 1.5

Table 4. Error results based on the test data.

Result
Reference Sum Error (%)
Ground Points Vegetation Points
Ground points 2,215,181 224,442 2,439,623 Type I: 9.20
T1 Vegetation points 117,420 1,896,071 2,013,491 Type II: 5.83
Sum 2,332,601 2,120,513 4,453,114 Total: 7.68
Ground points 705,636 13,893 719,529 Type I: 1.93
T2 Vegetation points 4130 66,601 70,731 Type II: 5.84
Sum 709,766 80,494 790,260 Total: 2.28

4. Discussion
The accuracy of terrain modeling based on UAVs is significantly affected by surface
vegetation [31]. Therefore, vegetation removal is an important step in terrain modeling.
Accurate vegetation removal is the key to ensuring the accuracy of terrain models. The
current methods have shortcomings in areas with complex terrain and low vegetation,
and they mainly identify and remove vegetation based on specific scales [32]. However,
the characteristics of terrain and ground features at different spatial scales are obviously
different, and it is difficult to accurately distinguish vegetation by only considering the
characteristics of a single scale [33]. Some scholars have improved modeling methods from
the perspective of multiscale progression [26,34,35], but individual features are extracted.
In recent years, point-cloud filtering methods based on deep learning have gradually
increased in popularity [36,37]. In these methods, the point clouds of training samples are
usually labeled with different categories. Not only ground points and vegetation points,
but also multiple classes of objects such as trees, roads, and buildings can be detected.
These methods represent a new research direction. However, the implementation of these
methods requires a large number of samples to be labeled and trained in advance. The
characteristics of training samples and the accuracy of labeling have a considerable impact
on the final results.
In this approach, a series of variations in elevation values after scale synthesis based
on VGs reflect the characteristics of the terrain at different scales. Therefore, a UAV terrain-
modeling method based on multiscale EVCs in low-vegetation areas is proposed using
elevation variability coefficients to quantify and extract differences in topographic features
at different scales.
To measure the final terrain-modeling accuracy of this method, cloth simulation filter-
ing (CSF), triangulated irregular network (TIN) filtering, and progressive morphological
filtering (PMF) were used to model the terrain in the two study areas. The parameter
settings of the three methods referred to the references [6,8,38] to optimize their filtering
results and, thus, the comparison with our method is reasonable. These three methods were
implemented in the software packages CloudCompare, Photoscan, and PCL, respectively.
The point cloud after vegetation removal with our method is shown in Figure 13a. The
parameters of compared methods were set as follows. In the CSF method, the scene was set
to steep slope, the cloth resolution was set to 0.3, the maximum number of iterations was
cal filtering (PMF) were used to model the terrain in the two study areas. The parameter
settings of the three methods referred to the references [6,8,38] to optimize their filtering
results and, thus, the comparison with our method is reasonable. These three methods
were implemented in the software packages CloudCompare, Photoscan, and PCL, respec-
Remote Sens. 2023, 15, 3569 tively. The point cloud after vegetation removal with our method is shown in Figure 13a.
16 of 20
The parameters of compared methods were set as follows. In the CSF method, the scene
was set to steep slope, the cloth resolution was set to 0.3, the maximum number of itera-
tions was set to 500, the classification threshold was set to 0.2, and the slope preprocessing
set to 500, the classification threshold was set to 0.2, and the slope preprocessing condition
condition was set to off. The point cloud after vegetation removal with the CSF method is
was set to off. The point cloud after vegetation removal with the CSF method is shown in
shown in Figure 13b. Notably, vegetation removal was most effective in areas of dense
Figure 13b. Notably, vegetation removal was most effective in areas of dense vegetation
vegetation
cover, cover,
but less but less
effective effective
in gully areasinwith
gullycomplex
areas with complex In
topography. topography. In the TIN
the TIN method, the
max angle was set to 10, the max distance was set to 0.5, and the cell size was setsize
method, the max angle was set to 10, the max distance was set to 0.5, and the cell was
to five.
set to
The five.cloud
point The point cloud afterremoval
after vegetation vegetationwithremoval
the TIN with the TIN
method method
is shown in is shown
Figure 13c.in
Figure 13c. Topographic
Topographic features werefeatures
preserved were preserved
in gully in gully
areas, but areas, but
the method the method
performed poorlyper-in
areas of dense vegetation cover. In the PMF method, the max window size was setwindow
formed poorly in areas of dense vegetation cover. In the PMF method, the max to five,
sizeterrain
the was setslope
to five,
wastheset
terrain
to 0.5slope
f, thewas set elevation
initial to 0.5 f, the initial elevation
threshold was setthreshold wasthe
to 0.5 f, and set
to 0.5 f, and the max elevation threshold was set to 0.5 f. The point cloud
max elevation threshold was set to 0.5 f. The point cloud after vegetation removal with the after vegetation
removal
PMF methodwithisthe PMF in
shown method
Figureis13d.
shownThein Figurewas
method 13d.effective
The method was effective
in filtering in filter-
and removing
ing and removing
vegetation vegetation
in flat areas, in flat areas,
but performed butinperformed
poorly poorly
areas of dense in areas of
vegetation dense
cover andvegeta-
areas
tion cover
with gullies.andThe
areas
DEMwith gullies. The
obtained DEM obtained
by interpolating thebypoint-cloud
interpolating the after
data point-cloud data
vegetation
after vegetation removal with our method for the
removal with our method for the T1 area is shown in Figure 13f. T1 area is shown in Figure 13f.

Figure 13. Results for T1: (a) the point cloud after vegetation removal with our method for T1; (b) the
point cloud after vegetation removal with the CSF method for T1; (c) the point cloud after vegetation
removal with the TIN method for T1; (d) the point cloud after vegetation removal with the PMF
method for T1; (e) the DSM of T1; (f) the DEM obtained by interpolating the point-cloud data after
vegetation removal with our method for T1.

The point cloud after vegetation removal with our method is shown in Figure 14a. The
parameters of the compared methods were set as follows. In the CSF method, the scene was
set to steep slope, the cloth resolution was set to 0.3, the maximum number of iterations was
set to 500, the classification threshold was set to 0.3, and the slope preprocessing condition
was set to on. The point cloud after vegetation removal with the CSF method is shown in
Figure 14b. In the TIN method, the max angle was set to 0.3, the max distance was set to
0.5, and the cell size was set to one. The point cloud after vegetation removal with the TIN
The point cloud after vegetation removal with our method is shown in Figure 14a.
The parameters of the compared methods were set as follows. In the CSF method, the
scene was set to steep slope, the cloth resolution was set to 0.3, the maximum number of
iterations was set to 500, the classification threshold was set to 0.3, and the slope prepro-
cessing condition was set to on. The point cloud after vegetation removal with the CSF
Remote Sens. 2023, 15, 3569 17 of 20
method is shown in Figure 14b. In the TIN method, the max angle was set to 0.3, the max
distance was set to 0.5, and the cell size was set to one. The point cloud after vegetation
removal with the TIN method is shown in Figure 14c. In the PMF method, the max win-
method
dow size is was
shownset in
to Figure 14c.terrain
three, the In the slope
PMF method,
was set tothe0.5max window
f, the size was set
initial elevation to three,
threshold
the terrain slope was set to 0.5 f, the initial elevation threshold was set to
was set to 0.5 f, and the max elevation threshold was set to 0.5 f. The point cloud after 0.5 f, and the max
elevation threshold was set to 0.5 f. The point cloud after vegetation
vegetation removal with the PMF method is shown in Figure 14d. The DEM obtained by removal with the PMF
method is shown
interpolating in Figure
the point cloud 14d.
afterThe DEM obtained
vegetation removalbywith
interpolating
our method thefor
point
the cloud
T2 area after
is
vegetation removal
shown in Figure 14f. with our method for the T2 area is shown in Figure 14f.

Figure14.
Figure 14.Results
Resultsfor
forT2.
T2.(a)
(a)The
Thepoint
pointcloud
cloudafter
aftervegetation
vegetationremoval
removalwithwith our
our method
method forfor
T2;T2;
(b)(b)
the
the point
point cloudcloud
afterafter vegetation
vegetation removal
removal withwith the method
the CSF CSF method for(c)
for T2; T2;the
(c)point
the point
cloudcloud
after after veg-
vegetation
etation removal
removal with thewith
TINthe TIN method
method for T2;for
(d)T2;
the(d) the cloud
point point cloud after vegetation
after vegetation removal
removal withwith the
the PMF
PMF method for T2; (e) the DSM for T2; (f) the DEM obtained by interpolating the point-cloud data
method for T2; (e) the DSM for T2; (f) the DEM obtained by interpolating the point-cloud data after
after vegetation removal with our method for T2.
vegetation removal with our method for T2.

Figures13a
Figures 13aand
and14a 14ashow
showthat
thatthe
theproposed
proposedfiltering
filtering algorithm
algorithm cancan effectively
effectively filter
filter the
the ground points and effectively remove a large amount of vegetation
ground points and effectively remove a large amount of vegetation cover. Compared with cover. Compared
with
the the methods,
other other methods, it can deal
it can better better deal
with withwith
areas areaslowwith low vegetation
vegetation coveragecoverage and
and preserve
preserve the details of terrain. A comparison of Figures 13f and 14f with Figures
the details of terrain. A comparison of Figures 13f and 14f with Figures 13e and 14e indicates 13e and
14e indicates that compared with the DSM, the DEM retains the topographic
that compared with the DSM, the DEM retains the topographic features of the region and features of
the region and effectively eliminates vegetation points. A comparison of
effectively eliminates vegetation points. A comparison of Figures 13a–d and 14a–d indicatesFigures 13a–d
andCSF
that 14a–d
canindicates
be used to that CSF can
remove mostbeofused to remove but
the vegetation, most ofeffect
the the vegetation, but the
is poor for low effect
vegetation
distributed in patches, and the terrain boundaries are eliminated where the slope changes
greatly. The TIN method preserves the boundaries of terrain, but the effect of vegetation
removal is poor in areas with large terrain gradients. The PMF method works well in flat-
terrain areas, but it does not work well in some areas with large topographic relief, and the
algorithm has many parameters that are difficult to set. We quantitatively analyzed the results
of the different methods, and an error comparison is shown in Table 5.
The results in Table 5 show that the method based on the multiscale EVC is superior
to the CSF, TIN, and PMF methods in terms of the type I error, type II error, and total error
based on the terrain-modeling results for the T1 and T2 study areas. In general, the filtering
algorithm proposed in this study is more suitable for areas with low and dense vegetation.
Notably, low vegetation is filtered, and the ground points are accurately retrained, thus
meeting the requirements for generating high-precision DEMs.
Remote Sens. 2023, 15, 3569 18 of 20

Table 5. Filtering error comparison for the three methods.

Sample Error Our Method CSF TIN PMF


Type I (%) 9.20 8.66 22.32 1.64
Xining Type II (%) 5.83 17.41 7.72 12.70
Total (%) 7.68 12.62 15.72 6.64
Type I (%) 1.93 3.01 2.46 2.57
Yulin Type II (%) 5.84 10.45 27.62 27.73
Total (%) 2.28 5.97 4.71 4.82

The point-cloud densities in the two study areas differ significantly at 189 points/m2
and 27 points/m2 , respectively. However, the parameter settings for the two study areas
are almost identical, except for the large-scale VG. The point-cloud density may affect the
results. If the point cloud is too sparse, the optimal scale may be difficult to determine be-
cause the sparse point cloud will not include all the vegetation or terrain relief information.
However, in this study, the point-cloud densities were all sufficiently high, so the result
varied little with the selected threshold. In addition, although the point-cloud density
varied, the vegetation type (low-rise vegetation), and topography in the two plots were
similar, and, thus, the threshold values were also similar. This indicates that the effect of
vegetation type may be greater than that of the point-cloud density. In addition, the optimal
size of the small-scale VG in both study areas was 0.2 m. The size of the small-scale VG
determined the size of the output elevation difference. The size of 0.2 m not only ensures
the precision of vegetation removal but also avoids misjudgments regarding areas of low
topographic relief. The optimal value of the neighborhood radius for both sample areas
was one grid. Increasing the neighborhood radius does not have a significant effect on
the type II error and mainly causes rapid increases in the type I error and the total error
(the proportion of ground points misclassified as vegetation points increases). To ensure
accuracy, the optimal solution for the neighborhood radius should be re-explored based
on the study area. The vegetation in both study areas was characterized by low-growing
vegetation. The optimal solution for the splitting threshold in both study areas was 1.5. This
value ensures that the vast majority of low vegetation in both study areas can be identified.
The optimal size of the large-scale VG varied in the two study areas. The optimal value for
T1 was 3.2 m, but the optimal value for T2 was 2.0 m. The vegetation in T1 was densely
distributed, and the vegetation in T2 was characterized by a mostly isolated distribution.
Vegetation distribution patterns may be related to the appropriate selection of the size of the
large-scale VG. In future work, the applicability of the methods in this paper can be verified
in areas with high vegetation coverage or containing artificial features. However, the choice
of parameters may not yield the same results. Future improvements to this method in
terms of the adaptive selection of parameters should be considered. Furthermore, UVA
photogrammetry can obtain optical images of the test areas, and the normalized difference
vegetation index (NDVI) can be used to identify vegetated areas. Notably, the NDVI could
be introduced in follow-up research to improve the filtering accuracy.

5. Conclusions
This paper addresses the issue of vegetation removal during terrain modeling. Ac-
cording to the different topographic features expressed by point-cloud data at different
scales, a terrain-modeling method based on multiscale EVCs was proposed in this paper.
The amount of elevation change between any two different scales of VGs was calculated
and the EVC of the VG that yielded the largest elevation variation was determined. The
optimal parameters were analyzed and discussed. The experimental results show that
the multiscale EVC method can accurately remove vegetation points and reserve ground
points in low and dense vegetation-covered areas. The error results were better than those
of the CSF, TIN, and PMF methods. The type I error, type II error, and total error in the
study areas ranged from 1.93 to 9.20%, 5.83 to 5.84%, and 2.28 to 7.68%, respectively. The
Remote Sens. 2023, 15, 3569 19 of 20

total error of the proposed method was 2.43–2.54% lower than that of the CSF, TIN, and
PMF algorithms in the study areas.
The parameters of the proposed method are easier to set than those of other methods.
The parameters minimally changed between the two study areas. The optimal small-scale
VG, neighborhood radius, and threshold in the two study areas were the same (0.2 m for
the small-scale VG, 1 grid for the optimal neighborhood radius, and 1.5 for the threshold),
thus highlighting the robustness of the proposed method. Only the large-scale VG changed
in the two study areas. The larger the scale difference is, the more prominent the vegetation
characteristics are. The optimal large-scale VG size was 3.2 m in the T1 and 2.0 m at T2,
which is only a minor difference. The optimal scale of VGs may differ in different areas,
and this scale is related to vegetation distribution patterns. In general, we recommend
2.0–3.0 m as the large-scale VG size. The proposed method provides a foundation for the
rapid establishment of high-precision DEMs based on UAV photogrammetry.

Author Contributions: Conceptualization, J.F. and W.D.; methodology, J.F. and W.D.; software, J.F.;
validation, J.F., W.D. and B.W.; formal analysis, J.Y.; investigation, K.C.; resources, B.W.; data curation,
J.F.; writing—original draft preparation, J.F.; writing—review and editing, W.D.; visualization, J.L.;
supervision, W.D.; project administration, B.W.; funding acquisition, J.F and B.W. All authors have
read and agreed to the published version of the manuscript.
Funding: We are grateful for the financial support provided by the Natural Science Foundation of
the Jiangsu Higher Education Institutions of China (No. 22KJB170016), the National Natural Science
Foundation of China (No. 42171402 and 41930102), and the Graduate Practice Innovation Program of
the Jiangsu Province of China (No. SJCX23_0418).
Data Availability Statement: Not applicable.
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Shahbazi, M.; Menard, P.; Sohn, G.; Théau, J. Unmanned aerial image dataset: Ready for 3D reconstruction. Data Brief 2019,
25, 103962. [CrossRef] [PubMed]
2. Berrett, B.E.; Vernon, C.A.; Beckstrand, H.; Pollei, M.; Markert, K.; Franke, K.W.; Hedengren, J.D. Large-scale reality modeling of a
university campus using combined UAV and terrestrial photogrammetry for historical preservation and practical use. Drones
2021, 5, 136. [CrossRef]
3. Dai, W.; Qian, W.; Liu, A.; Wang, C.; Yang, X.; Hu, G.; Tang, G. Monitoring and modeling sediment transport in space in small
loess catchments using UAV-SfM photogrammetry. CATENA 2022, 214, 106244. [CrossRef]
4. Dai, W.; Tang, G.; Hu, G.; Yang, X.; Xiong, L.; Wang, L. Modelling sediment transport in space in a watershed based on topographic
change detection by UAV survey. Prog. Geogr. 2021, 40, 1570–1580. [CrossRef]
5. Meng, X.; Currit, N.; Zhao, K. Ground filtering algorithms for airborne LiDAR data: A review of critical issues. Remote Sens. 2010,
2, 833–860. [CrossRef]
6. Axelsson, P. DEM generation from laser scanner data using adaptive TIN models. Int. Arch. Photogramm. Remote Sens. 2000,
33, 110–117.
7. Yang, Y.B.; Zhang, N.N.; Li, X.L. Adaptive slope filtering for airborne Light Detection and Ranging data in urban areas based on
region growing rule. Emp. Surv. Rev. 2016, 49, 139–146. [CrossRef]
8. Zhang, W.; Qi, J.; Wan, P.; Wang, H.; Yan, G. An Easy-to-Use Airborne LiDAR Data Filtering Method Based on Cloth Simulation.
Remote Sens. 2016, 8, 501. [CrossRef]
9. Dong, Y.; Cui, X.; Zhang, L.; Ai, H. An Improved Progressive TIN Densification Filtering Method Considering the Density and
Standard Variance of Point Clouds. Int. J. Geo-Inf. 2018, 7, 409. [CrossRef]
10. Véga, C.; Durrieu, S.; Morel, J.; Allouis, T. A sequential iterative dual-filter for Lidar terrain modeling optimized for complex
forested environments. Comput. Geosci. 2012, 44, 31–41. [CrossRef]
11. Hui, Z.; Jin, S.; Xia, Y.; Nie, Y.; Li, N. A mean shift segmentation morphological filter for airborne LiDAR DTM extraction under
forest canopy. Opt. Laser Technol. 2021, 136, 106728. [CrossRef]
12. Chen, Q.; Gong, P.; Baldocchi, D.; Xie, G. Filtering airborne laser scanning data with morphological methods. Photogramm. Eng.
Remote Sens. 2007, 73, 175. [CrossRef]
13. Huang, S.; Liu, L.; Dong, J.; Fu, X.; Huang, F. SPGCN: Ground filtering method based on superpoint graph convolution neural
network for vehicle LiDAR. J. Appl. Remote Sens. 2022, 16, 016512. [CrossRef]
14. Yilmaz, V. Automated ground filtering of LiDAR and UAS point clouds with metaheuristics. Opt. Laser Technol. 2021, 138, 106890.
[CrossRef]
Remote Sens. 2023, 15, 3569 20 of 20

15. Sithole, G. Filtering of Laser Altimetry Data using a Slope Adaptive Filter. Int. Arch. Photogramm. Remote Sens. 2011, 34, 203–210.
16. Chen, C.; Chang, B.; Li, Y.; Shi, B. Filtering airborne LiDAR point clouds based on a scale-irrelevant and terrain-adaptive approach.
Measurement 2021, 171, 108756. [CrossRef]
17. Xia, T.; Yang, J.; Chen, L. Automated semantic segmentation of bridge point cloud based on local descriptor and machine learning.
Autom. Constr. 2022, 133, 103992. [CrossRef]
18. Chen, C.; Guo, J.; Wu, H.; Li, Y.; Shi, B. Performance comparison of filtering algorithms for high-density airborne Lidar point
clouds over complex landscapes. Remote Sens. 2021, 13, 2663. [CrossRef]
19. Kláptě, P.; Fogl, M.; Barták, V.; Gdulová, K.; Urban, R.; Moudr, V. Sensitivity analysis of parameters and contrasting performance of
ground filtering algorithms with UAV photogrammetry-based and LiDAR point clouds. Int. J. Digit. Earth 2020, 13, 23. [CrossRef]
20. Li, H.; Ye, W.; Liu, J.; Tan, W.; Pirasteh, S.; Fatholahi, S.N.; Li, J. High-resolution terrain modeling using airborne lidar data with
transfer learning. Remote Sens. 2021, 13, 3448. [CrossRef]
21. Dai, W.; Yang, X.; Na, J.; Li, J.; Brus, D.; Xiong, L.; Tang, G.; Huang, X. Effects of DEM resolution on the accuracy of gully maps in
loess hilly areas. CATENA 2019, 177, 114–125. [CrossRef]
22. Li, S.; Dai, W.; Xiong, L.; Tang, G. Uncertainty of the morphological feature expression of loess erosional gully affected by DEM
resolution. J. Geo-Inf. Sci. 2020, 22, 338–350.
23. Xiong, L.; Li, S.; Tang, G.; Strobl, J. Geomorphometry and terrain analysis: Data, methods, platforms and applications. Earth-Sci.
Rev. 2022, 233, 104191. [CrossRef]
24. Cai, S.; Liang, X.; Yu, S. A Progressive Plane Detection Filtering Method for Airborne LiDAR Data in Forested Landscapes. Forests
2023, 14, 498. [CrossRef]
25. Song, D. A Filtering Method for LiDAR Point Cloud Based on Multi-Scale CNN with Attention Mechanism. Remote Sens. 2022,
14, 6170.
26. Hui, Z.; Hu, Y.; Yevenyo, Y.Z.; Yu, X. An Improved Morphological Algorithm for Filtering Airborne LiDAR Point Cloud Based on
Multi-Level Kriging Interpolation. Remote Sens. 2016, 8, 35. [CrossRef]
27. Bailey, G.; Li, Y.; McKinney, N.; Yoder, D.; Wright, W.; Herrero, H. Comparison of Ground Point Filtering Algorithms for
High-Density Point Clouds Collected by Terrestrial LiDAR. Remote Sens. 2022, 14, 4776. [CrossRef]
28. Hiep, N.H.; Luong, N.D.; Ni, C.F.; Hieu, B.T.; Huong, N.L.; Duong, B.D. Factors influencing the spatial and temporal variations of
surface runoff coefficient in the Red River basin of Vietnam. Environ. Earth Sci. 2023, 82, 56. [CrossRef]
29. Huang, F.; Yang, J.; Zhang, B.; Li, Y.; Huang, J.; Chen, N. Regional Terrain Complexity Assessment Based on Principal Component
Analysis and Geographic Information System: A Case of Jiangxi Province, China. Int. J. Geo-Inf. 2020, 9, 539. [CrossRef]
30. Sithole, G.; Vosselman, G. Report: ISPRS Comparison Of Filters; ISPRS Commission III, Working Group: Enschede, The Netherlands, 2003.
31. Wang, Y.; Koo, K.-Y. Vegetation Removal on 3D Point Cloud Reconstruction of Cut-Slopes Using U-Net. Appl. Sci. 2021, 12, 395.
[CrossRef]
32. Ma, H.; Zhou, W.; Zhang, L. DEM refinement by low vegetation removal based on the combination of full waveform data and
progressive TIN densification. ISPRS J. Photogramm. Remote Sens. 2018, 146, 260–271. [CrossRef]
33. Ren, Y.; Li, T.; Xu, J.; Hong, W.; Zheng, Y.; Fu, B. Overall filtering algorithm for multiscale noise removal from point cloud data.
IEEE Access 2021, 9, 110723–110734. [CrossRef]
34. Wang, X.; Ma, X.; Yang, F.; Su, D.; Qi, C.; Xia, S. Improved progressive triangular irregular network densification filtering algorithm
for airborne LiDAR data based on a multiscale cylindrical neighborhood. Appl. Opt. 2020, 59, 6540–6550. [CrossRef] [PubMed]
35. Yang, B.; Huang, R.; Dong, Z.; Zang, Y.; Li, J. Two-step adaptive extraction method for ground points and breaklines from lidar
point clouds. ISPRS J. Photogramm. Remote Sens. 2016, 119, 373–389. [CrossRef]
36. Liu, W.; Sun, J.; Li, W.; Hu, T.; Wang, P. Deep Learning on Point Clouds and Its Application: A Survey. Sensors 2019, 19, 4188.
[CrossRef]
37. Hu, X.; Yuan, Y. Deep-Learning-Based Classification for DTM Extraction from ALS Point Cloud. Remote Sens. 2016, 8, 730.
[CrossRef]
38. Zhang, K.; Chen, S.-C.; Whitman, D.; Shyu, M.-L.; Yan, J.; Zhang, C. A progressive morphological filter for removing nonground
measurements from airborne LIDAR data. IEEE Trans. Geosci. Remote Sens. 2003, 41, 872–882. [CrossRef]

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.

You might also like