Remotesensing 12 00317
Remotesensing 12 00317
Article
Classification of 3D Point Clouds Using Color
Vegetation Indices for Precision Viticulture and
Digitizing Applications
Francisco-Javier Mesas-Carrascosa 1, * , Ana I. de Castro 2 , Jorge Torres-Sánchez 2 ,
Paula Triviño-Tarradas 1 , Francisco M. Jiménez-Brenes 2 , Alfonso García-Ferrer 1 and
Francisca López-Granados 2
1 Department of Graphic Engineering and Geomatics, University of Cordoba, Campus de Rabanales,
Crta. IV, km. 396, E-14071 Córdoba, Spain; [email protected] (P.T.-T.); [email protected] (A.G.-F.)
2 Imaping Group, Department of Crop Protection, Institute for Sustainable Agriculture (IAS), Spanish
National Research Council (CSIC), E-14004 Córdoba, Spain; [email protected] (A.I.d.C.);
[email protected] (J.T.-S.); [email protected] (F.M.J.-B.); [email protected] (F.L.-G.)
* Correspondence: [email protected]
Received: 17 November 2019; Accepted: 16 January 2020; Published: 18 January 2020
Abstract: Remote sensing applied in the digital transformation of agriculture and, more particularly, in
precision viticulture offers methods to map field spatial variability to support site-specific management
strategies; these can be based on crop canopy characteristics such as the row height or vegetation cover
fraction, requiring accurate three-dimensional (3D) information. To derive canopy information, a set
of dense 3D point clouds was generated using photogrammetric techniques on images acquired by
an RGB sensor onboard an unmanned aerial vehicle (UAV) in two testing vineyards on two different
dates. In addition to the geometry, each point also stores information from the RGB color model,
which was used to discriminate between vegetation and bare soil. To the best of our knowledge, the
new methodology herein presented consisting of linking point clouds with their spectral information
had not previously been applied to automatically estimate vine height. Therefore, the novelty of
this work is based on the application of color vegetation indices in point clouds for the automatic
detection and classification of points representing vegetation and the later ability to determine the
height of vines using as a reference the heights of the points classified as soil. Results from on-ground
measurements of the heights of individual grapevines were compared with the estimated heights from
the UAV point cloud, showing high determination coefficients (R2 > 0.87) and low root-mean-square
error (0.070 m). This methodology offers new capabilities for the use of RGB sensors onboard UAV
platforms as a tool for precision viticulture and digitizing applications.
Keywords: UAV imagery; grapevine height; DSM; RGB sensor; structure; vineyard
1. Introduction
Precision agriculture involves the collection and use of large amounts of georeferenced data
relating to crops and their attributes in production areas at a high spatial resolution [1]. Its purpose
is for site-specific management of crop heterogeneity at both time and spatial scales [2] to optimize
agricultural inputs. As a second consequence, this high-quality information can be connected with
the global objective of creating opportunities for digitizing agriculture by renewing processes and
technologies to make the sector more insight-driven and efficient, without forgetting the need for
improving yield and final product quality and reducing the environmental impact of agricultural
activity. Factors like soil, water availability, pests (e.g., incidences related to weeds, fungi, insects,
root-knot nematodes), presence of cover crops between grape rows, topography, and variable climatic
conditions cause different responses in the crop which are ultimately reflected in spatial fluctuations in
yield and grape composition [3]. Precision viticulture (PV) and digitizing-related strategies, which
fall inside precision agriculture and the wider concept of digital transformation of agriculture, could
contribute to solving and managing this spatial heterogeneity in a sustainable way. This is because their
main objectives are the monitoring of vineyard variability and the design of site-specific management
accordingly to improve production efficiency with reduced inputs (e.g., labor, fuel, water, canopy
management, or phytosanitary applications) [4].
The implementation of PV can be considered a process which begins with the observation of
vineyard attributes, followed by the interpretation and evaluation of collected data, implementation of
targeted management (e.g., irrigation, fertilizers, spray, pruning or other canopy management, and even
selective harvesting), and, finally, evaluation of the implemented management [5]. The efficiency of PV,
particularly referring to vineyard zone delineation for site-specific phytosanitary foliar applications,
depends on many interacting factors related to canopy crop characteristics like height, vegetative stage,
or growing habits of the corresponding grape variety, which must be properly combined to adapt the
chemical application to the foliar part of the crop [6]. In this context, georeferenced information on
grapevine height at the field scale is one of the most important structural inputs used to map and
monitor the vineyard and to provide accurate information for rational decision-making [7].
Phenological development stage is related to biophysical processes, and among all phenotypic
characteristics, crop height is an adequate indicator of crop yield [8,9], evapotranspiration [10–12],
health [13,14], and biomass [9,15]. Most of the works conducted to measure tree height or crown
volume among other characteristics using geomatic techniques have been related to forest areas [16,17].
These geometric characteristics, to a lesser degree, are also used in agriculture as indicators to evaluate
pruning, pest effects on crops or fruit detection [18–21], probably motivated by their difficulty to
measure [22]. Collecting these data at the field scale is time-consuming and offers uncertain results
because of the variability of tree crowns in orchards and the difficulty of fitting to geometric models
such as cones or ovoids. To date, the measurement and characterization of plant structures has been
carried out using different remote sensed alternatives like radar [23], hemispherical photography [24],
digital photogrammetric techniques [25], light sensors [26], stereo images [27], ultrasonic sensors [28],
and Light Detection and Ranging (LiDAR) sensors [29]. Despite the great variety of technologies
used to characterize the 3D structures of plants, many of them have aspects that limit their use; only
a small group of them are suitable for this purpose, with LiDAR and those based on stereoscopic
images being the most relevant [30]. On the one hand, those methodologies based on terrestrial laser
scanners are very precise in measuring tree architecture [31,32]; however, they are inefficient over
large spatial extents [33]. Similarly, those methods based on stereoscopic images require a very high
spatial resolution to properly model the 3D characteristics of woody crops [34]. In this context, images
registered by sensors on board satellites or piloted aircraft platforms do not satisfy these technical
requirements, with unmanned aerial vehicles (UAVs) instead being the most adequate platform. The
advantages of UAV application in agriculture have been demonstrated in comparison to traditional
platforms regarding their very high spatial and temporal resolution [35] and low cost [36,37], which
make UAV technology an adequate tool to monitor crops at the field scale [35,38].
UAVs are capable of carrying sensors like LiDAR [39], RGB [40], thermal [41], multispectral [42],
and hyperspectral [43] sensors. Although LiDAR UAV data, combined with data from Global
Navigation Satellite System (GNSS) and inertial measurement unit (IMU) sensors, provide 3D point
clouds to monitor plant structure information, their use is limited because of the system weight and
economical cost [44], requiring very strong platforms [45]. As an alternative, UAV images registered
by passive sensors can form an alternative for the 3D characterization of crops by producing digital
surface models (DSMs). A DSM can be understood to be an image in which pixel values contain
elevation information or a set of 3D geometries. DSMs are obtained by Structure from Motion (SFM)
algorithms, which can also produce very highly dense 3D point clouds with color which corresponds
to the color of the original image pixel where each point is projected as part of the processing pipeline.
Remote Sens. 2019, 11, x FOR PEER REVIEW 3 of 20
RemoteIn the2020,
Sens. case12,of
317agricultural applications, one of the most crucial steps is the segmentation of 3 ofsoil
19
and vegetation. Using orthomosaics, the segmentation of soil and vegetation can be carried out by
vegetation indices (VIs) using different spectral bands and their combinations. Of all the possible VIs,
colorInvegetation
the case ofindices
agricultural
(CVIs)applications,
using common one of thegreen,
red, most crucial
and blue steps is the
(RGB) segmentation
sensors onboardofUAV soil
and vegetation. Using orthomosaics, the segmentation of soil and vegetation
platforms are used to accentuate plant greenness [46]. On the other hand, other methods are based can be carried out by
vegetation
on using DSMs. indices Many(VIs)approaches
using different
have spectral bands and
been developed their combinations.
to detect, delineate, andOf all theobjects
segment possible in
VIs, color vegetation indices (CVIs) using common red, green, and blue (RGB)
either raster or vector data [47–49]. For raster DSMs, some strategies have used object-based image sensors onboard UAV
platforms are used
analysis (OBIA) to accentuate
[50,51], plant [52,53],
local maxima greenness [46]. On the
or watershed other hand, [54,55],
segmentation other methods
among are based
others. Of
on using DSMs. Many approaches have been developed to detect, delineate,
these, the OBIA methods have successfully classified and identified single olive trees [56] and vertical and segment objects
in either
trellis raster or
vineyards vector
[57], data OBIA
although [47–49]. For raster
methods need DSMs,
to designsome strategies
an effective rulehave
set toused
assignobject-based
the correct
image
scale, analysis
shape, and (OBIA) [50,51], local
compactness maxima [52,53],
parameters or watershed
to obtain meaningful segmentation
objects [58]. [54,55],
For among
vector others.
DSMs,
Of these, the OBIA methods have successfully classified and identified
authors have used the adaptive clustering algorithm [59] or top-to-bottom region growing approach single olive trees [56] and
vertical trellis vineyards [57], although OBIA methods need to design
[60], and previous research projects have successfully used DSMs on both herbaceous [61,62] and an effective rule set to assign
the
woodycorrect
cropsscale, shape,
[56,57], andon
based compactness
geometricalparameters
approaches.toTherefore,
obtain meaningful
methods usingobjects [58]. DSMs
vector For vector
have
DSMs, authors have used the adaptive clustering algorithm [59] or
mainly focused on reducing the DSM to a digital elevation model (DEM), removing height value top-to-bottom region growing
approach
objects by [60],a and previous
filtering research projects
algorithm. Different havefilter
successfully
algorithmsused have
DSMs been
on both herbaceous
reported [61,62]
based on
and woody crops [56,57], based on geometrical approaches. Therefore, methods
morphological filters [63], linear prediction [64], or spline approximation [65], with all of them based using vector DSMs
have
on the mainly focused
geometric on reducingof
characteristics the
theDSM
pointto clouds
a digital elevation
and not using model (DEM),
the color removing height
information value
of the points
objects by a filtering algorithm. Different filter algorithms have been reported based on morphological
in the process.
filtersAs[63],
perlinear
the prediction [64], or spline
above discussion, approximation
we report [65], with
in this article all ofmethod
a new them based on the geometric
to classify 3D UAV
characteristics of the point clouds and not using the color information
photogrammetric point clouds using RGB information through CVIs, tested in two vineyards of the points in the process.
on two
As per
different the above
dates. discussion,
Our specific we report
objectives in this
included (1) article
selectinga newthe method to classify index,
most appropriate 3D UAV (2)
photogrammetric point clouds using RGB information through CVIs, tested in
classifying point clouds by the selected CVI, (3) determining the heights of the vines, and, finally, (4) two vineyards on two
different
assessingdates. Our specific
the quality of the objectives included
results obtained by(1) selecting the
comparing the UAV
most appropriate
estimated and index, (2) classifying
on-ground height
point clouds
measurements. by the selected CVI, (3) determining the heights of the vines, and, finally, (4) assessing the
quality of the results obtained by comparing the UAV estimated and on-ground height measurements.
2. Materials and Methods
2. Materials and Methods
2.1. Study Field and UAV Flights
2.1. Study Field and UAV Flights
The presented study was carried out in two different commercial vineyards (Vitis vinifera L.)
The presented study was carried out in two different commercial vineyards (Vitis vinifera L.)2
located in the province of Lleida, Spain (Figure 1). The first vineyard, Field A, was an area of 4925 m
located in the province of Lleida, Spain (Figure 1). The first vineyard, Field A, was an area of 4925
cultivated with the Merlot vine variety (central coordinates 41°38’41’’ N; 5°30’34’’W, WGS84), while
m2 cultivated with the Merlot vine variety (central coordinates 41◦ 38’41” N; 5◦ 30’34”W, WGS84),
Field B was an area of 4415 m2 cultivated with the Albariño vine variety (central coordinates
while Field B was an area of 4415 m2 cultivated with the Albariño vine variety (central coordinates
41°39’3.34’’; 5°30’22’’W, WGS84). Vines were drip-irrigated and trellis-trained and had inter-row
41◦ 39’3.34”; 5◦ 30’22”W, WGS84). Vines were drip-irrigated and trellis-trained and had inter-row cover
cover crops composed of grasses. The vineyard design was focused on wine production, with the
crops composed of grasses. The vineyard design was focused on wine production, with the distance
distance between rows equal to 3 m, vine spacing equal to 2 m, and north–south row orientation.
between rows equal to 3 m, vine spacing equal to 2 m, and north–south row orientation.
Figure 1. Study area: (a) general context and (b) locations of parcels.
Remote Sens. 2019, 11, x FOR PEER REVIEW 4 of 20
Remote Sens. 2020, 12, 317 4 of 19
Figure 1. Study area: (a) general context and (b) locations of parcels.
AA total of four UAV flights, two on each vineyard, was performed; the first was on 29 July 2015,
total of four UAV flights, two on each vineyard, was performed; the first was on 29 July 2015,
and the second
and the second was
wason on1616September
September2015, 2015, depicting
depicting two two different
differentcrop
cropstages.
stages.All
All UAV
UAV flights
flights were
were
performed
performed under
undersimilar
similarweather
weatherconditions.
conditions. In In July,
July, the
the grapevine
grapevinecanopies
canopieswere were fully
fully developed,
developed,
while in September, grapes were machine-harvested. Flying at two
while in September, grapes were machine-harvested. Flying at two crop stages made it possiblecrop stages made it possibleto to
analyze
analyze different situations to test the validity of the proposed methodology. The UAV used was anan
different situations to test the validity of the proposed methodology. The UAV used was
MD4-1000
MD4-1000 multi-rotor
multi-rotordrone
drone (Microdrones
(MicrodronesGmbH, GmbH,Siegen,
Siegen,Germany).
Germany). This UAV is a quadcopter
quadcopterwith with a
maximum
a maximum payload equal
payload to 1.2
equal kg.kg.
to 1.2 It It uses4 4××250
uses 250 W gearless
gearless brushless
brushlessmotors
motorsandandreaches
reaches a cruising
a cruising
speed
speed of 15
of m/s. TheThe
15 m/s. UAV UAVwaswas equipped
equipped withwith
an Olympus
an Olympus PENPENE-PM1 (Olympus
E-PM1 Corporation,
(Olympus Tokyo,
Corporation,
Tokyo,
Japan), Japan),
which is which
an RGB is (R:
an RGB
Red;(R: G:Red; G: Green;
Green; B: Blue) B: camera.
Blue) camera.
It hasIt ahas a focal
focal length
length equalequal to 14
to 14 mm.
mm. Registered
Registered imagesimages
have ahave a dimension
dimension equalequal to 4032
to 4032 × 3024
× 3024 pixelsand
pixels andaa pixel
pixel pitch 4.3μm.
pitchofof4.3 µm.UAVUAV
flights
flights were
were performedatat30
performed 30mmabove
above ground
ground level,
level, with
with aa ground
groundsample
sampledistance
distance ofof1 cm
1 cmand a a
and
ground image dimension of 37 × 28 m. Images were registered in
ground image dimension of 37 × 28 m. Images were registered in continuous mode at one-second continuous mode at one-second
intervals,
intervals, resulting
resulting in 93%
in 93% andand60% 60% forward
forward andand sideside laps,
laps, respectively.
respectively. These
These high
high overlaps
overlaps allow
allow us to
us to achieve an accurate 3D reconstruction of woody crops, according
achieve an accurate 3D reconstruction of woody crops, according to previous investigations [40]. Fiveto previous investigations
[40]. Five ground control points (GCPs) were placed per vineyard, one in each corner and the other
ground control points (GCPs) were placed per vineyard, one in each corner and the other in the center.
in the center. Each GCP was measured with the stop-and-go technique using a Trimble GeoXH 2008
Each GCP was measured with the stop-and-go technique using a Trimble GeoXH 2008 Series (Trimble,
Series (Trimble, Sunnyvale, CA, USA) to georeference the DSM and orthomosaic in the
Sunnyvale, CA, USA) to georeference the DSM and orthomosaic in the photogrammetric processing.
photogrammetric processing.
Figure 2 summarizes the workflow for classifying points. RGB images were photogrammetrically
Figure 2 summarizes the workflow for classifying points. RGB images were
processed to obtain a 3D RGB point cloud. From this RGB information, a CVI was calculated for each
photogrammetrically processed to obtain a 3D RGB point cloud. From this RGB information, a CVI
point,
wasobtaining
calculatedafor 3Deach
CVI point,
point cloud.
obtaining A sample
a 3D CVI is extracted to calculate
point cloud. A samplea is separation
extracted threshold
to calculate value
a
between the vegetation
separation and non-vegetation
threshold value between the vegetationpoints. Finally, this threshold
and non-vegetation valueFinally,
points. was applied to the 3D
this threshold
CVI point
value wascloud to classify
applied to the 3DtheCVIpoints.
point cloud to classify the points.
To produce a very dense point cloud for each UAV flight, aerial triangulation was firstly performed
to determine the individual external orientation, position, and orientation of each image of the
photogrammetric block. Afterwards, point clouds were generated using SfM techniques. SfM
works under the principles of stereoscopic photogrammetry, using well-defined geometrical features
registered in multiple images from different points of view [66]. This methodology has been validated
in previous research projects [36,67], and we used Agisoft PhotoScan Professional Edition software
(Agisoft LLC, St. Petersburg, Russia) for photogrammetric processing.
Taking into account that each singular point has information in the RGB color space, prior to the
calculation of the indices, a color space normalization for each singular point was applied following
the normalization scheme described in [74]. As a result, normalized spectral components r, g, and b
ranging in [0,1] were obtained according to Equation (1):
R G B
r= ,g = ,b = (1)
R+G+B R+G+B R+G+B
where R, G, and B are normalized RGB values ranging in [0,1] obtained according to Equation (2):
R G B
R= ,G = ,B = (2)
Rmax Gmax Bmax
A value of M lower than 1 means that the histograms overlap significantly and therefore offer
poor discrimination. On the other hand, a value of M higher than 1 means that the histograms are
well separated, providing adequate discrimination. To calculate the mean and standard deviation
for each class, a stratified systematic unaligned strategy was used as the sampling method to select
points. For that, the UAV point clouds from every field and date were divided into regularly spaced
regions of 10 × 10 m, these regions being divided into smaller sampling unit areas of 0.1 × 0.1 m. For
each region, units with points belonging to a single class were selected manually, taking into account
in-field differences. To do this, points were shown using their RGB color over top, frontal and side
views. Of all the CVIs in Table 1, the one that showed the highest value in the M-Statistic test was used
as the index to classify the point cloud.
Once the most appropriate CVI was selected, the next step was to determine a threshold to
separate both classes. To binarize the grey-scale point cloud, a threshold value which maximizes the
variance between the vegetation and non-vegetation points was chosen using Otsu’s method [76].
Otsu’s method analyzes the histogram of CVI values. The bimodal distribution, with two normal
distributions, one representing vegetation and the remainder representing non-vegetation, was verified
through the Sarle’s bimodality coefficient (SBC). Otsu’s methods provide an index thresholding value
by maximizing the between-class variance and minimizing the within-class variance of the values.
Due to the very high point density, threshold determination was performed on a sample of points from
the original point cloud. This reduced point cloud was formed by reading one point out of ten (10% of
the total points). All points with a CVI value equal to or below the calculated threshold were assigned
to the vegetation class, while all points with a CVI value greater than this threshold were assigned to
the non-vegetation class.
Matlab software (Natick, MA, USA) was employed to perform the point cloud classification and
R software ((R Development Core Team, 2012) was used to perform the data analysis.
Figure3.3.An
Figure Anexample
example of vine height
of vine heightmeasurement.
measurement.
3. Results
In addition, one-way and two-way Analysis of Variance (ANOVA) tests were applied to
Figure significant
evaluate 4 shows a differences
partial top view of each
in height of the
errors. Thepoint clouds generated
Shapiro–Wilk test and for
the each UAVtest
Bartlett's flight
wereover
theused
two to assess
fields in Julynormality and homoscedasticity
and September. of variances,
From visual analysis, it canrespectively,
be observedtothat meet
theANOVA
vines had
assumptions.
a larger cross-section in July (Figure 4a,c) than in September (Figure 4b,d) because of the different
phenological stages, setting (young berries growing) and harvest, respectively. In addition, the green
3. Results
color of the vegetation was much more intense in July than in September. On the other hand, it can be
observed that,4although
Figure both fields
shows a partial hadofcover
top view each crops between
of the point thegenerated
clouds crop rows, forthis appears
each to beover
UAV flight much
more
the widespread
two fields inin Field
July andASeptember.
than in Field
FromB. Of the four
visual UAVitflights,
analysis, can be the one carried
observed out
that the on Plot
vines had Ba in
larger
July cross-section
(Figure 4c) offers thein July
best(Figure 4a,c)for
conditions than in September
a manual (Figure 4b,d)
interpretation because
to separate theofvegetation
the differentfrom
Remote Sens. 2019, 11, x FOR PEER REVIEW 8 of 20
thephenological
bare soil, beingstages,
alsosetting
easier(young berries growing)
to differentiate between and harvest,
cover croprespectively.
and vineyard. In addition, the green
color of the vegetation was much more intense in July than in September. On the other hand, it can
be observed that, although both fields had cover crops between the crop rows, this appears to be
much more widespread in Field A than in Field B. Of the four UAV flights, the one carried out on
Plot B in July (Figure 4c) offers the best conditions for a manual interpretation to separate the
vegetation from the bare soil, being also easier to differentiate between cover crop and vineyard.
Figure 4. A partial top view of the point clouds generated from unmanned aerial vehicle (UAV) flights
Figure 4. A partial top view of the point clouds generated from unmanned aerial vehicle (UAV) flights
over fields A and B in the months of July and September: (a) Field A July, (b) Field A September,
over fields A and B in the months of July and September: (a) Field A July, (b) Field A September, (c)
(c) Fueld B July and (d) Field B September.
Fueld B July and (d) Field B September.
Remote Sens.
Table2020, 12, 317 size of points per plot, UAV flight date and class (vegetation and non-vegetation).8 of 19
2. Sample
Figure 5. M-Statistic
Figure values
5. M-Statistic for each
values for UAV
each flight
UAV per Field
flight per(AField
and B)
(Aand
andmonth (1:month
B) and July and
(1:2:July
September).
and 2:
September).
Remote Sens. 2020, 12, 317 9 of 19
Table 3. Mean and standard deviation (SD) of each color vegetation index (CVI) per field and UAV
flight date for the vegetation and non-vegetation classes and result of the M-statistic test.
Vegetation Non-Vegetation
Field—UAV Flight Date
CVI Mean SD Mean SD M-Statistic
EXG 0.32473 0.15767 0.0086 0.02055 1.77385
EXR 0.04268 0.07627 0.21251 0.03267 1.55906
EXB −0.14404 0.11568 0.04441 0.02727 1.31834
A-July
EXGR 0.28206 0.22842 −0.20391 0.04747 1.76143
CIVE 18.66369 0.0635 18.7923 0.00873 1.78067
NGRDI 0.11897 0.07445 −0.07587 0.0315 1.83896
EXG −0.003 0.02099 0.21128 0.09185 1.89905
EXR 0.21739 0.03187 0.09187 0.04643 1.60307
EXB 0.05267 0.02171 −0.06466 0.0754 1.20829
A-September
EXGR −0.2204 0.04935 0.11941 0.13054 1.88897
CIVE 18.79697 0.00906 18.70938 0.03758 1.87791
NGRDI −0.08266 0.03015 0.06504 0.04723 1.90891
EXG 0.00393 0.01493 0.27952 0.12134 2.02242
EXR 0.22984 0.02356 0.08768 0.05069 1.91471
EXB 0.03238 0.02103 −0.1378 0.10041 1.40132
B–July
EXGR −0.22591 0.03361 0.19184 0.16538 2.09936
CIVE 18.7948 0.00628 18.68292 0.04843 2.04517
NGRDI −0.09258 0.02186 0.07311 0.05187 2.24719
EXG 0.0111 0.01794 0.21911 0.09513 1.8397
EXR 0.20106 0.01717 0.12787 0.03742 1.34068
EXB 0.05303 0.01948 −0.10953 0.08987 1.4867
B–September
EXGR −0.18996 0.03012 0.09123 0.12171 1.85206
CIVE 18.78438 0.00726 18.7079 0.03759 1.7051
NGRDI −0.06546 0.01514 0.03075 0.03644 1.86511
affected by shadows, presenting a different color than green. As a result, the points belonging to the
vegetation class
Remote Sens. 2019,were extracted
11, x FOR for a two-stage classification process (Figure 6f).
PEER REVIEW 11 of 20
Figure 6. An
Figure example
6. An exampleofofthe
theresults
resultsobtained fromthe
obtained from thepoint
pointcloud
cloud classification
classification process:
process: Cloud
Cloud points:
points:
(a) RGB,
(a) 3D 3D RGB, (b) vegetation
(b) vegetation points,
points, (c) preliminar
(c) preliminar non-vegetation,
non-vegetation, (d) vegetation
(d) vegetation points,points, (e) non-
(e) non-vegetation,
vegetation,
(f) final (f) final
vegetation vegetation points.
points.
Remote Sens. 2020, 12, 317 11 of 19
Figure 7 shows the accuracy and graphical comparisons between the measured and UAV vine
3.3. Vine Height Quantification
heights for each field and date. Independently of flight date and field, all fitted models reported
Figure 7 shows
a high correlation the accuracy
(R2 higher and graphical
than 0.871) and a lowcomparisons between
RMSE (lower the 0.076
than measured
m). and UAV vinemost
In addition,
of the points were close to the 1:1 line, which indicated an adequate fit of the UAV estimateda vine
heights for each field and date. Independently of flight date and field, all fitted models reported
high correlation (R2 higher than 0.871) and a low RMSE (lower than 0.076 m). In addition, most of the
height and the measured height. The best results were obtained in the UAV flight over Field B in
points were close to the 1:1 line, which indicated an adequate fit of the UAV estimated vine height
July (Figure 7c), matching with the visual analysis carried out on the four-point clouds (Figure 4).
and the measured height. The best results were obtained in the UAV flight over Field B in July (Figure
Shapiro–Wilk
7c), matching with=the
test (W 0.956, p-value
visual = 0.154)
analysis carriedand
out Bartlett’s clouds=(Figure
test (p-value
on the four-point 0.184)4).
validated positively
Shapiro–Wilk
normality and homoscedasticity of vine height error distribution. One-way ANOVA
test (W = 0.956, p-value = 0.154) and Bartlett’s test (p-value = 0.184) validated positively normality test was used
to study
and the differences in
homoscedasticity of vine heighterror
vine height error among theOne-way
distribution. UAV flights,
ANOVA with
testthe
wasconclusion
used to studythat
thethere
were differences
no significant differences
in vine height errorin the vinethe
among height
UAV errors
flights,among
with thethe UAV flights
conclusion = 3/156,
(d.f. were
that there no F =
1.965, p > 0.05). In addition, the results of the two-way ANOVA test (Table 4) showed there was
significant differences in the vine height errors among the UAV flights (d.f. = 3/156, F = 1.965, p > no
0.05). In addition, the results of the two-way ANOVA test
statistically significant interaction between flight dates and fields. (Table 4) showed there was no statistically
significant interaction between flight dates and fields.
Figure 7. UAV vine height versus measured vine height by field and date and 95% confidence interval.
Figure 7. UAV vine height versus measured vine height by field and date and 95% confidence
The red line is the fitted linear function and the black line represents the 1:1 line. The root-mean-square
interval. The red line is the fitted linear function and the black line represents the 1:1 line. The root-
error mean-square
(RMSE) and error
the coefficient ofthe
determination 2 ) derived from the linear regression fit are included
(RMSE) and coefficient of (R
determination (R2) derived from the linear regression
(p < 0.0001). Field A:
fit are included (p (a) July and
< 0.0001). (b)A:September,
Field (a) July andField B: (c) JulyField
(b) September, andB:(d)
(c)September.
July and (d) September.
Table 4. F and p-values of the two-way ANOVA for vine height error.
Factor F p
Field 3.735 0.1075
Flight date 0.157 0.6929
Field: Flight date 2.784 0.1972
Table 4. F and p-values of the two-way ANOVA for vine height error.
Factor F p
Field 3.735 0.1075
Remote Sens. 2020, 12, 317 Flight date 0.157 0.6929 12 of 19
Field : Flight date 2.784 0.1972
Figure 8 shows the accuracy and graphical comparisons between the measured and UAV vine
Figure 8 shows the accuracy and graphical comparisons between the measured and UAV vine
heights taking into account jointly all vine heights from the four UAV flights. Linear regression model
heights taking into account jointly all vine heights from the four UAV flights. Linear regression model
estimates a slope and an intercept equal to 0.93502 (p-value < 2 × 10−16) and 0.052 m (p-value = 0.006)
estimates a slope and an intercept equal to 0.93502 (p-value < 2 × 10−16 ) and 0.052 m (p-value = 0.006)
respectively, being the residual standard error 0.063 m. The estimated plant heights of the vines
respectively, being the residual standard error 0.063 m. The estimated plant heights of the vines showed
showed a very high coefficient of determination (R2 = 0.91) and a low RMSE (equal to 0.070 m). These
a very high coefficient of determination (R2 = 0.91) and a low RMSE (equal to 0.070 m). These results
results are in the same range as those obtained in other research works using other methodologies
are in the same range as those obtained in other research works using other methodologies such as
such as OBIA in the case of vineyards [57] or olive groves [40]. The points were close to the 1:1 line,
OBIA in the case of vineyards [57] or olive groves [40]. The points were close to the 1:1 line, indicating
indicating an adequate fit between the UAV estimated height and the on-ground measured height.
an adequate fit between the UAV estimated height and the on-ground measured height. Figure 9
Figure 9 shows a diagnosis of the linear model. Residuals were equally spread around the dotted
shows a diagnosis of the linear model. Residuals were equally spread around the dotted horizontal line
horizontal line (Figure 9a). This indicates that there was a linear relationship between the fitted
(Figure 9a). This indicates that there was a linear relationship between the fitted adjusted height and
adjusted height and residuals. The normal Q–Q plot (Figure 9b) shows how the residuals were
residuals. The normal Q–Q plot (Figure 9b) shows how the residuals were normally distributed, close
normally distributed, close to the straight line and not deviating severely. In addition, Shapiro–Wilk
to the straight line and not deviating severely. In addition, Shapiro–Wilk normality testing was applied
normality testing was applied to residuals, with a result of W = 0.974 (p = 0.149). Taking into account
to residuals, with a result of W = 0.974 (p = 0.149). Taking into account that the p-value was higher
that the p-value was higher than 0.05, this indicates that the residual distribution was not significantly
than 0.05, this indicates that the residual distribution was not significantly different from a normal
different from a normal distribution. Figure 9c shows how the residuals were equally spread along
distribution. Figure 9c shows how the residuals were equally spread along the range of predictors and
the range of predictors and therefore have equal variance. In addition, Barlett’s test, analyzing the
therefore have equal variance. In addition, Barlett’s test, analyzing the homogeneity of variances taking
homogeneity of variances taking into account the four UAV flights, had a p-value equal to 0.268 and
into account the four UAV flights, had a p-value equal to 0.268 and therefore showed no evidence
therefore showed no evidence to suggest differences between UAV flights. Finally, the extreme
to suggest differences between UAV flights. Finally, the extreme values used were not influential in
values used were not influential in determining the linear regression model. Figure 9d shows how
determining the linear regression model. Figure 9d shows how every residual is outside the Cook’s
every residual is outside the Cook’s distance and, therefore, regression results would not be altered
distance and, therefore, regression results would not be altered if any measurement was excluded.
if any measurement was excluded.
Figure 8. A graphical comparison of UAV estimated and on-ground measured vine heights for all UAV
Figure 8. A graphical comparison of UAV estimated and on-ground measured vine heights for all
flights corresponding to Fields A and B and both dates (1: July and 2: September). The red line is the
UAV flights corresponding to Fields A and B and both dates (1: July and 2: September). The red line
fitted linear function and the black line represents the 1:1 line. The RMSE and R2 values (p < 0.0001)
is the fitted linear function and the black line represents the 1:1 line. The RMSE and R2 values (p <
obtained from adjustment are included.
0.0001) obtained from adjustment are included.
Remote
Remote Sens.
Sens. 2019,
2020, 11,
12, x317
FOR PEER REVIEW 14
13 of
of 20
19
Figure 9.
Figure Residualanalysis
9. Residual analysisof ofthe
the linear
linear model
model of
of UAV
UAV estimated
estimated and
and on-ground
on-ground measured
measured vine
vine
heights: (a) residuals vs. fitted values, (b) normality of residuals, (c) scale–location, and (d) leverage
heights: (a) residuals vs. fitted values, (b) normality of residuals, (c) scale–location, and (d) leverage
vs. residuals.
vs. residuals.
4. Discussion
4. Discussion
The results shown are slightly better than those obtained in other works using UAV flights. Other
The results shown are slightly better than those obtained in other works using UAV flights.
authors like [57,78,79] show R2 values close to 0.8 and an RMSE equal to 0.17 m. All these authors
Other authors like [57,78,79] show R2 values close to 0.8 and an RMSE equal to 0.17 m. All these
work for the estimation of the height of the vineyard with raster vegetation height models instead of
authors work for the estimation of the height of the vineyard with raster vegetation height models
the 3D point cloud, and this may be the consequence of this slight worsening in the height estimation.
instead of the 3D point cloud, and this may be the consequence of this slight worsening in the height
De Castro et al. (2018) [57] used UAV flights for height determination based on the use of DSM and
estimation. De Castro et al. (2018) [57] used UAV flights for height determination based on the use of
OBIA, obtaining good results supposing the rasterization of the information, with the results, therefore,
DSM and OBIA, obtaining good results supposing the rasterization of the information, with the
depending on the size of the pixel used. On the other hand, taking into account the measurements of
results, therefore, depending on the size of the pixel used. On the other hand, taking into account the
on-ground sensors, such as LiDAR, R2 values similar to those obtained in this work are reached, but
measurements of on-ground sensors, such as LiDAR, R2 values similar to those obtained in this work
with a lower RMSE, around 3 cm [80]. Although the results obtained show a slightly higher RMSE
are reached, but with a lower RMSE, around 3 cm [80]. Although the results obtained show a slightly
than the on-ground LiDAR measures, it does not affect the decision making in the management of the
higher RMSE than the on-ground LiDAR measures, it does not affect the decision making in the
crop [80]. In addition, using point clouds generated from UAV flights, other authors have obtained
management of the crop [80]. In addition, using point clouds generated from UAV flights, other
an RMSE in row height determination between 3 cm and 29 cm [81], making manual intervention
authors have obtained an RMSE in row height determination between 3 cm and 29 cm [81], making
necessary in the processes of classification, which could make the process less time-efficient. To solve
manual intervention necessary in the processes of classification, which could make the process less
this problem, other authors have developed unsupervised methods without manual intervention [82],
time-efficient. To solve this problem, other authors have developed unsupervised methods without
which required the establishment of a series of parameters in the classification process, meaning that
manual intervention [82], which required the establishment of a series of parameters in the
the quality of the process depends directly on the values selected for these parameters. Based on the
classification process, meaning that the quality of the process depends directly on the values selected
for these parameters. Based on the results presented herein, the use of a CVI in 3D point clouds
Remote Sens. 2020, 12, 317 14 of 19
Remote Sens. 2019, 11, x FOR PEER REVIEW 15 of 20
Figure
Figure 10.
10. Example
Example of
of aa vineyard
vineyard height-map
height-map based
based on
on the
the proposed
proposed methodology.
5. Conclusions
5. Conclusions
A fully
A fully automatic
automatic and and accurate
accurate method
method waswas developed
developed for for the
the classification
classification ofof 3D
3D point
point clouds
clouds
based only
based only on
on the
the color
color information
information of of the
the points; this method
points; this method was was then
then tested
tested onon two
two commercial
commercial
vineyards at two different growth stages. Using the RGB information
vineyards at two different growth stages. Using the RGB information from each point as from each point as input
input for
for
the classification algorithm, the color index value was first calculated. The threshold
the classification algorithm, the color index value was first calculated. The threshold of separation of separation
between classes
between classes inin the
the study
study area
area was
was automatically determined, allowing
automatically determined, allowing us us to
to isolate
isolate those
those points
points
that belonged to the vegetation class in the study area. The process was
that belonged to the vegetation class in the study area. The process was carried out completely carried out completely
automatically with
automatically with nono need
need to
to select
select any
any parameter
parameter or or for
for previous
previous training,
training, eliminating
eliminating anyany possible
possible
error inherent to manual intervention. In this way, the results are independent of
error inherent to manual intervention. In this way, the results are independent of the conditions and the conditions and
state of the crop.
state of the crop.
The results
The results were
were used
usedtotocalculate
calculatethetheheights
heightsofofthethevines
vineswith
withsatisfactory
satisfactory quality,
quality,asas
validated
validated in
two fields and at two development stages of the crop. Future work will, therefore,
in two fields and at two development stages of the crop. Future work will, therefore, be dedicated to be dedicated to
determining more
determining more structural
structuralcharacteristics
characteristicsofofthe
thevines,
vines, such
suchasas
volume
volume or or
crown
crownwidth, to help
width, in the
to help in
monitoring of the crop and support decision-making. In addition, future work
the monitoring of the crop and support decision-making. In addition, future work using CVIs for using CVIs for vines
vines should be used to estimate variables like biomass by constructing accurate 3D vineyard maps
in each phenological phase.
Remote Sens. 2020, 12, 317 15 of 19
should be used to estimate variables like biomass by constructing accurate 3D vineyard maps in each
phenological phase.
Author Contributions: F.-J.M.-C., A.I.d.C., J.T.-S., and F.L.-G. conceived and designed the experiments; J.T.-S.,
F.M.J.-B. and F.L.-G. performed the experiments; F.-J.M.-C., A.G.-F., J.T.-S., P.T.-T. and F.L.-G. analyzed the data;
A.I.d.C. and F.L.-G. contributed equipment and analysis tools; F.-J.M.-C. and F.L.-G. wrote the paper; A.I.d.C.,
J.T.-S., and F.M.J.-B. collaborated in the discussion of the results and revised the manuscript. All authors have read
and approved the manuscript.
Funding: This research was funded by the AGL2017-82335-C4-4R project (Spanish Ministry of Science, Innovation
and Universities, AEI-EU FEDER funds).
Acknowledgments: The authors thank RAIMAT S.A. for allowing the fieldwork and UAV flights in its vineyards.
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Cook, S.E.; Bramley, R.G.V. Precision agriculture—opportunities, benefits and pitfalls of site-specific crop
management in Australia. Aust. J. Exp. Agric. 1998, 38, 753–763. [CrossRef]
2. Whelan, B.M.; McBratney, A.B. The “Null Hypothesis” of Precision Agriculture Management. Precis. Agric.
2000, 2, 265–279. [CrossRef]
3. Bramley, R.G.V.; Hamilton, R.P. Understanding variability in winegrape production systems. 1. Within
vineyard variation in yield over several vintages. Aust. J. Grape Wine Res. 2004, 10, 32–45. [CrossRef]
4. Schieffer, J.; Dillon, C. The economic and environmental impacts of precision agriculture and interactions
with agro-environmental policy. Precis. Agric. 2015, 16, 46–61. [CrossRef]
5. Bramley, R.; Pearse, B.; Chamberlain, P. Being profitable precisely—A case study of precision viticulture from
Margaret River. Aust. N. Zealand Grapegrow. Winemak. Available online: http://www.nwvineyards.net/docs/
PVProfitabiltyPaper.pdf (accessed on 1 October 2019).
6. Llorens, J.; Gil, E.; Llop, J.; Escolà, A. Variable rate dosing in precision viticulture: Use of electronic devices to
improve application efficiency. Crop Prot. 2010, 29, 239–248. [CrossRef]
7. Ballesteros, R.; Ortega, J.F.; Hernández, D.; Moreno, M.Á. Characterization of Vitis vinifera L. Canopy Using
Unmanned Aerial Vehicle-Based Remote Sensing and Photogrammetry Techniques. Am. J. Enol. Vitic. 2015,
66, 120. [CrossRef]
8. Boomsma, C.R.; Santini, J.B.; West, T.D.; Brewer, J.C.; McIntyre, L.M.; Vyn, T.J. Maize grain yield responses to
plant height variability resulting from crop rotation and tillage system in a long-term experiment. Soil Tillage Res.
2010, 106, 227–240. [CrossRef]
9. Tilly, N.; Hoffmeister, D.; Cao, Q.; Huang, S.; Lenz-Wiedemann, V.; Miao, Y.; Bareth, G. Multitemporal crop
surface models: Accurate plant height measurement and biomass estimation with terrestrial laser scanning
in paddy rice. J. Appl. Remote Sens. 2014, 8, 1–23. [CrossRef]
10. Pereira, A.R.; Green, S.; Villa Nova, N.A. Penman–Monteith reference evapotranspiration adapted to estimate
irrigated tree transpiration. Agric. Water Manag. 2006, 83, 153–161. [CrossRef]
11. Cohen, S.; Fuchs, M.; Moreshet, S.; Cohen, Y. The distribution of leaf area, radiation, photosynthesis and
transpiration in a Shamouti orange hedgerow orchard. Part II. Photosynthesis, transpiration, and the effect
of row shape and direction. Agric. Meteorol. 1987, 40, 145–162. [CrossRef]
12. Fuchs, M.; Cohen, Y.; Moreshet, S. Determining transpiration from meteorological data and crop characteristics
for irrigation management. Irrig. Sci. 1987, 8, 91–99. [CrossRef]
13. Stanton, C.; Starek, M.J.; Elliott, N.; Brewer, M.; Maeda, M.M.; Chu, T. Unmanned aircraft system-derived crop
height and normalized difference vegetation index metrics for sorghum yield and aphid stress assessment.
J. Appl. Remote Sens. 2017, 11, 1–20. [CrossRef]
14. Sio-Se Mardeh, A.; Ahmadi, A.; Poustini, K.; Mohammadi, V. Evaluation of drought resistance indices under
various environmental conditions. Field Crop. Res. 2006, 98, 222–229. [CrossRef]
15. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining
UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass
monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [CrossRef]
16. Dempewolf, J.; Nagol, J.; Hein, S.; Thiel, C.; Zimmermann, R. Measurement of within-season tree height
growth in a mixed forest stand using UAV imagery. Forest. 2017, 8, 231. [CrossRef]
Remote Sens. 2020, 12, 317 16 of 19
17. Birdal, A.C.; Avdan, U.; Türk, T. Estimating tree heights with images from an unmanned aerial vehicle.
Geomat. Nat. Hazards Risk 2017, 8, 1144–1156. [CrossRef]
18. Johansen, K.; Raharjo, T.; McCabe, M. Using multi-spectral UAV imagery to extract tree crop structural
properties and assess pruning effects. Remote. Sens. 2018, 10, 854. [CrossRef]
19. Del-Campo-Sanchez, A.; Ballesteros, R.; Hernandez-Lopez, D.; Ortega, J.F.; Moreno, M.A.; Agroforestry and
Cartography Precision Research Group. Quantifying the effect of Jacobiasca lybica pest on vineyards with
UAVs by combining geometric and computer vision techniques. PLoS ONE 2019, 14, e0215521. [CrossRef]
20. Lin, G.; Tang, Y.; Zou, X.; Li, J.; Xiong, J. In-field citrus detection and localisation based on RGB-D image
analysis. Biosyst. Eng. 2019, 186, 34–44. [CrossRef]
21. Lin, G.; Tang, Y.; Zou, X.; Xiong, J.; Li, J. Guava detection and pose estimation using a low-cost RGB-D sensor
in the field. Sensors 2019, 19, 428. [CrossRef]
22. Lee, K.-H.; Ehsani, R. A Laser Scanner Based Measurement System for Quantification of Citrus Tree Geometric
Characteristics. Appl. Eng. Agric. 2009, 25, 777–788. [CrossRef]
23. Bongers, F. Methods to assess tropical rain forest canopy structure: An overview. In Tropical Forest
Canopies: Ecology and Management: Proceedings of ESF Conference, Oxford University, 12–16 December 1998;
Linsenmair, K.E., Davis, A.J., Fiala, B., Speight, M.R., Eds.; Springer: Dordrecht, The Netherlands, 2001;
pp. 263–277. [CrossRef]
24. Côté, J.-F.; Fournier, R.A.; Verstraete, M.M. Canopy Architectural Models in Support of Methods Using
Hemispherical Photography. In Hemispherical Photography in Forest Science: Theory, Methods, Applications;
Fournier, R.A., Hall, R.J., Eds.; Springer: Dordrecht, The Netherlands, 2017; pp. 253–286. [CrossRef]
25. Phattaralerphong, J.; Sinoquet, H. A Method for 3D Reconstruction of Tree Canopy Volume from Photographs:
Assessment from 3D Digitised Plants. Available online: https://www.researchgate.net/publication/281471747_
A_method_for_3D_reconstruction_of_tree_canopy_volume_photographs_assessment_from_3D_digitised_
plants (accessed on 1 October 2019).
26. Giuliani, R.; Magnanini, E.; Fragassa, C.; Nerozzi, F. Ground monitoring the light–shadow windows of a
tree canopy to yield canopy light interception and morphological traits. Plantcell Environ. 2000, 23, 783–796.
[CrossRef]
27. Kise, M.; Zhang, Q. Development of a stereovision sensing system for 3D crop row structure mapping and
tractor guidance. Biosyst. Eng. 2008, 101, 191–198. [CrossRef]
28. Schumann, A.W.; Zaman, Q.U. Software development for real-time ultrasonic mapping of tree canopy size.
Comput. Electron. Agric. 2005, 47, 25–40. [CrossRef]
29. Llorens, J.; Gil, E.; Llop, J. Ultrasonic and LIDAR sensors for electronic canopy characterization in vineyards:
Advances to improve pesticide application methods. Sensors 2011, 11, 2177–2194. [CrossRef] [PubMed]
30. Rosell, J.R.; Sanz, R. A review of methods and applications of the geometric characterization of tree crops in
agricultural activities. Comput. Electron. Agric. 2012, 81, 124–141. [CrossRef]
31. Fernández-Sarría, A.; Martínez, L.; Velázquez-Martí, B.; Sajdak, M.; Estornell, J.; Recio, J.A. Different
methodologies for calculating crown volumes of Platanus hispanica trees using terrestrial laser scanner
and a comparison with classical dendrometric measurements. Comput. Electron. Agric. 2013, 90, 176–185.
[CrossRef]
32. Llorens, J.; Gil, E.; Llop, J.; Queraltó, M. Georeferenced LiDAR 3D vine plantation map generation. Sensors
2011, 11, 6237–6256. [CrossRef]
33. Andújar, D.; Moreno, H.; Bengochea-Guevara, J.M.; de Castro, A.; Ribeiro, A. Aerial imagery or on-ground
detection? An economic analysis for vineyard crops. Comput. Electron. Agric. 2019, 157, 351–358. [CrossRef]
34. Díaz-Varela, R.; de la Rosa, R.; León, L.; Zarco-Tejada, P. High-resolution airborne UAV imagery to assess
olive tree crown parameters using 3D photo reconstruction: Application in breeding trials. Remote Sens.
2015, 7, 4213–4232. [CrossRef]
35. Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review.
Precis. Agric. 2012, 13, 693–712. [CrossRef]
36. Mesas-Carrascosa, F.; Rumbao, I.; Berrocal, J.; Porras, A. Positional quality assessment of orthophotos
obtained from sensors onboard multi-rotor UAV platforms. Sensors 2014, 14, 22394–22407. [CrossRef]
[PubMed]
Remote Sens. 2020, 12, 317 17 of 19
37. Mesas-Carrascosa, F.-J.; Notario García, M.; Meroño de Larriva, J.; García-Ferrer, A. An analysis of the
influence of flight parameters in the generation of unmanned aerial vehicle (UAV) orthomosaicks to survey
archaeological areas. Sensors 2016, 16, 1838. [CrossRef] [PubMed]
38. Sankaran, S.; Khot, L.R.; Espinoza, C.Z.; Jarolmasjed, S.; Sathuvalli, V.R.; Vandemark, G.J.; Miklas, P.N.;
Carter, A.H.; Pumphrey, M.O.; Knowles, N.R.; et al. Low-altitude, high-resolution aerial imaging systems for
row and field crop phenotyping: A review. Eur. J. Agron. 2015, 70, 112–123. [CrossRef]
39. Guo, Q.; Su, Y.; Hu, T.; Zhao, X.; Wu, F.; Li, Y.; Liu, J.; Chen, L.; Xu, G.; Lin, G.; et al. An integrated UAV-borne
lidar system for 3D habitat mapping in three forest ecosystems across China. Int. J. Remote Sens. 2017, 38,
2954–2972. [CrossRef]
40. Torres-Sánchez, J.; López-Granados, F.; Serrano, N.; Arquero, O.; Peña, J.M. High-Throughput 3-D Monitoring
of Agricultural-Tree Plantations with Unmanned Aerial Vehicle (UAV) Technology. PLoS ONE 2015, 10,
e0130479. [CrossRef] [PubMed]
41. Mesas-Carrascosa, F.-J.; Pérez-Porras, F.; Meroño de Larriva, J.; Mena Frau, C.; Agüera-Vega, F.;
Carvajal-Ramírez, F.; Martínez-Carricondo, P.; García-Ferrer, A. Drift correction of lightweight microbolometer
thermal sensors on-board unmanned aerial vehicles. Remote Sens. 2018, 10, 615. [CrossRef]
42. Candiago, S.; Remondino, F.; De Giglio, M.; Dubbini, M.; Gattelli, M. Evaluating multispectral images and
vegetation indices for precision farming applications from UAV images. Remote Sens. 2015, 7, 4026–4047.
[CrossRef]
43. Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J. Hyperspectral imaging: A review on
UAV-based sensors, data processing and applications for agriculture and forestry. Remote Sens. 2017, 9, 1110.
[CrossRef]
44. Jozkow, G.; Totha, C.; Grejner-Brzezinska, D. UAS Topographic Mapping with Velodyne Lidar Sensor.
Available online: https://www.researchgate.net/profile/Grzegorz_Jozkow/publication/307536902_UAS_
TOPOGRAPHIC_MAPPING_WITH_VELODYNE_LiDAR_SENSOR/links/57f7ddf608ae280dd0bcc8e8/
UAS-TOPOGRAPHIC-MAPPING-WITH-VELODYNE-LiDAR-SENSOR.pdf (accessed on 1 October 2019).
45. Nagai, M.; Chen, T.; Shibasaki, R.; Kumagai, H.; Ahmed, A. UAV-Borne 3-D Mapping System by Multisensor
Integration. IEEE Trans. Geosci. Remote Sens. 2009, 47, 701–708. [CrossRef]
46. Torres-Sánchez, J.; Peña, J.M.; de Castro, A.I.; López-Granados, F. Multi-temporal mapping of the vegetation
fraction in early-season wheat fields using images from UAV. Comput. Electron. Agric. 2014, 103, 104–113.
[CrossRef]
47. Jakubowski, M.; Li, W.; Guo, Q.; Kelly, M. Delineating individual trees from LiDAR data: A comparison of
vector-and raster-based segmentation approaches. Remote Sens. 2013, 5, 4163–4186. [CrossRef]
48. Chen, M.; Tang, Y.; Zou, X.; Huang, K.; Li, L.; He, Y. High-accuracy multi-camera reconstruction enhanced by
adaptive point cloud correction algorithm. Opt. Lasers Eng. 2019, 122, 170–183. [CrossRef]
49. Tang, Y.; Li, L.; Wang, C.; Chen, M.; Feng, W.; Zou, X.; Huang, K. Real-time detection of surface deformation
and strain in recycled aggregate concrete-filled steel tubular columns via four-ocular vision. Robot. Comput.
Integr. Manuf. 2019, 59, 36–46. [CrossRef]
50. Whiteside, T.G.; Boggs, G.S.; Maier, S.W. Comparing object-based and pixel-based classifications for mapping
savannas. Int. J. Appl. Earth Obs. Geoinf. 2011, 13, 884–893. [CrossRef]
51. Guo, Q.; Kelly, M.; Gong, P.; Liu, D. An Object-Based Classification Approach in Mapping Tree Mortality
Using High Spatial Resolution Imagery. Gisci. Remote Sens. 2007, 44, 24–47. [CrossRef]
52. Chang, A.; Eo, Y.; Kim, Y.; Kim, Y. Identification of individual tree crowns from LiDAR data using a circle
fitting algorithm with local maxima and minima filtering. Remote Sens. Lett. 2013, 4, 29–37. [CrossRef]
53. Hyyppa, J.; Kelle, O.; Lehikoinen, M.; Inkinen, M. A segmentation-based method to retrieve stem volume
estimates from 3-D tree height models produced by laser scanners. IEEE Trans. Geosci. Remote Sens. 2001, 39,
969–975. [CrossRef]
54. Kwak, D.-A.; Lee, W.-K.; Lee, J.-H.; Biging, G.S.; Gong, P. Detection of individual trees and estimation of tree
height using LiDAR data. J. Res. 2007, 12, 425–434. [CrossRef]
55. Jing, L.; Hu, B.; Li, J.; Noland, T. Automated Delineation of Individual Tree Crowns from Lidar Data by
Multi-Scale Analysis and Segmentation. Photogramm. Eng. Remote Sens. 2012, 78, 1275–1284. [CrossRef]
56. Jiménez-Brenes, F.M.; López-Granados, F.; de Castro, A.I.; Torres-Sánchez, J.; Serrano, N.; Peña, J.M.
Quantifying pruning impacts on olive tree architecture and annual canopy growth by using UAV-based 3D
modelling. Plant Methods 2017, 13, 55. [CrossRef] [PubMed]
Remote Sens. 2020, 12, 317 18 of 19
57. De Castro, A.; Jiménez-Brenes, F.; Torres-Sánchez, J.; Peña, J.; Borra-Serrano, I.; López-Granados, F. 3-D
characterization of vineyards using a novel UAV imagery-based OBIA procedure for precision viticulture
applications. Remote Sens. 2018, 10, 584. [CrossRef]
58. Suárez, J.C.; Ontiveros, C.; Smith, S.; Snape, S. Use of airborne LiDAR and aerial photography in the
estimation of individual tree heights in forestry. Comput. Geosci. 2005, 31, 253–262. [CrossRef]
59. Lee, H.; Slatton, K.C.; Roth, B.E.; Cropper, W.P. Adaptive clustering of airborne LiDAR data to segment
individual tree crowns in managed pine forests. Int. J. Remote Sens. 2010, 31, 117–139. [CrossRef]
60. Li, W.; Guo, Q.; Jakubowski, M.K.; Kelly, M. A New Method for Segmenting Individual Trees from the Lidar
Point Cloud. Photogramm. Eng. Remote Sens. 2012, 78, 75–84. [CrossRef]
61. De Castro, A.; Torres-Sánchez, J.; Peña, J.; Jiménez-Brenes, F.; Csillik, O.; López-Granados, F. An automatic
random forest-OBIA algorithm for early weed mapping between and within crop rows using UAV imagery.
Remote Sens. 2018, 10, 285. [CrossRef]
62. Chang, A.; Jung, J.; Maeda, M.M.; Landivar, J. Crop height monitoring with digital imagery from Unmanned
Aerial System (UAS). Comput. Electron. Agric. 2017, 141, 232–237. [CrossRef]
63. Eckstein, W.; Muenkelt, O. Extracting Objects from Digital Terrain Models. Available online:
https://www.spiedigitallibrary.org/conference-proceedings-of-spie/2572/0000/Extracting-objects-from-
digital-terrain-models/10.1117/12.216942.short(accessedon15October2019).
64. Kraus, K.; Pfeifer, N. Determination of terrain models in wooded areas with airborne laser scanner data.
Isprs J. Photogramm. Remote Sens. 1998, 53, 193–203. [CrossRef]
65. Axelsson, P. Processing of laser scanner data—Algorithms and applications. Isprs J. Photogramm. Remote Sens.
1999, 54, 138–147. [CrossRef]
66. Snavely, N.; Seitz, S.M.; Szeliski, R. Modeling the World from Internet Photo Collections. Int. J. Comput. Vis.
2008, 80, 189–210. [CrossRef]
67. Mesas-Carrascosa, F.-J.; Torres-Sánchez, J.; Clavero-Rumbao, I.; García-Ferrer, A.; Peña, J.-M.; Borra-Serrano, I.;
López-Granados, F. Assessing optimal flight parameters for generating accurate multispectral orthomosaicks
by UAV to support site-specific crop management. Remote Sens. 2015, 7, 12793–12814. [CrossRef]
68. Mao, W.; Wang, Y.; Wang, Y. Real-time detection of between-row weeds using machine vision. In Proceedings
of the 2003 ASAE Annual Meeting, Las Vegas, NV, USA, 27–30 July 2003; p. 1.
69. Woebbecke, D.; Meyer, G.; Von Bargen, K.; Mortensen, D. Shape features for identifying young weeds using
image analysis. Trans. ASAE-Am. Soc. Agric. Eng. 1995, 38, 271–282. [CrossRef]
70. Meyer, G.E.; Hindman, T.W.; Laksmi, K. Machine Vision Detection Parameters for Plant Species Identification.
In Proceedings of the Precision Agriculture and Biological Quality, Boston, MA, USA, 14 January 1999;
Volume 3543, pp. 327–335.
71. Camargo Neto, J. A Combined Statistical-Soft Computing APPROACH for Classification and Mapping Weed
Species in Minimum-TILLAGE Systems. 2004. Available online: https://search.proquest.com/openview/
c9d042c0b775871973b4494b3233002c/1?cbl=18750&diss=y&pq-origsite=gscholar (accessed on 1 October 2019).
72. Kataoka, T.; Kaneko, T.; Okamoto, H.; Hata, S. Crop growth estimation system using machine vision.
In Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics
(AIM 2003), Kobe, Japan, 20–24 July 2003; Volume1072, pp. b1079–b1083.
73. Woebbecke, D.M.; Meyer, G.E.; Bargen, K.V.; Mortensen, D.A. Plant Species Identification, Size, and
Enumeration Using Machine Vision Techniques on Near-Binary Images. In Proceedings of the Applications
in Optical Science and Engineering, Boston, MA, USA, 12 May 1993.
74. Gée, C.; Bossu, J.; Jones, G.; Truchetet, F. Crop/weed discrimination in perspective agronomic images.
Comput. Electron. Agric. 2008, 60, 49–59. [CrossRef]
75. Kaufman, Y.J.; Remer, L.A. Detection of forests using mid-IR reflectance: An application for aerosol studies.
IEEE Trans. Geosci. Remote Sens. 1994, 32, 672–683. [CrossRef]
76. Otsu, N. A threshold selection method from gray level histogram. IEEE Trans. Syst. Man Cybern. 1979, 9,
66–166. [CrossRef]
77. Fox, J. The R commander: A basic-statistics graphical user interface to R. J. Stat. Softw. 2005, 14, 1–42. [CrossRef]
78. Pádua, L.; Marques, P.; Hruška, J.; Adão, T.; Peres, E.; Morais, R.; Sousa, J. Multi-Temporal Vineyard
Monitoring through UAV-Based RGB Imagery. Remote Sens. 2018, 10, 1907. [CrossRef]
Remote Sens. 2020, 12, 317 19 of 19
79. Caruso, G.; Tozzini, L.; Rallo, G.; Primicerio, J.; Moriondo, M.; Palai, G.; Gucci, R.J.V. Estimating biophysical
and geometrical parameters of grapevine canopies (‘Sangiovese’) by an unmanned aerial vehicle (UAV) and
VIS-NIR cameras. Vitis 2017, 56, 63–70.
80. Madec, S.; Baret, F.; de Solan, B.; Thomas, S.; Dutartre, D.; Jezequel, S.; Hemmerlé, M.; Colombeau, G.;
Comar, A. High-Throughput Phenotyping of Plant Height: Comparing Unmanned Aerial Vehicles and
Ground LiDAR Estimates. Front. Plant Sci. 2017, 8. [CrossRef]
81. Weiss, M.; Baret, F. Using 3D point clouds derived from UAV RGB imagery to describe vineyard 3D
macro-structure. Remote Sens. 2017, 9, 111. [CrossRef]
82. Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Gay, P. Unsupervised detection of vineyards by 3D point-cloud
UAV photogrammetry for precision agriculture. Comput. Electron. Agric. 2018, 155, 84–95. [CrossRef]
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).