Nam Joshi 2017 Unmanned Aerial Vehicle Localization Using Distributed Sensors
Nam Joshi 2017 Unmanned Aerial Vehicle Localization Using Distributed Sensors
Abstract
The use of unmanned aerial vehicles of varying types and sizes has increased drastically in recent years. Finding the exact
geometric location of these unmanned aerial vehicles is important for several reasons. In this work, we investigate a new
approach to localizing unmanned aerial vehicles using multiple image sensors, such as cameras connected through sensor
networks, so that sensitive areas can be protected. In the proposed approach, image sensors deployed in the ground
measure the azimuth angle and angle of elevation of the target vehicle and send that information to a collector node, so
it can estimate the location of the target vehicle based on the collected samples. Two estimation methods are proposed
to solve the localization problem. The first scheme localizes the projection of the target vehicle on the horizontal plane
and estimates the altitude later. The second scheme estimates the location of the target vehicle directly in three-
dimensional space. The performance of the proposed schemes is evaluated through simulation.
Keywords
Distributed sensors, unmanned aerial vehicles, aerial vehicles, drones, localization
Creative Commons CC-BY: This article is distributed under the terms of the Creative Commons Attribution 4.0 License
(http://www.creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without
further permission provided the original work is attributed as specified on the SAGE and Open Access pages (http://www.uk.sagepub.com/aboutus/
openaccess.htm).
2 International Journal of Distributed Sensor Networks
of UAVs developed by many companies at very afford- literature. In section ‘‘UAV localization scheme,’’ we
able prices. Thus, it is very important to develop drone describe our proposed aerial vehicle localization
detection techniques to protect a given building or geo- scheme. In section ‘‘Analysis results,’’ we compare the
graphic location from possible threats by UAVs. estimation accuracy of different schemes for low-
Based on the type of sensor used, there are three altitude target UAVs and high-altitude target UAVs,
main UAV detection techniques. The first is UAV and we present analysis results. In section ‘‘Conclusion
detection by radio frequency (RF). This technique and future work,’’ we summarize the work in this arti-
works fine if a UAV is controlled over an RF channel. cle and describe possible future research that builds
However, if a drone is preprogrammed to follow a fixed upon this study.
route using global positioning system (GPS) informa-
tion without RF communications,2 RF detection sim-
ply does not work. Literature review
A second approach is detection by radar, which is
the traditional technology for detecting flying objects. Localization is the process of finding the physical loca-
Evidence shows that in some cases, radar fails to detect tion of a UAV in accordance with some real or virtual
UAVs. In particular, radar cannot detect an object coordinate system. Localization is an important task
within a radar shadow behind a building, mountain, when direct measurement of the UAV location is not
any large obstacle, or masking terrain, and cannot available. Generally, system performance with localiza-
detect objects flying at low altitude, where the electro- tion is evaluated based on the accuracy of the estimated
magnetic waves from radar cannot reach. Also, radar location information at a given time.
has a hard time detecting small, plastic, electric- In the literature, vision-based UAV detection has
powered UAVs because that is not what they were been investigated intensively in order to avoid collisions
designed to detect.3 It might be possible to design new between UAVs.7–9 Most of those vision-based
radar to detect such small drones. However, if it is approaches focused only on the detection of other
modified to also detect small objects, then the amount UAVs, but not on localization.
of noise will increase as birds, and all other small flying Although distance estimation has been considered,7
objects will surface above the threshold. This drawback the research used a training data set for a specific UAV,
limits the use of radar for identifying small UAVs.4 and thus, the estimation is not likely to work reliably
A third approach is detection by vision sensors and for any arbitrary UAV.
image processing. In this approach, a camera usually Boddhu et al.2 considered detection and tracking of
plays the role of vision sensor to detect UAVs. An hostile drones based on the paradigm of human-as-sen-
experiment by Nistér et al.5 proved that vision can sor. In this scheme, each user with a specific smart-
function as a promising navigation sensor that provides phone application contributes to small UAV detection
accurate localization. Modern cameras can see long dis- by sending small UAV-related information (e.g. flight
tances at a usable resolution. An advantage of vision direction, user position) manually into the sensor cloud.
sensors and image processing–based approaches is that Then, the sensor cloud notifies other users involved in
motion detection combined with a speeded-up robust the project and attempts to track the UAV.
features (SURF) algorithm can properly detect small Although this scheme is similar to our scheme, in
UAVs while successfully ignoring other flying objects, that it uses multiple sensors, and the detailed localiza-
such as birds.4 Our proposed schemes come under this tion method is very different, since localization is done
category. based on users’ manual feedback.
In this article, we propose new methods to localize At variance to our purpose, there is a lot of work in
UAVs, overcoming the limitations of radar detection. the literature on localizing ground-based objects when
In the proposed schemes, the radar shadow problem is imaged from UAVs.10–15 In these methods, the target is
resolved by deploying a sufficiently large number of basically localized using its pixel location in an image,
image sensors, such as cameras connected through sen- with measurement of UAV position and attitude and
sor networks, around the region needing protection. camera pose angle. These vision-based localization
Each image sensor measures the azimuth angle and techniques in unknown environments were classified
angle of elevation of the target UAV and sends that and reviewed by Ben-Afia et al.16
angle information to the data collector (sink) node To the best of our knowledge, the issue of localizing
through sensor networks.6 Then, the data collector a UAV with multiple distributed sensors has not yet
node estimates the location of the target UAV based been discussed intensively, because most of UAVs spe-
on the collected angle samples. It is necessary to detect cifically designed for civil and commercial applications
a UAV in the image in order to measure the angles. were used to localize targets on the ground. In this
The remainder of this article is organized as follows. work, we investigate the issue of localizing a UAV with
Section ‘‘Literature review’’ reviews the related multiple ground sensors in detail. Unlike most of the
Nam and Joshi 3
Figure 1. (a) Deployment of image sensors for UAV localization and (b) measurement of azimuth angle (u) and angle of elevation
(f) of a UAV by sensor nodes.
UAV localization schemes in the literature that focus the UAV based on the collected angle values. The loca-
on localizing objects in the ground with image sensors tion estimation methods are described below.
mounted on UAVs,10–15 in the proposed schemes image To simplify the problem, we make the following
sensors are deployed in the ground, that is, the region assumptions. (a) We assume that each sensor node has
needing protection. Figure 1(a) shows the deployment the ability to detect an aerial moving object and extract
of image sensors for UAV localization. the azimuth angle and angle of elevation of the target
When we have two sensors monitoring a selected UAV using image-processing techniques on a picture it
target UAV, if we know the angle of the target UAV took.7–9 (b) The clocks of the sensor nodes are synchro-
measured by each sensor and the distance between two nized, and thus, all of the working sensor nodes can
sensors, then the location of the target UAV can be take a picture at the same time. (c) The entire set of
determined by triangulation.2 However, if there is error sensor nodes shares a common coordinate system, as
in the measured angles, the accuracy of the triangula- shown in Figure 1(b), and they face in the same direc-
tion method is significantly low. This issue is discussed tion, that is, the direction of the positive y-axis.
in section ‘‘Analysis results.’’ The collector node knows the position of each sensor,
In this work, we investigate two new methods to for example, the coordinates (xi, yi, zi) for the ith sen-
improve localization accuracy by incorporating multi- sor, Si.
ple sensors and solving an optimization problem based The location of the UAV needs to be determined in
on the measurement results from multiple sensors while three-dimensional (3D) space, that is, in the reference
overcoming the limitations of a simple triangulation coordinate system. We attempt to solve this problem in
method. We analyze the proposed scheme by numerical two different ways. In the first approach, we localize
analysis, and we kept a comparison of the scheme with the projection of the target UAV on the horizontal
experimental results for our future work. plane and estimate the location of the UAV in 3D space
by calculating its altitude afterwards. In the second
approach, the estimation is done directly in 3D space.
UAV localization scheme
In this section, we discuss detailed methods to estimate
Localization of the target UAV through the projection
the location of a UAV using the angle measurement
values provided by image sensors deployed over a image on the horizontal plane
selected range of a geographic region. N denotes the In this subsection, we attempt to find the location of
total number of sensors in the selected region, and (xi, the target UAV in two steps. In the first step, we esti-
yi, zi) represents the position of ith sensor node Si, with mate the location of the projection of the target UAV
(xU , yU , zU ) as the position of target UAV node U at on the horizontal two-dimensional (2D) space using azi-
time t. Each sensor measures the azimuth angle (ui) and muth angles measured by sensors. In the second step,
the angle of elevation (fi) of the target UAV at time t, we estimate the altitude of the target UAV using the
as shown in Figure 1(b) and sends that information to measured angles of elevation.
the centralized collector (sink) node through the sensor Figure 1(b) shows two sensors, Si and Sj, and the
network. Then, the sink node estimates the location of target UAV U, along with their coordinates. Let us
4 International Journal of Distributed Sensor Networks
consider the projection of the points Si, Sj, and U on The distance between a point, b = (t1, t2)T and TPL li
the horizontal plane, which is defined by the equation given in equation (1), d((t1, t2), li), can be expressed as
z = 0. By projection, Si, Sj, and U are mapped to
Si0 = (xi , yi , 0), Sj0 = (xj , yj , 0), and U 0 = (xU , yU , 0), jai t1 + bi t2 + ci j
d ð(t1 , t2 ), li Þ = pffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
respectively. From azimuth angle ui measured by Si a2i + b2i
and the coordinates of Si0 , the equation of the line (li) = jai t1 + bi t2 + ci j
emanating from Sj0 toward U#, which is referred to as
the target-pointing line (TPL) of Sj0 in this article, can where the second equality is valid because
easily be obtained as a2i + b2i = sin2 ui + cos2 ui = 1 by the definition of ai
and bi in equation (1). Let S(U) denote the set of sen-
a i x + b i y + ci = 0 ð1Þ sors that detected specific target UAV U. Let us assume
where ai = sin ui , bi = cos ui , and ci = sin ui xi + that |S(U)| = M, and the sensors belonging to S(U)
cos ui yi . The equation of TPL lj passing Sj0 can be are numbered sequentially from 1 to M. Then, U ,
obtained in a similar manner from uj and the coordi- which estimates U#, can be expressed as
nates of Sj0 . Then, the coordinates of U#, (xU , yU , 0), can X
be calculated from the intersection of li and lj. Thus, U = arg min d((t1 , t2 ), li )2
b = ðt1 , t2 ÞT i2S ðU Þ
the x and y coordinates of target U can be obtained
from the intersection of two TPLs on the horizontal X
M
Thus, we find that the location of target UAV U can U = arg min kC X bk2
be determined by the above method if two distinct sen- b = ðt1 , t2 ÞT
sors can provide accurate information about the azi-
muth angle (u) and angle of elevation (f). However, This is a well-known linear least square optimization
the angle measurement of each sensor node may not be problem, and the optimal solution for this problem can
accurate under practical situations due to limited reso- be obtained as follows17
lution of the images, distortion of the images by an
unclean lens, and so on. Thus, we investigate localiza- U = (X T X )1 X T C ð5Þ
tion of the UAV under conditions such as noisy mea-
From the definition of X in equation (4), XTX
surement in more detail hereafter.
becomes a 2 3 2 matrix, and thus, it is not compli-
As we have seen in Figure 1(b), when the location of
cated to calculate the inverse of XTX in equation (5).
U# (i.e. the projection of U on the horizontal plane) is
determined reliably, the calculation of altitude (zU )
becomes trivial with equation (2). Thus, we concentrate Localization of the target UAV in 3D space
on estimation of the location of U# based on possibly
noisy angle measurement samples from many sensor As a second approach to UAV localization, we solve an
nodes. optimization problem directly in 3D space. A line that
When the number of sensor nodes is three or more, passes sensor Si and has the azimuth angle and the ele-
and the azimuth angle measurement is not accurate due vation angle measured by Si is called a 3D-TPL and is
to noise, there is not likely to be a single point where all denoted as Li. An arbitrary point, ~
v, on the 3D-TPL Li
the TPLs from the sensor nodes meet. In such a case, can be represented as
we estimate the location of U# as point U , which mini- 2 3
mizes summation of the square of the distance between xi + cos(fi )cos(ui )t
the selected point and the TPL from each sensor node, v = 4 yi + cos(fi )sin(ui )t 5
~
and this can be mathematically formulated as follows. zi + sin(fi )t
Nam and Joshi 5
where t is a real number corresponding to each point. The above set of simultaneous equations can be sum-
The distance between point g = (t1, t2, t3)T and the marized as
3D-TPL Li, d((t1, t2, t3),Li), can be expressed as
follows18 (I Y T Y )g = G ð8Þ
d((t1 , t2 , t3 ), Li ) = (t1 xi )2 + (t2 yi )2 + (t3 zi )2 where Y and G are M 3 3 and 3 3 1 matrices, respec-
tively, given by
((xi t1 )cos(fi )cos(ui ) + (yi t2 )cos(fi )sin(ui ) ð6Þ
2 3
+ (zi t3 )sin(fi ))2 cos(f1 )cos(u1 ) cos(f1 )sin(u1 ) sin(f1 )
6 cos(f )cos(u ) cos(f2 )sin(u2 ) sin(f2 ) 7
6 2 2 7
Since Li is determined from the measurement of Si, Y =6
6 .. .. ..
7
7
we define fi(t1, t2, t3) as fi(t1, t2, t3) = d((t1, t2, t3),Li). 4 . . . 5
We estimate the location of target UAV U as U ^ , defined
cos(fM )cos(uM ) cos(fM )sin(uM ) sin(fM )
as 2 8 9 3
2 2
< xi (1 cos (fi )cos (ui )) >
> =
P
6 M 2 7
^ = arg min F(t1 , t2 , t3 )
U ð7Þ 6 i=1 > i y cos (f i )cos(u i )sin(u i ) 7
6 : >
; 7
g = ðt1 , t2 , t3 ÞT 6 z cos(f )sin(f )cos(u ) 7
6 8
i i i i
97
P 6 2 7
6 > x cos (fi )cos(ui )sin(ui ) > =7
where F(t1 , t2 , t3 ) = 1 i M fi (t1 , t2 , t3 ), and M is the 6 PM < i 7
number of sensors that detected U. In other words, U is G=6 2 2
6 i = 1 > + yi (1 cos (fi )sin (ui )) > 7
7
estimated as the point U ^ that minimizes summation of 6 : ;7
6 zi cos(fi )sin(fi )sin(ui ) 7
the squares of the distance between the selected point 6 8 9 7
6 > x cos(f )sin(f )cos(u ) > 7
and the 3D-TPL from each sensor in 3D space. 6P < i i i i = 7
6 M 7
The eigenvalues of the Hessian matrix of fi (t1 , t2 , t3 ) 4 i = 1 yi cos(fi )sin(fi )sin(ui ) 5
>
: 2
>
;
can be obtained by solving the following equation + zi (1 sin (fi ))
0 = r2 fi (t1 , t2 , t3 ) lI
(2 l) 2cos2 f cos2 ui 2cos2 fi cos ui sin ui 2 cos fi sin fi cos ui
i
= 2cos2 fi cos ui sin ui (2 l) 2cos2 fi sin2 ui 2 cos fi sin fi sin ui
2 cos f sin f cos ui 2 cos fi sin fi sin ui (2 l) 2sin2 f
i i i
2
= l(l 2)
Since the solution of equation (8) can be obtained by
where I is a 3 3 3 identity matrix. Since the minimum multiplying (I Y T Y )1 on both sides of equation (8),
eigenvalue is zero, the Hessian matrix of fi (t1 , t2 , t3 ) is and combining equations (7) and (8) yields
positive semidefinite for any t1 , t2 , t3 , and thus,
^ = (I Y T Y )1 G
U
fi (t1 , t2 , t3 ) defined with equation (6) is convex.19 Since fi
is convex, the summation of fi0 s, that is, F in equation
(7), becomes convex again. Since F is quadratic and
convex, F has a global minimum at the point g = (t1, Analysis results
t2, t3)T, where the following equations are simultane-
In this section, we evaluate the accuracy of our pro-
ously satisfied
posed UAV localization schemes through simulation.
∂F XM
=2 fðt1 xi Þ + ð(xi t1 ) cos fi cos ui + (yi t2 ) cos fi sin ui + (zi t3 ) sin fi Þ 3 cos fi cos ui g
∂t1 i=1
=0
∂F XM
=2 fðt2 yi Þ + ð(xi t1 ) cos fi cos ui + (yi t2 ) cos fi sin ui + (zi t3 ) sin fi Þ 3 cos fi cos ui g
∂t2 i=1
=0
∂F XM
=2 fðt3 zi Þ + ð(xi t1 ) cos fi cos ui + (yi t2 ) cos fi sin ui + (zi t3 ) sin fi Þ 3 sin fi g
∂t3 i=1
=0
6 International Journal of Distributed Sensor Networks
Funding
The author(s) disclosed receipt of the following financial
support for the research, authorship, and/or publication of
this article: This research was supported in part by Basic
Science Research Program through the National Research
Foundation of Korea (NRF) funded by the Ministry of
Education (2015R1D1A1A01058595). This research was sup-
ported in part by the MSIT(Ministry of Science and ICT),
Korea, under the ITRC (Information Technology Research
Center) support program (IITP-2017-2016-0-00313) super-
vised by the IITP(Institute for Information & communica-
tions Technology Promotion).
(a)
References
1. Wallace RJ and Loffi JM. Examining unmanned aerial
system threats & defenses: a conceptual analysis. Int
J Aviat Aeronaut Aerosp 2015; 2(4): 1.
2. Boddhu SK, McCartney M, Ceccopieri O, et al. A colla-
borative smartphone sensing platform for detecting and
tracking hostile drones. In: SPIE defense, security, and
sensing: international society for optics and photonics, Bal-
timore, MD, 29 April–3 May 2013, Article ID 874211.
Bellingham, WA: SPIE.
3. Li CJ and Ling H. Synthetic aperture radar imaging
using a small consumer drone. In: 2015 IEEE interna-
(b) tional symposium on antennas and propagation & USNC/
URSI national radio science meeting, Vancouver, BC,
Figure 3. Comparison of estimation accuracies of the 2D-TPL Canada, 9–24 July 2015, pp.685–686. New York: IEEE.
and 3D-TPL schemes for a high-altitude target UAV at target 4. Ganti SR and Kim Y. Implementation of detection and
location (500, 200, 550): (a) se = 1.0 and (b) se = 5.0. tracking mechanism for small UAS. In: 2016 IEEE inter-
national conference on unmanned aircraft systems
(ICUAS), Arlington, VA, 7–10 June 2016, pp.1254–
1260. New York: IEEE.
Conclusion and future work
5. Nistér D, Naroditsky O and Bergen J. Visual odometry
In this work, we proposed two schemes for localizing for ground vehicle applications. J Field Robot 2006;
UAVs based on measurement of azimuth angle and 23(1): 3–20.
angle of elevation by many image sensors distributed 6. Joshi GP and Kim SW. A distributed geo-routing algo-
over a wide geographic region. The first scheme (2D- rithm for wireless sensor networks. Sensors 2009; 9(6):
4083–4103.
TPL) attempts to localize the projected image of the
7. Gökcxe F, Ücxoluk G, Sxahin E, et al. Vision-based detec-
target UAV on the horizontal plane and finds the alti-
tion and distance estimation of micro unmanned aerial
tude of the target UAV as the next step. The second vehicles. Sensors 2015; 15(9): 23805–23846.
scheme (3D-TPL) attempts to directly localize the tar- 8. Dey D, Geyer C, Singh S, et al. Passive, long-range
get UAV in 3D space. The simulation results show detection of aircraft: towards a field deployable sense
that both schemes exhibit very similar performance in and avoid system. In: Howard A, Lagnemma K and
terms of localization error when the altitude of the Kelly A (eds) Field and service robotics. Berlin: Springer-
target UAV is low. However, 3D-TPL outperforms Verlag, 2010, pp.113–123.
2D-TPL, especially when the altitude of the target 9. Dey D, Geyer C, Singh S, et al. A cascaded method to
UAV is high and the measurement error in the angle detect aircraft in video imagery. Int J Robot Res 2011;
of elevation is large. Comparison of the simulation 30(12): 1527–1540.
results with experimental results is left for future 10. Redding JD, McLain TW, Beard RW, et al. Vision-based
work. target localization from a fixed-wing miniature air vehi-
cle. In: 2006 IEEE American control conference, Minnea-
polis, MN, 14–16 June 2006, pp.2862–2867. New York:
Declaration of conflicting interests IEEE.
The author(s) declared no potential conflicts of interest with 11. Vidal R and Sastry S. Vision-based detection of autono-
respect to the research, authorship, and/or publication of this mous vehicles for pursuit-evasion games. IFAC Proc Vol
article. 2002; 35(1): 391–396.
8 International Journal of Distributed Sensor Networks
12. Chroust SG and Vincze M. Fusion of vision and inertial Infotech@ Aerospace conference, St. Louis, MO, 29–31
data for motion and structure estimation. J Field Robot March 2011, p.1658. Reston, VA: AIAA.
2004; 21(2): 73–83. 16. Ben-Afia A, Deambrogio L, Salós D, et al. Review and
13. Rysdyk R. UAV path following for constant line-of- classification of vision-based localisation techniques in
sight. In: 2nd AIAA ‘‘Unmanned Unlimited’’ conference unknown environments. IET Radar Sonar Nav 2014; 8(9):
and workshop & exhibition, San Diego, CA, 15–18 Sep- 1059–1072.
tember 2003, p.6626. Reston, VA: AIAA. 17. Koo J and Cha H. Localizing WiFi access points using
14. Yang J, Dani A, Chung SJ, et al. Inertial-aided vision- signal strength. IEEE Commun Lett 2011; 15(2): 187–189.
based localization and mapping in a riverine environment 18. Wolfram MathWorld. Point-line distance—3-dimen-
with reflection measurements. In: AIAA Guidance, sional, http://mathworld.wolfram.com/Point-LineDis-
Navigation, and Control (GNC) conference, Boston, MA, tance3-Dimensional.html (accessed 7 April 2017).
19–22 August 2013, p.5246. Reston, VA: AIAA. 19. Boyd S and Vandenberghe L. Convex optimization. Cam-
15. Mahboubi Z, Kolter Z, Wang T, et al. Camera based bridge: Cambridge University Press, 2004.
localization for autonomous UAV formation flight. In: