0% found this document useful (0 votes)
43 views

Sensors: Obstacle Avoidance of Multi-Sensor Intelligent Robot Based On Road Sign Detection

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views

Sensors: Obstacle Avoidance of Multi-Sensor Intelligent Robot Based On Road Sign Detection

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

sensors

Article
Obstacle Avoidance of Multi-Sensor Intelligent Robot Based on
Road Sign Detection
Jianwei Zhao 1,† , Jianhua Fang 1, *,† , Shouzhong Wang 2,† , Kun Wang 1 , Chengxiang Liu 1 and Tao Han 1

1 School of Mechatronic Engineering, China University of Mining and Technology, Beijing 100089, China;
[email protected] (J.Z.); [email protected] (K.W.);
[email protected] (C.L.); [email protected] (T.H.)
2 Beijing Special Engineering and Design Institute, Beijing 100028, China; [email protected]
* Correspondence: [email protected]
† These authors contributed equally to this work.

Abstract: The existing ultrasonic obstacle avoidance robot only uses an ultrasonic sensor in the
process of obstacle avoidance, which can only be avoided according to the fixed obstacle avoidance
route. Obstacle avoidance cannot follow additional information. At the same time, existing robots
rarely involve the obstacle avoidance strategy of avoiding pits. In this study, on the basis of ultrasonic
sensor obstacle avoidance, visual information is added so the robot in the process of obstacle
avoidance can refer to the direction indicated by road signs to avoid obstacles, at the same time, the
study added an infrared ranging sensor, so the robot can avoid potholes. Aiming at this situation,
this paper proposes an intelligent obstacle avoidance design of an autonomous mobile robot based

 on a multi-sensor in a multi-obstruction environment. A CascadeClassifier is used to train positive
and negative samples for road signs with similar color and shape. A multi-sensor information fusion
Citation: Zhao, J.; Fang, J.; Wang, S.;
is used for path planning and the obstacle avoidance logic of the intelligent robot is designed to
Wang, K.; Liu, C.; Han, T. Obstacle
realize autonomous obstacle avoidance. The infrared sensor is used to obtain the environmental
Avoidance of Multi-Sensor Intelligent
Robot Based on Road Sign Detection.
information of the ground depression on the wheel path, the ultrasonic sensor is used to obtain the
Sensors 2021, 21, 6777. https:// distance information of the surrounding obstacles and road signs, and the information of the road
doi.org/10.3390/s21206777 signs obtained by the camera is processed by the computer and transmitted to the main controller.
The environment information obtained is processed by the microprocessor and the control command
Academic Editors: Anastassia is output to the execution unit. The feasibility of the design is verified by analyzing the distance
Angelopoulou, Jude Hemanth, Peter acquired by the ultrasonic sensor, infrared distance measuring sensors, and the model obtained
M. Roth, Epameinondas Kapetanios by training the sample of the road sign, as well as by experiments in the complex environment
and Jose Garcia Rodriguez constructed manually.

Received: 6 September 2021


Keywords: intelligent robot; multi sensor; Adaboost; obstacle avoidance; object detection
Accepted: 8 October 2021
Published: 12 October 2021

Publisher’s Note: MDPI stays neutral


1. Introduction
with regard to jurisdictional claims in
published maps and institutional affil- With the development of artificial intelligence technology, mobile robots are widely
iations. used in intelligent factories, modern logistics, security, precision agriculture and other
aspects [1–4]. Wheeled mobile robots have been widely used in storage and transportation
fields. The focus of this research is to avoid obstacles in complex environments while the
path is optimal. The most important thing to realize the autonomous motion control of
Copyright: © 2021 by the authors.
a mobile robot is to obtain the information of the surrounding environment and transfer
Licensee MDPI, Basel, Switzerland.
it to the main controller to convert it into control command, so as to ensure that the
This article is an open access article
robot can safely and stably avoid all obstacles while moving to the destination, which
distributed under the terms and can be achieved when the mobile robot has a strong perception system. Different types
conditions of the Creative Commons of sensors are required for different information, including proprioceptive sensors that
Attribution (CC BY) license (https:// measure information such as joint angle and wheel speed; and exteroceptive sensors that
creativecommons.org/licenses/by/ sense external data such as sound, light and distance [5–10]. The sensing technologies
4.0/). of mobile robots include passive sensing based on multiple cameras, stereo vision and

Sensors 2021, 21, 6777. https://doi.org/10.3390/s21206777 https://www.mdpi.com/journal/sensors


Sensors 2021, 21, 6777 2 of 17

infrared cameras and active sensing using lidar and sonar sensors to detect dynamic or
stationary obstacles in real time [11]. Laser ranging is used to analyze the wheel skid of
the four-wheel sliding steering mobile robot. Some other studies have proposed target
tracking of wheeled mobile robots based on visual methods [12,13].
For an unknown environment, sensors are usually used for intelligent obstacle avoid-
ance and path planning. The early method of obstacle avoidance and path planning is to
detect the stickers on the ground by infrared ray for navigation. This method can only be
used in a known environment. In Ref. [14] Jiang et al. [15] utilized six ultrasonic sensors to
capture relative information about of ambient wheeled robots and to identify a parking
space for automatic parking. In 1995, Yuta and Ando [16] installed ultrasonic sensors on
the front of a robot and in various locations on the left and right sides. In Refs. [17,18]
multiple ultrasonic data were used to create the map of the surrounding environment or
establish the surface shape of obstacles.
At present, the research on an obstacle avoidance robot is mostly about the motor driv-
ing principle, motor speed regulation scheme and ranging principle, and the research on
obstacle avoidance is also about obstacle avoidance. Few people study mobile robots when
they encounter pits during automatic travel. In this paper, ultrasonic sensor information,
infrared distance measuring sensors information and camera information are fused. After
solving the above problems, the function of road sign recognition is also introduced, which
allows the mobile robot to make accurate movement based on the traffic sign information.

2. Establishing Kinematic Model


The object of this paper is a wheeled sliding steering mobile robot, which is driven
independently by four symmetrical wheels and has both rolling and sliding in the move-
ment process, and its mechanical structure is simple and has high flexibility. The car body
has no steering gear and relies on changing the left and right wheel speeds to make the
wheels skid, achieving a different radius steering or even zero radius steering. However,
due to the nonlinear, time-varying, multi-variable and strong coupling characteristics of
the system, its motion is more uncertain than that of the car with a steering device. So, it is
necessary to build a kinematic model for the system instead of using a simple kinematic
model to represent its motion characteristics.
Figure 1 shows the operational model of the robot studied in this study. Without
considering the mass transfer, the following assumptions are made for the model:
1. The vehicle body is completely symmetrical and the center of mass coincides with the
center of assembly;
2. The car moves in plane motion.
We established the fuselage coordinate system Ob and the ground coordinate system
XOY, where the fuselage coordinate system moves with the vehicle.
In fuselage coordinates system:

vGx = vG cosα
(1)
vGy = vG sinα

In the process of moving, the relative sliding between the mobile robot and the ground
is inevitable. We used the slip rate to describe the wheel slip and its calculation formula is:

wi r − vix
si = × 100% (2)
wi r

wi is the rotational speed of wheel i;


vix is the x-direction partial velocity of the center of wheel i.
When si > 0, that is, the wheel linear velocity is greater than the wheel center velocity,
the frictional force between the wheels and the ground is the driving force, and then the
slip occurs.
Sensors 2021, 21, 6777 3 of 17

When si < 0, that is, the wheel linear velocity is less than the wheel center velocity, the
Sensors 2021, 21, x FOR PEER REVIEW frictional force between the wheels and the ground is the braking force and slippage occurs.
3 of 18
When si = 0, that is, the linear speed of the wheel is equal to the speed of the wheel
center, the robot is in a complete rolling state.

Figure 1. Kinematic model.


Figure 1. kinematic model.
We defined the side near the center of rotation as the inside side and the other side
as the
Weoutside side. the
established When the robot
fuselage turns, the
coordinate innerOb
system wheel
andslips and thecoordinate
the ground outer wheel slips.
system
The instantaneous center of the contact point between
XOY, where the fuselage coordinate system moves with the vehicle. the wheels on both sides and the
ground is equal to the y coordinate
In fuselage coordinates system: of the rotating center.
When the robot rotates, the longitudinal velocity at the center of the same-side wheel
𝑣𝐺𝑥 = 𝑣𝐺 𝑐𝑜𝑠 𝛼
is equal: { 𝑣 = 𝑣 𝑠𝑖𝑛 𝛼 (1)
w1 r (1 − 𝐺𝑦s1 ) = w𝐺
2 r (1 − s2 ) (3)
In the process of moving, the relative sliding between the mobile robot and the
w3 r ( 1 − s 3 ) = w4 r ( 1 − s 4 ) (4)
ground is inevitable. We used the slip rate to describe the wheel slip and its calculation
formula Because
is: the wheels on the same side rotate at the same speed, so:
𝑤𝑖 𝑟 − 𝑣𝑖𝑥
s1 𝑠= s =
𝑖 =2
s l , s3 = 4 = sr
× s100% (5)
(2)
𝑤𝑖 𝑟
𝑤𝑖Formula (4) showsspeed
is the rotational the relationship
of wheel i;between linear velocity, attitude angular velocity
and𝑣leftisand right wheel rotation speed: of the center of wheel i.
the x-direction partial velocity
𝑖𝑥
When 𝑠𝑖 > 0, that is, the wheel linear velocity is greater than the wheel center veloc-
v = wl r+2 wr r
(
ity, the frictional force between the wheels
. and the ground is the driving force, and then
(6)
the slip occurs. ϕ = wr r −
b
wl r

When 𝑠𝑖 < 0, that is, the wheel linear velocity is less than the wheel center velocity,
Since there is sliding, the linear velocity is represented by the longitudinal velocity
the frictional force between the wheels and the ground is the braking force and slippage
of the wheel center, while the lateral velocity can be represented by the sideslip angle, so
occurs.
formula (5) can be obtained.
When 𝑠𝑖 = 0, that is, the linear speed of the wheel is equal to the speed of the wheel
center, the robot is in
 a complete
 rolling1 −

state.
sl 1 − sr

vGx  
We defined theside r  center of rotation as the inside wl and the other side
vGy near
 = the  side
 tanα(1 − sl ) tanα(1 − sr )  (7)
as the outside side. When . 2 wr outer wheel slips.
ϕ the robot turns, the
2(1−
− point sl ) inner wheel
2(1−sr )slips and the
The instantaneous center of the contact b between bthe wheels on both sides and the
groundThrough
is equalcoordinate
to the y coordinate of the rotating center.
system transformation, the kinematics equation of the mobile
When the robot rotates, the longitudinal
robot in the ground coordinate system XOY can velocity at the center
be expressed of the same-side
by formula (6). wheel
is equal:
𝑤1 𝑟(1 − 𝑠1 ) = 𝑤2 𝑟(1 − 𝑠2 ) (3)

𝑤3 𝑟(1 − 𝑠3 ) = 𝑤4 𝑟(1 − 𝑠4 ) (4)


Because the wheels on the same side rotate at the same speed, so:
Sensors 2021, 21, 6777 4 of 17

 . 
X
  
.
cosθ sinθ 0 vGx
 Y  =  −sinθ cosθ 0  vGy  (8)
 
. .
θ 0 0 1 ϕ

The meanings of parameters in the formula are shown in Table 1.


Table 1. Parameter list.

Parameter Meaning
r The wheel radius
b Distance between left and right wheel centroid
wl Left wheel speed
wr RPM of the right wheel
sl Slip rate of left wheel
sr The slip rate of the right wheel
α Sideslip Angle
.
ϕ The yaw velocity of rotation about the z axis in the XOY plane
v Gx The longitudinal velocity of the center of mass
v Gy The lateral velocity of the center of mass

3. System Structure of the Mobile Robot


The control system of the mobile robot is composed of the power module, main
controller, industrial personal computer (IPC), detection module and drive module. The
power supply module is responsible for the energy supply of the whole system. Since
the voltage of each module is different, the voltage and regulation are performed by the
up/down module. The voltage of the ultrasonic sensor and the infrared distance measuring
sensors need only be provided by the controller. The detection module comprises an
ultrasonic sensor, an infrared distance measuring sensor and a camera. The detection
module is mainly used for detecting the surrounding environment, and feeds the detected
environment information back to the main controller. The camera data shall be first
handed over to the IPC, and then fed back to the main controller after processing by
the IPC. The main controller processes the environment information. The drive module
consists of a motor driver as well as an encoder motor. The motor driver controls the
speed of the encoder motor. The motor encoder detects the speed of the motor and
feeds it back to the driver for closed-loop control. The main controller is responsible for
the information processing of the system using the Arduino Mega 2560. The detection
system detects that the environment information is fed back to the main controller, and the
main controller processes the information. The driving system is driven according to the
environment information to control the movement speed and attitude of the robot, so as
to avoid obstacles in the range of activity. Figure 2 shows the system composition of the
mobile robot.
Sensors 2021, 21, x FOR PEER REVIEW 5 of 18

the environment information is fed back to the main controller, and the main controller
Sensors 2021, 21, 6777 processes the information. The driving system is driven according to the environment
5 of 17 in-
formation to control the movement speed and attitude of the robot, so as to avoid obstacles
in the range of activity. Figure 2 shows the system composition of the mobile robot.

Figure2.2.Composition
Figure Compositionofofmobile
mobile robot
robot system.
system.

4.4.Detection
DetectionSystemSystem
InInthe
theresearch
researchofofobstacle
obstacleavoidance
avoidanceofofa mobile
a mobile robot, thethe
robot, processing
processing of of
thethe
sur-
sur-
rounding
roundingenvironment
environmentinformation
information is is
especially
especiallyimportant.
important. The environment
The environment is dynamic
is dynamic
and
andunknown
unknownininreal reallife.
life.AtAtthe
thesame
same time, inin
time, some
some environments,
environments, there areare
there signs thatthat
signs
require the robot to move in the specified direction. It is important that
require the robot to move in the specified direction. It is important that the robot moves the robot moves
safely
safelytotoitsitsdestination
destinationinina complex
a complex location.
location.ByBy selecting
selectingappropriate
appropriatesensors
sensorsto collect
to collect
and analyze environmental information, the robot can realize the
and analyze environmental information, the robot can realize the above functions. above functions. In this
In this
design, the HC-SR04 ultrasonic sensor, GP2YA02 and USB driver-free
design, the HC-SR04 ultrasonic sensor, GP2YA02 and USB driver-free camera are selected camera are selected
for the components of the detection system.
for the components of the detection system.
4.1. Sensor Layout
4.1. Sensor Layout
In order to make the robot work normally in both static and dynamic environments,
In order tosensors,
nine ultrasonic make the tworobot workdistance
infrared normally in both static
measuring and and
sensors, dynamic
a USBenvironments,
driverless
nine ultrasonic sensors, two infrared distance measuring sensors,
camera were installed on the robot body. An ultrasonic sensor was used to detect and a USBobstacle
driverless
camera were
information ofinstalled on the robot
the surrounding bulge;body. An ultrasonic
an infrared sensor
distance was used
measuring to detect
sensor was obstacle
posi-
tioned between two wheels in front of the bottom wheel for detecting the ground pit;posi-
information of the surrounding bulge; an infrared distance measuring sensor was a
tioned was
camera between
usedtwo wheels
to detect theinroad
frontsign
of the bottom wheel
information. for detectingdetected
The information the ground pit; a
by the
camera
sensor was used totodetect
is transmitted the controller
the main road sign for information.
processing,The
andinformation
a command detected
is sent to by
thethe
sensordriver
motor is transmitted
to controltothethe main
robot forcontroller for processing,
corresponding movement. andThe
a command is sent
sensor layout to the
of the
motor robot
mobile driveristo control
shown the robot
in Figure 3. for corresponding movement. The sensor layout of the
mobile robot is shown in Figure 3.
4.2. Target Detection Based on Adaboost Algorithm
4.2.1. Sample Pretreatment
The training sample is divided into a positive sample and negative sample, the positive
sample is a road sign sample picture and the negative sample is any other picture. In this
paper, 1000 positive samples and 2000 negative samples were selected, and samples were
grayed and normalized to 128 × 72 gray scale as to form a training sample set, so as to
avoid different pictures to calculate a different number of features. The picture shown in
Figure 4 is the sample of the three road signs to be trained on separately. Figure 5 is the
picture of the negative sample.
Sensors 2021,
Sensors 21, x6777
2021, 21, FOR PEER REVIEW 66ofof1718

Figure 3.
Figure 3. Sensor
Sensor Layout.
Layout.

4.2. Target
4.2. Target Detection
Detection Based
Based on
on Adaboost
Adaboost Algorithm
Algorithm
4.2.1. Sample
4.2.1. Sample Pretreatment
Pretreatment
The training
The training sample
sample isis divided
divided into
into aa positive
positive sample
sample and
and negative
negative sample,
sample, the
the posi-
posi-
tive sample is a road sign sample picture and the negative sample is any other
tive sample is a road sign sample picture and the negative sample is any other picture. In picture. In
this paper,
this paper, 1000
1000 positive
positive samples
samples and and 2000
2000 negative
negative samples
samples were
were selected,
selected, and
and samples
samples
were grayed and normalized to 128 × 72 gray scale as to form a training
were grayed and normalized to 128 × 72 gray scale as to form a training sample set, so sample set, so as
as
to avoid different pictures to calculate a different number of features. The
to avoid different pictures to calculate a different number of features. The picture shown picture shown
in Figure
in Figure 44 is
is the
the sample
sample ofof the
the three
three road
road signs
signs to
to be
be trained
trained onon separately.
separately. Figure
Figure 55 is
is the
the
Figure
picture
Figure 3. Sensor
3. of Layout. sample.
the negative
Sensor negative
Layout.
picture of the sample.
4.2. Target Detection Based on Adaboost Algorithm
4.2.1. Sample Pretreatment
The training sample is divided into a positive sample and negative sample, the posi-
tive sample is a road sign sample picture and the negative sample is any other picture. In
this paper,
Figure 1000 positive
4.Positive
Positive samples and 2000 negative samples were selected, and samples
samplepicture.
picture.
Figure
Figure4.4. Positivesample
sample picture.
were grayed and normalized to 128 × 72 gray scale as to form a training sample set, so as
to avoid different pictures to calculate a different number of features. The picture shown
in Figure 4 is the sample of the three road signs to be trained on separately. Figure 5 is the
picture of the negative sample.

Figure5.5.
Figure
Figure 5.Negative
NegativeSample
Negative SamplePicture.
Sample Picture.
Picture.

4.2.2.CascadeClassifier
4.2.2.
4.2.2. CascadeClassifierTraining
CascadeClassifier TrainingBased
Training Based
Based on
onon Adaboost
Adaboost
Adaboost
TheAdaboost
The Adaboost algorithm
algorithm is adaptive
is an an adaptive boosting
boosting algorithm.
algorithm. The idea
The basic basicofidea of Ada-
Adaboost
FigureThe Adaboost
4. Positive samplealgorithm
picture. is an adaptive boosting algorithm. The basic idea of Ada-
isboost
to use
boost is weak
to use classifier
weak and sample
classifier and space
sample of different
space of weight
different
is to use weak classifier and sample space of different weight distribution to builddistribution
weight to build
distribution a strong
to build aa
classifier
strong [19–22].
classifier The Adaboost
[19–22]. The algorithm
Adaboost synthesizes
algorithm a strong
synthesizes
strong classifier [19–22]. The Adaboost algorithm synthesizes a strong CascadeClassifier CascadeClassifier
a strong with
CascadeClassifier a
strong classification
with aa strong
with ability
strong classification by
classification ability superposing
ability by by superposinga large number
superposing aa large of
large number simple
number of CascadeClassifiers
of simple
simple CascadeClas-
CascadeClas-
with general
sifiers with classification
general ability.
classification A strong
ability. A CascadeClassifier
strong is
CascadeClassifier
sifiers with general classification ability. A strong CascadeClassifier is formed by formed is by selecting
formed weak
by selecting
selecting
CascadeClassifiers
weak CascadeClassifiers with the with
CascadeClassifiers best resolution
the best performance
best resolution
resolution and the leastand
performance error.theThe principle
least error. isThe
weak with the performance and the least error. The
to carry outisTto
principle cycle iteration,
carry out T selectiteration,
cycle an optimal and weak
select an CascadeClassifier
optimal and weak each time, then
CascadeClassifier
principle is to carry out T cycle iteration, select an optimal and weak CascadeClassifier
update 5.the
each time,
Figure time, sample weight,
then update
Negative update
Sample thereduce
sample
Picture. theweight,
weight reduce
of correctly resolved
the weight
weight of samples,
correctlyand increase
resolved sam-
each then the sample weight, reduce the of correctly resolved sam-
the weight
ples, and of incorrectly
increase the resolved
weight of samples.
incorrectly The specific
resolved algorithm
samples. Theis as follows
specific [23–25]:
algorithm is as
as
ples, and increase the weight of incorrectly resolved samples. The specific algorithm is
4.2.2. Step 1: given a set ofTraining
CascadeClassifier
follows [23–25]: data setsBased for training:
on Adaboost
follows [23–25]:
Step1:
The
Step1: given aa algorithm
Adaboost
given set of
set of data
datais sets for training:
{{ xsets anfor adaptive
training: boosting algorithm. The basic idea of Ada-
1 , y1 }, { x2 , y2 }, . . . , { xn , yn }}, (9)
boost is to use weak classifier and {{ sample
𝑥 , 𝑦 }, {space
𝑥 , 𝑦 of…different
}, , { 𝑥 , 𝑦 weight distribution to build(9)
}}, a
{{ 𝑥11, 𝑦11}, { 𝑥22, 𝑦22}, … , { 𝑥𝑛𝑛, 𝑦𝑛𝑛}}, (9)
where xclassifier
strong i is the input training
[19–22]. Thesample
Adaboost images, yi is thesynthesizes
algorithm result of classification and yi ∈ [0, 1]
a strong CascadeClassifier
is 1 sample,
with a strong 0 means a negative
classification abilitysample;
by superposing a large number of simple CascadeClas-
sifiersStep
with 2: general
specifiesclassification
the number ability. of loop A iterations;
strong CascadeClassifier is formed by selecting
Step 3: initializes the weight of the sample:
weak CascadeClassifiers with the best resolution performance and the least error. The
principle is to carry out T cycle iteration, select an optimal and weak CascadeClassifier
w1 = {w1,1 . . . , w1,N }, w1,j = d(i ), (10)
each time, then update the sample weight, reduce the weight of correctly resolved sam-
ples,
whereand d(i )increase the weightprobability
is the distribution of incorrectly usedresolved samples.
to initialize The specific
the strictly algorithm is as
impossible;
followsStep[23–25]:
4: t = 1, 2, . . . , T (T is the number of training times, which determines the number
Step1:
of final weak given a set of data sets for training:
CascadeClassifiers):
(1): Initialization weight: {{ 𝑥1 , 𝑦t 1 }, { 𝑥2 , 𝑦2 }, … , { 𝑥𝑛 , 𝑦𝑛 }}, (9)
p = pt1 , . . . , ptN , (11)
Sensors 2021, 21, 6777 7 of 17

(2): The initial sample is trained by a learning algorithm to obtain a weak CascadeClassifier.

ht : X → [0, 1], (12)

(3): The error rate of each weight under the current weight is found:
N
εt = ∑i=1 pt,i |ht (Xi ) − yi |, (13)

the CascadeClassifier with the smallest error rate from the obtained weak CascadeClassifier
is selected and added to the strong CascadeClassifier
(4): Weight:
ωt+1,i = ωt,i β1−|ht ( xi )−yi | , (14)
If the sample of i is classified correctly:

|ht ( xi ) − yi | = 0, (15)

Otherwise:
|ht ( xi ) − yi | = 1, (16)
where:
εt
αt = , (17)
1 − εt
(5): After passing the T wheel, the strong CascadeClassifier obtained is:

∑tT=1 αt ht ( x ) ≥ 1
∑tT=1 αt

1 2
H (x) = , (18)
0 other

where
1
αt = log , (19)
βt

4.2.3. Road Sign Identification Process


Figure 6 shows the identification flow chart. The whole process can be divided into
two steps: training and identification. In the training process, the Haar features are used
to extract the features of a large number of road sign samples, and then the Adaboost
algorithm is used to select the effective features to form the CascadeClassifier. In the
Sensors 2021, 21, x FOR PEER REVIEW 8 of 18
recognition process, the key features of the samples to be identified are extracted first, and
then the features and the trained CascadeClassifier are used for road sign recognition.

Figure6.6.Identification
Figure Identificationflow
flowchart.
chart.

5.
5. Obstacle
ObstacleAvoidance
AvoidanceStrategy
StrategyforforMobile
MobileRobots
Robots
In this study, the obstacle avoidance is mainly
In this study, the obstacle avoidance is mainlyrealized in theinfollowing
realized two situations,
the following two situa-
namely, obstacle avoidance for a ground bulging obstacle and obstacle avoidance
tions, namely, obstacle avoidance for a ground bulging obstacle and obstacle avoidance for a
for a ground sag obstacle. Figure 7 shows the schematic diagram of obstacles and obstacle
avoidance routes in this study. We made the following assumptions about obstacles:
1. Ground raised obstacles and a ground pit will only meet the conditions shown in
Figure 8;
Figure 6. Identification flow chart.

Sensors 2021, 21, 6777 5. Obstacle Avoidance Strategy for Mobile Robots 8 of 17

In this study, the obstacle avoidance is mainly realized in the following two situa-
tions, namely, obstacle avoidance for a ground bulging obstacle and obstacle avoidance
groundforsag
a ground sagFigure
obstacle. obstacle. Figurethe
7 shows 7 shows the schematic
schematic diagram diagram of obstacles
of obstacles and obstacle
and obstacle
avoidance
avoidance routes routes in this We
in this study. study. Wethe
made made the following
following assumptions
assumptions about obstacles:
about obstacles:
1. Ground
1. Ground raisedraised
obstacles and aand
obstacles ground pit will
a ground pitonly
will meet the conditions
only meet shown
the conditions shown in
in Figure 8; 8;
Figure
2. Ground
2. Groundraised raised
obstacles and a ground
obstacles pit do not
and a ground pit appear simultaneously;
do not appear simultaneously;
3. The road sign is present on a raised obstacle;
3. The road sign is present on a raised obstacle;
4. The
4. ground pit width
The ground is less than
pit width is lessthe wheel
than the spacing of the robot;
wheel spacing of the robot;
5. There is a stop
5. There is aroad
stopsign
road atsign
the destination that prompts
at the destination stop. stop.
that prompts

(a) (b)
Sensors 2021, 21, x FOR PEER REVIEW 9 of 18
Figure 7. Schematic Figure
diagram7. of obstacle,diagram
Schematic (a) is theof
raised obstacle
obstacle, (a) isonthe
theraised
ground; (b) is on
obstacle a sunken obstacle
the ground; (b) on
is athe ground.
sunken
obstacle on the ground.
Figure 8 shows a block diagram of the motion of a mobile robot. During the operation
of the intelligent vehicle, first initialize the data and give the vehicle an initial forward
speed; then start the ultrasonic sensor in front and the infrared distance measuring sen-
sors; if the ultrasonic sensor detects an obstacle, stop and start the camera to judge whether
there is a road sign indication; if there is a road sign indication, avoid the obstacle accord-
ing to the road sign indication; if there is no road sign, avoid the obstacle autonomously
according to the built-in program; if the infrared distance measuring sensors detect a
ground pit, use the built-in program to perform the corresponding movement of avoiding
the ground pit.

Figure8.8.Block
Figure Blockdiagram
diagramofofobstacle
obstacleavoidance
avoidanceprogram.
program.

Figure
When8the shows a block
robot avoidsdiagram of the
obstacles, motiontoofleave
it needs a mobile robot.space
a certain During
sothe operation
that the robot
of theturn
can intelligent vehicle,
safely. Since thefirst initialize
length the data
and width and
of the giveare
robot theboth
vehicle an initial
70 cm, forward
the distance be-
speed; then start the ultrasonic sensor in front and the infrared distance measuring sensors;
tween its rotating center and the furthest point is about 50 cm. The safe distance is set as
60 cm because the robot has deviation when rotating. When the distance measured by the
sensor in the front is equal to 60 cm, it indicates that there is an obstacle in front. At this
time, open the camera to detect whether there is a road sign. If the road sign is detected,
the obstacle should be avoided in the direction indicated by the type of road sign. If no
Sensors 2021, 21, 6777 9 of 17

if the ultrasonic sensor detects an obstacle, stop and start the camera to judge whether there
is a road sign indication; if there is a road sign indication, avoid the obstacle according to
the road sign indication; if there is no road sign, avoid the obstacle autonomously according
to the built-in program; if the infrared distance measuring sensors detect a ground pit, use
the built-in program to perform the corresponding movement of avoiding the ground pit.
When the robot avoids obstacles, it needs to leave a certain space so that the robot can
turn safely. Since the length and width of the robot are both 70 cm, the distance between
its rotating center and the furthest point is about 50 cm. The safe distance is set as 60 cm
because the robot has deviation when rotating. When the distance measured by the sensor
in the front is equal to 60 cm, it indicates that there is an obstacle in front. At this time, open
the camera to detect whether there is a road sign. If the road sign is detected, the obstacle
should be avoided in the direction indicated by the type of road sign. If no road sign is
detected, turn off the camera and perform obstacle avoidance. The reason cameras are used
only when obstacles are detected is that road signs are fixed to the surface of obstacles on
the ground, and because there is no other way of ranging other than by ultrasonic sensors,
the cameras cannot determine the distance of road signs once they are detected. Therefore,
the ultrasonic sensor detects the obstacle and determines the distance of the obstacle, and
then turns on the camera to determine whether there is a road sign on the obstacle and the
distance of the road sign.
The obstacle avoidance movement of raised obstacles on the ground is as follows:
when the distance of obstacles detected by the ultrasonic sensor on the front side is 60 cm,
the left and right ultrasonic waves start to detect for obstacles. If the distance measured by
the ultrasonic on the left is greater than that measured by the ultrasonic sensor on the right,
it turns to the left. At this time, the speed of the left wheel and the speed of the right wheel
are reversed and the left wheel reverses. If the distance measured by the ultrasonic on the
right is greater than that measured by the ultrasonic sensor on the left, it turns to the right.
At this time, the speed of the left wheel is in reverse with that of the right wheel, and the
left wheel is turning positively. After successful turning, the robot drives forward until it is
out of the range of the obstacle. At this point, the robot turns in the opposite direction of
the previous turning direction, and then continues driving and finally leaves the obstacle.
When the robot leaves the obstacle range, the detection value of the ultrasonic sensor on
the side of the robot is greater than 60 cm.
The obstacle avoidance movement of the ground pits is as follows: the distance
between the infrared sensor and the ground is 6 cm, and the distance between the chassis
and the ground is 3 cm, so the safe distance is less than 9 cm, which is set to 8 cm in this
experiment. So, the distance detected by the infrared ranging sensor is greater than 8 cm
to avoid the past. When the detection distance of the left infrared sensor is greater than
or equal to 8 cm, the left wheel slows down, and the right wheel accelerates to the left to
avoid the pit. When the detection distance of the right infrared sensor is greater than or
equal to 8 cm, the right wheel slows down, and the right wheel accelerates to the right
to avoid the pit. When pits are detected on both sides, the vehicle stops and waits for
manual movement.

6. Experiment and Analysis


Figure 9 is the physical prototype used in the experiment, and Table 2 is the mark in
the physical prototype diagram. The prototype includes a vehicle body, drive assembly,
detection assembly and control assembly. The drive assembly includes a drive member for
driving a wheel to rotate relative to a vehicle body and a wheel. A plurality of detection
assemblies is connected to the vehicle body, a portion of the detection assemblies (ultrasonic
sensors) are used to detect obstacles around the vehicle body, a portion of the detection
assemblies (infrared sensors) are used to detect obstacles at the lower end of the vehicle
body, and a portion of the detection assemblies (cameras) are used to detect road sign
information. The control assembly is connected to a drive member and a detection assembly
detection assembly and control assembly. The drive assembly includes a drive member
for driving a wheel to rotate relative to a vehicle body and a wheel. A plurality of detection
assemblies is connected to the vehicle body, a portion of the detection assemblies (ultra-
sonic sensors) are used to detect obstacles around the vehicle body, a portion of the detec-
Sensors 2021, 21, 6777 tion assemblies (infrared sensors) are used to detect obstacles at the lower end of10the of 17ve-
hicle body, and a portion of the detection assemblies (cameras) are used to detect road
sign information. The control assembly is connected to a drive member and a detection
assembly to receive a signal from the detection assembly and control
上 the drive member to
to receive a signal from the detection assembly and control 左 drive
drive a wheel to turn when the detection assembly detects the 后 member
an obstacle to the
to avoid drive a
obsta-
wheel to turn when the detection assembly detects an obstacle to avoid the obstacle.
cle.
前 右

111

11

12

33

32 2

31

22

Figure9.9.Physical
Figure Physicalprototype.
prototype.

Table2.2.Notes
Table Notestotophysical
physicalprototype.
prototype.
Label
Label 11 11 111
111 12 12 2 2 22 22 31 31 32 32 33 33
Name First Mounting
First mount Mounting 2Nd Mounting
hole 2Nd Mounting block Rear wheelFront
Front wheel Sensor Ultrasonic
Name Rear wheel IR IR
Sensor Ultrasonic sensor Camera
Camera
mount hole block wheel sensor
6.1. Sensor Ranging Experiment
6.1. Sensor Ranging
Figure Experiment
10 is an ultrasonic sensor. The HC-SR04 ultrasonic ranging module can pro-
videFigure 10 iscm
2 cm–450 annon-contact
ultrasonic sensor. Thefunction,
ranging HC-SR04ranging
ultrasonic ranging
accuracy upmodule canthe
to 3 mm; provide
module
Sensors 2021, 21, x FOR PEER REVIEW
2includes
cm–450 an
cmultrasonic
non-contact ranging function, ranging accuracy up to 3 mm; the 11 of 18
module
transmitter, receiver and control circuit.
includes an ultrasonic transmitter, receiver and control circuit.

Figure10.
Figure 10.Ultrasonic
Ultrasonictransducer.
transducer.

The ultrasonic module has four pins:, Trig (control end), Echo (receiving end), GND;
VCC and GND are connected to 5 V power supply, Trig (control end) controls the ultra-
sonic signal sent, and Echo (receiving end) receives the reflected ultrasonic signal.
Figure 11 is the principle of ultrasonic sensor ranging. The ultrasonic sensor ranging
is based on the reflection characteristics of the ultrasonic sensor. The transmitter end of
Sensors 2021, 21, 6777 11 of 17
Figure 10. Ultrasonic transducer.

The ultrasonic module has four pins:, Trig (control end), Echo (receiving end), GND;
VCC The
andultrasonic
GND are connected
module hastofour5 V pins:,
powerTrig supply, Trigend),
(control (control
Echoend) controls
(receiving the GND;
end), ultra-
sonic signal sent, and Echo (receiving end) receives the reflected ultrasonic
VCC and GND are connected to 5 V power supply, Trig (control end) controls the ultrasonic signal.
Figure
signal sent, 11
andisEcho
the principle
(receivingof end)
ultrasonic
receivessensor ranging.ultrasonic
the reflected The ultrasonic sensor ranging
signal.
is based on 11
Figure theisreflection characteristics
the principle of ultrasonicof sensor
the ultrasonic
ranging.sensor. The transmitter
The ultrasonic end of
sensor ranging
the ultrasonic
is based on thesensor emits
reflection a beam of ultrasonic
characteristics wave, and
of the ultrasonic at the The
sensor. same time it starts
transmitter endtiming,
of the
and the ultrasonic wave is transmitted in the medium at the same time.
ultrasonic sensor emits a beam of ultrasonic wave, and at the same time it starts timing, Because sound
waves
and thehave reflective
ultrasonic properties,
wave they bounce
is transmitted in the back when
medium atthey encounter
the same time.obstacles. When
Because sound
the receiving end of the ultrasonic sensor receives the reflected ultrasonic
waves have reflective properties, they bounce back when they encounter obstacles. When wave back, it
stops the timing.
the receiving endThe propagation
of the ultrasonicmedium in this study
sensor receives is air, and
the reflected the propagation
ultrasonic wave back,speedit
of sound
stops in air isThe
the timing. 340 propagation
m/s. According to theinrecorded
medium this studytime t, the
is air, anddistance S between
the propagation the
speed
launching
of sound in position and
air is 340 the obstacle
m/s. According cantobethe
calculated
recordedaccording
time t, the todistance
the formula S = 340 the
S between t/2.
launching position and the obstacle can be calculated according to the formula S = 340 t/2.

Sensors 2021, 21, x FOR PEER REVIEW 12 of 18

Figure 11. Ranging principle.


Figure 11. Ranging principle.
Figure
be emitted 12 isthe
inside a module
sequenceand diagram
the echoofto
anbeultrasonic
detected. sensor.
Once theAs shown
echo signalinisFigure 13,
detected,
Figure
a pulse 12 is
trigger a sequence diagram 10 of an ultrasonic sensor. As8 shown in Figure 13,
kHza
the echo signal issignal
output.of The
morepulse
thanwidthneeds to be
of the provided,
echo signal isfor cycle levels
proportional to of
the40meas-
pulse
to be trigger
emittedThesignal
insideofthe
more than 10
module andneeds to be to
thedistance
echo provided, for 8 cycle
be detected. Once levelsecho
of 40 kHz to
ured distance. formula for calculating can be obtained fromthethe timesignal is
interval
detected, the echo signal is output. The pulse width of the echo signal
between the transmitting signal and receiving echo signal: distance = high level time * is proportional to
the measured
sound speed/2. distance.
In order toThe formula
avoid for calculating
the influence of thedistance can besignal
transmitting obtained
on afrom
recallthe time
signal,
interval between the transmitting
the measurement period is above 60 ms. signal and receiving echo signal: distance = high level
time * sound speed/2. In order to avoid the influence of the transmitting signal on a recall
signal, the measurement period is above 60 ms.

Figure
Figure Sequencediagram
12.12.Sequence diagramofofultrasonic
ultrasonicsensor.
sensor.

An ultrasonic sensor was used to measure the value of distance from the object. The
test distance and actual distance of ultrasonic sensor obtained through multiple experi-
ments are shown in Table 3.

Table 3. Actual Distance and test distance.


Sensors 2021, 21, x FOR PEER REVIEW 13 of 18
Sensors 2021, 21, 6777 12 of 17

Figure
Figure 13.13. Curve
Curve after
after fitting.
fitting.

Table 4.An ultrasonic


Actual distancesensor wasdistance.
and fitted used to measure the value of distance from the object. The
test distance and actual distance of ultrasonic sensor obtained through multiple experi-
Actual Fittingin Table
ments are shown Actual
3. Fitting Actual Fitting Actual Fitting
Distance Distance Distance Distance Distance Distance Distance Distance
/cm /cm3. Actual/cm
Table Distance and
/cmtest distance.
/cm /cm /cm /cm
Actual Test 10Actual 10.3448 Test 35 34.8545
Actual 60 Test 59.9846 Actual85 84.9519
Test
Distance Distance 15Distance14.8298Distance40 39.887
Distance 65
Distance65.2120Distance
90 89.9657
Distance
(y)/cm (x)/cm 20(y)/cm 19.8639 (x)/cm45 45.3093
(y)/cm 70(x)/cm 70.1037 (y)/cm95 94.9999
(x)/cm
10 9.93 25 35 24.8269 34.0350 50.0078
60 75 58.74 75.0158 85 100 99.9425
83.29
15 14.34 30 40 29.8610 38.9855 65
54.9504 80 63.88 79.7144 90 88.22
20 19.29 45 44.31 70 68.69 95 93.17
25 24.17 50 48.93 75 73.52 100 98.03
6.2. Road Sign Detection Experiment
30 29.12 55 53.79 80 78.14
Figure 14 shows the experimental effect diagram of detecting the actual road sign
using the camera and the model after training. Figure 15 shows the actual environment of
In order to improve the accuracy of the distance value measured by the sensor, use
the detection experiment, and the background of the actual detection environment is not
MATLAB to fit the curve of each distance value in the table with the least square method,
pure color, which can meet the requirements of the robot. In Figure 16, from left to right,
and the fitting curve is y = ax + b, where a = 1.0171, b = 0.262, as shown in Figure 13.
and from top to bottom, the test distances are 10 cm, 20 cm, 30 cm, 40 cm, 50 cm, 60 cm,
In Figure 13 the longitudinal axis shows the measured value, and the horizontal axis shows
70 cm, 80 cm, 90 cm, and 100 cm. In this study, experiments were conducted on road signs
the actual values, the unit is cm. Table 4 shows the actual value and the value after fitting.
at different distances. The test results show that in the detection environment where the
It can be seen from the table that the error is very small
camera is 20 cm away from the road sign and 25 cm away from the road sign, the road
sign cannot Table
all be 4.inActual
the field of view
distance and due
fittedtodistance.
the influence of the camera parameters, it is
only partially in the field of view. Tables 5 and 6 are the experimental data of road sign
Actual Fitting recognition.
ActualFrom these two tables, itActual
Fitting can be concluded that the success
Fitting rate of recognition
Actual Fitting
Distance Distance Distance
is lower DistancebetweenDistance
when the distance the camera and Distance
the road signDistance
is less than 25Distance
cm. The
/cm /cm success /cm
rate of recognition /cm is higher when
/cm the distance /cmbetween the/cmcamera and the /cmroad
10 10.3448 35
sign is greater than or34.8545 60
equal to 30 cm, reaching 59.9846 of 99.625%,
an average 85which can 84.9519
meet the
15 14.8298 40
requirements of accurate39.887 65
obstacle avoidance. 65.2120 90 89.9657
20 19.8639 45 45.3093 70 70.1037 95 94.9999
25 24.8269 50 50.0078 75 75.0158 100 99.9425
30 29.8610 55 54.9504 80 79.7144

6.2. Road Sign Detection Experiment


Figure 14 shows the experimental effect diagram of detecting the actual road sign
using the camera and the model after training. Figure 15 shows the actual environment of
the detection experiment, and the background of the actual detection environment is not
Sensors 2021, 21, 6777 13 of 17

pure color, which can meet the requirements of the robot. In Figure 16, from left to right,
and from top to bottom, the test distances are 10 cm, 20 cm, 30 cm, 40 cm, 50 cm, 60 cm,
70 cm, 80 cm, 90 cm, and 100 cm. In this study, experiments were conducted on road signs
at different distances. The test results show that in the detection environment where the
camera is 20 cm away from the road sign and 25 cm away from the road sign, the road
sign cannot all be in the field of view due to the influence of the camera parameters, it is
only partially in the field of view. Tables 5 and 6 are the experimental data of road sign
recognition. From these two tables, it can be concluded that the success rate of recognition
Sensors 2021, 21, x FOR PEER REVIEW
is lower when the distance between the camera and the road sign is less 14 than 14 cm.
25 of 18
Sensors 2021, 21, x FOR PEER REVIEW of 18
The success rate of recognition is higher when the distance between the camera and the
road sign is greater than or equal to 30 cm, reaching an average of 99.625%, which can meet
the requirements of accurate obstacle avoidance.

Figure 14.Identification
Figure14. Identificationeffect
effectofofroad
roadsigns.
signs.
Figure 14. Identification effect of road signs.

Sensors 2021, 21, x FOR PEER REVIEW 15 of 18

command to control the positive and negative movement of the dual motor. Figure 17 is
the motor speed curve after PID adjustment. It can be observed from the figure that when
the speed of the motor increases from 0 to the maximum speed suddenly, the overshoot
Figure15.
Figure 15.Road
Roadsign
signdetection
detectionenvironment.
environment.
Figure is
15.very
Roadsmall and the
sign detection curve reaches dynamic equilibrium in a very short time.
environment.
Table 5. Experimental data of short distance road sign recognition.
Table 5. Experimental data of short distance road sign recognition.
Detection Number of Number of Detection Success Rate
Detection Number Number
of of Number of Detection Success Rate
Distance
Number of Successful Errors De- (Number of Successes/Total
Distance Inspections
Successful Errors De- (Number of Successes/Total
/cm Inspections Tests tected Number of Failures)
/cm Tests tected Number of Failures)
20 100 63 37 63%
20 100 63 37 63%
25 100 80 20 80%
25 100 80 20 80%
Table 6. Experimental data of long distance road sign recognition.
Table 6. Experimental data of long distance road sign recognition.
Detection Number of Detection Success Rate
Detection Number of Number ofNumber
Suc- of Detection Success Rate
Distance
Number of Number of Suc- Errors De- (Number of Successes/Total
Distance Inspections cessful Tests
Errors De- (Number of Successes/Total
/cmInspections cessful Tests tected Number of Failures)
/cm tected Number of Failures)
30 100 100 0 100%
30 100 100 0 100%
40 100 99 1 99%
40 100 99 1 99%
50
Figure16.
16.Motor 100
Motordriver.
driver. 100 0 100%
50
Figure 100 100 0 100%
60 100 99 1 99%
60 100 99 1 99%
70 100 100 0 100%
70 100 100 0 100%
80 100 99 1 99%
80 100 99 1 99%
90 100 100 0 100%
90 100 100 0 100%
100 100 100 0 100%
Sensors 2021, 21, 6777 14 of 17

Table 5. Experimental data of short distance road sign recognition.

Detection Success Rate


Detection Distance Number of Number of Successful Number of Errors
(Number of Successes/Total
/cm Inspections Tests Detected
Number of Failures)
20 100 63 37 63%
25 100 80 20 80%
Sensors 2021, 21, x FOR PEER REVIEW 15 of 18
Table 6. Experimental data of long distance road sign recognition.

Detection Success Rate


Detection Distance Number of
command Number of Successful
to control Number
the positive and of Errors
negative movement of the dual
(Number ofmotor. Figure 17 is
Successes/Total
/cm Inspections Testsafter PID adjustment.
the motor speed curve Detected
It can be observedNumber
from theoffigure that when
Failures)
30 100the speed of the motor
100 increases from 0 to the
0 maximum speed suddenly, 100% the overshoot
40 100is very small and the
99curve reaches dynamic1 equilibrium in a very short time.
99%
50 100 100 0 100%
60 100 99 1 99%
70 100 100 0 100%
80 100 99 1 99%
90 100 100 0 100%
100 100 100 0 100%
Average success rate of detection 99.625%

6.3. Physical Test


Figure 16 represents a neurons intelligent PID motor drive module with a built-in
controller capable of PID computation, trapezoidal control, and dc motor movement driven
by a drive circuit on the circuit board. Through the serial port it can send 8 B of command
to control the positive and negative movement of the dual motor. Figure 17 is the motor
speed curve after PID adjustment. It can be observed from the figure that when the speed
of the motor increases from 0 to the maximum speed suddenly, the overshoot is very small
and the curve reaches dynamic equilibrium in a very short time.
Figure 16. Motor driver.

Figure17.
Figure 17.Motor
Motorspeed
speedcurve.
curve.

Figure18
Figure 18 shows the
the artificially
artificiallyconstructed
constructedexperimental
experimental environment,
environment, including
includingthe
environment of obstacles and road signs. The obstacle avoidance experiment
the environment of obstacles and road signs. The obstacle avoidance experiment of the of the phys-
ical prototype
physical was was
prototype conducted in theinconstructed
conducted experimental
the constructed environment,
experimental and the
environment, ex-
and
perimental
the results
experimental are shown
results in Figure
are shown 19 The19
in Figure experimental resultsresults
The experimental show that
showthisthat
method
this
can successfully
method identifyidentify
can successfully road signs
road and
signsrealize autonomous
and realize obstacle
autonomous avoidance
obstacle in com-
avoidance in
plex environments.
complex environments.Figure 20 shows
Figure 20 showsthe the
real-time
real-timepicture of road
picture sign
of road detection
sign detection (not the
(not
experimental
the experimentalenvironment
environment as shown
as shown in Figure
in Figure19).19).
Sensors 2021, 21, 6777 15 of 17
Sensors 2021, 21, x FOR PEER REVIEW 16 of 18
Sensors
Sensors2021,
2021,21,
21,xxFOR
FORPEER
PEERREVIEW
REVIEW 1616ofof1818

Figure18.
Figure
Figure 18.Experimental
18. ExperimentalEnvironment.
Experimental Environment.
Environment.
Figure Environment.

Figure
Figure 19.
19. Physical
Physical prototype
prototypeexperiment.
experiment.
Figure 19. Physical
Figure19. Physicalprototype
prototypeexperiment.
experiment.

Figure 20. Real-time picture of road sign detection.


Figure 20. Real-time picture of road sign detection.
Figure 20. Real-time
Figure 20. Real-timepicture
pictureofofroad
roadsign
signdetection.
detection.
7. Conclusions
7. Conclusions
7. Conclusions
In the physical prototype experiment, the mobile robot can pass through the narrow
In the physical prototype experiment, the mobile robot can pass through the narrow
In the physical
gap between prototype
obstacles experiment,
stably and safely, andthe mobile
can robot can
run correctly pass through
according to thethe narrow
direction
gap between obstacles stably and safely, and can run correctly according to the direction
gap between
indicated by obstacles
the roadstably
signsand
andsafely, andreach
finally can run
the correctly according to
given destination the direction
position. The
indicated by the road signs and finally reach the given destination position. The
indicated by the road signs and finally reach the given destination position. The
Sensors 2021, 21, 6777 16 of 17

7. Conclusions
In the physical prototype experiment, the mobile robot can pass through the narrow
gap between obstacles stably and safely, and can run correctly according to the direction
indicated by the road signs and finally reach the given destination position. The experi-
mental results verify the feasibility of the design, the accuracy of the road sign detection
and obstacle avoidance. The method for information fusion of multiple sensors can not
only make up for the error generated by a single sensor, but also sense the information
of multi-directional and multi-type obstacles of the robot at this moment and realize the
obstacle avoidance function. Therefore, it can be widely used in mobile robot systems.

Author Contributions: Conceptualization, J.Z., J.F. and S.W.; methodology, J.F. and S.W.; software,
J.F.; validation, J.F., C.L. and K.W.; formal analysis, J.F.; investigation, K.W., S.W.; resources, T.H.;
data curation, J.F. and C.L.; writing—original draft preparation, T.H. and S.W.; writing—review and
editing, J.Z. and T.H.; visualization, J.F.; supervision, J.Z. and T.H.; project administration, J.Z., S.W.
and K.W.; funding acquisition, K.W. All authors have read and agreed to the published version of
the manuscript.
Funding: This research was supported in part by the National Social Science Foundation of China
under Grant BIA200191.
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
Data Availability Statement: The data are available upon request.
Acknowledgments: This study is supported by the National Social Science Foundation of China
(Grant/Award Numbers: BIA200191). We also thank the editors and reviewers.
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Liu, C.; Tomizuka, M. Real time trajectory optimization for nonlinear robotic systems: Relaxation and convexification. Syst.
Control. Lett. 2017, 108, 56–63. [CrossRef]
2. Das, P.; Behera, H.; Panigrahi, B. A hybridization of an improved particle swarm optimization and gravitational search algorithm
for multi-robot path planning. Swarm Evol. Comput. 2016, 28, 14–28. [CrossRef]
3. Zhao, J.; Gao, J.; Zhao, F.; Liu, Y. A search-and-rescue robot system for remotely sensing the underground coal mine envi-ronment.
Sensors 2017, 17, 2426. [CrossRef]
4. Milioto, A.; Lottes, P.; Stachniss, C. Real-time semantic segmentation of crop and weed for precision agriculture robots lev-eraging
background knowledge in CNNs. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA),
Brisbane, QLD, Australia, 21–25 May 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 2229–2235.
5. Goodin, C.; Carrillo, J.T.; Mcinnis, D.P.; Cummins, C.L.; Durst, P.J.; Gates, B.Q.; Newell, B.S. Unmanned ground vehicle sim-
ulation with the virtual autonomous navigation environment. In Proceedings of the 2017 International Conference on Military
Technologies (ICMT), Brno, Czech Republic, 31 May–2 June 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 160–165.
6. Peterson, J.; Li, W.; Cesar-Tondreau, B.; Bird, J.; Kochersberger, K.; Czaja, W.; McLean, M. Experiments in unmanned aerial
vehicle/unmanned ground vehicle radiation search. J. Field Robot. 2019, 36, 818–845. [CrossRef]
7. Rivera, Z.B.; de Simone, M.C.; Guida, D. Unmanned ground vehicle modelling in gazebo/ROS-Based environments. Machines
2019, 7, 42. [CrossRef]
8. Galati, R.; Reina, G. Terrain Awareness Using a Tracked Skid-Steering Vehicle With Passive Independent Suspensions. Front.
Robot. AI 2019, 6, 46. [CrossRef]
9. Dogru, S.; Marques, L. Power Characterization of a Skid-Steered Mobile Field Robot with an Application to Headland Turn
Optimization. J. Intell. Robot. Syst. 2018, 93, 601–615. [CrossRef]
10. Figueras, A.; Esteva, S.; Cufí, X.; De La Rosa, J. Applying AI to the motion control in robots. A sliding situation. IFAC-PapersOnLine
2019, 52, 393–396. [CrossRef]
11. Almeida, J.; Santos, V.M. Real time egomotion of a nonholonomic vehicle using LIDAR measurements. J. Field Robot. 2012, 30,
129–141. [CrossRef]
12. Kim, C.; Ashfaq, A.M.; Kim, S.; Back, S.; Kim, Y.; Hwang, S.; Jang, J.; Han, C. Motion control of a 6WD/6WS wheeled plat-form
with in-wheel motors to improve its maneuverability. Int. J. Control. Autom. Syst. 2015, 13, 434–442. [CrossRef]
13. Reinstein, M.; Kubelka, V.; Zimmermann, K. Terrain adaptive odometry for mobile skid-steer robots. In Proceedings of the 2013
IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe, Germany, 6–10 May 2013; IEEE: Piscataway, NJ,
USA, 2013; pp. 4706–4711.
Sensors 2021, 21, 6777 17 of 17

14. Petriu, E. Automated guided vehicle with absolute encoded guide-path. IEEE Trans. Robot. Autom. 1991, 7, 562–565. [CrossRef]
15. Jiang, K.; Seneviratne, L.D. A sensor guided autonomous parking system for nonholonomic mobile robots. In Proceedings of the
1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C), Detroit, MI, USA, 10–15 May 1999.
[CrossRef]
16. Ando, Y.; Yuta, S. Following a wall by an autonomous mobile robot with a sonar-ring. In Proceedings of the 1995 IEEE
International Conference on Robotics and Automation, Nagoya, Japan, 21–27 May 1995. [CrossRef]
17. Han, Y.; Hahn, H. Localization and classification of target surfaces using two pairs of ultrasonic sensors. Robot. Auton. Syst. 2003,
33, 31–41. [CrossRef]
18. Silver, D.; Morales, D.; Rekleitis, I.; Lisien, B.; Choset, H. Arc carving: Obtaining accurate, low latency maps from ultrasonic range
sensors. In Proceedings of the IEEE International Conference on Robotics and Automation, New Orleans, LA, USA, 26 April–1
May 2004; Volume 2, pp. 1554–1561. [CrossRef]
19. Mu, Y.; Yan, S.; Liu, Y.; Huang, T.; Zhou, B. Discriminative local binary patterns for human detection in personal album. In
Proceedings of the 2008 IEEE Conference on Computer Vision and Pattern Recognition, Anchorage, AK, USA, 23–28 June 2008;
IEEE: Piscataway, NJ, USA, 2008; pp. 1–8.
20. Zhou, S.; Liu, Q.; Guo, J.; Jiang, Y. ROI-HOG and LBP Based Human Detection via Shape Part-Templates Matching Procs. Lect.
Notes Comput. Sci. 2012, 7667, 109–115.
21. Bar-Hillel, A.; Levi, D.; Krupka, E.; Goldberg, C. Part-Based Feature Synthesis for Human Detection. In European Conference on
Computer Vision; Springer: Berlin/Heidelberg, Germany, 2010; Volume 6314, pp. 127–142. [CrossRef]
22. Walk, S.; Schindler, K.; Schiele, B. Disparity Statistics for Pedestrian Detection: Combining Appearance, Motion and Stereo; Springer:
Berlin/Heidelberg, Germany, 2010; pp. 182–195. [CrossRef]
23. Liu, Y.; Shan, S.; Chen, X.; Heikkilä, J.; Gao, W.; Pietikäinen, M. Spatial-Temporal Granularity-Tunable Gradients Partition (STGGP)
Descriptors for Human Detection; Springer: Berlin/Heidelberg, Germany, 2010; Volume 6311, pp. 327–340. [CrossRef]
24. Geronimo, D.; Sappa, A.D.; Ponsa, D. Computer vision and Image Understanding (Special Issue on Intelligent Vision Systems).
Comput. Vis. Image Underst. 2010, 114, 583–595.
25. Cho, H.; Rybski, P.E.; Bar-Hillel, A.; Zhang, W. Real-Time Pedestrian Detection with Deformable Part Models; Springer:
Berlin/Heidelberg, Germany, 2012; pp. 1035–1042. [CrossRef]

You might also like