Mixed_Test_Environment-based_Vehicle-in-the-loop_Validation_-_A_New_Testing_Approach_for_Autonomous_Vehicles
Mixed_Test_Environment-based_Vehicle-in-the-loop_Validation_-_A_New_Testing_Approach_for_Autonomous_Vehicles
Abstract— The current test of autonomous driving technology simulation environment, to simulate the real vehicle driving
requires extensive experimental verification, whether in simu- process by configuring the relevant virtual sensors and con-
lation or on real roads. Netherthless, how to test autonomous trollers. The whole process is repeatable to test the vehicle’s
vehicles thoroughly in a safe and comprehensive manner
remains a major challenge. To achieve safer and more effective algorithmic capabilities of perception, decision making, path
autonomous driving testing, this paper proposes a novel mixed planning and vehicle control in various scenarios, but lacks
test environment-based validation method with vehicle in the the validation of technical functionality and reliability on
loop(ViL) : (1) our method supports more realistic drive safety all hardware. Further, the HiL test methodology focuses
tests in mixed scenarios which integrate the synthetic and on validating embedded software on automotive hardware
the real-world scenarios. Synthetic scenarios offer complex
traffic simulation with diverse road conditions. The real-world ECU in a closed loop, while having ignored the real test
scenarios introduce the real autonomous driving vehicle, the environment. Restricted by safety concerns, road tests are
real sensor suite as well as the test field to the test loop, having constrained to limited traffic situations and also incapable of
further bridged the gap between Hardware-in-the-Loop(HiL) simulating complex or extreme scenarios, which are difficult
testing and real road tests than ViL; (2) virtual perceptional to replicate. Faced with this problem, the ViL [6], [7] testing
results are simulated directly and delivered to the real vehicle in
Unified Fusion Data Format(UFDF), without rendering virtual is now a widely-used method to work as a transition phase
detection data for reduced resource consumption; (3) diverse between HiL testing and on-road testing. As the ADAS
test scenarios are configurable and reproducible with OSM- system [8] is integrated into the real vehicle while roads,
based High Definition(HD) map, enabling the simulation to be corresponding traffic scene and sensor signals are simulated
decoupled from a specific test filed or traffic facilities. A series through real-time simulation software. Thus, a closed-loop
of experiments on the application of our method have been
demonstrated, and our approach is proved to be a promising test system is formed by coupling the autonomous vehicle,
drive safety testing technique before actual road testing. the virtual scene and the test bench. In this way, ADAS func-
tional verification, scenario testing and integration testing
I. INTRODUCTION of the real car-related electronic control system can all be
The stability and safety of self-driving vehicles must be effectively implemented. However, some functions are still
ensured before deployment in practice, which demands com- not included in the test scope. For example, the information
prehensive and repeated testing on software, hardware, and for environment modelling comes from the synthetic sensor
all possible contingencies. An autonomous vehicle consists data but that of real sensors, that is, the functionality of real
of a number of components, such as the sensors for different sensors cannot be fully verified.
detection purposes, a computing unit, an operating system, In this context, the purpose of this paper is to propose
algorithmic software, etc. For conventional vehicles, manu- a safer and more universal adaptation for the testing of
facturers detect and correct vehicle defects in performance intelligent vehicles, via the combination of virtual factors
by road testing with mileage accumulation of millions of with real-world scenarios. Our method enables the recon-
kilometers, but this method tends to be dangerous and time- struction of any real scene based on the corresponding HD
consuming for real-time embedded systems with multiple map to create a precise semantic map and a realistic rendered
components, such as self-driving vehicles. Therefore, the visualization for the virtual world. Plus, when merging the
development of more reliable, low-cost and repeatable testing real world with virtual elements, the real vehicle states are
methods is of paramount significance [1]. synchronized to a virtual equivalent in simulation using the
To achieve high-level vehicle automation, Software-in-the- internal communication mechanism of our software stack,
Loop(SiL) [2], [3] and HiL simulation [4] tests are necessary data of the “perceived” virtual elements as well as the real
prerequisites before road testing [5]. In the SiL simulation, environment are then transmitted to the computing unit on
the vehicle’s dynamic model is constructed and set in the vehicle for evaluation. Not only easy configuration in real
time is allowed for varied and specific test needs, experi-
*This work was supported by the National Natural Science Foundation ments on dangerous situations such as collision, emergency
of China(NO.61773312,61790563). barrier avoidance and so on can be carried out in unlimited
1 Y. Chen, S. Chen, T. Xiao, S. Zhang, Q. Hou, N.
Zheng are with the Department of Electrical and Information test fields using our method. There is no necessity of utilizing
Engineering, Xi’an Jiaotong University, Xi’an, Shaanxi 710049, cloud servers or remote communication with fixed facili-
P.R. China; Email: alan1996, chenshitao, xiao1998, ties, thus contributing to higher test flexibility and resource
zhangsongyi, [email protected];
[email protected] efficiency. Another major advantage of this approach over
∗ Correspondence: [email protected] existing self-driving test methods is that, through directly
sending the syntactic fusion data of virtual traffic rather than and fed into subsequent algorithmic processing module.
simulating the raw data of different sensors. The upper-level Finally, the generated vehicle driving commands including
tracking and decision-making module helps to save resource throttle, brake, steering, etc., will be sent to the vehicle to
consumption of simulating data of multiple sensors as well complete a whole cycle of closed-loop validation. This article
as the subsequent fusion computation. will start with the related research work in recent years,
Fig. 1 is a diagram that illustrates the framework of and then demonstrate the main components of our approach.
the proposed method, with a closed loop consisting of a Finally, the feasibility and effectiveness of our method are
real autonomous vehicle and a mixed test environment. On verified by several experiments.
the one hand, the vehicle is equipped with a computing
II. RELATED WORK
unit and is loaded with necessary sensors, such as GPS,
Lidar, camera, etc., for positioning and perception. When the As autonomous driving has become a hot subject in
vehicle receives a motion command, its status is constantly the field of artificial intelligence [9], [10], Google, Uber,
updated in the environment. On the other hand, the mixed Tesla, Ford and many other enterprises have introduced their
environment consists of two parts. One is the real-world autonomous driving platforms. Nevertheless, self-driving car
scenario in free space or on structured roads, and it’s free accidents have been emerging in recent years. The safety
to add a certain number of normal vehicles, pedestrians of self-driving cars is not yet guaranteed, and how to
or dummy obstacles to improve the randomness of the comprehensively and efficiently test the self-driving system
test within a safe range; the other part is the computer- is particularly important.
generated elements with rendered visualization in simulation. Conventional simulation methods [11], [12] are of low cost
At the same time, the simulation visualization interface also with reproducible scenarios. Fernandes et al. [13] created
contains structured roads reconstructed from HD maps. The proxy entities for self-driving vehicles on a traffic simulator
raw data perceived by real sensors will then be transmitted to and deployed them in a relatively real traffic stream while
the core algorithm module for fusion processing, The virtual simulating all of the vehicle sensors and actuators. The sim-
elements, excluding virtual roads, are directly processed as ulator tested the self-driving vehicle’s perception, decision-
fusioned results to be combined with the real fusion results making, planning, and control capabilities in a simulated
1284
Authorized licensed use limited to: Indian Institute of Technology Hyderabad. Downloaded on November 01,2024 at 10:05:05 UTC from IEEE Xplore. Restrictions apply.
environment without taking into account the hardware com-
ponent of the real vehicle. In order to combine the simulation
environment with the hardware test, some research work have
been done and developed based on the HiL test method [14]–
[16]. With the utilization of CARLA simulator for getting
access to various simulated sensor data, [14] incorporated
the exact hardware used on the real autonomous driving
platform for computation according to fake sensor data to
validate high-level software. HiL allows to incorporate part
of hardware into the simulation, but in fact it also ignores (a) (b)
the actual road environment.
In the real road environment, on-road testing is the most Fig. 2. Creation of HD map. (a) Projected 2D point cloud map of a highway
segment with ramp. (b) HD map visualization.
direct and comprehensive method, while being highly risky
and time-consuming. In response to this problem, researchers
developed test methods based on the concept of virtual- tion, zebra crossing, road signs, etc., can also be easily edited
actual combination. Butenuth et al. [17] augmented the real- in OSM format. In addition, we also added some private
world test with simulated sensors for active safety system attributes, such as lane offset (the distance of lane center
validation, but did not pay attention to the evaluation of to the road segment center), the branch level of road, the
real sensors in real-world scenarios. Yiheng Feng et al. [18] number of main road lanes in the presence of ramp, etc., to
used augmented reality testing technology to enable real- supplement the existing tag library of the software and enrich
world vehicle and virtually connected vehicles to interact the semantic map with more precise description. Afterwards,
in real time at a closed Mcity test site, using V2X [19], a 3D model of custom scene is generated and exported based
[20] communication and 5G connectivity to manage the on the semantic map, to be further rendered and visualized
intelligent traffic system. The method is efficient but also in our simulation platform.
costly for testing and has high requirements on the connected On the other hand, to better take advantage of the exist-
test site. ing real-world structured traffic network and achieve larger
Based on our previous works of HiL-based integrated follow-up route tracking tests, our method also allows to
simulation and testing platform [21], [22], this paper mainly build a high definition semantic map based on the acquired
studies more realistic validation of single autonomous driving real road data of point clouds. Firstly, through the utilization
vehicle’ intelligence in a mix test environment that does not of real laser sensors that are loaded during the vehicle
entail high resource consumption or depend on the test filed. driving process, the road and the surrounding information
Test scenarios can be open space or real roads with simple of the whole real scene are precisely scanned and plotted
traffic flow. The synchronization and transition of both virtual with high positioning accuracy. Secondly, we still use Java
and real world is achieved though inner computing platform OSM to label the scene information according to the 2D
without relying on remote communication. map obtained from point cloud projection to form an OSM
III. VIRTUAL TEST ENVIRONMENT GENERATION semantic map. The final accuracy is confirmed to be within
AND CONFIGURATION 10 cm. Finally, the whole scene can be quickly exported
and rendered smoothly in simulation for a clear supplement
A. Prior Semantic Map Generation
visualization. As shown in Fig. 2, the simulation scene is
In order to achieve free experimental scene configuration based on high-precision map reconstruction.
and testing without the field restrictions, we support the
simulation reconstruction of custom scenes or real scenes B. Road Participants Construction
in the real world. The scenario reconstruction allows to get To enhance the richness and flexibility of test scenarios,
rid of the dependence on a supporting test bench or the road participants such as pedestrians and four-wheeled vehi-
requirement of a pre-designed test track with fixed road cles are essential components, which can be designed with
information. Thus, more comprehensive and realistic test 3D modeling software on demand. At present, our model
results can be achieved. On the one hand, when using a library provides a variety of realistic simulated automobiles,
custom scene according to the test requirements, a free including personal car, SUV, ambulance, bus and so on,
space test scenario without special road restrictions is the providing different physical information of road participants.
easiest to simulate since no particular road information is Collision, friction, inertia and other essential physical proper-
needed, including the number of lanes, lane width, junctions, ties of different automobiles are well specified. At the same
and so on. By contrast, when special design requirements time, we provide a simulation interface with multi-vehicle
for road information should be taken into account, we dynamics simulation and multi-sensor simulation to achieve
use self-constructed OpenStreetMap(OSM) semantic maps the control of throttle, brake, steering wheel, as well as
to help complete the corresponding details. By means of an environmental perception of the virtual vehicle. Pedestrians
open-source Java OSM editing tool, apart from the above- and other necessary models are also available for usage.
mentioned road attributes, road form, road length, orienta- Given the rational use of computing resources, the user can
1285
Authorized licensed use limited to: Indian Institute of Technology Hyderabad. Downloaded on November 01,2024 at 10:05:05 UTC from IEEE Xplore. Restrictions apply.
choose whether to turn on the simulation of vehicle dynamics environment and other vehicles. According to the road in-
and sensor information or not. We also provide a real-time formation, traffic signals from the road network and the
user interface that allows users to freely adjust parameters virtually perceived data in simulation, virtual vehicles are
such as the location, orientation, and the speed of traffic capable of achieving simple obstacle avoidance and multi-
participants during the test process. vehicle interaction while respecting the traffic rules. The core
Meanwhile, in order to better track the state changes of idea is to predict the possible driving area of the simulated
the real vehicle(e.g. position, orientation, acceleration) in vehicle according to its driving states. The possible driving
simulation, the real vehicle states will be synchronized to area is divided into the deceleration area and the braking
the corresponding mapped virtual model in the simulation to area, outside of which is the normal driving area. Based
complete the update of the real vehicle information in the on a virtual front laser scanner, the obstacle or vehicle
virtual world. position in the possible driving area is obtained, and the
vehicle is maintained at a relatively stable speed by using
C. Virtual Traffic Setup
the continuous speed adjustment. Therefore, a safe distance
is maintained between the vehicle and the vehicle or obstacle
B0P+/.%&';#"-/Q'(#7% A%-7.#0%'D+&%7'=+"$#4)/-.#+" in front.
=90B</4295<;
H#/.)-7':/-$$#N'3#0)7-.#+"
>59430?%@2A<?;
D090@<495/%;<44567;
0E
A@
84094%:2;<
A
=097<4%:2;< !
A@56
3%7$O,%$#"%&'3%0-".#N'D-P'9'E/#+/'C#46',%$#"#.#+"'D-P
1%&2"+%&*," !"#"$"%&'()*+%&*,"+-'%&(,.'+&."&/
The construction and visualization of virtual traffic scenar- Fig. 4. Vehicle collision avoidance and multi-agent interaction based on
ios are achieved with the Gazebo simulator. In the case of laser scanner detection.
lacking a prior scenario data, real-time addition, adjustment
1) Selection of detection range: Define the flag for a
or deletion of various traffic participants to the designated
vehicle’s turning status as T , so
location can be carried out according to the test purpose
as well as the real-time pose of the autonomous vehicle
−1 , turn lef t
observed in simulation. Further, the motion of a dynamic T = 1 , turn right (1)
obstacle such as a vehicle or a pedestrian can be achieved
0 , go straight
by giving consecutive linear and angular velocity commands.
However, in providing a semantic map of the scene, a more As indicated in 2, the angular detection range R of the
large-scale and changeable traffic flow can be established laser scanner is selected according to T . In Fig. 4, the green
to complete a series of driving safety tests under structured area corresponds to driving straight. The green area and
roads, as shown in Fig. 3. When the user specifies the grey area are selected when there is a left turning tendency,
initial position and target position of the virtual vehicle to be otherwise, the blue area and green area are selected. The
added, the simulation system automatically loads a simulated vehicle brakes when the object is detected in the red range
vehicle model. Then the global planning module searches a at any time.
global optimal path according to the given points and sends
π
the encapsulated path points to the corresponding vehicle for 2 − θ, π , T = −1
π
tracking. Moreover, it is worth mentioning that the design of R= 0, 2 + θ ,T = 1 (2)
π π
2 − θ, 2 + θ ,T = 0
all simulation scenarios can be exported to binary files after
the completion of the current test and can be quickly reloaded 2) Adjustment of speed: After determining the detection
for repeated testing. When multiple simulated vehicles run in range, the speed adjustment logic can be defined as follows:
the scene based on the reconstructed semantic map, they will
naturally form a complex traffic flow, which is convenient
vt − λ , if d∈ [dmin , dmax ] ,
for carrying out tests on our ego vehicle, such as turning at vt+1 = 0 , if d∈ [0, dmin ) (3)
the intersection, vehicle following, converging to traffic flow,
emergency braking, etc. vt + ∆v , else
In addition to following the given route, the virtual where d is the current distance to obstacle, dmin is the
vehicles are also capable of reacting to the surrounding minimum braking distance of vehicle, dmax is the maximum
1286
Authorized licensed use limited to: Indian Institute of Technology Hyderabad. Downloaded on November 01,2024 at 10:05:05 UTC from IEEE Xplore. Restrictions apply.
deceleration distance of vehicle(i.e., the minimum safe dis- driving vehicles capability of coping with a variety of sce-
tance). vt is the current vehicle speed at time t and vt+1 is the narios. The intelligence of every single autonomous vehicle
speed at time t + 1. When d is in the range for deceleration, is indeed the fundamental prerequisite of autonomous driving
which is set to be a continuous process, the deceleration development, while V2V and V2X are technologies that
value of each calculation cycle is λ = supplement the overall traffic security. At the same time,
dmax − d whether the driving situation is rough as on a rural road
or smooth as on an urban road, whether there is rain, fog,
α
snow or bad lighting conditions, a simple real scene with
, with parameter α for acceleration regulation. On the con-
certain obstacles is enough for completing a relatively safe
trary, when the vehicle is supposed to accelerate with d larger
test process. We can even directly overlay the OSM map of a
than dmax , the speed increases by ∆v.
real scene on an empty test site, to not only profit on real road
3) Restriction on speed: [vmin , vmax ] is the pre-defined
network information but also avoid the risk caused by the
speed interval for corresponding agents. Thus, we have
huge environmental uncertainty in direct testing on the public
( road. For example, we have constructed the HD map of the
vmax , if vt+1 > vmax
vt+1 = (4) autonomous driving test filed in Changshu, Jiangsu. Part or
vmin , if vt+1 < vmin whole of the test filed map can be transferred to current
In this way, each agent in the simulation is able to maintain coordinate system. In this manner, even in an empty field,
a smooth driving speed while interacting with other agents the fine road information can be reproduced and superposed
running in the same traffic scenario. Further, pedestrian to realize effective test, while the resource cost of actual road
movement following road and traffic light conditions can also tests is reduced to a large extent.
be configured according to concrete testing demands.
IV. EXPERIMENTS AND APPLICATIONS
D. Real World Scenario Design In order to verify that our testing method can support
1) Real vehicle and sensors: The traditional ViL test the safety performance evaluation of autonomous driving-
method transfers digital sensor information from the virtual related modules, we have conducted repeated experiments on
scene to the real vehicle for decision making and path a variety of challenging scenarios. The following part demon-
planning, and maps the final vehicle motion states to the strates three different test scenarios, including unprotected
computer-generated scene to form a loop. Nevertheless, by left turn at an intersection, safe traversal in roundabouts, and
introducing the sensing results of real vehicle sensors into the collision detection and instant reaction in blind road areas.
loop to replace the virtual sensor signals, we can not only The experiments proved that our method can not only provide
obtain more realistic visual and spatial environmental data, a reliable mixed environment for testing, but also complete
but also verify the robustness of related algorithms in real the assessment of important module functions such as object
circumstances. It’s also supported to verify the redundancy perception, planning, and control in the process of driving.
and stability of the sensing system in case of partial sensor
failure. A. Unprotected Left Turn at An Intersection
The perceptual data from cameras, the multi-beam Lidar
system, millimeter wave radars and other sensors are col- 6
Section C 5
lected and processed into UFDF format by fusion algorithms
for subsequent object tracking and upper layer decision
making. The UFDF is a specially designed data format for Section B
Goal
precise information description of fusioned objects, includ-
ing necessary parameters such as position, orientation, 3D 7 4
bounding box, shape polygon, speed, acceleration, etc. The 8 3
simulation is synchronized with reality according to the time
stamp of data. Messages with earlier time stamps will be Section D
dropped to avoid inconsistency of data communication. Section A
2) Real environment setup: In addition to the the real
2 1
autonomous vehicle and a variety of sensor equipment, static
or low-speed real vehicles and obstacles can also be added Ego vehicle
and combined with complex and dangerous virtual scene to
form a mixed test scenario. In this way, we can achieve Fig. 5. Simulation of traffic flow in an unprotected intersection with real
a multi-module algorithm verification effect approximate to autonomous vehicle and test field in the loop.
that of real road tests, but under the premise of guaranteed
security. An unprotected left turn, refers to the completion of a
The test site is not limited and also does not require fixed left turn operation at an intersection without traffic signal
road design, traffic equipment or communication facilities, lights or parking signs guidance. This maneuver can make
mainly focusing on the validation of a single autonomous a human driver to be prone to make mistakes since at that
1287
Authorized licensed use limited to: Indian Institute of Technology Hyderabad. Downloaded on November 01,2024 at 10:05:05 UTC from IEEE Xplore. Restrictions apply.
moment he must give priority to all the other traffic. If a and how to move on smoothly and how to drive out at last.
self-driving car is planning to turn left from a crossroad, it In order to achieve a driving performance similar to that
will need to find the right time to merge into the fast traffic of human drivers, the physical properties of the roundabout
flow and ensure that once it does so it will not collide with including shape, curvature and lane number, the driving
other vehicles, which differs a lot from a simple right-turn tendency of the surrounding vehicles, the current state of the
scenario. Therefore, driving strategies for deciding when to ego vehicle itself and other factors need to be considered
turn, what path to take and at what speed to drive should comprehensively.
be trained in repeated and safe tests. In this test scenario, In a free real test site, we set up a custom virtual OSM-
we intercepted partial of the semantic map that contains based roundabout scene with configurable curvature and
a well-constructed intersection and superimposed it on our number of lanes, as shown in Fig. 6. Circular virtual traffic
frees pace test site, with virtual traffic flow coming from flow is running in order. When the target point of the ego
the opposite side, the left side and the right side of our vehicle is specified, the dynamic planning module should
ego vehicle to simulate the real road condition, as shown generate a global path through the roundabout with smooth
in Fig. 5. The ego vehicle is initially located on lane 1 of curvature change. In the process of advancing along the given
road section A, and the target point is located on lane 7 of path, real-time scenario changes require the autonomous
section D. Multiple virtual vehicles are running in the scene vehicle to enter the roundabout at the right time. The
with designated driving path and speed which are convenient vehicle should adjust the current steering angle as well as
to configure in real time. The ego vehicle should predict its speed to maintain an appropriate safe distance between
the possible trajectories of traffic participants according to the front and rear vehicles to avoid possible collision. The
the perceptual data, and make reasonable decisions such experiment proves that our method can help to check the
as stopping and waiting temporally to give the right of reliability of algorithms to undertake roundabout maneuvers
way considering a high risk factor. Or, executing the left by configuring different physical parameters of the scene and
turn command towards the target lane to avoid unnecessary adjusting the running state of the traffic flow.
waiting when the driving risk is of low grad.
In addition, due to the introduction of real sensors, there C. Collision Detection and Instant Reaction in Blind Road
is no necessity to model the surrounding environment of the Areas
test site containing no road information. During the test, the Situations that occur frequently on road, such as the hard
sensors can update the surrounding environment information braking of the front vehicle and the sudden cut-in of a vehicle
in real time as a supplement to the overall perception in the adjacent lane, put forward strict requirements on the
results. Experiments show that our mixed test scenario can vehicle’s ability to response to an emergency. Moreover, the
operate normally and support the repeated unprotected left- sudden appearance of pedestrians, motor vehicles or static
turn driving strategy verification and improvement based on obstacles from the blind spot of the ego vehicle are all tough
real road information more safely and conveniently. situations for the self-driving car to cope with. The car is
supposed to be capable of dealing with the acts of pedestrians
B. Safe Traversal in Roundabouts or motor vehicles that do not comply with traffic rules,
and performs timely identification, judgment and response
following a reasonable logic.
Goal
Section C
6 5 4 Reference Measurement
Section B
Section D 4
Speed(m/s)
3
2
1
Section A
0
3 2 1
0 5 10 15 20 25 30 35
Ego vehicle Time(s)
Fig. 7. Validation of decision making and control algorithms in a simulated
Fig. 6. Simulation of roundabout situations. emergency situation.
Roundabouts is a means to regulate and smooth the traffic In the the target test scenario, a virtual pedestrian will
flow. For an autonomous driving vehicle, the problem to be suddenly appear and cross the road from the front of a
solved is how to correctly understand the scene, how to enter, stopped bus on the adjacent lane of our ego vehicle. The
1288
Authorized licensed use limited to: Indian Institute of Technology Hyderabad. Downloaded on November 01,2024 at 10:05:05 UTC from IEEE Xplore. Restrictions apply.
autonomous vehicle is a distance behind the left of the [6] Y. Laschinsky, K. von Neumann-Cosel, M. Gonter, C. Wegwerth,
simulated bus, and the bus keeps still at the platform. When R. Dubitzky, and A. Knoll, “Evaluation of an active safety light
using virtual test drive within vehicle in the loop,” in 2010 IEEE
the autonomous vehicle detected the bus in front, it is International Conference on Industrial Technology, March 2010, pp.
supposed to make the first stage of deceleration in case 1119–1112.
the bus starts and enters the current lane. Subsequently, [7] T. Tettamanti, M. Szalai, S. Vass, and V. Tihanyi, “Vehicle-in-the-
loop test environment for autonomous driving with microscopic traffic
when the vehicle approaches the bus, the pedestrian model simulation,” in 2018 IEEE International Conference on Vehicular
generated by the computer will suddenly appear from the Electronics and Safety (ICVES), Sep. 2018, pp. 1–6.
front of the bus and cross to the opposite side of the road. [8] R. Okuda, Y. Kajiwara, and K. Terashima, “A survey of technical trend
of adas and autonomous driving,” in 2014 International Symposium
At this time, the vehicle should make a decision according on VLSI Technology, Systems and Application (VLSI-TSA), 2014.
to the perceived results quickly, deciding whether to make a [9] S. Chen, Z. Jian, Y. Huang, Y. Chen, Z. Zhuoli, and N. Zheng, “Au-
second stage of deceleration or directly stop considering the tonomous driving: cognitive construction and situation understanding,”
Science China Information Sciences, vol. 62, 08 2019.
distance to obstacle. Finally, its also required that the vehicle [10] S. Chen, S. Zhang, J. Shang, B. Chen, and N. Zheng, “Brain-
can plan out a new driving trajectory regarding its current inspired cognitive model with attention for self-driving cars,” IEEE
state as well as the road constraints. Not only the logic Transactions on Cognitive and Developmental Systems, pp. 1–1.
[11] J. Pereira and R. Rossetti, “Autonomous vehicles simulation: A
of the decision-making module is verified in this way, but comprehensive review,” pp. 217–224, 01 2011.
the experimental limit safe distance and the vehicles control [12] F. Rosique, P. Navarro Lorente, C. Fernandez, and A. Padilla, “A
performance can also be verified in the testing process. Fig. 7 systematic review of perception system and simulators for autonomous
vehicles research,” Sensors, vol. 19, p. 648, 02 2019.
shows the vehicle speed change in a safe collision avoidance [13] P. Fernandes and U. Nunes, “Platooning of autonomous vehicles
action. Experiments integrated with virtual environment have with intervehicle communications in sumo traffic simulator,” in 13th
effectively improved the robustness of the control algorithms. International IEEE Conference on Intelligent Transportation Systems,
Sep. 2010, pp. 1313–1318.
[14] C. Brogle, C. Zhang, K. L. Lim, and T. Brunl, “Hardware-in-the-loop
autonomous driving simulation without real-time constraints,” IEEE
V. CONCLUSION Transactions on Intelligent Vehicles, vol. 4, no. 3, pp. 375–384, Sep.
2019.
In this paper, we proposed a ViL based test method for [15] . Y. Gelbal, S. Tamilarasan, M. R. Canta, L. Gven, and B. Aksun-
autonomous driving vehicles in mixed test environments. By Gven, “A connected and autonomous vehicle hardware-in-the-loop
simulator for developing automated driving algorithms,” in 2017 IEEE
organizing a computer-generated virtual scenario with rich International Conference on Systems, Man, and Cybernetics (SMC),
virtual traffic participants for guaranteed driving safety, we Oct 2017, pp. 3397–3402.
introduce the real ego vehicle, sensor suites as well as simple [16] S. Zhang, Y. Chen, S. Chen, and N. Zheng, “Hybrid a*-based curvature
continuous path planning in complex dynamic environments,” 10 2019,
real-world test scenarios for comprehensive hardware valida- pp. 1468–1474.
tion. Whats more, single vehicle intelligence is the ultimate [17] M. Butenuth, R. Kallweit, and P. Prescher, “Vehicle-in-the-loop real-
foundation for road safety and further vehicle collaboration world vehicle tests combined with virtual scenarios,” vol. 119, no. 9,
pp. 52–55, 2017.
on road. The utilization of projectable HD map reconstructed [18] Y. Feng, C. Yu, S. Xu, H. X. Liu, and H. Peng, “An augmented
from real driving scenarios helps to achieve flexible and more reality environment for connected and automated vehicle testing and
comprehensive drive tests that are not constrained by specific evaluation*,” in 2018 IEEE Intelligent Vehicles Symposium (IV), June
2018, pp. 1549–1554.
test filed or facilities. In addition, we directly transfer the [19] L. Hobert, A. Festag, I. Llatser, L. Altomare, F. Visintainer, and
virtual perception results in a unified fusioned data format A. Kovacs, “Enhancements of v2x communication in support of
without fake sensor data simulation, thus reducing unneces- cooperative autonomous driving,” IEEE Communications Magazine,
vol. 53, no. 12, pp. 64–70.
sary computing resources. Future work will focus on how to [20] S.-W. Ko, H. Chae, K. Han, S. Lee, and K. Huang, “V2x-based ve-
achieve more intelligent interaction between simulated traffic hicular positioning: Opportunities, challenges, and future directions.”
participants and more diverse scenario design to meet the [21] Y. Chen, S. Chen, T. Zhang, S. Zhang, and N. Zheng, “Autonomous
vehicle testing and validation platform: Integrated simulation system
testing needs of multiple autonomous driving modules. with hardware in the loop*,” 06 2018, pp. 949–956.
[22] S. Chen, N.-N. Zheng, Y. Chen, and S. Zhang, “A novel integrated
simulation and testing platform for self-driving cars with hardware in
R EFERENCES the loop,” IEEE Transactions on Intelligent Vehicles, vol. PP, pp. 1–1,
05 2019.
[1] W. Huang, Kunfeng Wang, Yisheng Lv, and FengHua Zhu, “Au-
tonomous vehicles testing methods review,” in 2016 IEEE 19th In-
ternational Conference on Intelligent Transportation Systems (ITSC),
Nov 2016, pp. 163–168.
[2] A. Bayha, F. Grneis, and B. Schtz, “Model-based software in-the-loop-
test of autonomous systems,” in Proceedings of the 2012 Symposium
on Theory of Modeling and Simulation - DEVS Integrative MS
Symposium, 2012.
[3] K. Lee, M. Jeong, and D. H. Kim, “Software-in-the-loop based
modeling and simulation of unmanned semi-submersible vehicle for
performance verification of autonomous navigation,” 2017.
[4] W. Deng, Y. Lee, and A. Zhao, “Hardware-in-the-loop simulation for
autonomous driving,” Proceedings - 34th Annual Conference of the
IEEE Industrial Electronics Society, IECON 2008, 11 2008.
[5] S. Baltodano, S. Sibi, N. Martelaro, N. Gowda, and W. Ju, “Rrads:
Real road autonomous driving simulation,” 2015.
1289
Authorized licensed use limited to: Indian Institute of Technology Hyderabad. Downloaded on November 01,2024 at 10:05:05 UTC from IEEE Xplore. Restrictions apply.