A Command and Control System for Air Defense Forces with Augmented Reality and Multimodal Interaction
A Command and Control System for Air Defense Forces with Augmented Reality and Multimodal Interaction
Long Chen*, Wei Wang, Jue Qu, Songgui Lei and Taojin Li
Air force engineering university in Xi 'an, shaanxi province, China
*
E-mail:[email protected]
Abstract. This work proposes a framework for the command and control system of air defense
forces based on augmented reality technology. The system uses Microsoft HoloLens hardware
as a carrier to realize the holographic display of battlefield situation information. The system
can help the commander to perceive and control the situation on the battlefield in an efficient
manner, in order to reduce the decision-making time of the commander. Commanders can
efficiently interact with the system through gestures and voice, deploy simulated military units
in the system through gestures, and send commands to weapon systems through the system. At
the same time, the system also supports multi-person real-time collaborative operation, which
enables the joint decision-making and sharing of commanders. This system is of great
significance to improve operational efficiency and reliability, and at the same time provides an
open platform for operational process planning, deduction, simulation, and verification
evaluation.
1. Introduction
In the field of military science , "command and control" (C2 ) the term originated in the "Art of War"
(1836) a book, Jomini used "Conmmand" and "Control of Operations" respectively when he
expounded "army command and operational Control" [1]. Command and control system refers to the
information system for commanders and their command organs to conduct command and control of
combatants and main battle weapons and equipment. It is mainly composed of hardware platforms
such as information processing, display, transmission and monitoring, as well as processing of system
command and combat functions. Software composition . The accusation system has the functions of
intelligence receiving and processing and situation generation, auxiliary decision-making, combat
simulation and evaluation, information display and distribution, tactical calculation, command
issuance, security and confidentiality, force management, training simulation, etc. Its basic task is to
assist commanders to master the battlefield in time Situation, scientifically formulate operational plans,
and quickly and accurately issue operational commands to the troops.
Sources many modern battlefield, the war situation the rapidly changing, the explosive growth of
information to the commander 's perception of the battlefield situation has brought new challenges.
The battlefield information needs to be quickly transmitted to the commander through the information
display interface. After the commander makes a decision, the control information is quickly fed back
to the accusation system through appropriate means. The traditional way of displaying system
information is mainly based on the two-dimensional computer screen. The information display is not
intuitive and efficient, which leads to the increase of the commander's cognitive load and the low
efficiency of human-computer interaction. Augmented reality, as a tool that assists people in cognition
Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution
of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.
Published under licence by IOP Publishing Ltd 1
CGDIP 2020 IOP Publishing
Journal of Physics: Conference Series 1627 (2020) 012002 doi:10.1088/1742-6596/1627/1/012002
and interaction in the real environment, has its own technical characteristics to provide new solutions
to the situational awareness of commanders. Augmented reality systems used in military command
and control systems in can greatly improve warfighter situational perception can force [2]. This article
uses VR and MR technology to develop a set of air defense anti-missile holographic battlefield
command and control system. The system provides a holographic battlefield situation information
construction and command interaction implementation method, which can enhance the realism and
immersion of situation awareness, enhance the system's intelligent information service level, and
intelligently and efficiently provide images for commanders and staff. The intuitive battlefield
situation and visual information interaction interface realize a kind of intelligent information service
method for future advanced command post that supports different combat personnel and different
combat forms.
2. Background
2
CGDIP 2020 IOP Publishing
Journal of Physics: Conference Series 1627 (2020) 012002 doi:10.1088/1742-6596/1627/1/012002
needs to render a small number of objects superimposed in the real space, which greatly reduces the
computation.
(2) Scene positioning. The bottom layer of HoloLens comes with its own environment mapping
and positioning algorithm, which can adapt to various environments without additional auxiliary
pendants. The hologram in HoloLens does not change its position in real space when the user is
temporarily away, because this environmental information is already stored in the HoloLens.
(3) There are many ways to interact. In the virtual scene, users can operate on virtual objects in a
variety of ways, which are voice, gesture and gaze respectively. The operation of virtual objects by
gesture interaction has replaced the existence of the mouse in traditional computers.
(4) General application. The operating system of the HoloLens platform is Windows Holograpic,
which is also customized based on Windows 10. So the Windows 10 UWP universal application can
run smoothly on HoloLens. This not only reduces research and development and migration costs, but
also enables development efficiency to be greatly improved.
3 Technical Approach
The system architecture is shown in Figure 2. The system can support multiple users simultaneously
online for use, the user through gestures, voice and system be multi-channel interaction. The system
consists of a database, a Server / Client server, and a Host domain. The database is used for the storage
of data required for system operation and the construction of system virtual simulation scenarios,
including combat scenario data, combat weapon 3D model libraries, 3D terrain libraries, and various
model control scripts. Server / Client server is used to realize the real-time interaction of multiple
Hololens , and provides conditions for communication between Hololens . The Host domain contains
system interaction equipment Hololens and users. The system is constructed using the following
technology.
3
CGDIP 2020 IOP Publishing
Journal of Physics: Conference Series 1627 (2020) 012002 doi:10.1088/1742-6596/1627/1/012002
two-dimensional limitations of traditional interaction and construct a more Natural and intuitive
three-dimensional interactive environment. Interactive devices currently used for desktop graphical
interfaces, such as a mouse, trackball, and touch screen, have only two degrees of freedom
( translation along the plane x and y axis ) , while objects in three-dimensional space have the
characteristics of moving in six directions of freedom, including along the three-dimensional space X ,
Y , Z axis translation and about the X , Y , Z axis rotation, due to the increased degree of freedom,
windows, menus, icons and conventional two-dimensional cursor in the three-dimensional interactive
environment will destroy the sense of space, and also Makes the interaction process very unnatural, so
new interaction interfaces need to be designed. Three-dimensional user interface surface does not
replace the traditional two-dimensional graphical user interface paradigm, but to solve the poor
performance of traditional interactive mode field, compared to traditional two-dimensional graphical
user interface, three-dimensional user interface has the following advantages:
(1) Enhance the user's comprehensive ability to process information, these abilities include
cognition, perception, learning and memory;
(2) Provide a new space for organizing, carrying, and presenting more complex information, which
can more clearly and intuitively show the relationship and difference between different types of
information;
(3) Make the information display more direct and easier to understand; it can introduce many more
natural and rich behaviors in human reality into the traditional human-computer interaction.
The user interface in the mixed reality system includes two parts: 3D Widget and 3D model. 3D
Widget is a concept derived from a two-dimensional graphical interface, and is an entity that
encapsulates three-dimensional geometric shapes and their behavior. The three-dimensional geometric
shape refers to the appearance of the 3D Widget itself, which is generally a rectangular
three-dimensional patch that can be scaled, rotated, and stretched in a three-dimensional space. The
behavior includes the interactive control of 3D Widget with other objects in the 3D scene and the
display of object information. 3D the Widget interact also comprise two types, one 3D the Widget
itself as the 3D objects in the scene, can be changed by its interaction position, size, direction, and
other attributes; Second 3D the Widget contents displayed, by gestures, gaze Etc. for interactive
operations. The specific design is:
(1) Position of the 3D Widget under the viewpoint: As an object in the virtual scene, the 3D Widget
can be moved, rotated, and scaled by the user. This operation can be controlled by gesture signals. In
the mobile Widget mode, the incremental matrix is moved by the hand. To calculate the amount of
movement of the 3D Widget , the specific calculation formula is:
MV M H M H1 MV (1)
MV is the matrix of the current frame 3D Widget under the viewpoint, M H is the current frame
virtual hand gesture obtained from gesture recognition, MH’-1 is the virtual hand gesture obtained from
the previous frame gesture recognition, and MV’ is the matrix of the previous frame 3D Widget under
viewpoint .
(2) 3D Widget 's movement with the head: After the user's head moves, the 3D Widget remains
stationary relative to the objects in the real world. Therefore, after obtaining the posture of the head
movement, calculate the 3D Widget from the head The offset matrix obtained by motion:
M W M h M h01 M W0
(2)
MW current frame 3D Widget's position matrix relative to the head, MH is the current frame head
pose matrix, MH0-1 is the initial pose of the head, and MW0 is the 3D Widget's position matrix in the
initial state.
3D Widget activation status determination: the face of the interface of 3D Widget and 3D models,
interactive operation for only one of the only active object. The system judges the interference
between the ray formed by the center point of the screen as the origin and the direction of the
4
CGDIP 2020 IOP Publishing
Journal of Physics: Conference Series 1627 (2020) 012002 doi:10.1088/1742-6596/1627/1/012002
viewpoint and the 3D Widget , and calculates the first object that interferes with the ray. The object is
the active 3D Widget .
5
CGDIP 2020 IOP Publishing
Journal of Physics: Conference Series 1627 (2020) 012002 doi:10.1088/1742-6596/1627/1/012002
In the command and control decision-making process, each user ’s operation has the same weight
for server simulation calculations. Each calculation must be performed according to the user ’s latest
frame of complete operation data. The calculation is completed and displayed in real-time
visualization. Affect the operating experience of each user. When multiple users work together, users
will consciously avoid conflicts and operate in order. The server simulation simulation calculation will
complete the calculation within a set period. Generally, there will not be conflicts when different
instructions are issued for the same object, but unavoidable access conflicts will also occur. For
example, when two users make a decision on a certain mobile in a decision There will be differences
in the deployment of forces and simultaneous deployment to different locations. At this time, data
conflicts will occur in the calculation. Therefore, it is necessary to use a certain strategy to resolve
conflict operations, as shown in Figure 4 . When the server calculates according to the latest frame of
complete operation data of each user, if there is a conflict of access data during the process, the server
will select the correct client operation according to the specific first-come-first-served rule as the
preferred operation. , And then judge the conflict of access data until the conflict disappears, and
finally output the calculation result and visualize it.
6
CGDIP 2020 IOP Publishing
Journal of Physics: Conference Series 1627 (2020) 012002 doi:10.1088/1742-6596/1627/1/012002
user 2 sees is the same as the state that user 1 sees, he can also perform his own operation, and other
users participating in collaboration The operation result of user 2 can be seen in real time .
Multi-person collaboration data transfer is achieved through the Socket protocol, through the
Server / Client server interface to build the server and transfer client data. This system introduces a
message consistency check in the host domain in the server - client one-to-one communication mode,
and adds a real-time message synchronization mechanism between devices based on the traditional
socket network connection, as shown in Figure 5 . When any shared message data in the host domain
marked with the [ SyncVar ] label is modified on the server and client , the data is shared to all devices
in the host domain through the wireless network , and before responding to the received data operation,
the device will data received in the host conducted public domain, whether the data information to
check all equipment received unanimous, as the same fruit, then further rendered according to the
message data, otherwise, have to give up all the equipment in response to this message, until the
device is shared message The data becomes consistent.
4. System functions
We build a multiplayer sync air defenses, command and control system used, the system can achieve
situational awareness, command is sent, the deployment of troops and shared decision-making of
feature , shown in Figure 6 below.
7
CGDIP 2020 IOP Publishing
Journal of Physics: Conference Series 1627 (2020) 012002 doi:10.1088/1742-6596/1627/1/012002
information. For example, the deployment of enemy forces, the performance of our weapons, and the
terrain of the battlefield are displayed to the commander in three dimensions to reduce the
commander ’s cognitive load and better control the battlefield situation, as shown in Figure 7 .
(a) Radar range display (b) Air defence missile fire range display
8
CGDIP 2020 IOP Publishing
Journal of Physics: Conference Series 1627 (2020) 012002 doi:10.1088/1742-6596/1627/1/012002
(1) More natural display effect. Although the system realizes the battlefield situation virtualization
and command control, it is limited by the hardware conditions. The display viewing angle is limited,
and the display delay problem when multiple users collaborate on the same local area network needs
to be improved.
(2) Real-time access to combat information. The true meaning of the intelligent holographic
situation display and command and control system should be connected to the battlefield information
in real time to provide real-time display and guidance for the commander of the commander.
Unfortunately, due to the non-uniformity of the data interface of each weapon system, the function is
difficult to achieve .
(3) A more flexible holographic application interaction method. The holographic application
developed by the system needs to interact with 3D menus and virtual objects through gaze, gestures,
voice, and so on to control the movement of the model. If more intelligent interaction methods such as
eye movement control and brain-computer interface can certainly make the operation experience A
qualitative change has occurred.
References
[1] Henri D J 1838 The Art of War New York, NY: Greenhill Press
[2] Julier S, Baillot Y, Lanzagorta M, Brown D G and Rosenblum L 2000 BARS: battlefield
augmented reality system Nato Sympo-sium on Information Processing Techniques for Military
Systems pp 9-11
[3] Gans E, Roberts D, Bennett M, Towles H, Menozzi A and Cook J 2015 Augmented reality
technology for day/night situational awareness for the dismounted Soldier Display Technologies
and Applications for Defense, Security, and Avionics IX;and Head- and Helmet-Mounted
Displays XX, 947004 (21 May 2015)
[4] Jenkins M, Wollocko A, Negri A and Ficthl T 2018 Virtual, Augmented and Mixed Reality:
Applications in Health, Cultural Heritage, and Industry, pp.272-288
[5] Taylor A G 2016 develop microsoft holoLens apps now Apress
[6] Chen H, Lee A S and Swift M 2015 3D collaboration method over holoLens™ and skype™ end
points International Workshop on Immersive Media Experiences. ACM, 2015
[7] Hockett P and Ingleby T 2016 Augmented Reality with Hololens: Experiential Architectures
Embedded in the Real World
9
Reproduced with permission of copyright owner. Further reproduction
prohibited without permission.