Ada 499032
Ada 499032
ABSTRACT
Many future military operations are expected to occur in urban environments. These complex, 3D battlefields intro-
duce many challenges to the dismounted warfighter. Better situational awareness is required for effective operation
in urban environments. However, delivering this information to the dismounted warfighter is extremely difficult.
For example, maps draw a user's attention away from the environment and cannot directly represent the three-
dimensional nature of the terrain.
To overcome these difficulties, we are developing the Battlefield Augmented Reality System (BARS). The system
consists of a wearable computer, a wireless network system, and a tracked see-through head-mounted display
(HMD). The computer generates graphics that, from the user's perspective, appear to be aligned with the actual en-
vironment. For example, a building could be augmented to show its name, a plan of its interior, icons to represent
reported sniper locations, and the names of adjacent streets.
This paper surveys the current state of development of BARS and describes ongoing research efforts. We describe
four major research areas. The first is the development of an effective, efficient user interface for displaying data
and processing user inputs. The second is the capability for collaboration between multiple BARS users and other
systems. Third, we describe the current hardware for both a mobile and indoor prototype system. Finally, we de-
scribe initial efforts to formally evaluate the capabilities of the system from a user’s perspective through scenario
analysis. We also will discuss the use of the BARS system in STRICOM's Embedded Training initiative.
—1 —
Form Approved
Report Documentation Page OMB No. 0704-0188
Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and
maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information,
including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington
VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it
does not display a currently valid OMB control number.
16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF 18. NUMBER 19a. NAME OF
ABSTRACT OF PAGES RESPONSIBLE PERSON
a. REPORT b. ABSTRACT c. THIS PAGE Same as 8
unclassified unclassified unclassified Report (SAR)
Augmented Reality System (BARS) and multi-modal virtual reality projects. His research interests include ubiqui-
tous computing and data distribution.
YOHAN BAILLOT is a computer and electrical engineer of ITT Industries at the Naval Research Laboratory. He
received an M.S. in electrical engineering in 1996 from ISIM, France, and an M.S. in computer science in 1999 from
the University of Central Florida. His research interests are in computer graphics, 3D displays, tracking, vision,
mobile augmented reality and wearable computers. Baillot is a member of the IEEE Computer Society.
J. EDWARD SWAN II is a Research Scientist with the Virtual Reality Laboratory at the Naval Research Labora-
tory, where he conducts research in computer graphics and human-computer interaction. At the Naval Research
Laboratory he is primarily motivated by the problem of battlefield visualization. Currently, he is studying multi-
modal input techniques for virtual and augmented reality systems. He received his Ph.D. from The Ohio State Uni-
versity in 1997. Swan is a member of ACM, SIGGRAPH, SIGCHI, IEEE, and the IEEE Computer Society.
JOSEPH L. GABBARD is a senior research associate at Virginia Tech in Blacksburg, VA, where he performs hu-
man-computer interaction research in non-traditional interactive systems such as VR and AR. He is currently inter-
ested in researching and developing usability engineering methods specifically for VEs. Other interests include de-
veloping innovative and intuitive interaction techniques employing ubiquitous input technology. He is currently
pursuing his Ph.D. in computer science at Virginia Tech. He received his M.S. in computer science from Virginia
Tech. Gabbard is a member of the IEEE and the IEEE Computer Society.
DEBORAH HIX is a Research Scientist at Virginia Tech in Blacksburg, VA. She received her PhD in Computer
Science and Applications from Virginia Tech. Hix is a pioneer in the field of human-computer interaction (HCI),
focusing on usability engineering. She is co-author of a popular book entitled Developing User Interfaces: Ensur-
ing Usability through Product and Process, published by John Wiley and Sons. This book is used world-wide by
both usability practitioners and university students. Hix has done extensive research, teaching, and consulting with a
variety of industrial and government organizations in the area of usability engineering for over 20 years. Most re-
cently, Hix has extended her two decades of HCI work into usability of virtual environments. She is a member of
several professional organizations and professional honor societies.
—2 —
Proceedings of the Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC '02),
Orlando, FL, December 2-5, 2002.
Many future military operations will occur in urban reduce some of the masking effects of built-up
environments [CFMOUT-97]. Military operations in terrain.”
urban terrain (MOUT) present many unique and chal-
Finally, in 2001 the DUSD (S&T) identified five criti-
lenging conditions for the warfighter. The environ-
cal hard topics, one of which was MOUT. Under
ment is extremely complex and inherently three-
MOUT, the use of augmented reality technology to
dimensional. Above street level, buildings serve vary-
enhance situational awareness was a noted technology
ing purposes (such as hospitals or communication sta-
improvement.
tions). They can harbor many risks, such as snipers or
mines, which can be located on different floors. Below A number of research programs have explored the
street level, there can be an elaborate network of sew- means by which navigation and coordination of infor-
ers and tunnels. The environment can be cluttered and mation can be delivered to the dismounted soldier.
dynamic. Narrow streets restrict line of sight and make Many of these approaches are based on handheld maps
it difficult to plan and coordinate group activities. (e.g., an Apple Newton), or opaque head-mounted dis-
Threats, such as snipers, can continuously move and plays (HMDs). For example, the Land Warrior pro-
the structure of the environment itself can change. For gram introduced a head-mounted display that com-
example, a damaged building can fill a street with rub- bined a map and a “rolling compass” [Gumm-98].
ble, making a once-safe route impassable. Such diffi- Unfortunately, these methods have a number of limita-
culties are compounded by the need to minimize the tions. They obscure the user’s field of view and do not
number of civilian casualties and the amount of dam- truly represent the three-dimensional nature of the en-
age to civilian targets. vironment. Moreover they require the user to integrate
the graphical display within the environment to make
In principle, many of these difficulties can be over-
sense of it. This work is sometime difficult and dis-
come through better situational awareness. The Con-
tracting from the current task. To overcome these
cepts Division of the Marine Corps Combat Develop-
problems, we propose the use of a mobile augmented
ment Command (MCCDC) concludes [CMOUT-97]:
reality system.
“Units moving in or between zones must be
A mobile augmented reality system consists of a com-
able to navigate effectively, and to coordinate
puter, a tracking system, and a see-through HMD. The
their activities with units in other zones, as
system tracks the position and orientation of the user’s
well as with units moving outside the city.
head and superimposes graphics and annotations that
This navigation and coordination capability
are aligned with real objects in the user’s field of view.
must be resident at the very-small-unit level,
With this approach, complicated spatial information
perhaps even with the individual Marine.”
can be directly aligned with the environment. For ex-
These conclusions were strengthened in the document ample, the name of a building could appear as a “vir-
"Future Military Operations on Urbanized Terrain" tual sign post” attached directly to the side of the build-
where the MCCDC notes: ing. To explore the feasibility of such a system, the
Naval Research Laboratory (NRL) is developing a pro-
“...we must explore new technologies that will
totype augmented reality (AR) system known as
facilitate the conduct of maneuver warfare in
BARS, the Battlefield Augmented Reality System.
future MOUT. Advanced sensing, locating,
This system will network multiple outdoor, mobile
and data display systems can help the Marines
users together with a command center.
to leverage information in ways which will
—3 —
Proceedings of the Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC '02),
Orlando, FL, December 2-5, 2002.
To achieve this goal many challenges must be over- are shown, and the user can vary the inner and outer
come [Julier-99]. This paper surveys the current state diameters of the cylinder walls. We also use semantic
of development of BARS and describes ongoing re- filters based on the user's task or orders from a com-
search efforts. We describe four major research areas. mander—for example, a route associated with a task
The first is the development of an effective, efficient will be shown regardless of the user's spatial filter set-
user interface for displaying data and processing user tings, and threats will be shown at all times.
inputs (such as the creation of new reports). The sec-
ond is the capability for collaboration between multiple Selecting Objects
BARS users and other systems (CAVEs or Work-
benches). Third, we describe the current hardware to Early uses of BARS will mainly consist of users ob-
provide both mobile and indoor prototype systems. serving and selecting objects in the environment, either
Finally, we describe initial efforts to formally evaluate to find out more about them (“Where is the electrical
the capabilities of the system from a user’s perspective. cut off switch?”) or to add information about them (“I
We discuss the scenario analysis we have performed saw a sniper on the third floor of that building.”).
for the system and conclusions drawn to date. We also Thus, the system should include a mechanism to allow
will discuss the use of the BARS system in STRI- the user to easily select items in the environment.
COM's Embedded Training initiative. Our research on interaction paradigms is guided by two
facts. First, many of the objects a user interacts with
BARS USER INTERFACE are distant (greater than 5m away) and are large (e.g., a
building). Second, the position and orientation of the
The mobile outdoor system is designed with usability
user’s head is accurately tracked. Therefore, most in-
engineering methods to support efficient user task per-
teractions are via gestures that require a user to point at
formance. BARS must provide information to the user,
distant objects. To date, we have utilized a handheld
and the user must be able to enter data into the system.
wireless mouse. The gestural input requires two steps.
Neither flow of information can be allowed to distract
First, the user faces the possible object of interest (ad-
the user from the primary task. An important feature of
justing head orientation). Then, using the mouse, the
the user interface is that BARS must be able to monitor
user maneuvers a cursor over the object. When the
many sources of data about the user and use intelligent
user presses the mouse button, a “gaze ray” is con-
heuristics to combine those data with information about
structed from the user’s head position and the cursor
the environment and tasks. For example, it might be
position; this is intersected with the shared information
possible to monitor the level of stress of the user in
database to determine what objects have been selected.
order to tailor the amount of information needed and
Although current tracking methods do not always
reduce it to a minimum during high-stress situations.
achieve the accuracy necessary, we find them sufficient
and are working to improve the performance of the
The Shared Information Database
tracking system.
The system contains a detailed 3D model of objects in
the real environment that is used to generate the regis- Speech and Gesture Input
tered graphical overlay. This model is stored in a
The mouse-based interface described in the previous
shared database that also contains information about
subsection has two important limitations. First, it is
the objects such as a general description, threat classi-
difficult to perform complicated interactions with a
fication, etc. Using knowledge representation and rea-
handheld mouse; a user must resort to various types of
soning techniques, we can also store in this database
drop-down menus. Second, one of the user’s hands is
information about the objects’ relevance to each other
occupied with the need to hold and manipulate a
and to the user's task.
mouse. To overcome these problems, we are research-
ing speech and gesture input techniques. These tech-
The Information Filter
niques will support more sophisticated interactions and
The shared database contains much information about minimize errors. We are implementing speech and
the local environment. Showing all of this information gesture techniques with the Adaptive Agent Architec-
can lead to a cluttered and confusing display. We use ture, which is part of the QuickSet application suite
an information filter to add objects to, or remove ob- [Cohen97]. We have already performed a preliminary
jects from, the user's display. We use a spatial filter to integration of a 2D handheld gesture display with
show only those objects that lie in a certain zone BARS and we are investigating how novel 3D tracking
around the user. This zone can be visualized as a cyl- technologies can be used to implement 3D gesture rec-
inder whose main axis is parallel to the user's "up" vec- ognition..
tor, where objects that fall within the cylinder's walls
—4 —
Proceedings of the Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC '02),
Orlando, FL, December 2-5, 2002.
Figure 1: A remote BARS user is highlighted with a box shape. In this example, the user is also physically visible,
but the position information is transmitted for all mobile users and can show the location of an occluded user.
Collaboration Mechanism
The BARS collaboration system ensures that the rele-
vant parts of the shared database are replicated on
every user's machine. Information is deemed relevant
to a particular user based on the information filter de-
scribed previously. Users join distribution channels Figure 2: An annotated view of the hardware configura-
that work like IP multicast groups; however, the actual tion of the current BARS prototype.
implementation does not depend on IP multicast.
Based on the importance of the data, the channels use
reliable and unreliable transport mechanisms in order
to keep network traffic low. For example, under opti- BARS PROTOTYPE
mal conditions, user positions are updated in real time
Built from commercial, off-the-shelf (COTS) products,
(at least 30 Hz) using unreliable transport, but with a
the mobile prototype for BARS is composed of (Figure
frequency of around 5 Hz, user positions are sent relia-
2):
bly so that those with overloaded connections will at
least get positions at a usable rate (Figure 1). • Ashtech GG24-Surveyor (real-time differential
A channel contains a class of objects and distributes kinematic GPS receiver for position tracking)
information about those objects to members of the • InterSense InertiaCube2 (for orientation tracking)
channel. Some channels are based on physical areas,
and as the user moves through the environment or • Sony Glasstron LDI-D100B see-through HMD
modifies the spatial filter, the system automatically (when color and stereo rendering are important) or
joins or leaves those channels. Other channels are • MicroVision laser retinal scanning see-through
based on semantic information, such as route informa- head-worn display (when legibility in very bright
tion only applicable to one set of users, or phase lines or very dim conditions is important)
only applicable to another set of users. In this case, the
user voluntarily joins the channel containing that in- • Dell Inspiron 7000 Notebook computer (main
formation, or a commander can join that user to the CPU and 3D graphics engine)
channel.
—5 —
Proceedings of the Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC '02),
Orlando, FL, December 2-5, 2002.
—6 —
Proceedings of the Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC '02),
Orlando, FL, December 2-5, 2002.
have, through expert evaluation, designed three poten- this domain. STRICOM, in conjunction with NRL is
tial protocols (Figure 3 gives one example.) through studying how BARS can impact training at three-
which such information can be displayed. We take levels: as a means to blend synthetic and live forces; as
advantage of classic methods of technical illustration a means to provide “training wheels” to show trainees
and use combinations of the following parameters. critical information; and as a tool to assist trainers in
constructing and operating a training scenario.
• solid, dashed, or dotted lines or polygons
The first aspect utilizes BARS to “enrich” an existing
• intensity or color scenario. Many MOUT facilities consist of a small
• outlined or filled polygonal representation group of fairly bare buildings that occupy a self-
contained area, typically no more than a few city
• line thickness blocks. However, if a user’s position and orientation
Until user-based usability evaluations are conducted, were accurately tracked, synthetic forces and building
however, all such designs are speculative. We have features can be inserted into the user’s environment. If
identified a number of principles, such as using multi- a user were connected through a wireless network to a
ple parameters to differentiate different distances or simulation system such as OneSAF, users could be
number of occluding objects, limiting the number of presented with reactive entities such as air forces
objects in a given direction, and that parameters can be (simulate call for fire) or even with individual combat-
confounded or masked by the characteristics of the ants. Furthermore, BARS could be used to mix live
display. For example, intensity of the graphics can forces at physically different sites (such as multiple
sometimes be confounded with background intensity, MOUT facilities) into the same environment. How-
or with stippling (dashed or dotted) effects. We are ever, it should be noted that this application is ex-
conducting user-based evaluations in the summer and tremely technically challenging. Registration must be
fall of 2002 to determine how various parameters inter- accurate to the nearest pixel to ensure that occlusion by
act and how the user performs under a variety of de- the real world is correct. As noted in the previous sec-
signs and tasks. The evaluation will employ represen- tion, usability evaluation will help determine what
tative domain users, performing tasks derived from the level of accuracy a warfighter requires to complete a
BARS use case. To our knowledge, this is one of the (simulated) mission.
first user-based, mobile, outdoor AR usability evalua- The second aspect is to use BARS to provide trainees
tions. BARS and other non-traditional computer sys- with a set of “training wheels”. For example, BARS
tems are much more difficult to evaluate than their 2D could be used to visualize Fatal Funnels or other struc-
graphical user interface counterparts [Bowman-02] and tural risks in urban environments. Furthermore, it
as such, will likely require the invention of new evalua- could be combined with recording or playback systems
tion techniques. to assist in post mortem analysis of a training exercise.
In addition, the user-centered requirements identified The final aspect is to provide the trainer with a BARS
important performance bounds on known system re- system. Through its ability to convey situational
quirements. For example, by identifying the likely set awareness information such as the location of trainees
of objects of interest to BARS users, we discovered who might not be visible from the trainer’s vantage
that registration (and thus tracking) has to be good point, BARS could enable synthesis of more compel-
enough to accurately position graphical indicators on ling and difficult training scenarios.
buildings and streets, but it does not have to be any
more accurate than this. This bound is important, be- Current research plans are considering the first of these
cause highly accurate tracking is extremely difficult. training aspects and, in particular, we are beginning to
study how to interface BARS with a simulation system.
EMBEDDED TRAINING AND BARS
SUMMARY
So far, this paper has concentrated on the possible uses
of BARS as a situational awareness tool. However, We have presented the Battlefield Augmented Reality
BARS and augmented reality have the potential to sig- System in its current research state. The basic goal of
nificantly impact training. As dismounted warrior sys- BARS is to aid situational awareness for MOUT. To
tems become more sophisticated, the need for detailed, provide a useful and usable system, we are conducting
precise, and advanced training and simulation has be- research on the user interface and collaboration meth-
come paramount. The US Army Simulation, Training, ods. We are beginning to use the current prototype to
and Instrumentation Command (STRICOM) has initi- formally evaluate the usefulness and usability of the
ated an embedded training program [Dumanoir-02] to system, and expect to conduct our first user studies on
study how revolutionary techniques can be applied to basic information display research in the coming
—7 —
Proceedings of the Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC '02),
Orlando, FL, December 2-5, 2002.
months. As we continue to refine the BARS domain Imaging Science and Technology) Electronic Im-
analysis and subsequent usability engineering activi- aging 2002, San Jose, CA, January 19-25, 2002.
ties, we will iteratively improve the current prototype
[Gage-97] D. W. Gage, “Network Protocols for Mobile
to a field-deployable prototype in the coming years.
Robot Systems”, SPIE AeroSense Conference
Mobile Robotics XII, Vol. 3120, pp. 107-118, Oc-
REFERENCES
tober 1997.
[Baillot-02] “Wearable 3D Graphics for Augmented
[Gold-93] R. Gold, B. Buxton, S. Feiner, C. Schmandt,
Reality: A Case Study of Two Experimental
P. Wellner and M. Weiser, “Ubiquitous Comput-
Backpack Computers”, Y. Baillot, E. Gagas, T.
ing and Augmented Reality”, Proceedings of
Höllerer, S. Feiner and S. Julier, Unpublished
SIGGRAPH-93: Computer Graphics, Anaheim,
manuscript (technical report in preparation).
CA, 1993.
[Bowman-02] Doug Bowman, Joseph L. Gabbard,
[Gumm-98] M.M.Gumm, W.P. Marshakm T.A.
Deborah Hix, "A Survey of Usability Evaluation in
Branscome, M. McWesler, D.J. Patton and L.L.
Virtual Environments: Classification and Com-
Mullins, “A Comparison of Solider Performance
parison of Methods". In Presence: Teleoperators
using Current Land Navigation Equipment with
and Virtual Environments. Volume 11, Number 4,
Information Integrated on a Helmet-Mounted Dis-
2002, pages 435-455.
play,” ARL Report ARL-TR-1604, DTIC Report
[CFMOUT-97] Concepts Division, Marine Corps 19980527081, April 1998.
Combat Development Command, “A Concept for
[Hammel-68] E., Hammel, “Fire in the Streets: The
Future Military Operations on Urbanized Terrain,”
Battle for Hue, Tet 1968”, Pictorial Histories Pub.
approved July 1997
Co, 1968.
[DARPAWV-97] “Warfighter Visualization Program”,
[Julier-99] S. Julier, S. Feiner and L. Rosenblum,
http://web-ext2.darpa.mil/ETO/wv/index.html
“Augmented Reality as an Example of a Demand-
[Darken-99] R.P. Darken and H. Cevik, “Map Usage in ing Human-Centered System”, First EC/NSF Ad-
Virtual Environments: Orientation Issues,” Pro- vanced Research Workshop, 1-4 June 1999,
ceedings of IEEE Virtual Reality ’99, Houston, France.
TX, March 13-17, 1999, 133-140.
[Julier-00] S. Julier, M. Lanzagorta, Y. Baillot, L.
[Dumanoir-02] “ Embedded Training for Dismounted Rosenblum, S. Feiner, T. Höllerer, S. Sestito, “In-
Soldiers (ETDS)”, P. Dumanoir, P. Garrity, B. formation Filtering for Mobile Augmented Real-
Witmer, R. Lowe, submitted to 2002 IITSEC Or- ity”, 2000 International Symposium on Aug-
lando FL Dec 2002. mented Reality.
[Durbin-98] J. Durbin, S. J. Julier, B. Colbert, J. [Julier-00a] S. Julier, Y. Baillot, D. Brown, and L.
Crowe, R. Doyle, R. King, LCDR T. King, C. Rosenblum. “BARS: Battlefield Augmented Real-
Scannell, Z. J. Wartell and T. Welsh, Proceedings ity System,” NATO Symposium on Information
of the SPIE AeroSense Conference, Orlando, FL, Processing Techniques for Military Systems, 9-11
1998 October 2000, Istanbul, Turkey.
[Feiner-97] S. Feiner, B. MacIntyre, T. Höllerer and T. [Lanzagorta-98] M. Lanzagorta, E. Kuo and J.
Webster “A touring machine: Prototyping 3D mo- Uhlmann, “GROTTO Visualization for Decision
bile augmented reality systems for exploring the Support”, Proceedings of the SPIE 12th Annual In-
urban environment”, In Proceedings of ISWC '97 ternational Symposium on Aerospace/Defense
International Symposium on Wearable Computers, Sensing, Simulation, and Controls, Orlando, Fl.,
Cambridge, MA, 1997. 1998.
[FLEXIPC-98] “Specification of the ViA II Wearable [MacIntyre-98] B. MacIntyre and S. Feiner, “A Dis-
Computer”, http://www.flexipc.com tributed 3D Graphics Library”, Proceedings of
SIGGRAPH 1998, Orlando, FL, 1998
[Gabbard-02] J.L. Gabbard, J.E. Swan, D. Hix, M.
Lanzagorta, M.A. Livingston, D. Brown, S. Julier, [SSCOMWL-98] “Land Warrior Home Page”,
“Usability Engineering: Domain Analysis Activi- http://www-
ties for Augmented Reality Systems,” Proceedings sscom.army.mil/prodprog/lw/index.htm.
of the Conference on The Engineering Reality of
Virtual Reality 2002, SPIE (International Society
for Optical Engineering) and IS&T (Society for
—8 —