0% found this document useful (0 votes)
6 views

Ada 499032

The document discusses the development of the Battlefield Augmented Reality System (BARS) aimed at enhancing military operations in urban environments. BARS integrates a wearable computer, wireless network, and a tracked head-mounted display to provide real-time situational awareness by overlaying relevant information onto the user's view of the environment. The paper outlines ongoing research in user interface design, collaborative capabilities, hardware development, and user evaluation methods for the system.

Uploaded by

jojokaway
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Ada 499032

The document discusses the development of the Battlefield Augmented Reality System (BARS) aimed at enhancing military operations in urban environments. BARS integrates a wearable computer, wireless network, and a tracked head-mounted display to provide real-time situational awareness by overlaying relevant information onto the user's view of the environment. The paper outlines ongoing research in user interface design, collaborative capabilities, hardware development, and user evaluation methods for the system.

Uploaded by

jojokaway
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Proceedings of the Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC '02),

Orlando, FL, December 2-5, 2002.

AN AUGMENTED REALITY SYSTEM FOR MILITARY OPERATIONS IN


URBAN TERRAIN

Mark A. Livingston1 Lawrence J. Rosenblum1 Simon J. Julier2


Dennis Brown2 Yohan Baillot2 J. Edward Swan II1
Joseph L. Gabbard3 Deborah Hix3
1
Advanced Information Technology, Naval Research Laboratory, Washington, DC 20375
2
ITT Advanced Engineering and Sciences, Alexandria, VA 22303
3
Systems Research Center, Virginia Polytechnic Institute and State Univ., Blacksburg, VA 24061

ABSTRACT
Many future military operations are expected to occur in urban environments. These complex, 3D battlefields intro-
duce many challenges to the dismounted warfighter. Better situational awareness is required for effective operation
in urban environments. However, delivering this information to the dismounted warfighter is extremely difficult.
For example, maps draw a user's attention away from the environment and cannot directly represent the three-
dimensional nature of the terrain.
To overcome these difficulties, we are developing the Battlefield Augmented Reality System (BARS). The system
consists of a wearable computer, a wireless network system, and a tracked see-through head-mounted display
(HMD). The computer generates graphics that, from the user's perspective, appear to be aligned with the actual en-
vironment. For example, a building could be augmented to show its name, a plan of its interior, icons to represent
reported sniper locations, and the names of adjacent streets.
This paper surveys the current state of development of BARS and describes ongoing research efforts. We describe
four major research areas. The first is the development of an effective, efficient user interface for displaying data
and processing user inputs. The second is the capability for collaboration between multiple BARS users and other
systems. Third, we describe the current hardware for both a mobile and indoor prototype system. Finally, we de-
scribe initial efforts to formally evaluate the capabilities of the system from a user’s perspective through scenario
analysis. We also will discuss the use of the BARS system in STRICOM's Embedded Training initiative.

ABOUT THE AUTHORS


MARK A. LIVINGSTON is a Research Scientist in the Virtual Reality Laboratory at the Naval Research Labora-
tory, where he works on the Battlefield Augmented Reality System (BARS). He received his Ph.D. from the Uni-
versity of North Carolina at Chapel Hill, where he helped develop a clinical augmented reality system for both ultra-
sound-guided and laparoscopic surgical procedures, focusing on tracking subsystems. His current research focuses
on vision-based tracking algorithms and on user perception in augmented reality systems. Livingston is a member
of IEEE, ACM, and SIGGRAPH, and is a member of the VR2003 conference committee.
LAWRENCE J. ROSENBLUM is Director of VR Systems and Research at the Naval Research Laboratory (NRL)
and Program Officer for Visualization and Computer Graphics at the Office of Naval Research (ONR). Rosenblum
received his Ph.D. in mathematics from The Ohio State University. He is on the Editorial Board of IEEE CG&A
and J. Virtual Reality and the Advisory Board of the IEEE Transactions on Visualization and Computer Graphics.
He was the elected Chairman of the IEEE Technical Committee on Computer Graphics from 1994-1996 and is cur-
rently a TC Director. He is a founder and steering committee member of the IEEE Visualization and IEEE VR Con-
ference Series. Elected a Senior Member of the IEEE in 1994, Rosenblum is also a member of the IEEE Computer
Society, ACM, SIGGRAPH, and the AGU.
SIMON J. JULIER is a Research Scientist for ITT Industries at the Naval Research Laboratory. He received a
D.Phil. from the Robotics Research Group, Oxford University, UK. He is a technical lead on the Battlefield Aug-
mented Reality System (BARS) project. His research interests include mobile augmented reality and large-scale
distributed data fusion.
DENNIS BROWN is a Senior Software Engineer for ITT Industries at the Naval Research Laboratory. He received
his M.S. in Computer Science from the University of North Carolina at Chapel Hill. He works on the Battlefield

—1 —
Form Approved
Report Documentation Page OMB No. 0704-0188

Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and
maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information,
including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington
VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it
does not display a currently valid OMB control number.

1. REPORT DATE 3. DATES COVERED


2. REPORT TYPE
2002 00-00-2002 to 00-00-2002
4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER
An Augmented Reality System for Military Operations in Urban Terrain 5b. GRANT NUMBER

5c. PROGRAM ELEMENT NUMBER

6. AUTHOR(S) 5d. PROJECT NUMBER

5e. TASK NUMBER

5f. WORK UNIT NUMBER

7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION


REPORT NUMBER
Advanced Information Technology,Naval Research
Laboratory,Washington,DC,20375
9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S)

11. SPONSOR/MONITOR’S REPORT


NUMBER(S)

12. DISTRIBUTION/AVAILABILITY STATEMENT


Approved for public release; distribution unlimited
13. SUPPLEMENTARY NOTES
Proceedings of the Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC ’02),
Orlando, FL, December 2-5, 2002
14. ABSTRACT

15. SUBJECT TERMS

16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF 18. NUMBER 19a. NAME OF
ABSTRACT OF PAGES RESPONSIBLE PERSON
a. REPORT b. ABSTRACT c. THIS PAGE Same as 8
unclassified unclassified unclassified Report (SAR)

Standard Form 298 (Rev. 8-98)


Prescribed by ANSI Std Z39-18
Proceedings of the Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC '02),
Orlando, FL, December 2-5, 2002.

Augmented Reality System (BARS) and multi-modal virtual reality projects. His research interests include ubiqui-
tous computing and data distribution.
YOHAN BAILLOT is a computer and electrical engineer of ITT Industries at the Naval Research Laboratory. He
received an M.S. in electrical engineering in 1996 from ISIM, France, and an M.S. in computer science in 1999 from
the University of Central Florida. His research interests are in computer graphics, 3D displays, tracking, vision,
mobile augmented reality and wearable computers. Baillot is a member of the IEEE Computer Society.
J. EDWARD SWAN II is a Research Scientist with the Virtual Reality Laboratory at the Naval Research Labora-
tory, where he conducts research in computer graphics and human-computer interaction. At the Naval Research
Laboratory he is primarily motivated by the problem of battlefield visualization. Currently, he is studying multi-
modal input techniques for virtual and augmented reality systems. He received his Ph.D. from The Ohio State Uni-
versity in 1997. Swan is a member of ACM, SIGGRAPH, SIGCHI, IEEE, and the IEEE Computer Society.
JOSEPH L. GABBARD is a senior research associate at Virginia Tech in Blacksburg, VA, where he performs hu-
man-computer interaction research in non-traditional interactive systems such as VR and AR. He is currently inter-
ested in researching and developing usability engineering methods specifically for VEs. Other interests include de-
veloping innovative and intuitive interaction techniques employing ubiquitous input technology. He is currently
pursuing his Ph.D. in computer science at Virginia Tech. He received his M.S. in computer science from Virginia
Tech. Gabbard is a member of the IEEE and the IEEE Computer Society.
DEBORAH HIX is a Research Scientist at Virginia Tech in Blacksburg, VA. She received her PhD in Computer
Science and Applications from Virginia Tech. Hix is a pioneer in the field of human-computer interaction (HCI),
focusing on usability engineering. She is co-author of a popular book entitled Developing User Interfaces: Ensur-
ing Usability through Product and Process, published by John Wiley and Sons. This book is used world-wide by
both usability practitioners and university students. Hix has done extensive research, teaching, and consulting with a
variety of industrial and government organizations in the area of usability engineering for over 20 years. Most re-
cently, Hix has extended her two decades of HCI work into usability of virtual environments. She is a member of
several professional organizations and professional honor societies.

—2 —
Proceedings of the Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC '02),
Orlando, FL, December 2-5, 2002.

AN AUGMENTED REALITY SYSTEM FOR MILITARY OPERATIONS IN


URBAN TERRAIN

Mark A. Livingston1 Lawrence J. Rosenblum1 Simon J. Julier2


Dennis Brown2 Yohan Baillot2 J. Edward Swan II1
Joseph L. Gabbard3 Deborah Hix3
1
Advanced Information Technology, Naval Research Laboratory, Washington, DC 20375
2
ITT Advanced Engineering and Sciences, Alexandria, VA 22303
3
Systems Research Center, Virginia Polytechnic Inst. and State Univ., Blacksburg, VA 24061

Many future military operations will occur in urban reduce some of the masking effects of built-up
environments [CFMOUT-97]. Military operations in terrain.”
urban terrain (MOUT) present many unique and chal-
Finally, in 2001 the DUSD (S&T) identified five criti-
lenging conditions for the warfighter. The environ-
cal hard topics, one of which was MOUT. Under
ment is extremely complex and inherently three-
MOUT, the use of augmented reality technology to
dimensional. Above street level, buildings serve vary-
enhance situational awareness was a noted technology
ing purposes (such as hospitals or communication sta-
improvement.
tions). They can harbor many risks, such as snipers or
mines, which can be located on different floors. Below A number of research programs have explored the
street level, there can be an elaborate network of sew- means by which navigation and coordination of infor-
ers and tunnels. The environment can be cluttered and mation can be delivered to the dismounted soldier.
dynamic. Narrow streets restrict line of sight and make Many of these approaches are based on handheld maps
it difficult to plan and coordinate group activities. (e.g., an Apple Newton), or opaque head-mounted dis-
Threats, such as snipers, can continuously move and plays (HMDs). For example, the Land Warrior pro-
the structure of the environment itself can change. For gram introduced a head-mounted display that com-
example, a damaged building can fill a street with rub- bined a map and a “rolling compass” [Gumm-98].
ble, making a once-safe route impassable. Such diffi- Unfortunately, these methods have a number of limita-
culties are compounded by the need to minimize the tions. They obscure the user’s field of view and do not
number of civilian casualties and the amount of dam- truly represent the three-dimensional nature of the en-
age to civilian targets. vironment. Moreover they require the user to integrate
the graphical display within the environment to make
In principle, many of these difficulties can be over-
sense of it. This work is sometime difficult and dis-
come through better situational awareness. The Con-
tracting from the current task. To overcome these
cepts Division of the Marine Corps Combat Develop-
problems, we propose the use of a mobile augmented
ment Command (MCCDC) concludes [CMOUT-97]:
reality system.
“Units moving in or between zones must be
A mobile augmented reality system consists of a com-
able to navigate effectively, and to coordinate
puter, a tracking system, and a see-through HMD. The
their activities with units in other zones, as
system tracks the position and orientation of the user’s
well as with units moving outside the city.
head and superimposes graphics and annotations that
This navigation and coordination capability
are aligned with real objects in the user’s field of view.
must be resident at the very-small-unit level,
With this approach, complicated spatial information
perhaps even with the individual Marine.”
can be directly aligned with the environment. For ex-
These conclusions were strengthened in the document ample, the name of a building could appear as a “vir-
"Future Military Operations on Urbanized Terrain" tual sign post” attached directly to the side of the build-
where the MCCDC notes: ing. To explore the feasibility of such a system, the
Naval Research Laboratory (NRL) is developing a pro-
“...we must explore new technologies that will
totype augmented reality (AR) system known as
facilitate the conduct of maneuver warfare in
BARS, the Battlefield Augmented Reality System.
future MOUT. Advanced sensing, locating,
This system will network multiple outdoor, mobile
and data display systems can help the Marines
users together with a command center.
to leverage information in ways which will

—3 —
Proceedings of the Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC '02),
Orlando, FL, December 2-5, 2002.

To achieve this goal many challenges must be over- are shown, and the user can vary the inner and outer
come [Julier-99]. This paper surveys the current state diameters of the cylinder walls. We also use semantic
of development of BARS and describes ongoing re- filters based on the user's task or orders from a com-
search efforts. We describe four major research areas. mander—for example, a route associated with a task
The first is the development of an effective, efficient will be shown regardless of the user's spatial filter set-
user interface for displaying data and processing user tings, and threats will be shown at all times.
inputs (such as the creation of new reports). The sec-
ond is the capability for collaboration between multiple Selecting Objects
BARS users and other systems (CAVEs or Work-
benches). Third, we describe the current hardware to Early uses of BARS will mainly consist of users ob-
provide both mobile and indoor prototype systems. serving and selecting objects in the environment, either
Finally, we describe initial efforts to formally evaluate to find out more about them (“Where is the electrical
the capabilities of the system from a user’s perspective. cut off switch?”) or to add information about them (“I
We discuss the scenario analysis we have performed saw a sniper on the third floor of that building.”).
for the system and conclusions drawn to date. We also Thus, the system should include a mechanism to allow
will discuss the use of the BARS system in STRI- the user to easily select items in the environment.
COM's Embedded Training initiative. Our research on interaction paradigms is guided by two
facts. First, many of the objects a user interacts with
BARS USER INTERFACE are distant (greater than 5m away) and are large (e.g., a
building). Second, the position and orientation of the
The mobile outdoor system is designed with usability
user’s head is accurately tracked. Therefore, most in-
engineering methods to support efficient user task per-
teractions are via gestures that require a user to point at
formance. BARS must provide information to the user,
distant objects. To date, we have utilized a handheld
and the user must be able to enter data into the system.
wireless mouse. The gestural input requires two steps.
Neither flow of information can be allowed to distract
First, the user faces the possible object of interest (ad-
the user from the primary task. An important feature of
justing head orientation). Then, using the mouse, the
the user interface is that BARS must be able to monitor
user maneuvers a cursor over the object. When the
many sources of data about the user and use intelligent
user presses the mouse button, a “gaze ray” is con-
heuristics to combine those data with information about
structed from the user’s head position and the cursor
the environment and tasks. For example, it might be
position; this is intersected with the shared information
possible to monitor the level of stress of the user in
database to determine what objects have been selected.
order to tailor the amount of information needed and
Although current tracking methods do not always
reduce it to a minimum during high-stress situations.
achieve the accuracy necessary, we find them sufficient
and are working to improve the performance of the
The Shared Information Database
tracking system.
The system contains a detailed 3D model of objects in
the real environment that is used to generate the regis- Speech and Gesture Input
tered graphical overlay. This model is stored in a
The mouse-based interface described in the previous
shared database that also contains information about
subsection has two important limitations. First, it is
the objects such as a general description, threat classi-
difficult to perform complicated interactions with a
fication, etc. Using knowledge representation and rea-
handheld mouse; a user must resort to various types of
soning techniques, we can also store in this database
drop-down menus. Second, one of the user’s hands is
information about the objects’ relevance to each other
occupied with the need to hold and manipulate a
and to the user's task.
mouse. To overcome these problems, we are research-
ing speech and gesture input techniques. These tech-
The Information Filter
niques will support more sophisticated interactions and
The shared database contains much information about minimize errors. We are implementing speech and
the local environment. Showing all of this information gesture techniques with the Adaptive Agent Architec-
can lead to a cluttered and confusing display. We use ture, which is part of the QuickSet application suite
an information filter to add objects to, or remove ob- [Cohen97]. We have already performed a preliminary
jects from, the user's display. We use a spatial filter to integration of a 2D handheld gesture display with
show only those objects that lie in a certain zone BARS and we are investigating how novel 3D tracking
around the user. This zone can be visualized as a cyl- technologies can be used to implement 3D gesture rec-
inder whose main axis is parallel to the user's "up" vec- ognition..
tor, where objects that fall within the cylinder's walls

—4 —
Proceedings of the Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC '02),
Orlando, FL, December 2-5, 2002.

Figure 1: A remote BARS user is highlighted with a box shape. In this example, the user is also physically visible,
but the position information is transmitted for all mobile users and can show the location of an occluded user.

COLLABORATION BETWEEN USERS


Through its ability to automatically distribute informa-
tion, BARS can be used to facilitate collaboration be-
tween multiple users. Collaboration can occur horizon-
tally (between mobile users) and vertically (between
mobile users and a command center).

Collaboration Mechanism
The BARS collaboration system ensures that the rele-
vant parts of the shared database are replicated on
every user's machine. Information is deemed relevant
to a particular user based on the information filter de-
scribed previously. Users join distribution channels Figure 2: An annotated view of the hardware configura-
that work like IP multicast groups; however, the actual tion of the current BARS prototype.
implementation does not depend on IP multicast.
Based on the importance of the data, the channels use
reliable and unreliable transport mechanisms in order
to keep network traffic low. For example, under opti- BARS PROTOTYPE
mal conditions, user positions are updated in real time
Built from commercial, off-the-shelf (COTS) products,
(at least 30 Hz) using unreliable transport, but with a
the mobile prototype for BARS is composed of (Figure
frequency of around 5 Hz, user positions are sent relia-
2):
bly so that those with overloaded connections will at
least get positions at a usable rate (Figure 1). • Ashtech GG24-Surveyor (real-time differential
A channel contains a class of objects and distributes kinematic GPS receiver for position tracking)
information about those objects to members of the • InterSense InertiaCube2 (for orientation tracking)
channel. Some channels are based on physical areas,
and as the user moves through the environment or • Sony Glasstron LDI-D100B see-through HMD
modifies the spatial filter, the system automatically (when color and stereo rendering are important) or
joins or leaves those channels. Other channels are • MicroVision laser retinal scanning see-through
based on semantic information, such as route informa- head-worn display (when legibility in very bright
tion only applicable to one set of users, or phase lines or very dim conditions is important)
only applicable to another set of users. In this case, the
user voluntarily joins the channel containing that in- • Dell Inspiron 7000 Notebook computer (main
formation, or a commander can join that user to the CPU and 3D graphics engine)
channel.

—5 —
Proceedings of the Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC '02),
Orlando, FL, December 2-5, 2002.

PRELIMINARY BARS EVALUATION


User interaction occurs in user-based and task-based
contexts that are defined by the application domain.
Domain analysis plays a critical role in laying the
groundwork for developing a user-centered system.
We performed domain analysis in close collaboration
with several subject matter experts (i.e. military per-
sonnel who would be candidate BARS users) [Gab-
bard-02]. Domain analysis helps define specific user
interface requirements as well as user performance
requirements, or quantifiable usability metrics, that
ensure that subsequent design and development efforts
respect the interests of users. User information re-
quirements, also identified during domain analysis (and
focused through the development of use cases and sce-
Figure 3: A sample protocol to show the location of narios), ensure that the resulting system provides useful
occluded objects. The first three layers are shown and often time-critical insight to a user’s current task.
with outlines of varying styles. The last three layers The most intuitive and usable interface in the world
are shown with filled shapes of varying styles. will not make a system useful, unless the core content
of the system provides value to the end user. Finally,
domain analysis may also shape system requirements,
typically with respect to system components that affect
• Wavelan 802.11 11Mbps Wireless network card user performance.
and FreeWave Radio Modem 115Kbps (currently Domain analysis often includes activities such as use
used just to broadcast GPS differential corrections) case development, user profiles, and user needs analy-
• Interaction devices (currently a wrist-mounted sis. Use cases describe in detail specific usage contexts
keyboard and wireless hand-held gyroscope- within which the system will be used, and for which
equipped mouse) the system should be designed. User profiles charac-
terize an interactive system's intended operators and
The indoor prototype system uses the same displays, their actions while using the system. The process of
although the laser retinal scanning display is rarely defining representative users in turn yields information
needed under controlled lighting. Indoors, we must that is useful in making design decisions. A user needs
substitute the InterSense IS900 tracking system for the analysis further refines high-level user goals identified
combination of the GPS and inertial units. This system by user profiles by decomposing these goals within the
is similar in that it includes its own inertial compo- context of the developed use cases. Moreover, the user
nents, and it uses ultrasonic blips in from microphones needs analysis provides an assessment of what
mounted in rails hanging from the ceiling in place of capabilities are required of the system to assist users in
GPS. The tracking algorithm internal to the device is achieving these goals. The capabilities can then be
quite similar to the combined GPS and inertial method further analyzed to identify specific user interaction
on the mobile prototype. We use a Dell PC equipped requirements as well as information requirements.
with Dual Xeon 1.7GHz processors, an ATI FireGL II
graphics processor, a standard Ethernet network con- The BARS use case gives a platoon the mission to in-
nection, standard keyboard, and wireless hand-held filtrate an enemy facility and destroy two tanks of sus-
gyroscope-equipped mouse. picious chemical agents. Analysis of this scenario gave
a set of requirements, including the information re-
The software is implemented using Java JDK 1.3 for quirements for different BARS users and the generic
high-level object management and C for high perform- set of tasks that each user needs to accomplish. This
ance graphics rendering. The combination of software analysis revealed a set of features that cannot be easily
and hardware yields a system able to register a 3D delivered by any current AR system. For example, one
model in stereo at more then 30 frames per second on user-centered requirement says that the system must be
the mobile prototype and 85 frames per second on the capable of conveying the location of hidden and oc-
indoor prototype. cluded objects to the user. For example, a warfighter
on a mission might want to know the location of
friendly forces hidden behind a wall. This requirement
spurred research on display of hidden objects. We

—6 —
Proceedings of the Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC '02),
Orlando, FL, December 2-5, 2002.

have, through expert evaluation, designed three poten- this domain. STRICOM, in conjunction with NRL is
tial protocols (Figure 3 gives one example.) through studying how BARS can impact training at three-
which such information can be displayed. We take levels: as a means to blend synthetic and live forces; as
advantage of classic methods of technical illustration a means to provide “training wheels” to show trainees
and use combinations of the following parameters. critical information; and as a tool to assist trainers in
constructing and operating a training scenario.
• solid, dashed, or dotted lines or polygons
The first aspect utilizes BARS to “enrich” an existing
• intensity or color scenario. Many MOUT facilities consist of a small
• outlined or filled polygonal representation group of fairly bare buildings that occupy a self-
contained area, typically no more than a few city
• line thickness blocks. However, if a user’s position and orientation
Until user-based usability evaluations are conducted, were accurately tracked, synthetic forces and building
however, all such designs are speculative. We have features can be inserted into the user’s environment. If
identified a number of principles, such as using multi- a user were connected through a wireless network to a
ple parameters to differentiate different distances or simulation system such as OneSAF, users could be
number of occluding objects, limiting the number of presented with reactive entities such as air forces
objects in a given direction, and that parameters can be (simulate call for fire) or even with individual combat-
confounded or masked by the characteristics of the ants. Furthermore, BARS could be used to mix live
display. For example, intensity of the graphics can forces at physically different sites (such as multiple
sometimes be confounded with background intensity, MOUT facilities) into the same environment. How-
or with stippling (dashed or dotted) effects. We are ever, it should be noted that this application is ex-
conducting user-based evaluations in the summer and tremely technically challenging. Registration must be
fall of 2002 to determine how various parameters inter- accurate to the nearest pixel to ensure that occlusion by
act and how the user performs under a variety of de- the real world is correct. As noted in the previous sec-
signs and tasks. The evaluation will employ represen- tion, usability evaluation will help determine what
tative domain users, performing tasks derived from the level of accuracy a warfighter requires to complete a
BARS use case. To our knowledge, this is one of the (simulated) mission.
first user-based, mobile, outdoor AR usability evalua- The second aspect is to use BARS to provide trainees
tions. BARS and other non-traditional computer sys- with a set of “training wheels”. For example, BARS
tems are much more difficult to evaluate than their 2D could be used to visualize Fatal Funnels or other struc-
graphical user interface counterparts [Bowman-02] and tural risks in urban environments. Furthermore, it
as such, will likely require the invention of new evalua- could be combined with recording or playback systems
tion techniques. to assist in post mortem analysis of a training exercise.
In addition, the user-centered requirements identified The final aspect is to provide the trainer with a BARS
important performance bounds on known system re- system. Through its ability to convey situational
quirements. For example, by identifying the likely set awareness information such as the location of trainees
of objects of interest to BARS users, we discovered who might not be visible from the trainer’s vantage
that registration (and thus tracking) has to be good point, BARS could enable synthesis of more compel-
enough to accurately position graphical indicators on ling and difficult training scenarios.
buildings and streets, but it does not have to be any
more accurate than this. This bound is important, be- Current research plans are considering the first of these
cause highly accurate tracking is extremely difficult. training aspects and, in particular, we are beginning to
study how to interface BARS with a simulation system.
EMBEDDED TRAINING AND BARS
SUMMARY
So far, this paper has concentrated on the possible uses
of BARS as a situational awareness tool. However, We have presented the Battlefield Augmented Reality
BARS and augmented reality have the potential to sig- System in its current research state. The basic goal of
nificantly impact training. As dismounted warrior sys- BARS is to aid situational awareness for MOUT. To
tems become more sophisticated, the need for detailed, provide a useful and usable system, we are conducting
precise, and advanced training and simulation has be- research on the user interface and collaboration meth-
come paramount. The US Army Simulation, Training, ods. We are beginning to use the current prototype to
and Instrumentation Command (STRICOM) has initi- formally evaluate the usefulness and usability of the
ated an embedded training program [Dumanoir-02] to system, and expect to conduct our first user studies on
study how revolutionary techniques can be applied to basic information display research in the coming

—7 —
Proceedings of the Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC '02),
Orlando, FL, December 2-5, 2002.

months. As we continue to refine the BARS domain Imaging Science and Technology) Electronic Im-
analysis and subsequent usability engineering activi- aging 2002, San Jose, CA, January 19-25, 2002.
ties, we will iteratively improve the current prototype
[Gage-97] D. W. Gage, “Network Protocols for Mobile
to a field-deployable prototype in the coming years.
Robot Systems”, SPIE AeroSense Conference
Mobile Robotics XII, Vol. 3120, pp. 107-118, Oc-
REFERENCES
tober 1997.
[Baillot-02] “Wearable 3D Graphics for Augmented
[Gold-93] R. Gold, B. Buxton, S. Feiner, C. Schmandt,
Reality: A Case Study of Two Experimental
P. Wellner and M. Weiser, “Ubiquitous Comput-
Backpack Computers”, Y. Baillot, E. Gagas, T.
ing and Augmented Reality”, Proceedings of
Höllerer, S. Feiner and S. Julier, Unpublished
SIGGRAPH-93: Computer Graphics, Anaheim,
manuscript (technical report in preparation).
CA, 1993.
[Bowman-02] Doug Bowman, Joseph L. Gabbard,
[Gumm-98] M.M.Gumm, W.P. Marshakm T.A.
Deborah Hix, "A Survey of Usability Evaluation in
Branscome, M. McWesler, D.J. Patton and L.L.
Virtual Environments: Classification and Com-
Mullins, “A Comparison of Solider Performance
parison of Methods". In Presence: Teleoperators
using Current Land Navigation Equipment with
and Virtual Environments. Volume 11, Number 4,
Information Integrated on a Helmet-Mounted Dis-
2002, pages 435-455.
play,” ARL Report ARL-TR-1604, DTIC Report
[CFMOUT-97] Concepts Division, Marine Corps 19980527081, April 1998.
Combat Development Command, “A Concept for
[Hammel-68] E., Hammel, “Fire in the Streets: The
Future Military Operations on Urbanized Terrain,”
Battle for Hue, Tet 1968”, Pictorial Histories Pub.
approved July 1997
Co, 1968.
[DARPAWV-97] “Warfighter Visualization Program”,
[Julier-99] S. Julier, S. Feiner and L. Rosenblum,
http://web-ext2.darpa.mil/ETO/wv/index.html
“Augmented Reality as an Example of a Demand-
[Darken-99] R.P. Darken and H. Cevik, “Map Usage in ing Human-Centered System”, First EC/NSF Ad-
Virtual Environments: Orientation Issues,” Pro- vanced Research Workshop, 1-4 June 1999,
ceedings of IEEE Virtual Reality ’99, Houston, France.
TX, March 13-17, 1999, 133-140.
[Julier-00] S. Julier, M. Lanzagorta, Y. Baillot, L.
[Dumanoir-02] “ Embedded Training for Dismounted Rosenblum, S. Feiner, T. Höllerer, S. Sestito, “In-
Soldiers (ETDS)”, P. Dumanoir, P. Garrity, B. formation Filtering for Mobile Augmented Real-
Witmer, R. Lowe, submitted to 2002 IITSEC Or- ity”, 2000 International Symposium on Aug-
lando FL Dec 2002. mented Reality.
[Durbin-98] J. Durbin, S. J. Julier, B. Colbert, J. [Julier-00a] S. Julier, Y. Baillot, D. Brown, and L.
Crowe, R. Doyle, R. King, LCDR T. King, C. Rosenblum. “BARS: Battlefield Augmented Real-
Scannell, Z. J. Wartell and T. Welsh, Proceedings ity System,” NATO Symposium on Information
of the SPIE AeroSense Conference, Orlando, FL, Processing Techniques for Military Systems, 9-11
1998 October 2000, Istanbul, Turkey.
[Feiner-97] S. Feiner, B. MacIntyre, T. Höllerer and T. [Lanzagorta-98] M. Lanzagorta, E. Kuo and J.
Webster “A touring machine: Prototyping 3D mo- Uhlmann, “GROTTO Visualization for Decision
bile augmented reality systems for exploring the Support”, Proceedings of the SPIE 12th Annual In-
urban environment”, In Proceedings of ISWC '97 ternational Symposium on Aerospace/Defense
International Symposium on Wearable Computers, Sensing, Simulation, and Controls, Orlando, Fl.,
Cambridge, MA, 1997. 1998.
[FLEXIPC-98] “Specification of the ViA II Wearable [MacIntyre-98] B. MacIntyre and S. Feiner, “A Dis-
Computer”, http://www.flexipc.com tributed 3D Graphics Library”, Proceedings of
SIGGRAPH 1998, Orlando, FL, 1998
[Gabbard-02] J.L. Gabbard, J.E. Swan, D. Hix, M.
Lanzagorta, M.A. Livingston, D. Brown, S. Julier, [SSCOMWL-98] “Land Warrior Home Page”,
“Usability Engineering: Domain Analysis Activi- http://www-
ties for Augmented Reality Systems,” Proceedings sscom.army.mil/prodprog/lw/index.htm.
of the Conference on The Engineering Reality of
Virtual Reality 2002, SPIE (International Society
for Optical Engineering) and IS&T (Society for

—8 —

You might also like