Download
Download
net/publication/3208983
CITATIONS READS
3,342 22,214
6 authors, including:
All content following this page was uploaded by Reinhold Behringer on 06 November 2013.
Ronald Azuma
HRL Laboratories
Yohan Baillot
NRL Virtual Reality Lab/ITT Advanced Engineering
Reinhold Behringer
Rockwell Scientific
Steven Feiner
Columbia University
Mixed reality
1 Milgram’s reality–virtuality con-
tinuum. (Adapted from Milgram
and Kishino.1) Real Augmented Augmented Virtual
environment reality virtuality environment
optical see-through displays, virtual objects can’t com- Second, most video see-through displays have a par-
pletely occlude real ones. One experimental display allax error, caused by the cameras being mounted away
addresses this by interposing an LCD panel between the from the true eye location. If you see the real world
optical combiner and the real world, blocking the real- through cameras mounted on top of your head, your
world view at selected pixels13 (Figure 6). view is significantly different from what you’re normal-
ly used to, making it difficult to adapt to the display.14
The MR Systems Lab developed a relatively lightweight
4 Experimental (340 grams) video graphics array (VGA) resolution video
head-worn see-through display, with a 51-degree horizontal field of
projective dis- view, in which the imaging system and display system
play using optical axes are aligned for each eye.15 Finally, most dis-
lightweight plays have fixed eye accommodation (focusing the eyes
optics. (Cour- at a particular distance). Some prototype video and opti-
tesy of Jannick cal see-through displays can selectively set accommo-
Rolland, Univer- dation to correspond to vergence by moving the display
sity of Central screen or a lens through which it’s imaged. One version
Florida, and can cover a range of .25 meters to infinity in .3 seconds.16
Frank Biocca,
Michigan State New tracking sensors and approaches
University.) Accurately tracking the user’s viewing orientation and
position is crucial for AR registration. Rolland et al.17
give a recent overview of tracking systems. For prepared,
indoor environments, several systems demonstrate
excellent registration. Typically such systems employ
hybrid-tracking techniques (such as magnetic and video
sensors) to exploit strengths and compensate weak-
5 Projection nesses of individual tracking technologies. A system
display used to combining accelerometers and video tracking demon-
camouflage a strates accurate registration even during rapid head
haptic input motion.18 The Single Constraint at a Time (Scaat) algo-
device. The rithm also improved the tracking performance. Scaat
haptic input incorporates individual measurements at the exact time
device normally they occur, resulting in faster update rates, more accu-
doesn’t reflect rate solutions, and autocalibrated parameters.19 Two
projected new scalable tracking systems, InterSense’s Constella-
graphics (top). tion20 and 3rdTech’s HiBall,21 can cover the large indoor
The haptic input environments needed by some AR applications.
device coated Visual tracking generally relies on modifying the envi-
with retro- ronment with fiducial markers placed in the environment
Courtesy Tachi Lab, University of Tokyo
36 November/December 2001
compass/gyroscope tracker provides motion-stabilized
orientation measurements at several outdoor locations26
(Figure 7). With the addition of video tracking (not in
real time), the system produces nearly pixel-accurate
results on known landmark features. 27,28 The Townwear
system29 uses custom packaged fiber-optic gyroscopes 7 Motion-
for high accuracy and low drift rates.30 Either the Glob- stabilized labels
al Positioning System (GPS) or dead reckoning tech- annotate the
niques usually track the real-time position outdoors, Phillips Tower,
although both have significant limitations (for example, as seen from
GPS requires a clear view of the sky). two different
Ultimately, tracking in unprepared environments viewpoints.
10 Heteroge-
neous displays
in Emmie, com-
bining head-
worn,
projected, and
private flat-
screen displays.
11 The Studierstube (top) and Magic Book (bottom)
collaborative AR systems, with two users wearing see-
through HMDs. (Courtesy Dieter Schmalstieg, Vienna
Courtesy A. Butz, T. Höllerer, S. Feiner, B. MacIntyre, C. Beshers, Columbia Univ.
University of Technology, and Mark Billinghurst, Human
Interface Technologies Lab.)
■ using heterogeneous devices to leverage the advan-
tages of different displays and
■ integrating with the physical world through tangible
interfaces.
38 November/December 2001
14 Virtual and
real occlusions.
The brown cow
and tree are
virtual; the rest
is real.
Courtesy INRIA
40 November/December 2001
17 Battlefield
Augmented
Reality System,
a descendent of
the Touring
Machine. (Cour-
tesy of the
Naval Research
Lab.)
16 A view through a see-through HMD shows a 3D
model of demolished building at its original location.
(Courtesy T. Höllerer, S. Feiner, J. Pavlik, Columbia Univ.)
42 November/December 2001
Figure 19 shows two current examples of AR in sports.
In both systems, the environments are carefully modeled
ahead of time, and the cameras are calibrated and pre-
cisely tracked. For some applications, augmentations are
added solely through real-time video tracking. Delaying
the video broadcast by a few video frames eliminates the
registration problems caused by system latency. Fur- 20 Virtual
thermore, the predictable environment (uniformed play- advertising. The
ers on a green, white, and brown field) lets the system Pacific Bell and
use custom chroma-keying techniques to draw the yel- Pennsylvania
low line only on the field rather than over the players. Lottery ads are
With similar approaches, advertisers can embellish AR augmenta-
broadcast video with virtual ads and product place- tions.
ments (Figure 20).
Future work
Apart from the few commercial examples described in
the last section, the state of the art in AR today is com-
parable to the early years of VR—many research sys-
tems have been demonstrated but few have matured
beyond lab-based prototypes.87 We’ve grouped the
major obstacles limiting the wider use of AR into three
Courtesy Pacific Video Image
Technological limitations
Although we’ve seen much progress in the basic
enabling technologies, they still primarily prevent the
deployment of many AR applications. Displays, track-
ers, and AR systems in general need to become more
accurate, lighter, cheaper, and less power consuming.
By describing problems from our common experi- research in AR can reduce these difficulties through
ences in building outdoor AR systems, we hope to improved tracking in unprepared environments and cal-
impart a sense of the many areas that still need ibration-free or autocalibration approaches to minimize
improvement. Displays such as the Sony Glasstron are set-up requirements.
intended for indoor consumer use and aren’t ideal for
outdoor use. The display isn’t very bright and com- User interface limitations
pletely washes out in bright sunlight. The image has a We need a better understanding of how to display data
fixed focus to appear several feet away from the user, to a user and how the user should interact with the data.
which is often closer than the outdoor landmarks. The Most existing research concentrates on low-level per-
equipment isn’t nearly as portable as desired. Since the ceptual issues, such as properly perceiving depth or how
user must wear the PC, sensors, display, batteries, and latency affects manipulation tasks. However, AR also
everything else required, the end result is a cumber- introduces many high-level tasks, such as the need to
some and heavy backpack. identify what information should be provided, what’s the
Laptops today have only one CPU, limiting the appropriate representation for that data, and how the
amount of visual and hybrid tracking that we can do. user should make queries and reports. For example, a user
Operating systems aimed at the consumer market aren’t might want to walk down a street, look in a shop window,
built to support real-time computing, but specialized and query the inventory of that shop. To date, few have
real-time operating systems don’t have the drivers to studied such issues. However, we expect significant
support the sensors and graphics in modern hardware. growth in this area because research AR systems with suf-
Tracking in unprepared environments remains an ficient capabilities are now more commonly available.
enormous challenge. Outdoor demonstrations today For example, recent work suggests that the creation and
have shown good tracking only with significant restric- presentation of narrative performances and structures
tions in operating range, often with sensor suites that may lead to more realistic and richer AR experiences.88
are too bulky and expensive for practical use. Today’s
systems generally require extensive calibration proce- Social acceptance
dures that an end user would find unacceptably com- The final challenge is social acceptance. Given a sys-
plicated. Many connectors such as universal serial bus tem with ideal hardware and an intuitive interface, how
(USB) connectors aren’t rugged enough for outdoor can AR become an accepted part of a user’s everyday
operation and are prone to breaking. life, just like a mobile phone or a personal digital assis-
While we expect some improvements to naturally tant (PDA)? Through films and television, many people
occur from other fields such as wearable computing, are familiar with images of simulated AR. However, per-
suading a user to wear a system means addressing a Special Interest Group on Computer–Human Interaction
number of issues. These range from fashion (will users (SIGCHI 98), ACM Press, New York, 1998, pp. 542-549.
wear a system if they feel it detracts from their appear- 13. K. Kiyokawa, Y. Kurata, and H. Ohno, “An Optical See-
ance?) to privacy concerns (we can also use the tracking through Display for Mutual Occlusion of Real and Virtu-
required for displaying information for monitoring and al Environments,” Proc. Int’l Symp. Augmented Reality
recording). To date, little attention has been placed on 2000 (ISAR 00), IEEE CS Press, Los Alamitos, Calif., 2000,
these fundamental issues. However, these must be pp. 60-67.
addressed before AR becomes widely accepted. ■ 14. F.A. Biocca and J.P. Rolland, “Virtual Eyes Can Rearrange
Your Body: Adaptation to Visual Displacement in See-
Acknowledgements Through, Head-Mounted Displays,” Presence: Teleopera-
Part of this work was funded by the US Office of Naval tors and Virtual Environments. vol. 7, no. 3, June 1998, pp.
Research contracts N00014-99-3-0018, N00014-00-1- 262-277.
0361, N00014-99-1-0249, N00014-99-1-0394, N00014- 15. A. Takagi et al., “Development of a Stereo Video See-
99-0683, and the US National Science Foundation Grant through HMD for AR Systems,” Proc. Int’l Symp. Augment-
IIS-00-82961. ed Reality 2000 (ISAR 00), IEEE CS Press, Los Alamitos,
Calif., 2000, pp. 68-77.
16. T. Sugihara and T. Miyasato, “A Lightweight 3D HMD with
Accommodative Compensation,” Proc. 29th Soc. Informa-
References tion Display (SID 98), San Jose, Calif., 1998, pp. 927-930.
1. P. Milgram and F. Kishino, “A Taxonomy of Mixed Reality 17. J.P. Rolland, L.D. Davis, and Y. Baillot, “A Survey of Track-
Visual Displays,” IEICE Trans. Information Systems, vol. ing Technologies for Virtual Environments,” Fundamentals
E77-D, no. 12, 1994, pp. 1321-1329. of Wearable Computers and Augmented Reality, W. Barfield
2. I. Sutherland, “A Head-Mounted Three-Dimensional Dis- and T. Caudell, eds., Lawrence Erlbaum, Mahwah, N.J.,
play,” Fall Joint Computer Conf., Am. Federation of Infor- 2001, pp. 67-112.
mation Processing Soc. (AFIPS) Conf. Proc. 33, Thompson 18. Y. Yokokohji, Y. Sugawara, and T. Yoshikawa, “Accurate
Books, Washington, D.C., 1968, pp. 757-764. Image Overlay on Video See-through HMDs Using Vision
3. R. Azuma, “A Survey of Augmented Reality,” Presence: Tele- and Accelerometers,” Proc. IEEE Virtual Reality 2000, IEEE
operators and Virtual Environments. vol. 6, no. 4, Aug. 1997, CS Press, Los Alamitos, Calif., 2000, pp. 247-254.
pp. 355-385. 19. G. Welch and G. Bishop, “Scaat: Incremental Tracking with
4. H.L. Pryor, T.A. Furness, and E. Viirre, “The Virtual Reti- Incomplete Information,” Proc. ACM Siggraph 97, ACM
nal Display: A New Display Technology Using Scanned Press, New York, 1997, pp. 333-344.
Laser Light,” Proc. 42nd Human Factors Ergonomics Soc., 20. E. Foxlin, M. Harrington, and G. Pfeifer, “Constellation: A
Santa Monica, Calif., 1998, pp. 1570-1574. Wide-Range Wireless Motion-Tracking System for Aug-
5. M.B. Spitzer et al., “Eyeglass-Based Systems for Wearable mented Reality and Virtual Set Applications,” Proc. Sig-
Computing,” Proc. 1st Int’l Symp. Wearable Computers, graph 98, ACM Press, New York, 1998, pp. 371-378.
(ISWC 97), IEEE CS Press, Los Alamitos, Calif., 1997, pp. 21. G. Welch et al., “High-Performance Wide-Area Optical
48-51. Tracking—The HiBall Tracking System,” Presence: Teleop-
6. I. Kasai et al., “A Forgettable Near Eye Display,” Proc. 4th erators and Virtual Environments. vol. 10, no. 1, Feb. 2001,
Int’l Symp. Wearable Computers (ISWC 2000), IEEE CS pp. 1-21.
Press, Los Alamitos, Calif., 2000, pp. 115-118. 22. Y. Cho, J. Lee, and U. Neumann, “A Multi-Ring Fiducial Sys-
7. J. Rekimoto, “NaviCam: A Magnifying Glass Approach to tem and an Intensity-Invariant Detection Method for Scal-
Augmented Reality,” Presence: Teleoperators and Virtual able Augmented Reality,” Proc. Int’l Workshop Augmented
Environments, vol. 6, no. 4, Aug. 1997, pp. 399-412. Reality (IWAR 98), A.K. Peters, Natick, Mass., 1998, pp.
8. R. Raskar, G. Welch, and W-C. Chen, “Table-Top Spatially- 147-165.
Augmented Realty: Bringing Physical Models to Life with 23. F. Sauer et al., “Augmented Workspace: Designing an AR
Projected Imagery,” Proc. 2nd Int’l Workshop Augmented Testbed,” Proc. Int’l Symp. Augmented Reality 2000 (ISAR
Reality (IWAR 99), IEEE CS Press, Los Alamitos, Calif., 00), IEEE CS Press, Los Alamitos, Calif., 2000, pp. 47-53.
1999, pp. 64-71. 24. Y. Ohta et al., “Share-Z: Client/Server Depth Sensing for
9. H. Hua et al., “An Ultra-light and Compact Design and See-through Head Mounted Displays,” Proc. 2nd Int’l Symp.
Implementation of Head-Mounted Projective Displays,” Mixed Reality (ISMR 2001), MR Systems Lab, Yokohama,
Proc. IEEE Virtual Reality 2001, IEEE CS Press, Los Alami- Japan, 2001, pp. 64-72.
tos, Calif., 2001, pp. 175-182. 25. T. Kanade et al., “Virtualized Reality: Digitizing a 3D Time-
10. M. Inami et al., “Visuo-Haptic Display Using Head-Mount- Varying Event As Is and in Real Time,” Proc. Int’l Symp.
ed Projector,” Proc. IEEE Virtual Reality 2000, IEEE CS Mixed Reality (ISMR 99), Springer-Verlag, Secaucus, N.J.,
Press, Los Alamitos, Calif., 2000, pp. 233-240. 1999, pp. 41-57.
11. S. Mann, “Telepointer: Hands-Free Completely Self Con- 26. R. Azuma et al., “A Motion-Stabilized Outdoor Augment-
tained Wearable Visual Augmented Reality without Head- ed Reality System,” Proc. IEEE Virtual Reality, IEEE CS
wear and without Any Infrastructural Reliance,” Proc. 4th Press, Los Alamitos, Calif., 1999, pp. 252-259.
Int’l Symp. Wearable Computers (ISWC 2000), IEEE CS 27. R. Azuma et al., “Tracking in Unprepared Environments
Press, Los Alamitos, Calif., 2000, pp. 177-178. for Augmented Reality Systems,” Computers and Graphics,
12. J. Underkoffler and H. Ishii, “Illuminating Light: An Optical vol. 23, no. 6, Dec. 1999, pp. 787-793.
Design Tool with a Luminous-Tangible Interface,” Proc. ACM 28. S. You, U. Neumann, and R. Azuma, “Hybrid Inertial and
44 November/December 2001
Vision Trcking for Augmented Reality Registration,” Proc. Int’l Symp. Augmented Reality 2000 (ISAR 00), IEEE CS
IEEE Virtual Reality, IEEE CS Press, Los Alamitos, Calif., Press, Los Alamitos, Calif., 2000, pp. 159-164.
1999, pp. 260-267. 45. T. Ohshima et al., “RV-Border Guards: A Multi-Player
29. K. Satoh et al., “Townwear: An Outdoor Wearable MR Sys- Mixed Reality Entertainment,” Trans. Virtual Reality Soc.
tem with High-Precision Registration,” Proc. 2nd Int’l Symp. Japan, vol. 4, no. 4, 1999, pp. 699-705.
Mixed Reality (ISMR 2001), MR Systems Lab, Yokohama, 46. S. Feiner et al., “A Touring Machine: Prototyping 3D Mobile
Japan, 2001, pp. 210-211. Augmented Reality Systems for Exploring the Urban Envi-
30. K. Sawada, M. Okihara, and S. Nakamura, “A Wearable ronment,” Proc. 1st Int’l Symp. Wearable Computers (ISWC
Attitude Measurement System Using a Fiber Optic Gyro- ‘97), IEEE CS Press, Los Alamitos, Calif., 1997, pp. 74-81.
scope,” Proc. 2nd Int’l Symp. Mixed Reality (ISMR 2001), 47. J. Rekimoto and M. Saitoh, “Augmented Surfaces: A Spa-
MR Systems Lab, Yokohama, Japan, 2001, pp. 35-39. tially Continuous Workspace for Hybrid Computing Envi-
31. U. Neumann and S. You, “Natural Feature Tracking for ronments,” Proc. ACM SIGCHI 99, 1999, ACM Press, New
Augmented Reality,” IEEE Trans. Multimedia, vol. 1, no. 1, York, pp. 378-385.
Mar. 1999, pp. 53-64. 48. A. Butz et al., “Enveloping Users and Computers in a Col-
32. R. Behringer, “Registration for Outdoor Augmented Real- laborative 3D Augmented Reality,” Proc. 2nd Int’l Work-
ity Applications Using Computer Vision Techniques and shop Augmented Reality (IWAR 99), IEEE CS Press, Los
Hybrid Sensors,” Proc. IEEE Virtual Reality, IEEE CS Press, Alamitos, Calif., 1999, pp. 35-44.
Los Alamitos, Calif., 1999, pp. 244-251. 49. Zs. Szalavári and M. Gervautz, “The Personal Interaction
33. V. Coors, T. Huch, and U. Kretschmer, “Matching Build- Panel—A Two-Handed Interface for Augmented Reality,”
ings: Pose Estimation in an Urban Environment,” Proc. Int’l Proc. 18th Eurographics, Eurographics Assoc., Aire-la-Ville,
Symp. Augmented Reality 2000 (ISAR 00), IEEE CS Press, Switzerland, 1997, pp. 335-346.
Los Alamitos, Calif., 2000, pp. 89-92. 50. H. Kato et al., “Virtual Object Manipulation of a Table-Top
34. B. Jiang, S. You, and U. Neumann, “Camera Tracking for AR Environment,” Proc. Int’l Symp. Augmented Reality 2000
Augmented Reality Media,” Proc. IEEE Int’l Conf. Multime- (ISAR 00), IEEE CS Press, Los Alamitos, Calif., 2000, pp.
dia Expo 2000, IEEE CS Press, Los Alamitos, Calif., 2000, 111-119.
pp. 1637-1640. 51. T. Ohshima et al., “AR2 Hockey: A Case Study of Collabo-
35. G. Simon, A.W. Fitzgibbon, and A. Zisserman, “Markerless rative Augmented Reality,” Proc. IEEE Virtual Reality Ann.
Tracking Using Planar Structures in the Scene,” Proc. Int’l Int’l Symp. (VRAIS 98), IEEE CS Press, Los Alamitos, Calif.,
Symp. Augmented Reality 2000 (ISAR 00), IEEE CS Press, 1998, pp. 268-275.
Los Alamitos, Calif., 2000, pp. 120-128. 52. M. Billinghurst, H. Kato, and I. Poupyrev, “The Magic-
36. Y. Akatsuka and G.A. Bekey, “Compensation for End to End Book—Moving Seamlessly between Reality and Virtuali-
Delays in a VR System,” Proc. Virtual Reality Ann. Int’l ty,” IEEE Computer Graphics and Applications, vol. 21, no.
Symp. (VRAIS 98), IEEE CS Press, Los Alamitos, Calif., 3, May/June 2001, pp. 2-4.
1998, pp. 156-159. 53. B. MacIntyre and E.M. Coelho, “Adapting to Dynamic Reg-
37. L. Chai et al., “An Adaptive Estimator for Registration in istration Errors Using Level of Error (LOE) Filtering,” Proc.
Augmented Reality,” Proc. 2nd Int’l Workshop Augmented Int’l Symp. Augmented Reality 2000 (ISAR 00), IEEE CS
Reality (IWAR 99), IEEE CS Press, Los Alamitos, Calif., Press, Los Alamitos, Calif., 2000, pp. 85-88.
1999, pp. 23-32. 54. A. Fuhrmann et al., “Occlusion in Collaborative Augment-
38. M.C. Jacobs, M.A. Livingston, and A. State, “Managing ed Environments,” Computers and Graphics, vol. 23, no. 6,
Latency in Complex Augmented Reality Systems,” Proc. Dec. 1999, pp. 809-819.
1997 Symp. Interactive 3D Graphics, ACM Press, New York, 55. S. Julier et al., “Information Filtering for Mobile Augment-
1997, pp. 49-54. ed Reality,” Proc. Int’l Symp. Augmented Reality 2000 (ISAR
39. M. Regan et al., “A Real Time Low-Latency Hardware Light- 00), IEEE CS Press, Los Alamitos, Calif., 2000, pp. 3-11.
Field Renderer,” Proc. Siggraph 99, ACM Press, New York, 56. B. Bell, S. Feiner, and T. Höllerer, “View Management for
1999, pp. 287-290. Virtual and Augmented Reality,” Proc. ACM Symp. User
40. R. Kijima, E. Yamada, and T. Ojika, “A Development of Interface Software and Technology (UIST 2001), ACM Press,
Reflex HMD-HMD with Time Delay Compensation Capa- New York, 2001, pp. 101-110.
bility,” Proc. 2nd Int’l Symp. Mixed Reality (ISMR 2001), 57. V. Lepetit and M.-O. Berger, “Handling Occlusions in Aug-
MR Systems Lab, Yokohama, Japan, 2001, pp. 40-47. mented Reality Systems: A Semi-Automatic Method,” Proc.
41. W.R. Mark et al., “Post-Rendering 3D Warping,” Proc. 1997 Int’l Symp. Augmented Reality 2000 (ISAR 00), IEEE CS
Symp. Interactive 3D Graphics, ACM Press, New York, 1997, Press, Los Alamitos, Calif., 2000, pp. 137-146.
pp. 7-16. 58. J. Stauder, “Augmented Reality with Automatic Illumina-
42. K.N. Kukulakos and J.R. Vallino, “Calibration-Free Aug- tion Control Incorporating Ellipsoidal Models,” IEEE Trans.
mented Reality,” IEEE Trans. Visualization and Computer Multimedia, vol. 1, no. 2, June 1999, pp. 136-143.
Graphics, vol. 4, no. 1, Jan.-Mar. 1998, pp. 1-20. 59. Y. Mukaigawa, S. Mihashi, and T. Shakunaga, “Photomet-
43. Y. Seo and K-S. Hong, “Weakly Calibrated Video-Based ric Image-Based Rendering for Virtual Lighting Image Syn-
Augmented Reality: Embedding and Rendering through thesis,” Proc. 2nd Int’l Workshop Augmented Reality (IWAR
Virtual Camera,” Proc. Int’l Symp. Augmented Reality 2000 99), IEEE CS Press, Los Alamitos, Calif., 1999, pp. 115-124.
(ISAR 00), IEEE CS Press, Los Alamitos, Calif., 2000, pp. 60. P. Debevec, “Rendering Synthetic Objects into Real Scenes:
129-136. Bridging Traditional and Image-Based Graphics with Glob-
44. B. Hoff and R. Azuma, “Autocalibration of an Electronic al Illumination and High Dynamic Range Photography,” Proc.
Compass in an Outdoor Augmented Reality System,” Proc. Siggraph 98, ACM Press, New York, 1998, pp. 189-198.
61. U. Neumann and A. Majoros, “Cognitive, Performance, and ed Reality First Person Application,” Proc. 4th Int’l Symp.
Systems Issues for Augmented Reality Applications in Man- Wearable Computers (ISWC 2000), 2000, pp. 139-146.
ufacturing and Maintenance,” Proc. IEEE Virtual Reality 77. D. Stricker et al., “Design and Development Issues for
Ann. Int’l Symp. (VRAIS 98), IEEE CS Press, Los Alamitos, Archeoguide: An Augmented Reality based Cultural Her-
Calif., 1998, pp. 4-11. itage On-Site Guide,” Proc. Int’l Conf. Augmented Virtual
62. D. Drascic and P. Milgram, “Perceptual Issues in Aug- Environments and 3D Imaging (ICAV3D 2001), Publishing
mented Reality,” Proc. SPIE, Stereoscopic Displays VII and ZITI, Greece, 2001, pp. 1-5, http://www.ziti.gr.
Virtual Systems III, SPIE Press, Bellingham, Wash., vol. 78. J. Rekimoto, Y. Ayatsuka, and K. Hayashi, “Augmentable
2653, 1996, pp. 123-134. Reality: Situated Communication through Physical and
63. J.P. Rolland and H. Fuchs, “Optical Versus Video See- Digital Spaces,” Proc. 2nd Int’l Symp. Wearable Computers,
through Head-Mounted Displays in Medical Visualization,” IEEE CS Press, Los Alamitos, Calif., 1998, pp. 68-75.
Presence: Teleoperators and Virtual Environments, vol. 9, 79. R. Behringer et al., “A Wearable Augmented Reality Test-
no. 3, June 2000, pp. 287-309. bed for Navigation and Control, Built Solely with Com-
64. R.L. Holloway, “Registration Error Analysis for Augment- mercial-off-the-Shelf (COTS) Hardware,” Proc. Int’l Symp.
ed Reality,” Presence: Teleoperators and Virtual Environ- Augmented Reality 2000 (ISAR 00), IEEE CS Press, Los
ments. vol. 6, no. 4, Aug. 1997, pp. 413-432. Alamitos, Calif., 2000, pp. 12-19.
65. S.R. Ellis et al., “Factors Influencing Operator Interaction 80. S. Mann, “Wearable Computing: A First Step Toward Per-
with Virtual Objects Viewed via Head-Mounted See- sonal Imaging,” Computer, vol. 30, no. 2, Feb. 1997,
through Displays: Viewing Conditions and Rendering pp. 25-32.
Latency,” Proc. Virtual Reality Ann. Int’l Symp. (VRAIS 97), 81. M. Billinghurst and H. Kato, “Collaborative Mixed Reali-
IEEE CS Press, Los Alamitos, Calif., 1997, pp. 138-145. ty,” Proc. Int’l Symp. Mixed Reality (ISMR 99), Springer-Ver-
66. L. Vaissie and J. Rolland, “Accuracy of Rendered Depth in lag, Secaucus, N.J., 1999, pp. 261-284.
Head-Mounted Displays: Choice of Eyepoint Locations,” 82. J. Rekimoto, “Transvision: A Hand-Held Augmented Real-
Proc. SPIE AeroSense 2000, SPIE, Bellingham, Wash., vol. ity System for Collaborative Design,” Proc. Virtual Systems
4021, 2000, pp. 343-353. and Multimedia (VSMM 96), Int’l Soc. Virtual Systems and
67. D. Curtis et al., “Several Devils in the Details: Making an Multimedia, Gifu, Japan, 1996, pp. 85-90.
AR Application Work in the Airplane Factory,” Proc. Int’l 83. Zs. Szalavári et al., “Studierstube: An Environment for Col-
Workshop Augmented Reality (IWAR 98), A.K. Peters, Nat- laboration in Augmented Reality,” Virtual Reality—Systems,
ick, Mass., 1998, pp. 47-60. Development and Application, vol. 3, no. 1, 1998, pp. 37-49.
68. N. Navab et al., “Scene Augmentation via the Fusion of 84. S. Julier et al., “The Software Architecture of a Real-Time
Industrial Drawings and Uncalibrated Images with a View Battlefield Visualization Virtual Environment,” Proc. IEEE
to Markerless Calibration,” Proc. 2nd Int’l Workshop Aug- Virtual Reality 99, IEEE CS Press, Los Alamitos, Calif., 1999,
mented Reality (IWAR 99), IEEE CS Press, Los Alamitos, pp. 29-36.
Calif., 1999, pp. 125-133. 85. T. Jebara et al., “Stochasticks: Augmenting the Billiards
69. N. Navab, A. Bani-Hashemi, and M. Mitschke, “Merging Experience with Probabilistic Vision and Wearable Com-
Visible and Invisible: Two Camera-Augmented Mobile C- puters,” Proc. 1st Int’l Symp. Wearable Computers (ISWC
arm (CAMC) Applications,” Proc. 2nd Int’l Workshop Aug- 97), 1997, IEEE CS Press, Los Alamitos, Calif., pp. 138-145.
mented Reality (IWAR 99), IEEE CS Press, Los Alamitos, 86. R. Cavallaro, “The FoxTrax Hockey Puck Tracking System,”
Calif., 1999, pp. 134-141. IEEE Computer Graphics and Applications, vol. 17, no. 2,
70. H. Fuchs et al., “Augmented Reality Visualization for Mar./Apr. 1997, pp. 6-12.
Laparoscopic Surgery,” Proc. 1st Int’l Conf. Medical Image 87. F.P. Brooks, Jr., “What’s Real About Virtual Reality?,” IEEE
Computing and Computer-Assisted Intervention (MICCAI Computer Graphics and Applications, vol. 16, no. 6,
98), Springer-Verlag, Heidelberg, Germany, 1998, pp. 934- Nov./Dec. 1999, pp. 16-27.
943. 88. B. MacIntyre et al., “Ghosts in the Machine: Integrating 2D
71. S. Weghorst, “Augmented Reality and Parkinson’s Disease,” Video Actors into a 3D AR System,” Proc. 2nd Int’l Symp.
Comm. ACM, vol. 40, no. 8, Aug. 1997, pp. 47-48. Mixed Reality (ISMR 2001), MR Systems Lab, Yokohama,
72. T. Starner et al., “Augmented Reality through Wearable Japan, 2001, pp. 73-80.
Computing,” Presence: Teleoperators and Virtual Environ-
ments, vol. 6, no. 4, Aug. 1997, pp. 386-398.
73. T. Höllerer et al., “Exploring MARS: Developing Indoor and
Outdoor User Interfaces to a Mobile Augmented Reality Ronald Azuma is a senior research
System,” Computers and Graphics, vol. 23, no. 6, Dec. 1999, staff computer scientist at HRL Labs.
pp. 779-785. He received a BS in electrical engi-
74. Y. Baillot, D. Brown, and S. Julier, “Authoring of Physical neering and computer science from
Models Using Mobile Computers,” to appear in Proc. Int’l the University of California at Berke-
Symp. Wearable Computers, IEEE CS Press, Los Alamitos, ley in 1988, and MS and PhD degrees
Calif., 2001. in computer science from the Univer-
75. W. Piekarski, B. Gunther, and B. Thomas, “Integrating Vir- sity of North Carolina at Chapel Hill in 1990 and 1995,
tual and Augmented Realities in an Outdoor Application,” respectively. His research interests are in augmented real-
Proc. 2nd Int’l Workshop Augmented Reality (IWAR 99), ity, virtual environments, and visualization. He is program
IEEE CS Press, Los Alamitos, Calif., 1999, pp. 45-54. co-chair of the International Symposium on Augmented
76. B. Thomas et al., “ARQuake: An Outdoor/Indoor Augment- Reality (ISAR 2001).
46 November/December 2001
Yohan Baillot is a software engi- Blair MacIntyre is an assistant
neer at ITT Industries working as a professor in the College of Comput-
consultant at the Naval Research ing at the Georgia Institute of Tech-
Lab, Washington, D.C. He received nology. He received BMath and
an MS degree in electrical engineer- MMath degrees from the University
ing in 1996 from ISIM, France, and of Waterloo in 1989 and 1991, and
an MS in computer science in 1999 received a PhD in computer science
from the University of Central Florida, Orlando. His from Columbia University in 1999. His research interests
research interests are in computer graphics, 3D displays, are in computer graphics and human–computer interac-
body tracking, 3D reconstruction, and mobile augmented tion, with a focus on the design and implementation of
reality. He is a member of the IEEE Computer Society. augmented reality systems.
http://computer.org/pervasive http://computer.org/tmc