ch069 PDF
ch069 PDF
”
The Biomedical Engineering Handbook: Second Edition.
Ed. Joseph D. Bronzino
Boca Raton: CRC Press LLC, 2000
69
Medical Applications
of Virtual Reality
Technology
Virtual Reality (VR) is the term commonly used to describe a novel human-computer interface that
enables users to interact with computers in a radically different way. VR consists of a computer-generated,
multi-dimensional environment and interface tools that allow users to:
1. immerse themselves in the environment,
2. navigate within the environment, and
3. interact with objects and other inhabitants in the environment.
The experience of entering this environment—this computer-generated virtual world—is compelling.
To enter, the user usually dons a helmet containing a head-mounted display (HMD) that incorporates
a sensor to track the wearer’s movement and location. The user may also wear sensor-clad clothing that
likewise tracks movement and location. The sensors communicate position and location data to a
computer, which updates the image of the virtual world accordingly. By employing this garb, the user
“breaks through” the computer screen and becomes completely immersed in this multi-dimensional
world. Thus immersed, one can walk through a virtual house, drive a virtual car, or run a marathon in
a park still under design. Recent advances in computer processor speed and graphics make it possible
for even desk-top computers to create highly realistic environments. The practical applications are far
reaching. Today, using VR, architects design office buildings, NASA controls robots at remote locations,
and physicians plan and practice difficult operations.
Virtual reality is quickly finding wide acceptance in the medical community as researchers and clini-
cians alike become aware of its potential benefits. Several pioneer research groups have already demon-
strated improved clinical performance using VR imaging, planning, and control techniques.
world. Therefore, one long-term challenge for VR developers has been to replace the conventional
computer interface with one that is more natural, intuitive, and allows the computer—not the user—to
carry a greater proportion of the interface burden. Not surprisingly, this search for a more practical
multi-dimensional metaphor for interacting with computers and computer-generated artificial environ-
ments has spawned the development of a new generation of computer interface hardware. Ideally, the
interface hardware should consist of two components: (1) sensors for controlling the virtual world and
(2) effectors for providing feedback to the user. To date, three new computer-interface mechanisms, not
all of which manifest the ideal sensor/effector duality, have evolved: Instrumented Clothing, the Head
Mounted Display (HMD), and 3D Sound Systems.
Instrumented Clothing
The DataGlove™ and DataSuit™ use dramatic new methods to measure human motion dynamically in
real time. The clothing is instrumented with sensors that track the full range of motion of specific activities
of the person wearing the Glove or Suit, for example as the wearer bends, moves, grasps, or waves.
The DataGlove is a thin lycra glove with bend-sensors running along its dorsal surface. When the
joints of the hand bend, the sensors bend and the angular movement is recorded by the sensors. These
recordings are digitized and forwarded to the computer, which calculates the angle at which each joint
is bent. On screen, an image of the hand moves in real time, reflecting the movements of the hand in
the DataGlove and immediately replicating even the most subtle actions.
The DataGlove is often used in conjunction with an absolute position and orientation sensor that
allows the computer to determine the three-space coordinates, as well as the orientation of the hand and
fingers. A similar sensor can be used with the DataSuit and is nearly always used with an HMD.
The DataSuit is a customized body suit fitted with the same sophisticated bend-sensors found in the
DataGlove. While the DataGlove is currently in production as both a VR interface and as a data-collection
instrument, the DataSuit is available only as a custom device. As noted, DataGlove and DataSuit are
utilized as general purpose computer interface devices for VR. There are several potential applications
of this new technology for clinical and therapeutic medicine.
3D Spatialized Sound
The impression of immersion within a virtual environment is greatly enhanced by inclusion of 3D
spatialized sound [Durlach, 1994; Hendrix and Barfield, 1996]. Stereo-pan effects alone are inadequate
since they tend to sound as if they are originating inside the head. Research into 3D audio has shown
the importance of modeling the head and pinea and using this model as part of the 3D sound generation.
A Head Related Transfer Function (HRTF) can be used to a generate the proper acoustics [Begault and
Wenzel, 1992; Wenzel, 1994]. A number of problems remain, such as the “cone of confusion” wherein
sounds behind the head are perceived to be in front of the head [Wenzel, 1992].
VR Applications
Applications today are diverse and represent dramatic improvements over conventional visualization and
planning techniques:
Public Entertainment: VR made its first major inroads in the area of public entertainment, with
ventures ranging from shopping mall game simulators to low-cost VR games for the home. Major
growth continues in home VR systems, partially as a result of 3D games on the Internet.
Computer-Aided Design: Using VR to create “virtual prototypes” in software allows engineers to test
potential products in the design phase, even collaboratively over computer networks, without
investing time or money for conventional hard models. All of the major automobile manufacturers
and many aircraft manufacturers rely heavily on virtual prototyping.
Military: With VR, the military’s solitary cab-based systems have evolved to extensive networked
simulations involving a variety of equipment and situations. There is an apocryphal story that
General Norman Schwartzcopf was selected to lead the Desert Storm operation on the basis of
his extensive experience with simulations of the Middle East battle arena.
Architecture/Construction: VR allows architects and engineers and their clients to “walk through”
structural blueprints. Designs may be understood more clearly by clients who often have difficulty
comprehending them even with conventional cardboard models. Atlanta, Georgia credits its VR
model for winning it the site of the 1996 Olympics, while San Diego is using a VR model of a
planned convention center addition to compete for the next convention of the Republican Party.
Data Visualization: By allowing navigation through an abstract “world” of data, VR helps users rapidly
visualize relationships within complex, multi-dimensional data structures. This is particularly
important in financial-market data, where VR supports faster decision-making.
VR is commonly associated with exotic “fully immersive applications” because of the over-dramatized
media coverage on helmets, body suits, entertainment simulators, and the like. Equally important are
the “Window into World” applications where the user or operator is allowed to interact effectively with
“virtual” data, either locally or remotely.
An ultrasonic image of the appropriate virtual scanfield is presented on the monitor in accordance with
the position of the virtual probe on the phantom. As the probe is moved on the phantom, new virtual
scanfields are extracted from the database and presented on the monitor. The UltraTrainer is able to
present sequential virtual scanfields rapidly enough for the trainee to perceive the virtual ultrasonography
as occurring in real time.
Planned improvements to the UltraTrainer include a larger database of stored sonograms that can be
customized for different medical specialties, and a reduction in cost by porting the system to a PC and by
using a less expensive tracking system. These improvements should allow the UltraTrainer to be accessed
by a broader range of users and to move from the laboratory to the classroom, or even to the home.
Three groups of researchers have taken different approaches to developing VR-based training systems
for needle insertion, each based on feedback to a different sensory system. At Georgetown University,
Lathan and associates [1998] have produced a spine biopsy simulator based on visual feedback; a team
from Ohio State University and the Ohio Supercomputer Center have demonstrated an epidural needle
insertion simulator based on force feedback [Heimenz et al., 1998]; and Computer Aided Surgery in New
York has developed a blind needle biopsy placement simulator that uses 3D auditory feedback [Wenger
and Karron, 1998]. Each of these innovative systems draws on technological advances in virtual reality.
In the aggregate, they disclose that, given appropriate cognitive constructs, humans are capable of using
diverse sensory input to learn very demanding tasks. Imagine how effective virtual reality could be in
training complex tasks if all of the sensory information could be integrated in a single system. That is
currently one of the goals of the virtual reality community.
visualization [Kelly, 1994; Kall, 1994]. The technology offers a technique for surgeons to plan and simulate
the surgical procedure beforehand in order to reach deep-seated or centrally located brain tumors. The
imaging method (volumetric stereotaxis) gathers, stores, and reformats imaging-derived, three-dimen-
sional volumetric information that defines an intracranial lesion (tumor) with respect to the surgical field.
Computer-generated information is displayed intraoperatively on computer monitors in the operating
room and on a “heads up” display mounted on the operating microscope. These images provide surgeons
with a CT (computed tomography) and MRI defined map of the surgical field area scaled to actual size
and location. This guides the surgeon in finding and defining the boundaries of brain tumors. The
computer-generated images are indexed to the surgical field by means of a robotics-controlled stereotactic
frame which positions the patient’s tumor within a defined targeting area. Simulated systems using VR
models are being advocated for high risk techniques, such as the alignment of radiation sources to treat
cancerous tumors. Where registration of virtual and real anatomical features is not an issue, other display
technologies can be employed. Two recent examples, based on totally different display modalities, are
described below.
Parsons and Rolland [1998] developed a non-intrusive display technique that projects images of virtual
inclusions such as tumors, cysts, or abscesses, or other medical image data into the surgeon’s field of
view on demand. The system relies on retroreflective material, perhaps used as a backing for the surgeon’s
glove or on a surgical instrument, to provide an imaging surface upon which can be presented images
extracted from 2D data sources, for example MRI scans. The surgeon can access the data by placing the
retroreflective screen, e.g., his gloved hand, in the path of an imaging beam. The image of the virtual
MRI scan, for example, will appear superimposed along the surgeon’s line of sight. Although image
registration is not currently possible with this real-time display system, the system does provide a means
for the clinician to access critical information without turning to view a monitor or other screen.
For displaying volumetric data, Senger [1998] devised a system based on the FakeSpace Immersive
Workbench™ and position-tracked StereoGraphics CrystalEyes™ that presents stereoscopic images
derived from CT, MRI, and the Visible Human data sets. The viewer can interact with the immersive
data structure through the use of a position/orientation sensed probe. The probe can be used first to
identify a region of interest within the anatomical data set and then to segment the data so that particular
DataGlove and DataSuit technologies were originally developed as control devices for VR, but through
improvements they are now being applied to the field of functional evaluation of movement and to
rehabilitation in a variety of ways. One system, for example, uses a glove device coupled with a force
feedback system—The Rutgers Master (RM-I)—to rehabilitate a damaged hand or to diagnose a range
of hand problems [Burdea et al., 1995; Burdea et al, 1997]. The rehabilitation system developed by Burdea
and colleagues uses programmable force feedback to control the level of effort required to accomplish
rehabilitative tasks. This system measures finger-specific forces while the patient performs one of several
rehabilitative tasks, including ball squeezing, DigiKey exercises, and peg insertion.
Another system under development—The RM II—incorporates tactile feedback to a glove system to
produce feeling in the fingers when virtual objects are “touched” [Burdea, 1994]. In order to facilitate
accurate goniometric assessment, improvements to the resolution of the standard DataGlove have been
developed [Greenleaf, 1992]. The improved DataGlove allows highly accurate measurement of dynamic
range of motion of the fingers and wrist and is in use at research centers such as Johns Hopkins and
Loma Linda University to measure and analyze functional movements.
Adjacent to rehabilitation evaluation systems are systems utilizing the same measurement technology
to provide ergonomic evaluation and injury prevention. Workplace ergonomics has already received a
boost from new VR technologies that enable customized workstations tailored to individual requirements
[Greenleaf, 1994]. In another area, surgical room ergonomics for medical personnel and patients is
projected to reduce the hostile and complicated interface among patients, health care providers, and
surgical spaces [Kaplan, 1994].
Motion Analysis Software (MAS) can assess and analyze upper extremity function from dynamic
measurement data acquired by the improved DataGlove. This technology not only provides highly
objective measurement, but also ensures more accurate methods for collecting data and performing
69.5 Summary
VR tools and techniques are being developed rapidly in the scientific, engineering, and medical areas.
This technology will directly affect medical practice. Computer simulation will allow physicians to
practice surgical procedures in a virtual environment in which there is no risk to patients, and where
mistakes can be recognized and rectified immediately by the computer. Procedures can be reviewed from
new, insightful perspectives that are not possible in the real world.
The innovators in medical VR will be called upon to refine technical efficiency and increase physical
and psychological comfort and capability, while keeping an eye on reducing costs for health care. The
mandate is complex, but like VR technology itself, the possibilities are very exciting. While the possibil-
ities—and the need—for medical VR are immense, approaches and solutions using new VR-based
applications require diligent, cooperative efforts among technology developers, medical practitioners,
and medical consumers to establish where future requirements and demand will lie.
Defining Terms
For an excellent treatment of the state-of-the-art of VR and its taxonomy, see the ACM SIGGRAPH
publication “Computer Graphics,” Vol. 26, #3, August 1992. It covers the U.S. Government’s National
Science Foundation invitational workshop on Interactive Systems Program, March 23-24, 1992, which
served to identify and recommend future research directions in the area of virtual environments. A more
in-depth exposition of VR taxonomy can be found in the MIT Journal “Presence”, Vol. 1 #2.
Further Information
Burdea, G and Coiffet, P. Virtual Reality Technology. John Wiley & Sons, New York.
HITL (Human Interface Technology Laboratory), University of Washington, FJ-15, Seattle, WA 98195.
UNC Laboratory, University of North Carolina, Chapel Hill, Computer Science Department, Chapel Hill,
NC 27599-3175.
Presence: Teleoperators & Virtual Environments. Professional Tech Papers and Journal, MIT Press Journals,
55 Hayward St, Cambridge MA 02142.