0% found this document useful (0 votes)
19 views9 pages

RA10 Active and passive range sensing for robotics

This paper surveys current and emerging technologies for range sensing in robotics, focusing on active and passive methods such as laser range finders, triangulation, and passive stereo. It highlights advancements in sensor integration and computing power that have made passive stereo viable for real-time applications, while also discussing the limitations and requirements specific to robotics. The document emphasizes the importance of speed, integration, and robustness in developing effective range sensing systems for dynamic environments.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views9 pages

RA10 Active and passive range sensing for robotics

This paper surveys current and emerging technologies for range sensing in robotics, focusing on active and passive methods such as laser range finders, triangulation, and passive stereo. It highlights advancements in sensor integration and computing power that have made passive stereo viable for real-time applications, while also discussing the limitations and requirements specific to robotics. The document emphasizes the importance of speed, integration, and robustness in developing effective range sensing systems for dynamic environments.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Proceedings of the 2000 IEEE

International Conference on Robotics & Automation


San Francisco, CA April 2000

Active and Passive Range Sensing for Robotics

Martial Hebert
hebert @ri.cmu.edu

The Robotics Institute


Carnegie Mellon University
5000 Forbes Avenue
Pittsburgh, PA 15213
Abstract sensors as well as of very high speed sensors have opened
up new exciting possibilities. Second, in the area of pas-
111this pupel; we present U brief survey of the technologies
sive range sensing, the tremendous increase in computing
currently available f o r range sensing, of their use in robot-
power and integration that occurred in the past few years
ics upplications, and of emerging technologies f o r future
has made passive stereo ranging a viable option for real-
systems. The paper is organized by type of sensing: laser
time robotic systems.
range finders, triangulation rangefinders, and passive ste-
reo. A separate section focuses on current development in This paper is a brief survey of the technologies currently
the area of non-scanning sensors, a critical area to available for range sensing, of their use in robotics appli-
achieve range sensing performance comparable to that of cations, and of emerging technologies for future systems.
conventional cameras. The presentation of the different The paper is organized by type of sensing: laser range
technologies is based on many recent examples from finders, triangulation range finders, and passive stereo. A
robotics research. separate section focuses on current development in the
area of non-scanning sensors, a critical area in achieving
1.0 Introduction range sensing performance comparable to that of conven-
tional cameras'.
The use of three-dimensional information is paramount in
robotics, from detecting and avoiding obstacles in a three- Rather than listing sensors and applications with their
dimensional workspace, to recognizing objects, to map- specs, the paper explores the range sensing technologies
ping environments. Although simple solutions such as through many examples, both from the applications stand-
probes and light beams are sufficient in many industrial point and from the sensor technology standpoint. Those
applications, robotics problems typically require a much examples are not intended to constitute an exhaustive list
higher sensing rate, longer range, and high spatial resolu- but to provide snapshots of typical activities in the area of
tion that cannot be delivered through those conventional range sensing. In particular, many examples are from com-
means. mercial prototypes, or even products in some cases. Those
Development of range sensing in robotics has used virtu- examples were chosen as typical examples of high-end
systems for the robotics applications of the future. That
ally all the possible modalities. Laser range finders were
selection is not the result of a systematic quantitative eval-
considered as far back as [24], while passive techniques
uation or of an exhaustive list of available systems. When-
based on stereo vision have been under development for
ever possible, the source of the information is listed for all
decades. One of the most widely used types of sensing in
those examples.
robotics was ultrasonic sensing owing to its applicability
to a number of problems in mobile robotics. Finally, the emphasis here is on technologies with demon-
strated potential in robotics applications. Except for a brief
More advanced range sensors, however, are not as wide-
mention of radar, other ranging techniques are not consid-
spread as one would hope. The main reason is that the
ered. The presentation is further restricted by emphasizing
requirements of workspace, speed, and accuracy often led
to systems that were too costly and bulky to be of practical imaging, or scanning systems that are most relevant to
robotics applications.
use beyond research experiments. Several advances have
changed that picture. First of all, in the area of active sens-
ing, advances in solid-state technology and electronics
integration have rendered possible the design of compact
1. The paper aims at presenting a broad survey of the technol-
and relatively affordable active range sensors whose per- ogies available and under development. As a result. it is not
formance is compatible with the requirements of robotics. possible to include detailed performance tables for each
Furthermore, current development on non-scanning range device. Instead, a comprehensive bibliography is included at
the end of the paper.

0-7803-5886-4/00/$1O.OO@ 2000 IEEE 102


2.0 Range Sensing and Robotics Although the existing systems in this class of laser range
finders tend to be costly, many turnkey systems are now
Because much of the development effort in range sensing
available, a number of which have been demonstrated in
are concerned with industrial applications such as metrol-
robotics research systems. We consider below some exam-
ogy, surveying, or inspection, it is important to identify the
ples of systems relevant to robotics applications. We limit
requirements that are specific to robotics applications.
ourselves to those sensors that use a mechanical scanning
Broadly speaking, the first distinguishing feature of robot-
system. (Non-mechanical scanning will be briefly dis-
ics applications is that they are dynamic, that is, either the
cussed in a separate section.)
robot is in motion or the robot observes a dynamically
changing environment. The impact of this requirement on A major step toward the widespread use of laser range sen-
sensing technology is that data acquisition speed is critical sors in robotics has been the introduction of affordable and
and that many of the advanced concepts developed for the relatively compact range scanners. Those sensors permit
metrology and surveying areas cannot be transferred to the acquisition of considerably more accurate and denser
robotics. data than was possible with sonars. In the area of indoor
mobile robots in particular, this has been coupled with the
The second consideration in robotics is the physical pack-
recent breakthroughs in concurrent mapping and robot
aging of the sensor system. Sensors are typically inte-
localization [36,8,14] which opened new areas of research
grated in robot platforms. Current trends in robotics, e.g.,
and application. A popular illustrative example is the
personal robotics, are pushing for smaller robots with
SICK laser scanner [http://www.sickoptic.com/] which
more limited power storage. As a result, many of the most
uses a TOF ranging laser with a 50m range with a resolu-
advanced ranging systems cannot be transferred to robot-
tion of +/- 50". The scanner provides a single line scan
ics unless significant advances are made in the area of sen-
with 180' coverage. Figure I shows the scanner mounted
sor integration.
on a RWI robot and a typical result of accurate mapping of
Finally, operation in difficult environmental conditions, a building. This example is taken from the research initi-
such as low-light, dust, fog, etc. is critical in many field ated in [36,14].
applications; thus, new developments in robustness are
needed before many of the proposed technologies can be
used in robotics settings.
In the remainder of the paper, we emphasize those three
axis of speed, integration, and robustness through exam-
ples of typical existing ranging systems in robotics and of
on-going research in addressing their limitations.

3.0 Scanning Laser Range Finders


Figure 1 Typical example of global mapping using a
Essentially, three basic technologies are used in active single-scan laser range finder; (a) SICK laser mounted
laser ranging. AM-CW lasers use the difference of phase on a mobile platform; (b) final map of building; (c)
A@ between emitted and received beam; time of flight initial map from odometry only (from Thrun.)
lasers measure the travel time AT of a pulse; and FM-CW Because they are based on the detection of a pulse rather
use the frequency shift of a frequency modulated laser for than on the integration of a continuous signal which rap-
measuring range. Although those are somewhat generali- idly decays with range, TOF systems can be built to
zations, a rough comparison of technologies can be done achieve much longer ranges. In the context of robotics,
as follows: the AM-CW lasers are faster and perform best this means that we can now consider new applications in
at close to medium range (e.g., 50m range). They are typi- which 3-D data previously could not be directly used, such
cally more sensitive to ambient natural light, and therefore as AUVs. The Riegl system is a typical example of a TOF
more suitable for indoor use. TOF scanners can perform at system [www.riegl.co.at] operating in the near-infrared --
long range and are best suited for mobile robot application similar systems exists, for example [32]. The laser can use
in outdoor settings; however, their acquisition rate is typi- either single or multiple pulse detection. One-axis or two-
cally lower. FM-CW sensors can be considerably more axis scanning can be achieved with the basic system. Such
accurate, but at a cost of more complex design and more a system can easily achieve maximum ranges of one to
brittle packaging. Except for a few experimental excep- several hundred meters. The advertised accuracy is +/-
tions, the successes in robotic applications to date have Scm. It should be noted, for such a TOF system, the accu-
been obtained with AM-CW or TOF sensors; therefore, racy does not decrease with range as rapidly as with the
FM sensors will not be discussed further. AM-CW systems.

1-03
Such approaches to laser ranging systems are used in Although this idea has been around for a long time, opera-
robotic applications involving long-range mapping and tional systems are now available. The implication for
navigation. One example is the use of an AUV, an autono- robotics is that field applications that were not previously
mous helicopter, for autonomous mapping of large areas possible can now be developed, thus extending the reach
[21]. Figure 2 shows the map obtained by accumulating a of robotics applications. For example, Figure 4 shows the
large number of scans acquired while flying over an area. set of points acquired from such a laser scanner. The scene
contains a truck and a dirt pile and is typical of the envi-
ronment used for robotic field applications [34]. Although
the environment could generate a great deal of spurious
measurement from scattering from the medium, the data is
correct and the shape of the object can be easily recovered.
TOF scanners are inherently limited in speed (typically to
a few thousand samples per second) and therefore are lim-
ited in the spatial density of the data that can be acquired
in a scan. AM-CW technology can provide a much higher
Figure 2 Application of laser ranging to an AUV; (left) sampling rate at the cost of the maximum range, which is
autonomous helicopter with scanner; (right) map typically limited to tens of meters, assuming eye safety
constructed from repeated scanning of an area.
limitations. Such scanners are mature and have been used
A major objection to the use of laser sensors is their sensi- in manufacturing applications. For example the Perceptron
tivity to environmental conditions, e.g., scattering due to scanner can measure 256x256 images at 2Hz in work-
fog or dust. A major development in TOF technology over spaces of 10m [7]. Further technology development in this
the past few years is the introduction of “last pulse mea- area concentrates on applications that require very high
suring” techniques. As shown in Figure 3, the basic con- resolution and fast scan rate, for example, autonomous
cept is to detect the last echo received from a pulse, or a high-resolution mapping. One example of such a develop-
train of pulses, so that the echo from the surface that is the ment effort is the scanner described in [IO] which can
further away will be used for range computation. When acquire 150Ksamples per second at a maximum range of
operating under hostile environmental conditions, such as 57m. The range accuracy, i.e., 30, is 34” on reflective
fog, dust, or smoke, this technique guarantees that the targets. An example data set is shown in Figure 5. The
range to the target is returned instead of the range pro- high accuracy is obtained by using two lasers offset in
duced by the scattering from the medium. modulation frequency. Such a high-end ranging technol-
exit window dust cloud target ogy does combine high resolution and acquisition speed,
which makes it potentially suitable for a variety of robotics
applications. A major obstacle, however, is that the rate at
which a scene can be scanned is considerably lower than
the acquisition rate because of the limitations of the scan-
signal strength I
ning technology. (This issue will be addressed in a sepa-
- rate section.)
b
last &se time
Figure 3 Last pulse measuring technique.

Figure 5 Typical example of data from a high-end AM-


CW scanner: range and intensity (left) and 3-D view
(right)
4.0 Active Triangulation
Historically, active triangulation is one of the first range
imaging approaches used in robotics. Recall the basic
Figure 4 3-D scan of a truck for a robotic excavation principle: a stripe (or a single spot) of light is projected
application (from [34])

104
onto the scene. A sensor, usually a CCD camera, views the There has been considerable effort in addressing this prob-
scene. The depth to the point on the object is found lem. One of the most successful attempts is the synchro-
as:,, = E where B is the baseline separation of nized scanning approach, originally introduced in [25]. In
.I / J + ta a ’
the lasef and the camera optical centers, f is the focal this approach, both the emitted beam and the receiver
length of the camera lens, xo is the image location within a optics are scanned so that the projected beam remains in
row where the laser stripe is detected, and ~1 is the projec- focus over a large depth of field, and high resolution is
tion angle of the laser in respect to the z axis. In order to maintained despite a short physical baseline. The original
ease the detection of the laser in the image, the illumina- idea has been extended to include color by using an RGB
tion conditions are usually adjusted so that the projected laser. The technique can be modified to allow for random
laser generates the brightest features in the scene. access scanning, thus enabling active 3-D target tracking
A fairly wide selection of commercial scanners based on [2]. This example shows the applicability of high-resolu-
this principle is available. Figure 6 shows one example, the tion color/range scanners to robotics [3].
Minolta VIVID scanner which acquires a complete scan in A second problem is the frame rate of the scanners. With
0.6 second. A complete scan produces a 200x200 range the basic triangulation approach as described so far,
image and a 400x400 color image in an operating range of images must be acquired and processed for each position
0.6 m to 2.5 m. The various incarnations of the basic trian- of the beam. Although this can be done efficiently through
gulation technology differ in the beam forming technique, proper design of the electronics, the basic principle does
the focusing technique used to ensure sharpness of the not allow frame-rate data acquisition. An obvious solution
stripe in the image, e.g., the Minolta uses an auto-focus to this problem, proposed many years ago, is to projected a
camera to get maximum sharpness on the target, and in the structured pattern over the entire scene and recover depth
implementation of the scanning mechanism. through triangulation by processing only a few -- or even
one -- image. Early examples of this technique include the
OGIS camera [28]. Although this is an obvious and simple
approach, it is dramatically limited in practice for robotics
because of its sensitivity to ambient light and its limited
range, which stems from the fact that it is difficult to
project enough power in the scene at long range. For short
range applications in controlled lighting, the projected pat-
tern approach continues to receive some attention.
A more promising approach is that of time-to-range: Sup-
pose that the laser is continuously swept across the scene,
Figure 6 Typical data from a triangulation light-stripe say from right to left. Each pixel in the sensor has its own
--
sensor the Minolta VIVID. line of sight and “sees” the laser stripe only once as it
The attractive aspect of the active triangulation approach is sweeps by. Based on the observation that, if we can record
the simplicity of its implementation. However, although
the basic triangulation technology as described above has the laser,-the range is calculated as Z , = ’-
the time to at which a particular pixel at location x, sees
x , , / f + tilnwt,,
been successful in object modeling and recognition appli- where o is the angular velocity of the mirror.
cations, it has several major drawbacks that need to be In other words, we can continuously scan the scene and
addressed in the context of robotics applications. We now the range at each pixel will be reported as soon as the pixel
turn to those robotics-specific issues and solutions cur- is triggered by the laser stripe, that is, as soon as the pixel
rently being investigated. sees a maximum of intensity. The acquisition rate in this
The first problem is that the resolution of a triangulation case is limited only by the maximum speed of the mechan-
system is directly related to the baseline of the system. The ical scanning apparatus and by the minimum integration
resolution relative to depth of field typically needed to time at each pixel. In both cases, the theoretical speed is
robotics applications can be achieved only by increasing far greater than the speed achievable through conventional
the baseline, thus increasing the occurrence of occlusions means. Most importantly, the speed does not depend on
and missing data, and making the size of the system the image size since the “computation” is performed
unpractical for on-robot embarked systems, except for simultaneously at each pixel.
very short range applications. The situation is complicated The main challenge in this approach is that each sensing
by the fact that the projected beam must stay in focus over element of the receiver must be equipped with additional
a wide depth of field, consistent with the workspace typi- triggering electronics, thus causing a serious integration
cally required in robotics systems. problem. Current VLSI integration technology does make

105
such an approach viable. crystal deflects the beam according to a pre-programmed
pattern, thus effectively scanning the scene without mov-
ing parts. An extensive literature exists on AOs which have
been used for a variety of applications [37]. Research on
beam deflection using AOs was originally motivated by
applications to inspection, image formation, and 3-D dis-
plays [30,23]. Recent efforts have concentrated on extend-
\
ing both the range and angular deflection of those devices.
row \
under A comparison of current techniques can be found in [35].
\ laser
considerat \ ,sheet A key issue in this approach is the loss of power incurred
\ ‘

&+
Intensity
as the beam is deflected by the A 0 device. Additional

ban
complex optical effects affecting focusing and diffraction
of the beam are additional challenges under investigation.
Other approaches based on solid-state techniques bor-
Time
rowed from the data storage industry do not suffer from
those obstacles. However, they still do have limitations in
Figure 8 Principle of the time-to-range approach.
the maximum angle that can be scanned as well as other
Theoretical speeds of 1000 frameslsec. are ultimately tar- issues with the allowable transmitted power.
geted for such time-to-range triangulation systems. While
early attempts [41,29,9] did not match those goals, they
A 0 element
demonstrated the feasibility of such an approach and 10
frameshec acquisition rate independently of image size.
deflection
Application to 3-D object tracking for robotics were
reported in [33]. More recently, taking advantage of
advances in VLSI integration of computational sensors,
I modulation signal

100 frameskec sensors were demonstrated in [4] -- several Figure 9 Acoustico-optical scanning.
other efforts are described in the same proceedings. Simi- Another approach is eliminate scanning altogether. The
lar approaches are being investigated for specific robotic basic idea is to send a laser pulse that covers the entire
applications, e.g., fast range sensing for obstacle detection field of view in the scene. The returned pulse is then
for mobile robots [ 131. Although the obstacles, in particu- observed by an array of receivers. The key is that the
lar the integration required for full-frame range sensing, returned pulse will be received at different times for
remain formidable, this technology is a promising and receivers corresponding to points at different ranges. The
attractive path toward fast range sensing for dynamic problem is to then efficiently detect the time difference at
robotic applications. each receiver simultaneously. Although the principle is
straightforward, the technical challenge is to be able to
5.0 “Scannerless Scanning” detect such small time intervals over a large array effi-
One of the key limitations of today’s range sensors is the ciently. One early example of such a device was imple-
inability to acquire dense data at a sufficiently high rate. mented by Sandia Natl. Lab [ l ] - incarnations of this
The limitation comes from the mechanical constraints on technology are sometimes referred to as “flash lasers.”
the scanning mechanism; these prevent the motion of the Practical implementations include range gating of the
deflection system from keeping up with the actual range received pulse through fast electronic shuttering, and off-
measurement rate. Ideally, one would like a sensor in line processing of the time data. It is also possible to com-
which the scanning is performed electronically or without pute the time of flight directly at the receiver by handling
any scanning at all. This would also address the issue of the charge created during sensing at each pixel of a CCD
sensor size which is still a major obstacle to their practical array [22]. Another benefit of this approach is the simulta-
use. neous acquisition of range and color data since the receiv-
Several concepts have been proposed for non-mechanical ers are co-located. Examples of recent implementation of
scanning. One example is the use of acoustico-optical this technology can be found in [26,31,11]. Quantitative
devices A 0 for beam steering. AOs are crystals placed in evaluation techniques for those devices are proposed in
front of the optical path of the beam with the property that ~71.
their optical characteristics change when an acoustical sig- Progress reported in the literature on such devices is prom-
nal is applied to it. By properly modulating the signal, the ising. Several difficult technical challenges needed to be

106
overcome, however. High resolution relative to depth of tion operations at each pixel. The rapid advent of
field is difficult to achieve. In fact, some of the earlier computing power has generated a revolution in the way
examples of this technology were for measuring ranges stereo systems are implemented and used. This trend has
much longer than those normally used in robotics applica- accelerated owing to the introduction of parallel image
tions; short range applications are more challenging. The computing hardware, such as MMX, in standard hardware.
electronics required for the range measurements is also an Only a few years ago, stereo systems could barely gener-
issue and will require more integration before being cost- ate a few frames per second at reduced resolution, even
effective and practical. when using custom hardware or special-purpose image
Although the deployment of those technologies in truly processors such as Datacube boards. Today, a large num-
turnkey systems for robotics is still in the future, those ber of operational systems operate at near-frame rate on
lines of development open up the exciting possibility of conventional PC hardware.
having true range cameras. For example, the Zcam device Beyond the computation issues, stereo suffers from limita-
from 3DVSystems [38] provides full-frame rangekolor tions due to the triangulation geometry. Specifically, the
images at 30Hz with a resolution of up to 10 bits in depth. trade-off between high resolution (large baseline) and
reduced ambiguity in matching (small baseline), which
also existed for active triangulating systems, is exacer-
bated in the case of passive systems. Stereo systems with
three or more cameras do address this problem. The effect
of using shorter baselines is compensated for by integrat-
ing matching results from multiple cameras. Practical trin-
ocular stereo systems were made possible not only
because of the increased computational power but also
because of a better understanding of the geometry of the
Figure 10 Range channel (left) and color channel stereo problem, including development in calibration, a
(right) from a sequence acquired using the zcam range result of fundamental research in the Computer Vision
camera (the images are slightly out of phase because of
the capture program used for generating the community.
illustration.) Examples of real-time stereo systems now abound in
Although the cost and size of this type of device still place robotics. For example, the Point Grey system runs on a
it out of reach of robotics work -- it was designed for the conventional PC and is able to compute one million pixels
film industry-- it is obvious that the development of such per second by taking advantage of the MMX architecture
sensors will have a substantial impact on robotics applica- [www.ptgrey.com]. If a reduced frame size is used, 30
tions. frameshec can be achieved. Figure 11 shows an example
of data from that system in an outdoor scene. Range reso-
6.0 Passive Stereo lution and matching accuracy are achieved by using a trin-
ocular system. Another example is the SVM system from
Passive stereo is also a triangulation technique with the
same geometric characteristics as active triangulation, SRI [12], which also uses computation on a conventional
except that the range is computed by triangulation between
PC to deliver 320x240 range images at 12Hz with 16 dis-
parity levels. This system has been used in a number of
the locations of matching pixels in images rather than
applications for navigation, localization, and mapping.
between a known source and an observed pixel. Corre-
spondences between pixels are established by searching Interestingly, the acquisition rate is fast enough to do
through the image and using correlation or sum of square motion detection and people tracking [ 6 ] , which are
differences measures to compare local neighborhood. The greatly simplified by using range information.
raw output of a stereo system is an image of the disparity - As those examples show, recent developments in stereo
- or, equivalently, inverse range -- between images at each systems place the possibility of real-time range acquisition
pixel. The literature on stereo vision is too massive to be on the horizon. Perhaps more important than sensing
discussed here; we limit ourselves to those considerations speed is the fact that, because they rely only on conven-
that bear directly on the robotics applications of stereo. tional cameras, the physical packaging of stereo systems
Although passive stereo vision is one of the oldest can be made fairly small. For example, Figure 12 shows
research topics in the Computer Vision community, its use the layout of the Point Grey trinocular system which is a
in robotics was limited by the large amount of computa- little over 100" on the side. Rapid decrease in packag-
ing size is amplified by the advent of progressive scan
tion required, basically the equivalent of dozens of correla-
board cameras as standard commercial equipment.

107
for longer range, higher speed driving. For example, the
work on unmanned ground vehicles reported in [19,20]
involves the demonstration of mobile robots with target
speeds of 20mph operating in unstructured environments.
In that case, ranging and obstacle detection must take
place at ranges of tens of meters.
Stereo has even been applied to applications in which
ranging at long distances. e.g., loom, is required. For
example, the system described in [40] is able to reliably
detect 14cm obstacles at over lOOm for on-road obstacle
Figure 11 Example data from the Point Grey detection. This system, illustrated in Figure 14, computes
trinocular stereo system. stereo disparity after rectification of the image with
respect to the anticipated ground plane. A similar tech-
nique, known as focused stereo [5], had been used before
but required specialized hardware for real-time perfor-
mance [17]. At the current rate of progress in stereo, it is
clear that the same real-time algorithm for vehicle guid-
ance applications will soon be operational on conventional
processors [ 151.
Figure 12 Compact layout of stereo system As mentioned before, another dimension of many robotics
The impact of size reduction on robotics applications is applications is robustness to environmental conditions. In
considerable. As we have seen, laser and structured light that respect, additional developments [20] include the use
systems are still too large for many robotic systems and of conventional infra red cameras or of light intensifiers
require further integration. In contrast, the small stereo for night range sensing, another example of how the use of
systems can now be easily integrated in the small robots stereo permits the conventional use of sensing hardware.
that are becoming increasingly important. For example, Those developments open the door to continuous opera-
[I81 describes a man-packable robot from IS Robotics, tion of stereo systems.
equipped with a stereo system from JPL (Figure 13.) This
stereo system is able to generate 120x128 range images at
5Hz. The range images are used for obstacle avoidance in
outdoor, unstructured terrain at travel speeds close to I d s .
This example is typical of the impact that the recent
progress in stereo has had on robotics systems: The stereo One of the input images Rectified disparity map
package, including computing on a standard PC104 CPU,
is small enough and of sufficiently low power to provide
ranging capabilities where laser systems would still be too
large for rugged terrain driving. As a result, capabilities
for rugged terrain driving on a small robot can be demon- Detected vertical surfaces Detected obstacles
strated; those would not have been possible a few years (in white) (in black)
ago, and will lead to new developments in future small- Figure 14 Typical example of real-time stereo
scale robotic systems [39]. processing from obstacle detection (from [40])
Owing to the use of multiple cameras to increase resolu-
tion, it is now possible to use stereo for mapping applica-
tions which require higher range resolution than that
required by pure navigation applications. One example is
reported in [I61 in which a track robot is equipped with a
trinocular stereo system which is used for building maps
of an unstructured environment -- the application is the
Figure 13 The JPL stereo system on the Urbie robot mapping of the inside of the Chernobyl reactor. The robot,
The use of stereo for navigation is becoming widespread, stereo system, and an example map built from multiple
both for short range application, as shown above, and also range images are shown in Figure 15. This example is sig-

108
niticant because active ranging is typically not a viable ity of using 3-D mapping and navigation techniques for
option in applications such as robotic remediation of AUVs. New techniques for robust operation of range sen-
nuclear sites. Thus, the advances in stereo ranging may sors in hostile environmental conditions have substantially
open up a new arena for more capable robotic systems. increased the possibilities in field robotics. The rapid
advances in stereo systems over the past few years offer
the opportunity for using range imaging on small-size
robots.
Despite all those successes, the field is still quite a way
from the “dream” range sensor for robotics: short to long
range, better than frame rate acquisition speed, high reso-
lution, color data, small size comparable to video cameras
and low power draw. Some of the research directions indi-
cated above attempt to bring such a dream closer to reality.
For example, active research in scannerless approaches
Figure 15 Typical example of autonomous mapping and to on-chip integration of ranging functions will be crit-
from stereo; robot and trinocular head (left) and
reconstructedmap of the environment (right). ical in moving closer to a true range imager. Research and
development in stereo is progressing quickly toward gen-
Finally, stereo has advanced from research work in Com- eral-purpose range imagers.
puter Vision to a practical solution to many robotic prob- Acknowledgements
lems through algorithmic advances, increase in computing
power, size reduction in computing and sensing. Fast The author wishes to thank the following individuals and
range imaging with an accuracy compatible with naviga- institutions for providing illustration and data: Sebastian
tion and mapping application is now possible, enabling the Thrun (Fig. l), Omead Amidi (Fig. 2), Tony Stentz (Fig. 3
implementation of systems not possible with the larger, & 4), Dirk Langer, Z+F Inc. (Fig. 5 ) , Z-Cam Inc. (Fig. lo),
more costly laser systems. Point Grey Research (Fig. 11 & 12), Larry Matthies (Fig.
13), Scott Thayer (Fig. 15).
Despite all those successes, stereo still has substantial
drawbacks which are current topics of development. Since References
it relies on ambient illumination, stereo is much more sen-
sitive to environmental conditions, in particular for out- [I] J.P Anthes et al. Non-scanned RADAR imaging and applica-
door applications. More limiting is the fact that the range tions. Proc. SPIE. Vol. 1936. pp. 11-21. 1993.
[2] J.A. Berardin, E Blais, M. Rioux, M. Cournooyer, D.Laurin,
resolution and accuracy of stereo are still far below what
and S.MacLean. Short and medium range 3D sensing for space
can be achieve with active techniques, such as TOE As a applications. Proc. SPIE Aerosense 97, Visual Oformution Pro-
result, despite cost and size, active scanners still remain cessing VI. Vol. 3074, pp. 29-46, Orlando, FL. 1997.
preferred solution for applications in which accuracy mat- [3] E Blais, M. Rouix, J. Domey, and J.A. Berldin. On the imple-
ters most. Because the only way to alleviate this problem mentation of the BIRIS range sensor for applications in robotic
is to increase image resolution, another leap in computing control and autonomous systems. Proc. of tlze Cunudiun Confer-
power will probably be necessary before stereo closes that ence on Electrical and Computer Engineering, Vol. I , pp. 31.1.I ,
gap. Ottawa, Ont. 1990.
[4] V. Brajovic. A Triangulation-Based VLSI Range Mapper. I n
7.0 Conclusion Proc. ISA/Expo’98. 1998.
[ 5 ] P. Burt, L. Wixson, G. Salgian. Electronically directed “focal
Advances in sensor design, electronics integration, and stereo”. In Proc. of the Fifth International Coigerence on Corn-
computing power have allowed the development of new ptiter Vision. pp. 94- 101. Cambridge. 1995.
solutions for range sensing systems. Performance is dra- [6] C. Eveland, K. Konolige, R.C. Bolles. Background modeling
matically improving on all fronts: acquisition speed, accu- for segmentation of video-rate stereo sequences. I n Proc. lEEE
racy, packaging size, and robustness. This progress is Computer Society Coifererice on Computer Visicin und Puttern
having an impact on robotics, not just in improved perfor- Recognition. 1998.
mance of existing robotic approaches, but also in new [7] P.Fornoff. 3-D robot guidance with the PERCEPTRON
opportunities and lines of research. For example, afford- LASAR System. In Proc. SPIE vo1.2271.Industrid Applications
able and accurate laser scanners now permit the develop- of Laser Rarlur. 1994.
[8] D. Fox, W. Burgard, and S. Thrun. Markov localization and
ment of practical approaches to 3-D mapping, an advanced
applications. In Modeling und Plurining,for Sensor-Bused Intelli-
which has spurred a flurry of activity in this area. Similar gent Robot Sy.stern.s. Springer Verlag, Berlin, 1999.
systems with long-range capabilities open up the possibil-

109
[9] A. Gruss, L.R. Carley and T. Kanade. Integruterl Sensor und the Workshop on Interpretation of 3 - 0 Scenes, Austin, TX.
Range-Finding Analog Signal Processor. IEEE JSSC, vol. 26, No. November 27-29. 1989.
3, March 1991. [26]J.T. Sackos, R.O. Nellums, S.M. Lebien, C.F. Diegert, C.F.,
[ IOIJ. Hancock, D.Langer, M. Hebert, R. Sullivan, D. In,'01 marson, J.W. Grantham, T. Monson. Low-cost, high-resolution, video-
E. Hoffman, M. Mettenleitner, C. Froehlich. Active Laser Radar rate imaging optical radar. In Proc. of the SPIE. vo1.3380. Laser
for High Performance Measurements. I n Proc. IEEE Internn- Radar Technology and Applications III. 1998.
tionul Conjkrence on Robotics and Automation. May 1998. [27]J.T. Sackos. Building accurate geometric models from abun-
[ 1 I] Y. Huimin, L. Zhukang. Scannerless laser three dimensional dant range imaging information. In Proc. ofthe SPIE. vo1.3065,
imaging method. In Proc. of the SPIE. ~01.3558,p. 49-52. 1999. Laser Radar Technology and Applications 11. 1997.
[ 12]K. Konolige. Small Vision Systems: Hardware and Imple- [28] K. Sato, S. Inokuchi. Range picture input system based on
mentation. Eighth International Symposium on Robotics space-encoding. Transactions of the Institute of Electronics and
Reseurch. 1997. Communication Engineers of Japan, Part D, vol.J68D, no.3. 1985.
[I31 H. Lamela, E. Garcia, A.De La Escalera, M. Salichs. A new [29]Y. Sato, K. Araki, and S. Parthasarathy. High speed
laser triangulation processor for mobile robot applications: pre- rangefinder. Proc. Optics, Illumination, arid Image Sensing for
liminary results. In Proc. Intelligent Components for Vehicles. Machine Vision 11, SPIE, vol. 850, pp. 184-188, 1987.
1998. [30] V.V. Savelyev, P.E. Tverdokhleb, A.V. Trubetskoi. Y.A.
[14]F. Lu and E. Millios. Globally consistent range scan align- Shchepetkin. Three-dimensional image generation by means of a
ment ji)r environment mapping. Autonomous Robots, Vol. 4. cascade high-speed acousto-optical dejector. In Optoelectronics,
1997. Instrumentation and Data Processing, no.2. 1997.
[15]Q.T. Luong, J. Weber, D. Koller, J. Malik. An integrated [311W. Schroeder, E. Forgber, S. Estable. Scannerless laser
stereo-based approach to automatic vehicle guidance. In Proc. of range camera. Sensor Review, vo1.19, no.4, p. 285-91. 1999.
the Fijth International Conference on Computer Vision, pp. 94- [32]Schwartz ElectroOptics, Inc., Autosense Scanner.
101. Cambridge. 1995. www.seord.com/comsen/autosen.htm
[ 16]M. Maimone, L. Matthies, J. Osborn, E. Rollins, J. Teza, S. [33]D. Simon, M. Hebert, T. Kanade. Real-time 3-D pose estima-
Thayer. A photo-realistic 3-D mapping system for extreme tion using a high-speed range sensor. Proc. International Confer-
nuclear environments. In Proc. IEEWRSJ International Confer- ence on Robotics arid Automation. May 1994.
ence on Intelligent Robotic Systems. Victoria, B.C. 1998. [34]A. Stentz, J. Bares, S. Singh. A robotic excavator for autono-
[ 171 R. Mandelbaum, M. Hansen, P. Burt, S . Baten. Vision for mous truck loading. In Proc. IEEE/RSJ International Coilfererice
autonomous mobility: image processing on the VFE-200. Proc.of on Intelligent Robotic Systems. Victoria, B.C. 1998.
the 1998 IEEE International Symposium on Intelligent Control. [35] D.J. Svetkoff, D.B.T. Kilgus. Comparison of beam deflectors
1998. for ultra-high speed 3Dimaging. Proc. SPIE, Vol. 2909. Three-
[I8]L. H. Matthies, C. Won, M. Hebert, L. E. Parker,G. Dimensional Imaging and Laser-Based Systems for Metrology
Sukhatme. System design of a portable mobile robot for urban and Inspection 11. 1997.
reconnaissance. Proc. SPIE AeroSense Conference, Unmanned [36]S. Thrun, D. Fox, and W. Burgard. Probabilistic mapping of
Ground Vehicle Technology workshop. Orlando. 1999. an environment by a mobile robot. I n Proc. ofthe IEEE Intemu-
[ 191L. Matthies, A. Kelly, T. Litwin, G. Tharp. Obstacle detection tional Conference on Robotics arid Automation . 1998.
for Uninanned ground vehicles: A progress report. Springer-Ver- [37]C.S. Tsai. Recent progress on guided-wave acoustooptic
lag. 1998. devices and applications. In Proc. First European Coi$erence on
[20] L.H. Matthies, T. Litwin, K. Owens, A. Rankin, K. Murphy, Integrated Optics. 1981.
D. Coombs, J. Gilsinn, H. Tsai , S. Legowik, M. Nashman, B. [38]www.3dvsystems.com/zcam~fs.html
Yoshimi. Performance evaluation of UGV obstacle detection with [39] C.R. Weisbin, J. Blitch, D. Lavery, E. Krotkov. C. Shoe-
CCDIFLIR stereo vision and LADAR. Proceedings of the 1998 maker, L.H. Matthies, G. Rodriguez. Miniature robuts Jur spuce
IEEE Inteniational Symposium 011 Intelligent Control. 1998. and military missions. IEEE Robotics & Automation Magazine,
[21]J.R. Miller, 0. Amidi. 3-D Site Mapping with the CMU ~01.6,no.3. 1999.
Autonomous Helicopter. In Proc, of the 5th Intemational Confer- [4O]T. Williamson, C. Thorpe. A trinocular system for highway
ence on Intelligent Autonomous Systems (IAS-5).June, 1998. obstacle detection. Proc. IEEE International Conference on Coni-
[22]R. Miyagawa, T. Kanade. CCD-Based Range-Finding Sensor. puter Vision and Pattern Recognition. 1998.
IEEE Transactions on Electron Devices. Vol. 44, No. 10. Oct. [41]A. Yokoyama, K. Sato, T. Yoshigahara, S. Inokuchi. Realtime
1997. Range Imaging using Adjustment-Free Photo-VLSI - Silicon
[23]Y. Nishiyama, H. Tsukahara, Y. Oshima, E Takahashi,T. Range Finder. Proceedings IROS, pp. 1751-1758, 1994.
Fuse, T. Nishino. Development of high-speed 3D inspection [42]L. Zhao, C. Thorpe. Qualitative and quantitative car tracking
system for solder bumps. I n Proc. SPIE Vol. 3306. Machine Vision from a range image sequence. I n Proc. CVPR'98. pp. 496-501.
Applicatioris in Industrial Inspection VI. 1998. Santa Barbara. 1998.
[24] D. Nitzan, A.E. Brain, R.O. Duda. The measurement and use [43]L. Zhao, C. Thorpe. Qualitative and quantitative tracking
of registered rejectance and range data in scene analysis. Pro- from a range image sequence. Proc. Computer Vision und Pattern
ceedings of the IEEE, ~01.65,no.2, p. 206-20. 1976. Recognition. 1998.
[25]M Rioux, F. Blais, J.A. Beraldin and P. Boulanger. Range
imaging sensor development at NRC laboratories. Proceedings of

110

You might also like