Navigation of An Omni-Directional Mobile Robot With Active Caster Wheels
Navigation of An Omni-Directional Mobile Robot With Active Caster Wheels
Abstract—This work deals with navigation of an wheel. Recently, researches on omni-directional mobile
omni-directional mobile robot with active caster wheels. robots with active caster wheels have drawn much attention.
Initially, the posture of the omni-directional mobile robot is Moore and Flann[8] and Berkermeier and Ma[9]
calculated by using the odometry information. Next, the position implemented a six-wheeled omni-directional mobile robot
accuracy of the mobile robot is measured through comparison and ODIS (Omni-Directional Inspection System) with the
of the odometry information and the external sensor
so-called “smart wheel” active caster wheel module,
measurement. Finally, for successful navigation of the mobile
robot, a motion planning algorithm that employs kinematic
respectively. Yi and Kim [13] initially developed the
redundancy resolution method is proposed. Through kinematic model of the omni-directional mobile robot with
experiments for multiple obstacles and multiple moving active caster wheels. Wada, et al.[10] developed the mobile
obstacles, the feasibility of the proposed navigation algorithm robot with two caster wheels and one rotational actuator.
was verified. Ushimi, et al.[11] developed an omni-directional vehicle with
two-wheeled casters. Lee, et al.[12] implemented and
I. INTRODUCTION proposed a motion generation algorithm for the caster-type
⎛l ⎞
+ ω k × ⎜ i + aj ⎟
⎝ 2 ⎠
Fig. 1. Modeling of Omni-directional mobile robot JJJJJG JJJJG
v c = v o 2 + η 2 k × O2 A2 + ωk × A2 C
We assume that every wheel retains point contact at the = θ ( sin ϕ i + cos ϕ j) × rk
2 2 2
ground, and that the wheel does not slip in the horizontal
direction, but it is allowed to rotate about the vertical axis. + η2 k × ( − d cos ϕ 2 i + d sin ϕ2 j) , (5)
Each wheel mechanism possesses its own driving and ⎛ l ⎞
steering actuators. Then, the instantaneous motion of each + ωk × ⎜ − i + aj ⎟
wheel mechanism can be modeled as two revolute joints and ⎝ 2 ⎠
one prismatic joint, as shown in Fig.1(b). Mobility of the and
JJJJJG JJJG
J
omni-directional mobile robot can be calculated as below. v c = v o 3 + η3k × O3 A3 + ωk × A3C
9
M = 3(8 − 1) − ∑ (3 − 1) = 3 (2) = θ ( sin ϕ i + cos ϕ j) × rk
3 3 3
, (6)
i =1
+ η3k × ( −d cos ϕ3 i + d sin ϕ3 j)
B. First-order Kinematics + ωk × ( -bj)
Consider the mobile robot depicted in Fig. 2. This system
where ω representing the angular velocity of the mobile
consists of three wheels, three offset link, and a top platform.
platform can be described as
Assume that the motion of the mobile robot is constrained to
ω = ηi + ϕi i = 1, 2,3 . (7)
the plane and there exists no sliding and skidding friction, but
T
that rotation of the wheel about the axis vertical to the ground v c = ⎡⎣vcx vcy ⎤⎦ of (4)-(6) and ω of (7) can be expressed as
is allowed. XYZ represents the global reference frame, and
one matrix form given by
xyz denotes a local coordinate frame attached to the mobile
platform; i, j, and k are the unit vectors of the xyz coordinate ⎡ −d sin ϕ1 − a r cos ϕ1 −a ⎤ ⎡η1 ⎤
frame. C denotes the origin of the local coordinate. u = ⎢ −d cos ϕ1 + l 2 − r sin ϕ1 l 2 ⎥ ⎢θ1 ⎥ , (8)
⎢ ⎥⎢ ⎥
l ⎢⎣ 1 0 1 ⎥⎦ ⎢⎣ϕ1 ⎥⎦
Wheel #3 A3
⎡ − d sin ϕ2 − a r cos ϕ2 − a ⎤ ⎡η2 ⎤
ϕ3
r O3 u = ⎢ −d cos ϕ 2 − l 2 −r sin ϕ 2 − l 2 ⎥⎥ ⎢⎢θ2 ⎥⎥ ,
⎢ (9)
b ⎢⎣
ĵ 1 0 1 ⎥⎦ ⎢⎣ϕ2 ⎥⎦
Y C Vc and
iˆ
ω a ϕ2 ⎡ −d sin ϕ3 + b r cos ϕ3 b ⎤ ⎡η3 ⎤
A1 u = ⎢⎢ − d cos ϕ3 −r sin ϕ3 0 ⎥⎥ ⎢⎢θ3 ⎥⎥ , (10)
X ϕ1 A2
Wheel #2
⎣⎢ 1 0 1 ⎦⎥ ⎣⎢ϕ3 ⎦⎥
d θ1 kˆwheel O2 where the operational velocity vector is defined as
O1 ⎡ vcx ⎤
Wheel #1
u = ⎢⎢ vcy ⎥⎥ . (11)
η1
Fig. 2. Kinematics description of the omni-directional mobile robot ⎢⎣ ω ⎥⎦
From (8) through (10), the mobile robot can be visualized as a
We define θ as the rotating angle of the wheel and ϕ as the parallel mechanism. The variable (η) interfacing the wheel to
steering angle between the steering link and the local x-axis. η the ground enables this model. Thus, the kinematics of the
denotes the angular displacement of the wheel relative to the mobile robot is instantaneously equivalent to that of typical
X-axis of the global reference frame. r and d denote the radius parallel robot that is connected to a fixed ground.
1660
The intermediate coordinate transfer method [14], which posture of the mobile robot is given by the ( 3 × 1 ) vector.
was popularly employed in parallel robot community, will be ⎡ x⎤
ξ = ⎢⎢ y ⎥⎥ .
employed to derive the forward kinematic relation. Taking
(15)
the inverses of (8)-(10), we have
⎢⎣φ ⎥⎦
⎡ l ⎤
⎢ − r sin ϕ1 − r cos ϕ1 2
r cos ϕ1 − ar sin ϕ1 ⎥
⎡η1 ⎤ ⎢ ⎥
⎢θ ⎥ = 1 ⎢ d cos ϕ −d sin ϕ l
d sin ϕ + ad cos ϕ ⎥ u
⎢ 1 ⎥ dr ⎢ 1 1
2
1 1
⎥
⎢⎣ϕ1 ⎥⎦ ⎢ ⎥
⎢ r sin ϕ l ⎥
r cos ϕ dr + ar sin ϕ − r cos ϕ
⎣⎢ ⎦⎥
1 1 1 1
2
(12)
⎡ l ⎤
⎢ −r sin ϕ 2 −r cos ϕ 2 2
r cos ϕ 2 − ar sin ϕ 2 ⎥
⎡η2 ⎤ ⎢ ⎥
⎢θ ⎥ = 1 ⎢ d cos ϕ l
−d sin ϕ2 − d sin ϕ 2 + ad cos ϕ 2 ⎥ u
⎢ 2 ⎥ dr ⎢ 2
2 ⎥
⎢⎣ϕ2 ⎥⎦ ⎢ ⎥
⎢ r sin ϕ l ⎥
r cos ϕ dr + ar sin ϕ + r cos ϕ
⎢⎣ 2 2 2
2
2
⎥⎦
(13)
and Fig. 3. The omni-directional mobile robot with active caster wheels
⎡η3 ⎤ ⎡ − r sin ϕ3 − r cos ϕ3 br sin ϕ3 ⎤
⎢θ ⎥ = 1 ⎢ d cos ϕ − d sin ϕ3 −bd cos ϕ3 ⎥⎥ u . (14)
⎢ 3 ⎥ dr ⎢ 3
1661
chains as described in Fig.5. The rotation of the caster wheel mobile robot. Fig. 7 shows the positions of the vertical plate
with respect to the base frame, which is represented by the in the laser range finder frame during the path following. Fig.
angle ηi , can be expressed as 8 shows external measurement of the vertical plate, which is
transformed to the global frame by (21). The horizontal bar
ηi = ϕi − φ .
denotes the shape of the vertical plate.
(16)
YG
Then if φ is given, then the robot posture can be determined
YC
as follows.
XC
Step 1. Set the initial positions ( O1 , O2 ) of the wheels
C φ
and angles of the caster wheels. y
O1 : ( xO1 , yO1 ) O2 : ( xO 2 , yO 2 )
Vertical
ηi [0] = ϕi [0] = θi [0] = 0 Plate
xO1[0] = d − l / 2, yO1[0] = − a .
xO 2 [0] = d + l / 2, yO 2 [0] = −a
(17)
x2 = xO 2 − d cos(η 2 ), y2 = yO 2 + d sin(η 2 )
Raw Data of Laser Range Finder
(19) 1800
1600
Step 4. Obtain the posture of the mobile robot
1400
C : ( xc , yc )
⎛π ⎞ 1200
xc = x1 + b cos ⎜ + φ ⎟
Y Position (mm)
⎝6 ⎠ 1000
⎛π ⎞ 800
yc = y1 + b sin ⎜ + φ ⎟ (20)
⎝ 6 ⎠ 600
φ = atan 2 ( ( y2 − y1 ), ( x2 − x1 ) ) 400
The odometry information of the omni-directional mobile
200
robot with active caster wheels can be obtained by using this
procedure at every encoder sampling time iteratively. 0
-500 0 500 1000
X Position (mm)
B. Experiment
Fig. 7. Positions of the vertical plate in the laser range finder frame
In order to evaluate the performance of the proposed
method for odometer, an experiment is conducted for a
rectangular trajectory. In experiment, we measure the real
position of the mobile robot by using a laser range finder that
is fixed at a certain position of the global frame. Fig. 6
displays the external measurement system, and the
specification of the laser range finder is shown in Table I. The
two coordinates x and y of the origin C of the moving frame
are the center position of the vertical plate on the top of
1662
Transformed Data of Laser Range Finder Trajectory Property
1200 1200
1000 1000
800
800
Y Position (mm)
Y Position (mm)
600
600 Trajectory
Internal Odometer
400 External Measurement
400
200
200
0
0
-200
-1200 -1000 -800 -600 -400 -200 0 200
X Position (mm)
-200
-1200 -1000 -800 -600 -400 -200 0 200
X Position (mm)
Fig. 8. Positions of the vertical plate in the global frame
Fig. 9. Trajectory property of the omni-directional mobile robot.
⎡ x ⎤ ⎡cos(θ ) − sin(θ ) ⎤ ⎡ xL ⎤ ⎡ Δx ⎤
⎢ ⎥= ⎢ ⎥⎢ ⎥+ ⎢ ⎥ , (21)
⎣ y ⎦ ⎣ sin(θ ) cos(θ ) ⎦ ⎣ yL ⎦ ⎣ Δy ⎦
Trajectory Property
50
where θ denotes the rotation of the laser sensor frame with 40
Trajectory
Internal Odometer
respect to the global frame, and xL and yL denote External Measurement
30
coordinates in the laser sensor frame, and Δx and Δy denote 20
the position of the laser sensor in the global coordinate frame.
Y Position (mm)
10
Fig. 9 shows the trajectory following property of the
0
omni-directional mobile robot. In the result of rectangular
trajectory tracking control, the solid line denotes the given -10
trajectory, the dotted line denotes the position history of the -20
mobile robot measured by odometer inside the robot, and the -30
dashed line denotes the position history measured by a URG -40
laser sensor from outside. The external measurement depicted
-50
in Fig. 9 is the collection of the center points of the horizontal -50 -40 -30 -20 -10 0 10 20 30 40 50
X Position (mm)
bars of Fig. 8. Although there are some noises owing to
resolution of the LRF sensor, it is proved that the Fig. 10. Odometry error of the omni-directional mobile robot.
omni-directional mobile robot has a good tracking
performance of the given rectangular trajectory. Fig. 10
shows the odometry error near the global origin position after IV. NAVIGATION
the path following. 1cm error occurs after traveling 4 m . All A. Navigation Algorithm
forms of mobile robots cannot help having such odometry For navigation, mobile robots should know their postures
errors. The mobile robot we developed has more complicated with respect to the global reference frame as well as the
structure compared with differential wheel type, which means positions of obstacles. Because the mobile robot we
that it can have more complicated systematic odometry error. developed has 3-DOF, more dexterous motion to avoid
Correction of systematic odometry errors in the obstacle is possible as compared to the differential type
omni-directional mobile robot is a future work. mobile robot. The first goal of navigation will be moving to
the goal point, and then the second goal will be avoiding
obstacles. To satisfy these purposes, positions of the mobile
robot, obstacles, and goal point should be estimated. The
navigation algorithm ought to minimize the distance between
the current position of the mobile robot and the goal point,
and maximizes the distances between the current position of
the mobile robot and obstacles. The performance indices are
set as follows.
1663
min{PG = ( x − xG ) 2 + ( y − yG ) 2 } distance.
(22)
max{Pi = ( x − xi ) + ( y − yi ) },
2 2
Relationship between Ki gaine and the obstacle distance (k=10,h=700)
10
where x and y denote the positions of the mobile robot,
9
xG and yG denote the positions of the goal point, and
8
xi and yi denote the positions of the i-th obstacle. 7
Weight factor K i
6
u = ω = J ⎢⎢ vcy ⎥⎥ , (23) 1
⎢⎣ ω ⎥⎦ 0
0 100 200 300 400 500 600 700 800 900 1000
Distance (mm)
J = [ 0 0 1] and
T
where ⎡⎣ vcx vcy ω ⎤⎦ denotes the
Fig. 11. Weight factor K i property with respect to the distance.
velocity of the mobile platform with respect to the global
reference frame. The general solution for (23) is given by
⎡ vcx ⎤ B. Experiment
⎢ v ⎥ = J + u + I − J + J ∇H ,
⎢ cy ⎥ ([ ] ) (24) Experimentation has been performed to verify the
navigation performance of the developed mobile robot. In the
⎢⎣ ω ⎥⎦
experiment, the odometry information was used for locating
where H defined in the homogeneous solution is the the mobile robot, and a laser range finder was used for
potential function that needs to be optimized. When the detecting obstacles and map building.
gradient of H is given as follows Fig. 12 shows the experimental environment. The primary
(
∇H = − K G
∂PG T
∂θ
+ ∑ Ki
∂Pi T
∂θ ), (25) task is that the mobile robot starts at (0, 0) and moves to the
goal point at (1000, 5000). While the mobile robot is
it implies that PG and Pi are to be optimized. Depending upon navigating, it gathers the geometric information of the
the sign of the coefficient K G and K i , the mobile robot is environment and the obstacles through a laser range finder.
controlled in such a way to minimize or maximize the given The results, the environment map and the trace of the mobile
robot, are shown in Fig. 13. The curved line in the center of
performance index. In this example, K G should be positive
the map denotes the trace that the robot has been on, and the
to minimize the distance between the robot and the goal, and dots denote the environment.
K i should be negative to maximize the distance between the In usual differential-driven mobile robots, the orientation
robot and the obstacles. Here, the vector θ = [ x y φ] of the mobile robot should be changed so that they avoid
T
1664
[2] P. F. Muir and C. P. Newman, “Kinematic modeling of wheeled mobile
robot,” Journal of Robotics System, vol. 4, no. 2, pp. 281–340, 1987.
[3] S. K. Saha and J. Angeles, “Kinematic and dynamics of a three-wheeled
Goal 2-DOF AVG,” in Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and
point Systems, pp. 1572-1577, 1989.
[4] P. Chen, S. Mitsutake, T. Isoda, and T. Shi, “Omni-directional robot
and adaptive control method for off-road running,” IEEE Trans. on
Robotics and Automation, vol. 18, no. 2, pp. 251-256, 2002.
[5] K. Tadakuma, S. Hirose, and R. Tadakuma, “Development of
VmaxCarrier2: omni-directional mobile robot with function of
step-climbing,” in Proc. IEEE Int. Conf. on Robotics and Automation,
pp. 3111-3118, 2004.
[6] S. Ziaie-Rad, et al., “A practical approach to control and
Starting self-localization of Persia omni directional mobile robot,” in Proc.
point IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 3473-3479,
2005.
[7] J. E. M. Salih, et al., “Designing omni-directional mobile robot with
Fig. 12. The picture of real environment mecanum wheel,” American Journal of Applied Sciences, vol. 3, no.5,
pp. 1831-1835, 2006.
Map building and Moving path [8] K. L. Moore and N. S. Flann, “A six-wheeled omnidiretional
6000
autonomous mobile robot,” IEEE Control System Magazine, vol. 20, no.
6, pp. 53-66, 2000.
[9] M. D. Berkemeier and L. Ma, “Discrete control for visual servoing the
5000 ODIS robot to parking lot lines,” in Proc. IEEE Int. Conf. on Robotics
and Automation, pp. 3149-3154, 2005.
4000 [10] M. Wada, A. Takagi, and S. Mori, “Caster drive mechanism for
Y Position (mm)
ACKNOWLEDGMENT
This work was supported by the GRRC Project of
Gyeonggi Province Government, Republic of Korea
(2007-041-0003- 001).
REFERENCES
[1] C. Campion, G. Batin, and B. D’Andrea-Novel, “Structural properties
and classification of kinematics and dynamics models of wheeled
Mobile Robot”, IEEE Trans. on Robot and Automation, vol. 4, no. 2,
pp. 281–340, 1987.
1665