0% found this document useful (0 votes)
105 views

Hand Tracking-Based Motion Control For Robot Arm Using Stereo Camera

The document discusses a system that uses hand tracking with a stereo camera to control a robot arm. The system tracks hand motion using a Leap Motion controller and sends the coordinate data over LAN to move the robot arm accordingly. Experimental results showed the system could successfully control the robot arm motion with low error rates for different movements.

Uploaded by

Maitriya Damani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
105 views

Hand Tracking-Based Motion Control For Robot Arm Using Stereo Camera

The document discusses a system that uses hand tracking with a stereo camera to control a robot arm. The system tracks hand motion using a Leap Motion controller and sends the coordinate data over LAN to move the robot arm accordingly. Experimental results showed the system could successfully control the robot arm motion with low error rates for different movements.

Uploaded by

Maitriya Damani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Hand Tracking-based Motion Control for Robot

Arm Using Stereo Camera


Budi Sugandi Handri Toar Ary Alfianto
Electronics Engineering Electronics Engineering Electronics Engineering
Politeknik Negeri Batam Politeknik Negeri Batam Politeknik Negeri Batam
Batam, Indonesia Batam, Indonesia Batam, Indonesia
[email protected] [email protected] [email protected]

Abstract—In this paper, we designed an application and able to control robotic manipulator and a mobile robot with
implemented the robot arm control by hand tracking using the gesture tracking. Mehmud et al. [4] proposed the analysis
stereo camera. The camera captured and tracked the hand and orientation for robotic planer arm. They can specify the
motion and saved the coordinate position or the hand motion movement of the robot to achieve the desired position.
position in 3D. Our system was designed to track pitch, yaw Gochev et al. [5] developed the motion planning for robotic
and roll motion of the hand. Besides that, the system could also manipulators with independent wrist join. Sato, et al. [6]
be tracked the hand motion of forwarding and backward proposed of intelligent control of an arm robot equipped with
motion. The hand tracking data were then saved and processed CMOS camera using a new type of image processing. He
in our system. By using TCP/IP communication, those data
applied the image processing such as pattern recognition,
were sent to the arm robot and be transformed become the
movement of the arm robot. Our system could move the arm
color identification and object tracking to obtain the robot
robot by the tracking of the hand motion. The experimental position.
results showed that our system successfully reduces the error Our research focuses on developing the robot arm control
of the arm robot motion for each motion such as pitch, yaw by hand tracking using a stereo camera that is leap motion
and roll motion with the motion error 7.28o, 4.70o, 5,37o, controller. We tracked the hand motion using leap motion,
respectively. and we translated the coordinate of the hand motion to
control the arm robot. The communication between the
Keywords—arm robot, stereo camera, hand motion, hand
tracking
controller and the arm robot was connected by using LAN.
Therefore the arm robot can be controlled remotely.
I. INTRODUCTION This paper is organized as follows. In section 2, the hand
motion related problems are given. In section 3 and 4, the
Nowadays, robots have become important over a wide characteristic of robot manipulator and stereo camera using
range of applications to assist the human works. Robots can in the research are presented. The experimental setup and
be operated in manufacturing, in the hospital to make a results are presented in section 5 and 6.
surgery, and any hazardous place or material. Consequently,
we should understand how robot work and how to program
the robots to make them have an effective motion as we II. METHOD
wanted. One kind of robot that often used in manufacture is a
robotic manipulator. Robotic manipulator is a kind of robot A. Hand Motion Tracking
that can perform several tasks, such as picking and placing HSV color filtering is one of the color filterings that is
objects and can move similar to the function of the human used to distinguish one color to other colors. We can use the
arm. filter to distinguish the goal which has color that different
with the environment. We used the HSV color filter to detect
The application of hand tracking is one of the solutions to
the opponent robot that has certain designation color.
the need of human being to develop human-machine
interaction (HMI) that can be faster, reliable and has a The studies of hand motion tracking based on computer
function as the function of a human body organ. Using HMI, vision that support HMI were established by many
the hand tracking can be applied to control the machine such researchers. The studies still have many problems to estimate
as arm robot. Therefore, the hand tracking can be used to the motion of the hand as analyzed by Erol, et . al. [7]. Some
replace the function of the mouse to control the motion of the of the difficulties are:
arm robot.
a. Dimension problem; the human hand is an articulated
There are several technologies can be used to track hand object that has more than 20 DOF (Degree of Freedom).
motion, one of them is using Leap Motion Controller. Those Because there is interdependence among finger joins,
technologies give the opportunity to researchers to develop the DOF of hand is not more than 20
HMI using hand tracking. There are many researchers b. Self-occlusion; the motion of hand can be occluded by
already developed the system using leap motion controller other parts of the hand itself
such as [1]. They analyzed the accuracy and robustness of c. Processing speed; hand tracking has big data that should
the leap motion controller. Real-time robot control using leap be processed using high-speed computer
motion has been developed in [2]. They developed the hand
d. Uncontrolled environment; the environment condition
tracking system with accuracy till millimeter order. They
used the hand tracking position to calculate the joint angles cannot be predicted especially when background and
on robot rotation. Another approach were developed the light are changed.
home environment interaction via service robot and leap e. The speed of hand motion; the hand is able to move
motion controller [3]. They give the solution to the user to be very fast. The speed of hand movement sometimes

978-1-5386-8066-7/18/$31.00 ©2018 IEEE


cannot be detected by the camera. The sensor provides the coordinate position of our hand
and fingers represented in 3D coordinates, i.e. (X, Y, Z) as
B. Robot Manipulator shown in Fig. 2. After obtaining the coordinate position of
Robot manipulator is a kind of robot that can perform the hand from the sensor, we can transfer it to a robot arm
several tasks, such as picking and placing objects and can with some adjustments because the arm coordinate is not the
move similar to the function of the human arm. Robot same with our hand coordinate.
manipulator is built from a combination of link and joint or
axes that are connected by a link. The axes are the movable
components of the robotic manipulator that cause relative
motion between adjoining links. The robotic arm
manipulator constructed by the mechanical joins consists of
five principal types; two of the joints are linear and the others
in which three are rotary types. The linear types have relative
motion between adjacent links that is non-rotational, and the
rotary types have relative motion involves rotation between
links.
Fig. 1 shows one of the robot manipulator used in our
system.

Fig. 3. Hand Tracking by Leap Motion

III. EXPERIMENTAL RESULT


Our system was implemented as shown in Fig. 4. As
shown on the figure; first of all, the hand gesture is detected
and tracked by leap motion. The tracking result is displayed
Fig. 1. Robot Manipulator on the visual manipulator. The hand tracking data are then
sent to the robot manipulator through LAN communication.
The system sent the coordinate of the tracked hand motion
C. Stereo Camera and transferred to the robot manipulator. The visual
In our system, we used the leap motion controller as a manipulator will display the hand tracking position and the
stereo camera. Leap Motion is a device that consists of two robot manipulator position in pitch, yaw and roll position.
monochromatic IR cameras and three infrared LED. This
device can observe a roughly hemispherical area up to a Gesture Leap LAN Robot
distance of about 1 meter. The IR cameras generate almost tracking motion Manipulator
200 frames per second and sent through a USB cable to the
host computer, where it is analyzed by the Leap Motion
software. The software can synthesize 3D position data by Visual Visual
comparing the 2D frames generated by the two cameras. Fig. manipulator manipulator
2 shows the coordinate position of leap motion.
Fig. 4. Diagram Block of the System
Fig. Three shows how a Leap Sensor tracks our hand
when placed above it. It is the screenshot of the leap motion
To get better accuracy, the data of arm robot are taken
visualizer tool that is used to monitor the hand movements as
and compared with the real data. The data are taken based on
detected by the sensor.
the change of motion angle for such motion pitch, yaw, and
roll. Each data is taken three times, and we calculated the
average of each data. The results are shown in Table I, Table
II and Table III for each motion, respectively. Table I
showed the pitch motion tracked by the leap motion. We got
the average error 7.28o. Table II showed the yaw motion
tracked by the leap motion. We got the average error 4.70o.
Table III showed the roll motion tracked by the leap motion.
We got the average error 7.28o.

Fig. 2. The coordinate position of Leap Motion


TABLE I. PITCH MOTION TABLE III. ROLL MOTION

Motion Pitch Pitch Pitch Motion Roll Roll Roll


Ave Error Ave Error
Angle 1 2 3 Angle 1 2 3
90° 70° 77° 72° 73.00° 17.00° 90° 86° 94° 88°` 89.33° 0.67°
80° 69° 72° 69° 70.00° 10.00° 80° 75° 81° 77° 77.67° 2.33°
70° 65° 60° 58° 61.00° 9.00° 70° 64° 71° 69° 68.00° 2.00°
60° 49° 50° 45° 48.00° 12.00° 60° 52° 40° 58° 50.00° 10.00°
50° 41° 43° 42° 42.00° 8.00° 50° 46° 38° 43° 42.33° 7.67°
40° 24° 28° 25° 25.67° 14.33° 40° 40° 42° 44° 42.00° 2.00°
30° 9° 26° 26° 20.33° 9.67° 30° 34° 36° 39° 36.33° 6.33°
20° 10° 13° 25° 16.00° 4.00° 20° 24° 29° 30° 27.67° 7.67°
10° -2° 10° 15° 7.67° 2.33° 10° 18° 23° 23° 21.33° 11.33°
0° -3° 0° 5° 0.67° 0.67° 0° 2° 4° 0° 2.00° 2.00°
-10° -15° -12° -8° -11.67° 1.67° -10° -12° -15° -12° -13.00° 3.00°
-20° -22° -17° -16° -18.33° 1.67° -20° -22° -26° -17° -21.67° 1.67°
-30° -33° -15° -19° -22.33° 7.67° -30° -25° -28° -24° -25.67° 4.33°
-40° -39° -28° -23° -30.00° 10.00° -40° -38° -31° -27° -32.00° 8.00°
-50° -47° -46° -38° -43.67° 6.33° -50° -49° -49° -44° -47.33° 2.67°
-60° -53° -65° -52° -56.67° 3.33° -60° -58° -56° -55° -56.33° 3.67°
-70° -60° -69° -60° -63.00° 7.00° -70° -59° -71° -65° -65.00° 5.00°
-80° -72° -81° -70° -74.33° 5.67° -80° -84° -89° -73° -82.00° 2.00°
-90° -82° -75° -89° -82.00° 8.00° -90° -110° -109° -110° 109.67° 19.67°
Average Error 7.28° Average Error 5.37°

TABLE II. YAW MOTION


TABLE IV. FORWARD MOTION
Motion Yaw Yaw Yaw
Ave Error
Angle 1 2 3 Distance Data read Error
-90° -82° -87° -85° -84.67o 5.33° 4 mm
10 mm 14 mm
-80° -79° -77° -75° -77.00o 3.00° 5 mm
20 mm 25 mm
-70° -72° -68° -73° -71.00o 1.00° 3 mm
30 mm 33 mm
-60° -58° -57° -51° -55.33o 4.67° 2 mm
40 mm 42 mm
-50° -43° -44° -39° -42.00o 8.00° 6 mm
50 mm 56 mm
-40° -33° -38° -30° -33.67o 6.33° 1 mm
60 mm 61 mm
-30° -22° -33° -33° -29.33o 0.67° 2 mm
70 mm 68 mm
-20° -17° -20° -21° -19.33o 0.67° 4 mm
80 mm 76 mm
-10° -8° -8° -10° -8.67o 1.33° 9 mm
90 mm 81 mm
0° 4° 0° 6° 3.33° 3.33° 6 mm
100 mm 84 mm
10° 11° 16° 15° 14.00° 4.00° 2 mm
110 mm 108 mm
20° 22° 21° 22° 21.67° 1.67° 3 mm
120 mm 117 mm
30° 24° 28° 25° 25.67° 4.33° 9 mm
130 mm 121 mm
40° 33° 35° 32° 33.33° 6.67° 7 mm
140 mm 133 mm
50° 46° 43° 46° 45.00° 5.00° 7 mm
150 mm 143 mm
60° 57° 52° 50° 53.00° 7.00° 4.66 mm
Average Error
70° 59° 67° 57° 61.00° 9.00°
80° 71° 78° 66° 71.67° 8.33°
The forward and backward motions were tracked by leap
90° 79° 89° 75° 81.00° 9.00° motion with error 4.66 and 5.55, respectively as shown in
Average Error 4.70° Table IV and Table V.
TABLE V. BACKWARD MOTION
TABLE IX. COMPARISON DATA BETWEEN CAMERA AND ARM
Distance Data read Error ROBOT FOR FORWARDING AND BACKWARD MOTION

10 mm 9 mm 1 mm Position Camera Arm Robot Formula


20 mm 18 mm 2 mm Max
30 mm 27 mm 3 mm Forward -130 30 y = -0.1685x
Middle 0 6.9 + 7.6346
40 mm 34 mm 6 mm
Min
50 mm 40 mm 10 mm
Backward 225 -30
60 mm 55 mm 5 mm
70 mm 65 mm 5 mm Fig. 5, Fig.6 and Fig.7 show the result of hand tracking
80 mm 71 mm 9 mm by leap motion and successfully move the arm robot to the
10 mm same position. The position of the hand and arm robot could
90 mm 80 mm
be observed in the monitor so we can compare both data
100 mm 94 mm 6 mm together.
Average Error 5.55 mm

The data from leap motion could not directly send to arm
robot. Therefore we need also the motion data from arm
robot. Table VI until Table IX shows the comparison data
between leap motion data and arm robot. According to the
data, we could interpolate the leap motion data to be
transferred to arm robot. For example for yaw motion, we
obtained that the motion of leap motion is the opposite of the
motion of the arm robot. Moreover, so on for the others
motion.

Fig. 5. Hand Tracking by Leap Motion

TABLE VI. COMPARISON DATA BETWEEN CAMERA AND ARM ROBOT


FOR YAW MOTION

Target
Camera Arm Robot Interpolation
Angle
0 -90 90
y = -x
90 0 0
180 90 -90

TABLE VII. COMPARISON DATA BETWEEN CAMERA AND ARM


ROBOT FOR PITCH MOTION
Fig. 6. Comparison of Hand Tracking Position by
Leap Motion and arm robot position
Target
Camera Arm Robot Interpolation
Angle
0 90 63
y = -0.0913x
90 0 83 + 86.929
180 72 100

TABLE VIII. COMPARISON DATA BETWEEN CAMERA AND ARM


ROBOT FOR ROLL MOTION

Target
Camera Arm Robot Interpolation
Angle
0 100 90
y = -0.9x
90 0 180 + 180
180 -100 270
Fig. 7. Hand Tracking by Leap Motion
IV. CONCLUSION [2] V. Trinadh and S. Patel, “Real-time robot control using leap motion.
ASEE Northeast Section Conference”, (2015).
Our implementation of arm robot control has successfully [3] C.Georgoulas, A. Raza, J. Guttler, T. Linner and T. Bock, “Home
could control the arm robot with an average error for each environment interaction via service robots and the leap motion”,
hand motion such as pitch, yaw, and roll were 7.28o, 4.70o, Proceeding of 31st International Sysposium on Automation and
and 5.37o, respectively. The experimental results showed that Robotics in Construction and Mining (ISARC), Sydney, Australia,
(2014).
our method could control the arm robot by tracking the hand
motion using the stereo camera. [4] N. Mehmood, F. Ijaz, Z. Murtaza and Ali Shah,"Analysis of end-
effector position and orientation for 2P-3R planer pneumatic robotic
The accuracy and precision of the arm robot motion still arm”, Robotic and Emerging Allied Technology in Engineering
(2014) 47-50.
limited especially when the hand moves in high speed. The
leap motion could not detect the motion in such condition. [5] K. Gochev, V. Narayanan, B. Cohen and A. Likhachev,”Motion
planning for robotic manipulator with independent wrist joints”, IEEE
The problem remains for the next future work. Conference on Robotics and Automation, (2014) 461-468.
[6] Y. Sato,” Study of intelligent control of an arm robot equipped with c-
ACKNOWLEDGMENT mos camera using a new type of image processing”, Proceeding of the
54th meeting of the International Society for the System Sciences,
This work was supported by Politeknik Negeri Batam in (2010).
Madya Research Scheme. [7] A. Erol, G. Bebis, M. Nicolescu and R. Boyl,”Vision-based hand pose
estimation”, Computer Vision and Image Understanding (108) (2007)
52-73
REFERENCES
[1] F. Weichert, D. Bachmann, B. Rudak and D. Fisseler, “Analisis of the
accuracy and robustness of the leap motion controller”, Sensor,
13(5)(2013) 6380-6393.

You might also like