2004 IFToMM DatagloveBasedGraspPlanningforMulti FingeredRobotHand
2004 IFToMM DatagloveBasedGraspPlanningforMulti FingeredRobotHand
net/publication/272416042
CITATIONS READS
10 256
2 authors:
All content following this page was uploaded by Yuru Zhang on 12 July 2015.
Abstract: We developed a dataglove based grasp planning reflected in the robot motions. This may result in a fatal
system for grasp simulation, visualization, and analysis of motion accident. Robot teaching in VR environment can overcome
mapping from human hand to robot hand. Mapping in joint this problem. Takahashi et al. [4] presented a robot teaching
coordinate system during preliminary grasp is achieved. The method using a dataglove to measure finger motion of the
system can effectively analyze motion mapping from human hand
to robot hand in a virtual environment. It is also useful in building
human in a VR environment. Kaiser et al., [5] analyzed
virtual prototypes of different hand design and determine how the human skill. For a dexterous hand, it is desirable to control
design affect the grasp ability. This paper describes the working it by tracking compatible motions of human hand. Grasp
principle, the hardware and software architecture, and the grasp planning of robot hand involves determining the hand
strategy. Furthermore, we developed a classifier for human hand placement relative to the object of interest as well as the
grasp according to features of human grasp represented by posture of the hand assumed in grasping the object. The
Gaussian Mixture Model (GMM). The taxonomy of robot hand operator's movements in the VR space are interpreted as
grasp can be achieved by the joint coordinate mapping from robot task-level operations. Takahashi and Ogata
human hand to robot hand. Our aim is to control robotic finger investigated the transfer of gross human motion in the VR
motions through human grasp experience.
environment to robot motion [6].
Keywords: Grasp planning, Motion mapping, Gaussian mixture Our aim is to develop a friendly task-teaching system
model for multi-fingered robotic hands with the ability of learning
human motion. The system will provide solutions to map
human skill to robot hands and thus enhance the usage and
1 Introduction productivity of robots. This paper presents a teaching
In the field of robot hand programming, grasp planning method for grasp planning of multi-fingered robot hand
involves determining the hand placement relative to the based on a data glove. Position and orientation of the robot
object of interest as well as the posture of the hand assumed hand is determined on line by the operator. In the system,
in grasping the object. It is computationally intensive due grasp planning is made more tractable by using cues
to both the large search space and the incorporation of extracted from human execution of grasps. We use pattern
geometric and task-oriented constraints. Globally optimal recognition technology to design a classifier, which
grasp planning using comprehensive or exhaustive search analyzes grasps of human hand and realizes the taxonomy
is not feasible, especially with a high number of joints of of grasp. This research has provided a natural and intuitive
the robot hand. Telemanipulation entails tracking human means for human to control a robot hand by wearing an
finger motions to control robotic finger motions, with the instrumented glove. Data measured during the human
intention of providing human-like versatility and execution of the task produces commands to the robot
complexity in robot hand performance. However, system to replicate the observed task firstly in VR
teleoperation is less suitable for controlling robots environment and then in physical environment.
performing repetitive tasks, as it requires the full time The remainder of this paper is laid out as follows.
attention of a human operator. Section 2 describes the system implementation. Then, in
Napier [1] performed the seminal work in analyzing the section 3, the grasping simulator is presented, grasp
grasping movements of the human hand. Recently, taxonomy based on GMM are shown in section 4,
techniques using human motion measurements as teaching experimentation and results are shown in section 5. Finally,
data for robots are proposed in a teaching by showing section 6 summarizes discussion and future works.
method [2-3]. Task programming based on the observation
of human operation is viable for humanoid robots because 2 System Implementation
it is not necessary to describe the details of motions and Figures 1 and 2 show the hardware and software
forces explicitly for the robot to accomplish the task. Direct architectures of the grasp planning system.
teaching presents one difficulty that the operator undergoes
stress because a mistake of the operator is immediately
CyberTouch
User interface
Data Pre-processing
collection
S1 GMM Outpu t
classifier
S2 S3
GMM
Training parameter Basic grasp
data estimation models
Prob. Computation
GraspV’s GMM
λV
P ( X / λV )
Prob. Computation
5 Experimentation and Results hand grasp database, 20 of which are training samples, the
Feature extraction process focuses on the changes of joint remaining are testing samples. Initialize the GMM
angle. The grasp gesture input device is a right-handed parameters by the modified K_means algorithm, a general
Cyberglove with 18 sensors, whose sampling rate is method for vector quantization. The mixtures number of
38400-baud rate. Via CyberGlove we can get the data of 18 GMM is 5; Training samples are 15 dimensional vectors,
sensors during each sample time. Since the BH3 hand has a grasp types is 5. According to the result, we find that the
total of 9 degrees of freedom, we can use the 9 sensor data GMM classifier can classify the five types of grasp with an
of thumb, index and middle finger of human hand to accuracy of 88%.
mapping human hand motion to the BH3 hand. The data
used are from sensors: thumb rotation, thumb TMJ
(Trapeziometacarpal Joint), thumb IJ (outer thumb joint),
thumb abduction (angle between thumb and index finger),
index PIJ (joint second from finger tip), index DIJ (joint
closest to finger tip), middle PIJ, middle DIJ, middle
–index abduction (angle between middle and index fingers)
[15]
, as shown in figure 8. Five types of grasp are recognized,
which are hook grip, power grasp, cylindrical grasp, lateral
pinch, chuck grip. We define the grasp of BH3, which are
pose like human hand during grasping. The hook grip of
human hand and BH3 are shown in Figure 9 and Figure 10,
respectively. 30 observation data corresponding to each Figure 8 Anatomical Definitions of the Human Hand
type grasp are obtained and respectively stored in single
Since the robot hand is very likely to be kinematically
dissimilar from the human hand, controlling its degrees of
freedom is not very intuitive to some extent and thus
requires some amount of research on how to map human
hand motion to robot hand motion.
Acknowledgements
This research is sponsored by the Natural Science
Foundation of China (Grant No. 59985001) and the
Doctoral Grant of the Education Ministry of China (Grant
No. 2000000605).
References
Figure 9 Hook grip of the BH3 1. Napier, J. R, The prehensile movements of the human hand,
J. Of Bone and Joint Surgery, 38(4): 902-913,1956
2. Kuniyoshi, T., Inaba, M., Inoue, H., Teaching by showing:
generating robot programs by visual observation of human
performance, In: Proc. 20th Int. Symp. Ind. robots,
119-126,1989
3. Kang, S.B., Ikeuchi, K., Toward automatic robot instruction
from perception-temporal segmentation of tasks from human
hand motion, IEEE Trans. on Robotics and Automation,
11(5): 670-681, 1995
4. Takahashi, T., Ogata, H., Robotic assembly operation based
on task-level teaching in virtual reality, In: Proc. of ICRA,
Figure 10 Hook grip of the human hand Nice, France, 1083-1088,1992
5. Kaiser, M., Dillmann, R., Building Elementary Robot Skills
from human Demonstration, In: Proc. of ICRA, Minneapolis,
6 Discussion and Future Works
MN, USA, 2700-2705 ,1996
Although the system is still in an initial phase, we believe it 6. Takahashi, T., Ogata, H., Muto, S., A method for analyzing
is one of major steps in implementing both human assembly operations for use in automatically
tele-manipulation and autonomous manipulation. The generating robot commands, In: Proc. of ICRA, Atlanta,GA,
work presented here develops a procedure intended to USA, 695-700,1993
further this research. It reduces the complexity of planning 7. Kang, S.B., Ikeuchi, K., Toward automatic robot instruction
grasp tasks by taking advantage of human intelligence, from perception — mapping human grasps to manipulator
learning and experience and combines with the strength, grasps, IEEE Trans. on Robotics and Automation, 13(1):
endurance and speed of the robot hand. 81-95,1997
In summary we feel the system has the following 8. Weston B. Griffin, et al, Calibration and Mapping of a
Human Hand for Dexterous Telemanipulation, in Proc. 2000
advantages:
ASME IMECE DSC-Symposium on Haptic Interfaces, 1-8,
(1) Novel visualization environment for user to 2000
understand pre-grasps for human hand and robot 9. Kawasaki, H. et al, Virtual teaching based on hand
hand at the same time. manipulability for multi-fingered robots, In: Proc. of ICRA,
(2) We give a grasping taxonomy for robot hand Seoul, Korea, 1388-1393, 2001
based on GMM. 10. Wren, D, Fisher, R.B., Dexterous hand grasping strategies
(3) As an integrated graphical simulation platform, using preshapes and digit trajectories, In: Proc. of ICSMC,
the grasp simulator is a virtual environment with 910-915, 1995
low cost and high quality. In addition, it has the 11. Cutkosky, M. R., On grasp choice, grasp models, and the
design of hands for manufacturing tasks, IEEE Trans. on
flexibility to integrate with many PC based
Robotics and Automation, 5(3): 269-279,1989
applications.
(4) By employing database of grasp, we can plan
12. Reynolds, D.A., Speaker identification and verification
using Gaussian mixture speaker models, Speech
robot grasp online or offline, and replicate the Communication, 17: 91-108,1995
same grasp exactly. 13. Jiyong, M.,Wen G., The upervised Gassian mixture model, J.
(5) The simulator can be used to find the most of Comput. Sci.& Technol., 13(5): 471-474, 1998
effective motion mapping method to be used for 14. Jiangqin, W., et al, A hierarchical DGMM recognizer for
preliminary grasp planning by comparing the Chinese sign language recognition, J. of Software, 11(11):
human hand motion and robot hand motion in 1430-1439, 2000
virtual environment using different mapping 15. CyberGlove User’s Manual, Virtual Technologies Inc, 1997
methods.