Haptic Interaction Between Human and Virtual Icub Robot Using Novint Falcon With CHAI3D and MATLAB
This document describes a preliminary investigation into haptic interaction between a human and a virtual iCub robot. The interaction is implemented using a Novint Falcon 3D joystick to connect to a virtual robot created by the iCub Simulator. Two approaches are introduced - one based on CHAI3D and one based on MATLAB/HAITIK. The goal is to allow the user to control the virtual iCub robot via the haptic joystick and receive force feedback to provide a sense of touch.
Haptic Interaction Between Human and Virtual Icub Robot Using Novint Falcon With CHAI3D and MATLAB
This document describes a preliminary investigation into haptic interaction between a human and a virtual iCub robot. The interaction is implemented using a Novint Falcon 3D joystick to connect to a virtual robot created by the iCub Simulator. Two approaches are introduced - one based on CHAI3D and one based on MATLAB/HAITIK. The goal is to allow the user to control the virtual iCub robot via the haptic joystick and receive force feedback to provide a sense of touch.
Haptic Interaction Between Human and Virtual iCub Robot
Using Novint Falcon with CHAI3D and MATLAB
Pierre Renon 2 , Chenguang Yang 2 , Hongbin Ma 1 and Rongxin Cui 3 1. Key Laboratory of Intelligent Control and Decision of Complex Systems, Beijing Institute of Technology, Beijing 100081, P.R.China E-mail: [email protected] 2. School of Computing and Mathematics, University of Plymouth, Plymouth PL4 8AA, United Kingdom E-mail: [email protected] 3. School of Marine Engineering, Northwestern Polytechnical University, Xian, P.R.China E-mail: [email protected] Abstract: The paper describes our preliminary investigation of haptic interaction between the human and virtual iCub robot. With haptic interaction, the user is able to have tactile sensensaiton of the robot environment, and hence to manipulate a robot remotely, for instance, from the users ofce to perform some tasks at home, or a patient operation, realized by a surgeon several miles away. In this work, haptic interaction has been implemented using a 3D joystick, Novint Falcon, to connect with a virtual robot created by the Cub Simulator, which works with YARP interface and simulates the real iCub robot developed by Italian scientists. To this end, two approaches based on CHAI3D and MATLAB are introduced and implemented in this paper and furthermore, some ideas to improve the human-robot interaction performance have also been discussed. Key Words: haptic interaction, virtual robot, iCub Simulator, CHAI3D, YARP, Novint Falcon 1 Introduction 1.1 Background Nowadays, the presence of robotic systems more or less evolved is common in our daily life. Since the start of this eld of study from the last century, their evolution was in- comparable. In the domains such as the automobile industry or the mobile phone factory, their uses have become essen- tial. Robotic systems have been designed with various de- vices such as robotic arms, mostly used in industry, to help people carry out useful work. Robots are articial systems with capacities which allow to them, the perception and ac- tion into the surrounding environment. Due to their uses in critical areas where the human life can be endangered, the necessity of a narrow interaction has to be dened. Humans and robots can perform tasks together and their relation be- came more important than a basic remote control to realize a task. The human/robot interaction is a large domain and a way of research is the haptic interaction, which has attracted more and more interests in the research community. 1.2 Related work The concept of the haptic interaction is to feel by the tactile sense of robot environment. Haptic interaction can lead to many applications in different domains such as the rehabilitation or the body enhancement [1][2][3], the tele- manipulation of device or the transfer of knowledge through robots [4]. Existing researches already provide some results for the possible applications of the sort of interaction. The study of this kind of interaction mostly uses virtual environment. They provide a suitable environment of tests and studies decreasing the dangers of a direct interaction hu- man/robot. This virtual interpretation of the reality became an important step in the integration process of the haptic ap- plication in the daily life. The understanding of the tactile perception has to be made in order to create mathematical models of the haptic interac- tion. The denition of the shapes, the textures, the surfaces denition and representation are some properties required to the virtual objects in the model [5]. The sense of touch with a virtual surface involves a tangential pressure on this sur- face, receiving in parallel feedbacks which vary according to the duration of the movement (pressure) and the surface properties. Nowadays, the static simulation where the scene displayed is not complex offers good results. However, there are still some limitations when the 3D environment is too complex and dynamics distortions have to be computed, such as the interaction with a soft ball [6][7][8], where the frequency of calculation decreases which degrades the reality of the simulation. Researches are still made to resolve this problem and improve the current possible haptic interactive simulations. Besides, previous research has seldom included a virtual representation of a robot, just objects and cursors. 1.3 Motivations Therefore, experimentations on a possible haptic interac- tion with a virtual robot including, the communication be- tween joystick/virtual robot, its remote control, the haptic environment conguration and the interaction will provide a new way to study the haptic interaction. Therefore, this kind of experimentation could improve the understanding of the human intelligence and the robot, combined together. Detailedly speaking, this work is motivated by the follow- ing issues: 1) What virtual robot is capable to conduct human-robot interaction and how to make a virtual robot work? 2) What haptic device is suitable to interact with the vir- tual robot and how to make it connect to the virtual robot? 3) How to design experiments to understand the interac- tions between the user and the virtual robot? 2 Problem analysis 2.1 Tools selection Regarding the haptic interaction between human and vir- tual robot, several things have to be taken into account. A haptic interaction needs to be realized between two devices. So this kind of interaction needs a specic tool to transfer force feedbacks to the user. Several haptic devices are avail- able nowadays, providing more or less the same capacities. Proceedings oI the 32nd Chinese Control ConIerence July 26-28, 2013, Xi'an, China 6045 The haptic device called Falcon 3D joystick [9] designed by the Novint Technologies Inc. provides a force-feedback up to 1kg allowing a good illusion of the tactile render- ing. Moreover, its use in many applications such as video games ensures realistic communication capacities, and accu- rate force feedbacks giving the illusion of a tactile perception during a simulation. Compared with other commercial hap- tics devices like Force Dimension Omega 3/6, Novint Falcon is more affordable and of high cost performance. Fig. 1: Novint Falcon 3D joystick (source: http://www.novint.com) The other part of the interaction is the virtual robot. The iCub Simulator [10], simulation of the real humanoid iCub [11], part of the European project RobotCub offers many ad- vantages. First, iCub Simulator is an open source virtual robot, and it also provides an environment where the simu- lated iCub can already interact with. Moreover, its design follows the realistic version of the robot, so results on the simulator could be directly tested on the real robot. Fig. 2: iCub Simulator 3D interface (screen snapshot) Therefore, using the 3D joystick, the user would like to control the real iCub robot or the virtual counterpart, iCub Simulator, and with realistic force feedbacks sent to the user thanks to motors into it. However, in the literature, to the best knowledge of the authors, there is not report on controlling the iCub robot with Novint Falcon. In this work, we aim to resolve the problem of control- ling a virtual iCub robot with a haptic device Novint Fal- con by exploring two different approaches: one is based on CHAI3D, one powerful open-source haptics C/C++ library; the other one is based on MATLAB with the aid of HAP- TIK, one open-source haptics library/toolbox for use with MATLAB. The former approach has the advantage of real- time, ne portability and extendability with C/C++ program- ming, however this approach may suffer from the difculties of learning C/C++ language and implementing complex sci- entic algorithms which involve many matrix/vector oper- ations or specic mathematical processes; while the latter approach has the advantage of convenience in algorithmic research and integration with MATLAB, however this ap- proach will be limited by restrictions of MATLAB. 2.2 Communication architecture The Novint Falcon communicates through the computer, data to a program based on libraries which will analyze the data, and will send it to the iCub Simulator. Then the iCub Simulators data could be received in order to compute forces sent to the joystick to the motors allowing the tactile perception. Fig. 3 illustrates the analysis of the working en- vironment required to communicate. Later we will discuss the libraries used and the development platform in details. Fig. 3: Communication design The Falcon joystick is commonly installed on Windows. The iCub Simulator is also available on Windows or Linux with source les downloadable. Note that Falcon itself does not know anything on the iCub Simulator, hence the biggest problem of controlling the virtual iCub robot via the Falcon joystick is to resolve the communication between the joy- stick and the iCub Simulator. Fortunately, the iCub robot and simulator are built on the top of one open source framework, the so-called YARP (yet another robot platform), which allows data and command exchanging among any applications through the application programming interface of YARP. Once the communication made, data received from the joystick have to be analyzed and translated in order to be understandable by the simulator. The data from the motors of the Falcon provide only a 3D position and button values. Then a loop has to be made which will allow moving the robot arm with the joystick in real time. Parameters needed to be taken into account such as the fact that the Novint Fal- con had its own coordinates system and the iCub robot too, the way of moving the arm to keep the maximum of uid- ity and to allow the iCub robot to reach an area as large as possible. Some tests will nally have to be done in order to match correctly the human movement and the iCub one. The way of communication, from the joystick to the sim- ulator, had to be enhanced to receive data from the simulator as well. To create the haptic interaction, specic data from the virtual robot are needed in order to understand its posi- tion and its behavior into its environment. Then the appli- cation of force-feedbacks through the motors will allow the creation of the haptic interaction. The nal result has to be as realistic as possible. 3 Falcon and iCub structure 3.1 The Novint Falcon joystick The Falcon device is a three dimensional joystick using a USB interface with commands sent from the computer by the rmware to provide perception. Motors used are three Mabuchi RS-555PH-15280 with motion monitored by a coaxial 4-state encoder with 320 lines per revolution. The reachable workspace allocated will allow a suitable working 6046 area where the robot will be able to move its arm. The Fal- con device can be tested with the provided test software as shown in Fig.4. Fig. 4: Testing Novint Falcon (screen snapshot) 3.2 The iCub Simulator The iCub Simulator mostly coded in C++, is a 3D repre- sentation of the real iCub robot allowing movement of the joints, vision system, and pressure sensors on the hands. Its design using data from the real robot provides a virtual clone of the real robot. In the real and virtual iCub robot, there are 53 degrees of freedom which 12 for the legs, 3 for the torso, 32 for the arms and 6 for the head. The iCub Simulator uses several external open-source li- braries which can simplify the complex programming jobs occurred in the simulation of iCub robot. Here is one incom- plete list of libraries depended: ODE (Open Dynamics Engine) libraries: used to simu- late the body and the collision detection. ACE (Adaptive Communication Environment): used to implement high-performance and real-time communi- cation services and applications across a range of OS platforms. GLUT (OpenGL utility toolkit): used to simplify pro- gramming with OpenGL 3D graphics rendering and user interface. GTKMM libraries: used to create the user interfaces for GTK+ and GNOME. IPOpt (Interior Point Optimizer): used to solve opti- mization problems. OpenCV libraries: used in the creation of real time computer vision. QT3 libraries: generally used in the development of software with user interface. SDL (Simple DirectMedia Layer): used to provide low level access to audio, keyboard, mouse, joystick, 3D hardware via OpenGL, and 2D video framebuffer. In the virtual arm including the hand, 16 degrees of free- dom ensure the left arm movements, arm used during the experimentations. 3.3 YARP conguration The YARP software is contained in the simulator archi- tecture and also coded in C++. This open-source tool pro- vides real time applications and simplies interfacing with devices. The architecture of the iCub Simulator is depicted in Fig. 5, which is borrowed from [13], an introduction to the iCub Simulator written by the developers. All applications based on YARP can communicate with one another, hence YARP plays a foundational role in the iCub system. Fig. 5: Architecture of the iCub Simulator [13] On Windows, the YARP and iCub Simulator were simply installed thanks to precompiled libraries containing all the packages needed to run effectively the simulator. YARP and iCub can be also successfully installed on Linux. The choice of the operating system is not so important as it looks due to the cross-platform nature of YARP, which makes it feasible to remotely control virtual iCub robot installed in one com- puter (say A) by any YARP application running on another computer (say B), even if A and B run different operating systems. The small tests shipped with the iCub Simulator were successful and everything could be executed to launch the iCub Simulator. We also successfully tested the iCub Simulator on Ubuntu Linux and AndLinux, among which the latter is one native Linux environment working coopera- tively with Windows. When the Falcon device and the iCub Simulator are in- stalled on the same operation system of one computer, there is no special conguration for YARP needed. To make our studies more practical and useful, it would be desirable to install the Falcon device on one computer (say F) while to run the virtual iCub robot on another remote computer (say R). To make it possible to remotely control R via F, it is necessary to congure both F and R with the command-line command yarp conf <ip>, where <ip>denotes the true IP address of computer R. Note that F and R should connect with each other (use ping command to test the connection), ideally in the same ethernet. 4 Implementation with CHAI3D 4.1 The CHAI3D library The haptic interaction needs to compute forces with data received from the virtual environment, before the sending to the motors. A set of open source C++ library functions called CHAI3D [14] allows the use of several haptic devices including the Novint Falcon in the Windows [32-bit] version. Moreover they offer haptic functions such as the computa- tion of continuous forces sent to the device, simplifying the low level of the haptic interaction. Finally the CHAI3D library is downloadable with exam- ples which could be compiled and executed with Visual Stu- dio 2010. These examples provided a good idea of the Novint capacities and the CHAI3D. Examples are smooth and the haptic interaction is really realistic. Even some bugs reported such as the modication of the button IDs during the example executions, the CHAI3D library is excellent for starting haptic development due to its easy-to-use program- ming interface and easy-to-understand example codes. 6047 4.2 Communication The communication had to be done between the CHAI3D library simplifying the use of the joystick and haptic percep- tion, and the iCub Simulator. Note that CHAI3D itself does not provide any functions to connect and operate the iCub Simulator. To resolve this problem, with the help of YARP, a programbased on the CHAI3Dlibrary and YARP library can send commands to the iCub Simulator. Visual Studio was the development environment for the experiments, coded in C++ because it was the computer language used by the iCub Simulator and the CHAI3D library used. The base of the code was a real time loop receiving joy- stick positions and button values. Commands have been sent to the iCub Simulator through the YARP while the real time loop received data. The communication became possible and correct information had to be transferred. 4.3 iCub arm control To remotely control the robot, the rst required develop- ment was the arm control. Offsets were applied to dene the workspace of the virtual arm. They provided a suit- able workspace environment where the remote control of the iCub hand was possible. Then the choice concerning was decided. As a mirror, the hand movements were matched to the iCub Simulator hand, as the user moved its own hand. The Cartesian systems used for the iCub Simulator and the Novint Falcon joystick are shown in Fig. 6. Fig. 6: Cartesian world of joystick and simulator The control of the arm was performed by sending posi- tions with a command via YARP interface to the iCub Simu- lator. Then it moved its joints according to its kinematics in order to reach the desired position. Fig. 7: The congured simulator interface in our simulations (screen snapshot) 4.4 Simulations implementation The rst idea was the use of a function called magnet which allows the robot to glue the shape to the hand, or more precisely, to keep a constant distance between the hand simulator and the shape. This function can be used if the iCub hand is not in running mode which means the hand sensors are off. This solution allowed the application of a small force-feedback when the object was grabbed. To be a bit more realistic, the distance between the shape and the simulator hand should be small in order to be able to grasp the object. Fig. 8: Representation of the magnet simulation (screen snapshot) A second solution was tried with the sensors. Using the hand sensors providing 12 boolean values which allowed knowing when the hand touched something, the robot had to detect a table placed in front of him. Then, a retro-action into the joystick, equals to the inverse of the force applied by the human on the table, would be implemented to feel the table. Fig. 9: Simulation using hand sensors values (screen snap- shot) 4.5 Reducing time delay in arm control By the above mentioned approaches, it can be seen that the virtual iCub robot can be successfully controlled by the Novint Falcon joystick, yet with some time delay if the joy- stick moves fast. This time delay may be caused by hardware issues and software issues, among which the former may be restricted by the performance of the computer or the Novint Falcon device while the latter may include the implementa- tion details of our program or design defects of CHAI3D. Forgetting the real time interaction, the last solution tried was the programming of buffers which could save the robot hand positions thus the simulator could reach the points with its own pace. Therefore, a better accuracy of movement with less computer bugs was expected. To realize this solution, two threads have been imple- mented, one running the real time loop allowing the recep- tion of the Novint Falcon joystick data, and the other running the sending of commands to simulator via the YARP. Tests where realized, and the Novint Falcon joystick could per- form movements, and then executed by the iCub Simulator with its own speed. 6048 5 Implementation with MATLAB Besides the solution of using CHAI3D, we also tried the other solution based on MATLAB. To enable using haptic devices in MATLAB, we rely on one open-source library, HAPTIK, which provides also MATLABbindings for script- ing with MATLAB. 5.1 The HAPTIK library The HAPTIK library is an open source lightweight library with a component based architecture that acts as a hardware abstraction layer to provide constant access to HAPTIK de- vices. HAPTIK was developed with the aim of providing an easy but powerful low-level access to devices from different vendors[15]. Prior to HAPTIK, when focusing on hardware access, all current libraries present some drawbacks and fail to satisfy all the following needed requirements: Available devices are enumerated by the library, and ap- plications can easily let the user choose one of them. Loading of hardware specic plugins is performed transparently at runtime, allowing the same executable to run on systems with different hardware congura- tions, different driver versions, or even without any de- vice. The fallback loading scheme allows the same exe- cutable to use the most recent hardware and driver ver- sions. These requirements motivate the development of HAPTIK, which overcomes such limitations, achieving many advan- tages for the end user as well as the developers. HAPTIK has a very simple API (Application Program- ming Interface). A small amount of code lines are enough to start using any haptic device. Devices operations such as enumeration, automatic default selection, device informa- tion querying and auto-recalibration are already built in and ready to use. HAPTIK is developed in C++ programming language but it can be used in MATLAB and Simulink since it provides MATLAB bindings. Unlike CHAI3D, HAPTIK does not contain graphic primitives, physics- related algo- rithms, or complex class hierarchies. It exposes instead a set of interfaces that allow the differences between devices to be hidden and thus making applications device-independent. The library has been built with a highly exible infrastruc- ture of dynamically loadable plugins. With HAPTIK library, it is convenient to obtain 3D po- sition of the Novint Falcon joystick via MATLAB codes shown in List. 1. Listing 1: MATLAB codes for reading position of Falcon joystick 1 f unc t i on XYZ = c a l l F a l c o n ( a ) 2 gl obal h ; 3 h = h a p t i k d e v i c e ; 4 pos = r e a d p o s i t i o n ( h ) ; 5 XYZ = pos ; 6 end 5.2 YARP for MATLAB To interact with the iCub Simulator via MATLAB, it is necessary to call YARP within MATLAB. Fortunately, YARP provides also MATLAB binding through JAVA with the help of SWIG (Simplied Wrapper and Inter- face Generator), which is an open source software tool used to connect computer programs or libraries written in C or C++ with scripting languages such as Python and other languages like Java. According to the instructions given in http://wiki.icub.org/yarpdoc/yarp_ swig.html#yarp_swig_matlab, after steps of in- stalling SWIG, compiling YARP binding (a dynamic load- ing library jyarp.dll), compiling JAVA classes, and congur- ing MATLAB paths, it is ready to call YARP in MATLAB by running LoadYarp command in MATLAB. The codes shown in List. 2 load YARP in MATLAB, initialize iCub Simulator, and set positions of left arm through YARP inter- face. Listing 2: MATLAB codes for using YARP to initialize iCub Simulator 1 LoadYarp ; % i mpor t s YARP and c onne c t t o YARP net wor k 2 o p t i o n s = yar p . Pr o p e r t y ; 3 o p t i o n s . put ( de vi c e , r e mo t e c o n t r o l b o a r d ) ; 4 o p t i o n s . put ( r emot e , / i cubSi m / l e f t a r m ) ; 5 o p t i o n s . put ( l o c a l , / mat l ab / l e f t ) ; 6 r obot De vi c e Le f t = yar p . Pol yDr i ve r ( o p t i o n s ) ; 7 i f i s e q u a l ( r obot De vi c e Le f t . i s Va l i d , 1 ) 8 di s p [ s u c c e s s ] r obot a v a i l a b l e ; 9 e l s e 10 di s p [ war ni ng ] r obot NOT a v a i l a b l e , does i t e x i s t ? ; 11 end 12 pos Le f t = r ob ot De v i c e Le f t . v i e wI Po s i t i o n Co n t r o l ; 13 e nc Le f t = r obot De vi c e Le f t . vi ewI Encoder s ( ) ; 14 nDOF = e nc Le f t . get Axes ( ) ; 15 t h e t a = [ 45 40 60 90 45 65 10 45 10 0 0 0 0 0 0 0 ] 180/ pi ; 16 home = yar p . DVect or ( nDOF) ; 17 e n c o d e r s Le f t = yar p . DVect or ( nDOF) ; 18 f or k =0: 15 , 19 home . s e t ( k , t h e t a ( k+11) ) ; 20 end ; 21 pos Le f t . pos i t i onMove ( home ) ; 5.3 Robotics toolbox To control the arms of the virtual iCub robot by MATLAB, we use also another open source toolbox, Robotics[16], de- veloped by Peter Corke. This toolbox has provided many functions that are useful for the study and simulation of clas- sical arm-type robotics, for example such things as kinemat- ics, dynamics, and trajectory generation. The Toolbox is based on a very general method of representing the kinemat- ics and dynamics of serial-link manipulators. With Robotics toolbox, we have created iCub left arm and right arm as manipulators. For easy implementation of con- trol algorithms on the virtual iCub robot, a Simulink model, icubSIM.mdl, is created for simulations of iCub Simulator through the haptic device. List. 3 shows some codes to cre- ate the iCub arms with Robotics toolbox. Listing 3: Creating robot arms with Robotics toolbox 1 % Cr eat e i Cub arms 2 % \Thet a d a \al pha \si gma 3 Le f t ( 1 ) = Li nk ( [ 0 0. 10774 0 pi / 2 0 ] , s t a n d a r d ) ; 4 Le f t ( 2 ) = Li nk ( [ 0 0 0 pi / 2 0 ] , s t a n d a r d ) ; 5 Le f t ( 3 ) = Li nk ( [ 0 0. 15228 0 pi / 2 0 ] , s t a n d a r d ) ; 6 Le f t ( 4 ) = Li nk ( [ 0 0 0.015 pi / 2 0 ] , s t a n d a r d ) ; 7 Le f t ( 5 ) = Li nk ( [ 0 0. 1373 0 pi / 2 0 ] , s t a n d a r d ) ; 8 Le f t ( 6 ) = Li nk ( [ 0 0 0 pi / 2 0 ] , s t a n d a r d ) ; 9 Le f t ( 7 ) = Li nk ( [ 0 0.016 0. 0625 0 0 ] , s t a n d a r d ) ; 10 11 f or i = 1: 7 6049 12 Le f t ( i ) . m = ml ( i ) ; 13 Le f t ( i ) . I = I l ( : , : , i ) ; 14 Le f t ( i ) . G = 1; 15 Le f t ( i ) . r = r l ( : , : , i ) ; 16 Le f t ( i ) . Jm = 00e 6; 17 end 18 19 Le f t ( 1 ) . ql i m = [ 0. 38 1 . 4 7 ] ; 20 Le f t ( 2 ) . ql i m = [ 0. 68 0 . 6 8 ] ; 21 Le f t ( 3 ) . ql i m = [ 1. 03 1 . 0 3 ] ; 22 Le f t ( 4 ) . ql i m = [ 1. 66 0 . 0 9 ] ; 23 Le f t ( 5 ) . ql i m = [ 0. 00 2 . 8 1 ] ; 24 Le f t ( 6 ) . ql i m = [ 0. 65 1 . 7 5 ] ; 25 Le f t ( 7 ) . ql i m = [ 0. 10 1 . 8 5 ] ; 26 27 robot Arm1 = Se r i a l Li n k ( Lef t , name , Le f t , bas e , . . . 28 t r a n s l ( 0. 0170686 , 0. 0060472 , 0. 1753) t r o t x ( pi / 2 ) t r o t y (15 pi / 180) ) ; 5.4 Simulations With the installed HAPTIK toolbox, Robotics toolbox and YARP loaded in MATLAB, we can use the Novint Falcon device to control the virtual iCub robot. The MATLAB codes shown in List. 4 outline the main loop of simulations, where functions pos2ang and yarp_set_pos should be implemented further according to forward and inverse kinematics of the robot arm model. Function pos2ang is to convert the end-point position to angles of links, and function yarp_set_pos is to set the position of the vir- tual iCub robot arm through YARP interface. In this main loop, robotArm1 was created by the Robotics toolbox and robotDeviceLeft was initialized through the codes in List. 2. Listing 4: Main loop of simulation 1 mdl i cubs i m ;%Load i Cub r obot model 2 h = h a p t i k d e v i c e ;%Get handl e of h a p t i c j o y s t i c k 3 pos = r e a d p o s i t i o n ( h ) ;%Read h a p t i c p o s i t i o n 4 t i c ;%Ti mer 5 whi l e t oc < 100;%Do t h i s l oop f o r 100 s econds 6 pos = r e a d p o s i t i o n ( h ) ;%Read h a p t i c p o s i t i o n 7 qpos = pos2ang ( pos , robot Arm1 ) ;%Conver t 3D p o s i t i o n t o angl e s of l i n k s 8 robot Arm1 . pl ot ( qpos ) ;%Pl ot t he p o s i t i o n read 9 drawnow( ) ;%Pl ot t he p o s i t i o n now 10 y a r p s e t p o s ( r obot De vi c e Le f t , pos ) ;%Se t p o s i t i o n of l e f t arm v i a YARP 11 end ; 12 c l o s e ( h ) ;%c l o s e t he h a p t i c d e v i c e 13 c l e ar h %c l e a r v a r i a b l e 6 Conclusion In the experiments of this work, the possible communi- cation and arm control via the Novint Falcon joystick using YARP and CHAI3D libraries have been performed. Even if the software solutions implemented need to be improved fur- ther, they allowed a better understanding of the haptic inter- action with a virtual robot and highlighted relevant problems which could be solved in further works. The haptic inter- action is an interesting way of research in the combination of the human intelligence and the robots, and the relation between 3D Novint Falcon joystick and virtual iCub robot permits to illustrate a possible way of research. Based on the techniques presented in this paper, more work can be done to introduce human intelligence by programming the adap- tation law of the virtual robot for the purpose of integrating the haptic device and the remote robot. Acknowledgments This work was supported in part by the European Com- mission funded Marie Curie International Incoming Fellow- ship H2R Project (FP7-PEOPLE-2010-IIF-275078), the Na- tional Nature Science Foundation in China (NSFC) under Grants 61004059 and 61203074, NSFC-RS Joint Project grant (61211130359 and IE111175), Program for New Cen- tury Excellent Talents in University (NCET-09-0045), Bei- jing Outstanding Talents Programme (Type D), and Graduate Teaching/Innovation Funding of Beijing Institute of Tech- nology. References [1] Tavakoli, M., Patel, R. V. and Moallem, M. (2005), Haptic interaction in robot-assisted endoscopic surgery: a sensorized end-effector, January 15 [2] Wang, D., Li, J. and Li, C. (2009), An Adaptive Haptic Inter- action Architecture for Knee Rehabilitation Robot, Interna- tional Conference on Mechatronics and Automation, August 9-12 [3] Gupta, A. and OMalley, M. K. (2006), Design of a Haptic Arm Exoskeleton for Training and Rehabilitation, Transactions on mechatronics IEEE/ASME, Vol. 11, No. 3 [4] Medina, J. R., Lawitzky, M., Mortl, A., Lee, D. and Hirche, S., (2011), An Experience-Driven Robotic Assistant Acquiring Human Knowledge to Improve Haptic Cooperation, Institute of Automatic Control Engineering [5] Salisbury, K., Broc, D., Massie, T., Swarup, N. and Zilles, C. (1995), Haptic Rendering: Programming Touch Interaction with Virtual Objects, Proceedings of the ACM Symposium on Interactive 3D Graphics, Monterrey, California, Avril 9-12, 1995 [6] Adams, R. and Hannaford, B. (1999), Stable Haptic In- teraction with Virtual Environments, IEEE Transations on Robotics and Automation, vol. 15, No.3, June [7] Ruspini, C., Kolorov, K. and Khatib, O. (1997), Haptic Inter- action in Virtual Environments, Proceedings of the IEEE/RSJ Int. Conference on Intelligent Robots and Systems (IROS97), Grenoble, France, September [8] Mendoza, C. and Laugier, C. (2001), Realistic Haptic Ren- dering for Highly Deformable Virtual Objects, Proceedings of the IEEE Virtual Reality Conference (VR 2001), Yoko- hama, Japan, March [9] Novint Web Site (2012), company specialized in 3D haptic devices, http://www.novint.com/index.php [10] Simulator Web Site (2012), Wiki for the iCub Sim- ulator specications from its installation to its use, http://www.eris.liralab.it/wiki/ [11] iCub robot Web Site (2012), ofcial web site concerning the iCub projects around the Europe, http://www.iCub.org/ [12] Martin, S. and Hillier, N. (2009), Characterisation of the Novint Falcon Haptic Device for Application as a robot Ma- nipulator, Australasian Conference on Robotics and Automa- tion (ACRA), December 2-4, Sydney [13] Tikhanoff, V., Cangelosi, A.,Fitzpatrick, P., Metta, G., Natale, L. and Nori, F., (2008), An Open-Source Simulator for Cogni- tive Robotics Research: The Prototype of the iCub Humanoid Robot Simulator. [14] CHAI3D Web Site (2012), set of open-source libraries pro- viding an environment for haptic real time interactive simula- tions, http://www.chai3d.org/documentation.html [15] M. de Pascale, G. de Pascale, D. Prattichizzo, and F. Barbagli, The haptik library, a component based architecture for hap- tic devices access, in Proc. EuroHaptics 2004, Munich, Ger- many, June 2004. [16] Peter Corke (2012), Robotics Toolbox, http://petercorke.com/Robotics Toolbox.html [17] SWIG Web Site (2012), ofcial web site of SWIG, http://swig.org/ 6050