0% found this document useful (0 votes)
0 views

The PyBullet module-based approach to control the collaborative YuMi robot

The paper discusses the development of a Python program using the PyBullet module to control the collaborative YuMi robot, aiming to simplify robot programming for users with limited expertise. It highlights the importance of simulation tools in robotics for testing and training, particularly in collaborative environments where robots and humans interact. The authors propose a model for the YuMi robot that can be controlled and queried, with future plans to integrate artificial intelligence methods for enhanced learning and adaptability.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
0 views

The PyBullet module-based approach to control the collaborative YuMi robot

The paper discusses the development of a Python program using the PyBullet module to control the collaborative YuMi robot, aiming to simplify robot programming for users with limited expertise. It highlights the importance of simulation tools in robotics for testing and training, particularly in collaborative environments where robots and humans interact. The authors propose a model for the YuMi robot that can be controlled and queried, with future plans to integrate artificial intelligence methods for enhanced learning and adaptability.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

The PyBullet module-based approach to control the

collaborative YuMi robot


Roman Michalík Aleš Janota
Department of Control & Information Systems, FEEIT, Department of Control & Information Systems, FEEIT,
University of Žilina University of Žilina
Žilina, Slovakia Žilina, Slovakia
[email protected] https://orcid.org/0000-0003-2132-1295

Abstract—Collaborative robots who are able to share threats to improve safety and security of the robotic platforms.
workspace with humans, have opened a new dimension of The programming of industrial collaborative robots is a
automation across multiple industries. The paper deals with the complex task and requires a high level of expertise. Therefore,
problem of selecting appropriate software for simulating robot there is an effort to make this task easier and more available
control, creating a model for these simulations and for humans with no or limited programming skills, e.g. [7]
implementing them. The authors briefly analyze and discuss the presents a concept for a highly usable interactive device aimed
problem, its potential solutions and present applied approach in for programming the workflow of the robot YuMi, together
the form of the Python program by using the PyBullet module. with literature review and a case study. Unlike [7], having
Keywords—artificial intelligence, robots, robot programming,
dealt with the industrial view most, [8] explores the
simulation, modeling collaborative robot market and the possibility of
multidisciplinary characteristics of robot design. The
I. INTRODUCTION currently provided support tools and programming paradigms
do not yet fully support the full exploitation of the robot’s
The automation of a wide variety of work tasks is an
properties, and the programmer/operator still needs a certain
essential aspect of the factories of the future. However, many
level of knowledge about the internal features of the robot
tasks can be too complex or so specific and difficult to
system - [9] indicates it when programming the YuMi robot
automate, that it will still be inevitable to create collaborative
with standard equipment, without any external sensors. On the
environments shared by humans and robots. In that case,
contrary, [10] addresses the subject of using sensory
humans and robots will have to work side by side, sharing the
information in a real-time manner to control robots. This
same work tasks. Collaborative robots (or cobots) appear to be
approach supports the key-idea: instead of building the factory
a key and essential prerequisite for this concept, requiring safe
around the robots, the robots should collaborate with humans
interaction of both kinds of entities. The technical solution
in their human environment. The importance of suitable
must guarantee the safety of persons moving near the robots.
development environments for modelling, programming and
To make human-machine collaboration fruitful and effective,
simulation of robots is constantly increasing since a robot is
we need to replace physical fence-based protection with more typically a complex of joints, links, actuators, sensors,
advanced and sophisticated approaches. Limiting the robot's controller and other structural parts [11]. The chapter [12]
strength, range and speed of its motion, or limiting other emphasizes the application of robots concerning industry
physical parameters of the robot is one of measures, largely at revolution 4.0 and highlights the essential role of robot
the expense of the efficiency of the operations performed. A
simulation. Simulation brings many advantages and key-
more promising way is to equip the robot with the ability to
benefits – it provides unlimited amount of training time, it
monitor events happening around, including people and their
allows collecting huge amounts of data, making the process
activities. Thanks to that, the robot can change its behavior to
faster, cheaper and simpler than in real systems. Combining
eliminate or reduce risk of dangerous collisions with humans
data from multiple robotic domains can provide rich and
and/or other machine, to tolerable levels. Safety means
diverse training data for transfer of learning algorithms, e.g.
freedom from injury, however, non-injuring but subjectively
[13] presents a set of surgical robotic skills DESK (Dexterous
painful contacts are not forbidden. A topic of safety of human-
Surgical SKills) collected using 3 robotic platforms, including
machine collaboration has been a significant topic of many
the YuMi robot. Generally, there is no problem when robots
research projects, e.g. [1] presents the results of two
work in precisely defined and stable spaces. However, the
experiments measuring level of workers' confidence in safety
ability of the robot to adapt to different variations, e.g.
measures developed and implemented in the EU when
uncertainty in the position of working objects, small
working with robots; [2] discusses an approach to safety
differences in the position of working objects, etc. is more
analysis of safety-related systems (with emphasis given to
valuable [14]. One of the complex tasks is controlling the
common-cause failures). Many studies have been based on
robot’s orientation. In many cases, it is assumed that the robot
practical experiments with the single or dual arm ABB YuMi
configuration space is a real coordinate space (usually joint
robot – YuMi stands for “You and Me”. The authors of [3]
space or Cartesian position space) [15], [16]. What we need is
describe an experiment of playing a game (card magic trick)
legibility and predictability of robot moves. If we assume that
between a human and an untrained robot; [4] discusses safety
humans move optimal with respect to a certain objective, this
aspect in the context of actual standards, concerning heavy
function can be found developing an inverse optimal control
manufacturing tasks. No less important as safety aspects there
approach. In [17] the human arm is modelled as a seven
are security issues, especially with regard to the
degrees of freedom manipulator, similar to a robot arm model
interconnected digital world, e.g. [5] presents results of an
of the YuMi. Reference [18] investigates how YuMi robot
experimental security analysis of an industrial robot
could fit in the production concept, with the approach based
controller; or [6] analyses several cyber-physical security

978-1-7281-7542-3/20/$31.00 ©2020 IEEE

Authorized licensed use limited to: Carleton University. Downloaded on October 05,2020 at 01:31:05 UTC from IEEE Xplore. Restrictions apply.
on integrated vision and another approach based on transition simple parts. In our case, modelling the YuMi robot in such
from the digital twins to product assembly. a program would be difficult without similar previous
experience in such a size. Therefore, we decided to use the
Several approaches that try to train the robot as effectively pre-created model of the YuMi robot implemented in [23].
as possible and create an acceptable model of its behavior, e.g. This model consists of meshes and the main URDF
a method of learning from demonstration, based on Discrete (Universal Robotic Description Format) file in which there
Hidden Markov Model (DHMM) is demonstrated in [19] and are individual connections between parts of the YuMi robot.
applied to a mining inspection task. Reference [20] presents The 3D model of the YuMi robot body is shown in Fig. 1 and
results of control of Ball & Plate model using the YuMi. corresponding part of the URDF code is shown in Fig. 2.
A part of the robotic research community uses its own or
publicly available modules such as the “yumipy module” from
the Berkeley AutoLab [21], which is a Python interface to
commanding the YuMi robot.
Within the context of ideas stated above, the aim of this
paper is to propose a possible approach in the form of the
Python program with an implemented model of the YuMi
robot using PyBullet module. This proposed approach
develops a 3D model of the YuMi robot in an empty graphical
environment, which we can control and query several
variables. These variables could be used in the future to
implement artificial intelligence (AI) methods.
II. PROBLEM DEFINITION
Although the development of collaborative robots is still
in its beginning, the cobots are now successfully integrated in
a range of processes. Simulators are important in robotics
research as tools for testing. Testing the performance of the
cobot by modifying only specific conditions is one of the
benefits of simulation software. In recent years, we have
Fig. 1. The 3D model of the YuMi robot body.
witnessed a need for the development of easy to use software
tools for the simulation of robotic models. This need has been
motivated by the modern robotic applications in which robots
are required to autonomously operate in close interaction with
humans – collaborative robots. A growing number of
software tools for the simulation has been designed due to
these needs. Some software tools have been developed by
companies for their own robots and others have been freely
available open source applications or modules. Software
from companies usually does not include all the necessary
options for users e.g. monitoring of various parameters,
simulating in other than real time, simple implementation of
AI, what can be confusing to use especially for new users. In
addition, there are missing easy to use simulation examples,
installation and usage instructions. On the other hand, there
are many free applications or open source modules and the
user does not know which to choose. There are various Fig. 2. Corresponding URDF code of the YuMi robot body.
articles devoted to selecting applications according to user
requirements. Reference [22] provides information about the This URDF file consists of all links of the YuMi robot.
overview and comparison of various selected applications That is, the interconnections of the individual joints between
from around the world. These applications mostly use themselves and the base, interconnections of the gripper and
programming languages like C/C++, Java and Python. two fingers. The file is as large as 862 lines of code and
Choosing the application with a programing language that contains links to both arms. For simplicity, we removed one
users prefer is one of many problems that bother them. One arm, reducing the code to 464 lines.
of the main difficulties is to create a robot model. Simple
models are often required, but the simpler the model, the Our approach is based on the use of the PyBullet [24].
more inaccurate the solution that describes the behavior of the PyBullet is an easy to use Python module for physics
physical system. Possibility of creating a model and its simulation for robotics, games, visual effects and machine
connection with the Python module is the subject of the next learning, with a focus on transferring the simulation to the
chapter. real robot. In addition to the URDF file, it supports SDF
(Spatial Data File), MJCF (Multi-Joint dynamics with
III. PROBLEM SOLUTION Contact) and other types that we can load. PyBullet provides
a variety of simulations e.g. forward and inverse dynamics,
As mentioned above, the basic problem is to create the kinematics, collision detection and also includes robotic
right model. The user can create his/her own model in any of examples such as a simulated quadruped, humanoids running
the modelling programs as long as the robot is composed of

Authorized licensed use limited to: Carleton University. Downloaded on October 05,2020 at 01:31:05 UTC from IEEE Xplore. Restrictions apply.
using TensorFlow inference and KUKA arms grasping an • UDP and TCP will connect to an existing physics
object. server over UDP or TCP networking.
Our proposed approach consists of the 3D YuMi robot Our approach was to connect to a shared memory and if
model with an example of use in Python using PyBullet it fails, then try to connect to GUI. By default, there is no
module. Our real YuMi robot is shown in Fig. 3 and the gravitational force enabled. To enable it we used command
corresponding model with one arm using PyBullet module is setGravity(0,0,-9.8) to simulate gravity of the Earth with the
shown in Fig. 4. gravitational acceleration of 9.8 m/s2 along the Z world axis.
To finally load a model we used the command
loadURDF(“yumi.urdf”, useFixedBase = True). This
command will load a physics model of the YuMi robot from
the URDF file mentioned before and force the base of it to be
static. After these basic settings, we can work with the model
using additional commands for movement control, query
several state variables, apply force or torque, get additional
information about the mass, friction, position and other
properties. In our case, we used the getNumJoints(yumi)
command with a print command to display and verify the
correct number of robot joints. It is also possible to use
commands like getJointInfo, getJointState, getLinkState to
list the information of the selected joint or link, e.g. position
Fig. 3. Our YuMi robot, for which the model was created of the joint as we used it in our project. To control a robot,
we can use command setJointMotorControl2. This command
will control a robot by setting a desired control mode for one
or more joint motors and by setting a velocity, force or
position. For example: setJointMotorControl2(bodyUniqueId
= yumi, jointIndex = 1, controlMode =
pybullet.POSITION_CONTROL, targetPosition = 2, force =
maxForce). This command will set joint 1 of the YuMi robot
to reach the target position 2. Other control modes are
velocity and torque modes. To perform all the desired actions,
we can include commands in a cycle to repeat, and to perform
the simulation we must add the stepSimulation command.
The default value of one simulation step is 1/240 second,
which can be changed with command setTimeStep or we can
use the real time clock with command
setRealTimeSimulation.
PyBullet provides many commands to control and
Fig. 4. Model of the YuMi robot with one arm using retrieve various information about desired parts of the robot
PyBullet or needed values. We used selected commands to set the
environment, implement the model, verify its correctness and
In order to implement this model using PyBullet, we must
determine the position of the joints. We are also planning to
first program some additional simulation parameters. For
use this program with the implementation of AI in the future,
programming we used Spyder, which is Python environment
specifically the TensorFlow platform for machine learning.
written in Python with advanced editing, analysis, debugging
Our attention is focused on the use of reinforcement learning
and visualization.
that makes the robot possible to learn through trial and error
The very first thing to do before programming is to import using direct sensor inputs [25].
PyBullet. To do this, we need to download the PyBullet
States
module using the pip install pybullet command. After
downloading, we can add the import pybullet command. If Environment Rewards
necessary, we can add additional modules, such as time or Critic Agent
math. After importing the PyBullet module, the next thing to
do is connecting to the physics simulation. PyBullet is Actions
designed as a client-server API, with a client sending
commands and a server returning the status. PyBullet has
some built-in physics servers, which are: Fig. 5. Reinforcement learning: sequential decision [25]

• DIRECT mode, create a new physics engine and Similarly, like other methods of supervised learning, in
communicates with it; reinforcement learning the system also gets a feedback,
• GUI will create a physics engine with graphical GUI however, its character is a little bit different. There are no
frontend and communicates with it; required outputs available for presented inputs. The agent-
• SHARED_MEMORY will connect to an existing environment interaction is envisioned as the interaction
physics engine process on the same machine and between an agent (a controller) and the environments (the
communicates with it over shared memory; controlled system), with a specialized reward signal coming

Authorized licensed use limited to: Carleton University. Downloaded on October 05,2020 at 01:31:05 UTC from IEEE Xplore. Restrictions apply.
from a “critic” in the environment based on how the system [9] M. Stenmark et al., The GiftWrapper: Programming a Dual-Arm Robot
is doing well (or not) [25]. with Lead-Through. Human-Robot Interfaces for Enhanced Physical
Interactions Conference, Paper presented at Human-Robot Interfaces
for Enhanced Physical Interactions, Stockholm, Sweden, 2016/05/16 -
IV. CONCLUSIONS 2016/05/16.
The presented solution represents one of the possible [10] T. T. Andersen, Sensor based real-time control of robots. PhD Thesis,
approaches to modelling robots in Python. The proposed Technical University of Denmark, Department of Electrical
approach consists of the 3D YuMi robot model with an Engineering, 2015.
example of use in Python using PyBullet module. This [11] S. B. Niku, Introduction to Robotics: Analysis, Control, Applications.
2nd Ed., New Jersey, USA: John Wiley & Sons, Inc, 2011.
approach could be used in the future to easily implement
[12] Y. Rizal, Computer Simulation of Human-Robot Collaboration in the
artificial intelligence. The PyBullet module can be used in a Context of Industry Revolution 4.0 (Open access peer-reviewed
wide range of applications. It has great community support chapter – online first). Future of Robotics – Becoming Human with
and is easy to present using a graphical environment. We Humanoid or Emotional Intelligence. InTechOpen, pp. 1-22, 2019.
consider AI to be an efficient method of teaching an industrial [13] N. Madapana, et al., DESK: A Robotic Activity Dataset for Dexterous
robot new skill. Similarly to other AI domains, our future Surgical Skills Transfer to Medical Robots. ArXiv, Vol.
interests in the field of cooperative robotics are directed at abs/1903.00959 [cs.RO], 2019.
self-control, i.e. the aim to teach the robot system directly [14] M. Karlsson, A. Robertsson, and R. Johansson, Temporally Coupled
from raw data to ensure the robot's ability to adapt to new Dynamical Movement Primitives in Cartesian Space. arXiv:
1905.11176[cs.RO], 2019.
tasks and new circumstances. With AI, it is easier for an
[15] M. Prada, A. Remazeilles, A. Koene, and S. Endo, Implementation and
operator to track the robot’s learning than it previously was experimental validation of Dynamic Movement Primitives for object
with conventional training methods. handover, IEEE/RSJ Int. Conf. on Intelligent Robots and Systems
(IROS), pp. 2146–2153, September 14–18, Chicago, Illinois, Sept. 14–
ACKNOWLEDGMENT 18 2014.
The paper has been elaborated with support of the KEGA [16] C. T. Landi, F. Ferraguti, C. Fantuzzi, and C. Secchi, A Passivity-Based
Strategy for Coaching in Human-Robot Interaction, IEEE Int. Conf. on
grant agency within the project “KEGA 014-ZU-4/2018 Robotics and Automation (ICRA), May 21–25, Brisbane, Australia,
Broadening the content in a field of study with respect to the 2018.
current requirements of the industry as regards artificial [17] E. Hoogerwerf, Inverse Optimal Control for robot arm motions:
methods and IT.” Improving human-robot collaboration by incorporating human
characteristics into optimal motion generation. MSc. Thesis, TU Delft,
REFERENCES November 2018.
[1] I. Maurtua, A. Ibarguren, J. Kildal, L. Susperregi, and B. Sierra, [18] V. Kuliaev, Implementation of Assembly Operations with YuMi
“Human-robot collaboration in industrial applications: Safety, Robot. MSc. thesis, Aalto University, School of Electrical Engineering,
interaction and trust,” Int. J. of Advanced Robotic Systems, pp. 1-10, 2019.
July-August 2017. [19] F. Ge, W. Moore, M. Antolovich, J. Gao, Robot Learning By a Mining
[2] J. Ilavský, K. Rástočný, and J. Ždánsky, „Common-cause failures as Tunnel Inspection Robot. 9th IEEE Int. Conf. on Ubiquitous Robots and
major issue in safety of Control Systems“, Advances in Electrical and Ambient Intelligence (URAI), pp. 200-204, Daejeon, Korea, Nov. 26-
Electronic Engineering, vol. 11, No. 2, special issue, pp. 86-93, 2013. 28, 2012.
[3] A. Zahavi, F. A. Afrange, S. N. Haeri, U. Ajeevan, and D. Ch. [20] Ľ. Špaček, and J. Vojtěšek, “Ball & Plate Model on ABB YuMi
Liyanage, ABB YuMi© high-speed pick and place game in action. In Robot,” Advances in Intelligent Systems and Computing, vol. 986, R.
29th DAAAM Int. Symp. on Intelligent Manufacturing & Automation, Silhavy, Ed., pp. 283-291, Springer Verlag, 2019.
Zadar, Croatia, 21-28th October 2018, pp. 1216-1221. [21] Berkeley yumipy: Python control interface for interacting with the
[4] T. Koukkari, Collaborative robotics: Human-robot collaboration in ABB YuMi Robot, https://github.com/BerkeleyAutomation/yumipy.
heavy manufacturing tasks. Seinäjoki University of Applied Sciences, [22] A. Hentout, A. Maoudj, B. Bouzouia, A Survey of Development
URN:NBN:fi:amk-2017053111382, 2016. Frameworks for Robotics. 8th IEEE Int. Conf. on Modelling,
[5] D. Quarta, M. Pogliani, M. Polino, F. Maggi, A. M. Zanchettin, and S. Identification and Control (ICMIC), pp. 67 – 72, Algiers, Algeria, Nov.
Zanero, An Experimental Security Analysis of an Industrial Robot 15-17, 2016.
Controller. In IEEE Symposium on Security and Privacy, 2017. [23] KTH Royal Institute of Technology: ROS packages for the ABB YuMi
[6] K. M. A. Yousef, A. AlMajali, S. A. Ghalyon, W. Dweik, and B. J. (IRB14000) robot, https://github.com/kth-ros-pkg/yumi.
Mohd, “Analyzing Cyber-Physical Threats on Robotic Platforms,” [24] Erwin Coumans and Yunfei Bai, PyBullet, a Python module for physics
Sensors 18(5):1643, 2018. simulation for games, robotics and machine learning, 2016-2019,
[7] T. Pentikäinen, and S. Richard, How to make collaborative robot https://pybullet.org.
programming easier: Workflow visualization on a tablet device. MSc. [25] S. Singh, A. Barto, and N. Chentanez, Intrinsically Motivated
Thesis project UPTEC IT 16 011, Department of Information Reinforcement Learning. In 18th Annual Conference on Neural
Technology, Uppsala Universitet, 61 p., June 2016. Information Processing Systems (NIPS), 2004.
[8] D. Baratta, Industrial Collaborative Robot Design. A guideline for
future design activity. AIDE@AI*IA, pp. 2-8, 2015.

Authorized licensed use limited to: Carleton University. Downloaded on October 05,2020 at 01:31:05 UTC from IEEE Xplore. Restrictions apply.

You might also like