The Robotics API An Object-Oriented Framework For Modeling Industrial Robotics Applications
The Robotics API An Object-Oriented Framework For Modeling Industrial Robotics Applications
Abstract— During the last two decades, software development available robot controllers. Considering the future challenges
has evolved continuously into an engineering discipline with of industrial robotics [3] like flexible manufacturing systems,
systematic use of methods and tools to model and implement human-friendly task description or cooperating robots, exist-
software. For example, object-oriented analysis and design is
structuring software models according to real-life objects of ing approaches for developing applications for robotics are
the problem domain and their relations. However, the industrial reaching their limits.
robotics domain is still dominated by old-style, imperative robot To overcome these limitations, there have been several
programming languages, making software development difficult academic approaches providing robot-specific libraries for
and expensive. For this reason, we introduce the object-oriented
general-purpose languages. Examples are RIPE [4], MR-
Robotics Application Programming Interface (Robotics API)
for developing software for industrial robotic applications. The ROC+ [5] and the Robotic Platform [6] Today, there’s a trend
Robotics API offers an abstract, extensible domain model and in robotics towards component-based software engineer-
provides common functionality, which can be easily used by ing [7], as the robotics domain, especially for experimental
application developers. The advantages of the Robotics API robotics, is considered as too diverse and too inhomogeneous
are illustrated with an application example.
to develop one stable reference domain model [8]. Examples
I. I NTRODUCTION for component-based robotics frameworks are Player [9] and
OROCOS [10]. However, these libraries and frameworks use
Today, industrial robots are still programmed with textual, a low level of abstraction i.e. developers still need profound
proprietary robot programming languages. These languages knowledge in robotics and often in real-time programming.
are provided by robot manufacturers for their products and From our point of view, the industrial robotics domain
are bound to their robot controllers. They are derived from can greatly profit from a rich object-oriented framework.
early imperative languages like ALGOL or Pascal and have Such a framework should provide the common concepts
in common that they offer robotics-specific data types, allow of this domain in form of class structures. Existing ap-
the specification of motions, and process I/O operations proaches like SIMOO-RT [11] and the above-mentioned
for the communication with external systems (e.g. tools, Robotic Platform [6] do not model concepts beyond different
sensors, or PLCs). Examples are the KUKA Robot Language manipulators and rather simple movements. Therefore, we
or RAPID from ABB. Due to these low-level languages, developed the Robotics Application Programming Interface
programming an industrial robot is a difficult task requiring (Robotics API) as the main element of a three-tiered software
considerable technical expertise and time. Hence, industrial architecture [12] and present it in this paper. Its main
robots are usually equipped and programmed to perform only contribution is the introduction of a comprehensive model
one particular task for a considerable time. This might be of the industrial robotics domain that comprises modeling
acceptable for mass production as in automotive industries, devices, action descriptions as well as interesting parts of
but for small and medium enterprises with rapidly changing the physical world.
products in small batches, the introduction of industrial
The paper is structured as follows: Sect. II describes
robots is an expensive and risky decision.
why and how object-orientation can be applied to industrial
Furthermore, robot programming languages are strongly robotics and motivates the chosen architecture. In Sect. III,
limited compared to general-purpose languages. For exam- the structure of the Robotics API including its domain model
ple, there is no built-in support for graphical user inter- is presented in detail. Subsequently, the advantages of this
faces, and external connectivity is limited, which makes e.g. object-oriented framework are illustrated with an industrial
connecting to databases or accessing web services difficult. application example in Sect. IV. The low-level execution of
Hence, developing software for configuring and supervising Robotics API commands with regard to real-time constraints
industrial robotic cells is a very complex and error-prone is described in Sect. V. Finally, conclusions are drawn in
task. The examples from [1] and [2] illustrate the efforts nec- Sect. VI.
essary for developing custom applications for commercially
The authors are with the Institute for Software and Systems Engineering, II. O BJECT-O RIENTATION IN I NDUSTRIAL ROBOTICS
University of Augsburg, D-86135 Augsburg, Germany. E-mail of corre-
sponding author: [email protected] Nowadays, business software systems are usually con-
This work presents results of the research project SoftRobot which is structed using techniques such as object-oriented analy-
funded by the European Union and the Bavarian government within the
High-Tech-Offensive Bayern. The project is carried out together with KUKA sis, design and programming. Elaborate software design
Roboter GmbH and MRK-Systeme GmbH. processes like the Unified Process [13] exist, as well as
4037
Motion
Command Action
model -durationInMs : int
exactly one SpatialObject. The World is a special Spa- Devices that can be controlled separately. This makes
tialObject defining the notion of a globally unique world a Manipulator the simplest kind of ComposedDevice.
concept. It knows a distinguished Frame WorldOrigin, One example for a concrete Manipulator is a Robot.
which is a globally unique world frame. In object-oriented design and programming, objects usu-
• A PhysicalObject is a concrete object in the world and ally carry their state with them as attributes, as well as
is a specialization of SpatialObject. Attributes like mass, methods operating on that state and representing the func-
size etc. can be defined for it. tionality of the respective object. Thus, one would expect that
When it comes to the type and number of devices that Robotics API devices contain several methods for performing
shall be controlled, classical robot controls can handle two usual actions (e.g. Robot.MoveLinear(), Gripper.Close()).
kinds: (1) A single robot that can be controlled natively and However, the Robotics API models commands that shall be
(2) external (from the view of the robot control) periphery executed by devices as separate objects, which corresponds
like tools, sensors and complete systems that the robot to the ’Command pattern’ defined in [16]. This kind of
shall interact with. Periphery is usually connected to the modeling allows the flexible combination of multiple com-
robot control via field buses which can be directly read mands into a bigger, complex command. This realizes the
and written in most robot control languages. Languages like core idea of encapsulating real-time critical action sequences.
KRL provide additional commands for the control of certain Sec. V explains this mechanism more in detail. Of course,
periphery that abstract from the IO level by introducing convenience methods can be implemented that serve as
control commands on a logical level. shortcuts for commonly used functionality and rely on the
The Robotics API provides basic classes to support all Command pattern structure in their implementations, like
kinds of devices and to control them. The relevant part of the aforementioned movement methods in the class Robot.
the class structure is shown in the lower right part of Fig. 1: Below, the classes forming the Robotics APIs command
• Device is the basic class for all kinds of mechanical model are explained:
devices that can be controlled, such as robots, grippers • An Action represents some sort of operation that can
or any other kinematic structures. Any such Device must be executed by certain devices (e.g. a Motion by a
be controllable by the Robot Control Core (see Sec. II). Manipulator).
Each Device can have a PhysicalObject attached that • A Command specifies a Device and an Action the
describes its physical properties. ComposedDevice is Device should execute.
intended to be the base class for devices that are • Actions contain a set of Events. An Event can occur
compositions of other devices, like e.g. a robot on a when an Action is executed and describes that a certain
mobile platform. It is possible to control each single state has been reached. Events are specific for each
device of this composed device separately, or treat the Action: E.g., a Motion has a MotionStartEvent and a
composition as a whole, depending on the use case. MotionEndEvent.
• A Joint represents a mechanical joint, which is the most • A Trigger can be attached to any Event. A Trigger
basic element of any robot. The Robotics API supports starts a new Command when the respective Event has
PrismaticJoints and RevoluteJoints occurred, with a specified temporal delay. An arbitrary
• A Manipulator is an example of a common concrete de- number of Triggers can be defined for each Event.
vice. It consists of several Joints, which are themselves The Robotics API is an open, extendable framework for
4038
high-level programming of industrial robot applications. By (1) Perform the ignition movement to the start point of the
providing basic concepts for world modeling, devices and welding seam and turn on the shielding gas flow. (2) After
action specifications, it promotes reuse of logic common to the defined gas preflow time, initiate the arc ignition. (3) As
such applications. Compared to the manipulator-level robot soon as the arc has been established, start the first motion
programming languages used in today’s robot controls, the along the seam. (4) Perform all necessary motions along the
Robotics API also facilitates the notion of real-world objects seam in a continuous manner. (5) As soon as the end of the
like workpieces that are to be manipulated. last motion segment has been reached, turn off the arc. (6)
Wait for the defined crater time and the defined gas postflow
IV. A PPLICATION E XAMPLE
time and after that, turn off the shielding gas flow. (7) Finally,
For evaluating the usefulness of the Robotics API as a perform the final movement away from the workpiece.
basic framework for industrial applications, we created a Most parts of this operation have to be executed within
draft implementation of a welding application as a proof of defined timing windows. In particular, all movements have
concept, based on the manual of the KUKA ArcTech Digital to be executed without any delay as soon as the arc has
add-on technology package [17]. Welding is a typical use been established, otherwise the workpiece will be damaged.
case for industrial robots and comprises many challenging Though some other steps (e.g. waiting for the gas postflow)
aspects of programming robots: perhaps do not require hard real-time guarantees, we chose to
• Configuring and controlling a robot and a welding tool implement all steps as one complex Robotics API Command.
• Specifying points and welding lines that are, in general, Listing 1 shows the first lines of code of the Weld() method.
specific to a type of workpiece // start the ignition movement
• Defining the welding work flow, including movements Command ignitionMovementCmd =
new Command(Robot, line.IgnitionMovement);
and synchronized tool actions
During implementation, we extended the Robotics API by // start gas depending on status of ignition movement
Command startGasCmd =
classes that are common for welding tasks. In Fig. 2, those new Command(WeldingTorch, new GasOn());
classes are shown in dark color, together with those classes
Trigger startGasTrigger = new Trigger(
of the Robotics API (in light color) that they relate to. The line.IgnitionMovement.OnMotionEnded,
diagram is structured similar to Fig. 1. startGasCmd);
stored in the IODevices can be used for mapping the high- ignitionMovementCmd.Execute();
level actions to input and output signals on the robot control.
Listing 1. Excerpt of the WeldingRobot.Weld() method.
The class WeldingRobot is a ComposedDevice, too, which
aggregates the used Robot and the WeldingTorch mounted The statements that are shown create a command for
at its flange. The WeldingRobot has the ability of moving letting the Robot do the ignition movement, and attach
the robot arm (with the correct center of motion, i.e. the tip triggers to the command, which start the gas preflow and
of the WeldingTorch) and for executing a complex Weld() initiate the ignition. The last line of the listing shows the
operation. This operation takes a WeldingLine as a parameter end of the Weld() method, where the ignition movement
as well as specific WeldingParameters. Those parameters is actually executed. This leads to the execution of all
specify characteristic details of the welding operation like the commands that are connected to the ignition movement
timeout for the ignition of the welding arc. The WeldingLine command via events and triggers. The execution of all those
consists of a sequence of motion definitions, which specify commands is performed as one atomic step and with real-
the welding seam that the WeldingRobot shall follow. The time guarantees considering the timing specifications.
class WeldedWorkpiece is a subclass of Workpiece and knows Having defined those basic classes, every weld application
a set of WeldingLines, so that large parts of a welding can just be implemented as a series of calls of the Weld()
program can be reused when the WeldedWorkpiece is ex- method (corresponding to the WeldingLines defined on the
changed. WeldedWorkpiece) with adequate transfer movements in
The most interesting part of our application is the im- between that move the robot from one welding line to the
plementation of the WeldingRobot.Weld() method. This op- next. Programmers of welding applications do not have to
eration performs a complete welding operation of a given deal with any real-time aspects of their applications, as those
WeldingLine. This operation consists of the following parts: are encapsulated inside the WeldingRobot’s implementation.
4039
Motion
Action
Command Full path defined in -durationInMs : int
model cartesian space
ArcStandingEvent ArcOn
ArcOff
PathMotion
PointToPointMotion Start- and endpoint Event WeldingTorchAction
... defined in cartesian
space GasFlowingEvent
GasOn GasOff
1..* 1 1
SeamMovements
Device
IgnitionMovement World Input Device
1..*
PhysicalObject model model
finalMovement
-Mass : int IODevice ComposedDevice
Frame toolCenter Output
1..*
1
1 1 WeldingTorch Robot
WeldingTorchObject Workpiece
preIgnitionPosition WeldingParameters
-GasPreflowTime 1 1
postWeldingPosition -IgnitionTime WeldingRobot
-CraterTime
WeldingLine -GasPostflowTime
WeldedWorkpiece
+Weld( WeldingLine, WeldingParameters )
... 1..* 1 +LIN( Frame )
+PTP( Frame )
representedBy +CIRC( Frame, Frame )
The implementation of all the classes specific for welding To enable control flow and conditional execution, the
•
functionality (shown in dark color in Fig. 2) took only about modules representing actions and devices have an input
100 lines of code, while all other parts (Robot, Command, port active controlling whether the module shall be
Trigger etc.) could be re-used directly from the Robotics API. evaluated. These ports are connected to a module called
V. E XECUTION OF ROBOTICS API C OMMANDS trigger which can be controlled over its on and off
inputs. This way, a trigger activating an action for a
Using the Robotics API, complex and domain specific device can be transformed into a module evaluating the
commands can be specified. However, to run these com- event condition and switching on the trigger for the
mands on real robots, they are converted into an executable action and device. Of course, this evaluation module
form. Our approach uses dataflow specifications, which are has to be able to access status information about other
expressed with RPI and describe the communication among running actions, provided as additional output ports by
various basic modules representing hardware, calculation and action modules.
control algorithms.
Fig. 3 gives an example for a Robotics API command
These RPI commands can be sent to a compatible robot
structure. It consists of the initial movement in a welding
controller, where they are executed respecting hard real-
application (action ignitionMovement) executed by a certain
time constraints. The controller also has to return status
device (robot), and enables the gas flow (action gasOn) of
information about its executed commands and devices to
the attached weldingTorch (device) once the initial motion
the Robotics API layer. This way, the running application
is completed (startGasTrigger triggered by a motionEnded
can be synchronized to the execution on the controller, and
event).
can always work on up-to-date state information about the
existing devices. ignitionMovement : : action ignitionMovementCmd : targets robot : Robot
The transformation of Robotics API commands into RPI Motion : Command
4040
Action: ignitionMovement Device: robot
trigger robot
advantages in software development for robotics. In order to
active
prove its universal validity and the improvements in software
trajectory generator
active position quality, we are applying our approach to a set of more
progress complex examples. Concerning the Robotics API, next steps
check
on
trigger
active
digital output include the introduction of sensors, extensions of the world
model (e.g. including moving frames) as well as sophisti-
active binary value io control
cated error handling concepts. Moreover, we are currently
Event: motionEnded Action: gasOn Device: weldingTorch
extending our approach to program real-time cooperation
Fig. 4. Generated RPI net tasks like load-sharing motions or rendezvous operations.
The Robotics API was designed to support such advanced
tasks as well, and we plan to verify that using the lightweight
VI. C ONCLUSION & F UTURE W ORK robots.
4041