0% found this document useful (0 votes)
29 views

The Robotics API An Object-Oriented Framework For Modeling Industrial Robotics Applications

This document summarizes a research paper about developing an object-oriented framework called the Robotics Application Programming Interface (Robotics API). The Robotics API aims to address limitations with existing robot programming languages by providing an abstract and extensible domain model and common functionality for developing industrial robotics applications. The framework offers classes that model key concepts in industrial robotics like devices, actions, and parts of the physical world. This allows application developers to program robots at a higher level of abstraction compared to traditional robot programming languages. The paper describes the motivation, structure, and advantages of the Robotics API and illustrates its use through an industrial application example.

Uploaded by

blerta.hajdini
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views

The Robotics API An Object-Oriented Framework For Modeling Industrial Robotics Applications

This document summarizes a research paper about developing an object-oriented framework called the Robotics Application Programming Interface (Robotics API). The Robotics API aims to address limitations with existing robot programming languages by providing an abstract and extensible domain model and common functionality for developing industrial robotics applications. The framework offers classes that model key concepts in industrial robotics like devices, actions, and parts of the physical world. This allows application developers to program robots at a higher level of abstraction compared to traditional robot programming languages. The paper describes the motivation, structure, and advantages of the Robotics API and illustrates its use through an industrial application example.

Uploaded by

blerta.hajdini
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

The 2010 IEEE/RSJ International Conference on

Intelligent Robots and Systems


October 18-22, 2010, Taipei, Taiwan

The Robotics API: An Object-Oriented Framework for Modeling


Industrial Robotics Applications
Andreas Angerer, Alwin Hoffmann, Andreas Schierl, Michael Vistein and Wolfgang Reif

Abstract— During the last two decades, software development available robot controllers. Considering the future challenges
has evolved continuously into an engineering discipline with of industrial robotics [3] like flexible manufacturing systems,
systematic use of methods and tools to model and implement human-friendly task description or cooperating robots, exist-
software. For example, object-oriented analysis and design is
structuring software models according to real-life objects of ing approaches for developing applications for robotics are
the problem domain and their relations. However, the industrial reaching their limits.
robotics domain is still dominated by old-style, imperative robot To overcome these limitations, there have been several
programming languages, making software development difficult academic approaches providing robot-specific libraries for
and expensive. For this reason, we introduce the object-oriented
general-purpose languages. Examples are RIPE [4], MR-
Robotics Application Programming Interface (Robotics API)
for developing software for industrial robotic applications. The ROC+ [5] and the Robotic Platform [6] Today, there’s a trend
Robotics API offers an abstract, extensible domain model and in robotics towards component-based software engineer-
provides common functionality, which can be easily used by ing [7], as the robotics domain, especially for experimental
application developers. The advantages of the Robotics API robotics, is considered as too diverse and too inhomogeneous
are illustrated with an application example.
to develop one stable reference domain model [8]. Examples
I. I NTRODUCTION for component-based robotics frameworks are Player [9] and
OROCOS [10]. However, these libraries and frameworks use
Today, industrial robots are still programmed with textual, a low level of abstraction i.e. developers still need profound
proprietary robot programming languages. These languages knowledge in robotics and often in real-time programming.
are provided by robot manufacturers for their products and From our point of view, the industrial robotics domain
are bound to their robot controllers. They are derived from can greatly profit from a rich object-oriented framework.
early imperative languages like ALGOL or Pascal and have Such a framework should provide the common concepts
in common that they offer robotics-specific data types, allow of this domain in form of class structures. Existing ap-
the specification of motions, and process I/O operations proaches like SIMOO-RT [11] and the above-mentioned
for the communication with external systems (e.g. tools, Robotic Platform [6] do not model concepts beyond different
sensors, or PLCs). Examples are the KUKA Robot Language manipulators and rather simple movements. Therefore, we
or RAPID from ABB. Due to these low-level languages, developed the Robotics Application Programming Interface
programming an industrial robot is a difficult task requiring (Robotics API) as the main element of a three-tiered software
considerable technical expertise and time. Hence, industrial architecture [12] and present it in this paper. Its main
robots are usually equipped and programmed to perform only contribution is the introduction of a comprehensive model
one particular task for a considerable time. This might be of the industrial robotics domain that comprises modeling
acceptable for mass production as in automotive industries, devices, action descriptions as well as interesting parts of
but for small and medium enterprises with rapidly changing the physical world.
products in small batches, the introduction of industrial
The paper is structured as follows: Sect. II describes
robots is an expensive and risky decision.
why and how object-orientation can be applied to industrial
Furthermore, robot programming languages are strongly robotics and motivates the chosen architecture. In Sect. III,
limited compared to general-purpose languages. For exam- the structure of the Robotics API including its domain model
ple, there is no built-in support for graphical user inter- is presented in detail. Subsequently, the advantages of this
faces, and external connectivity is limited, which makes e.g. object-oriented framework are illustrated with an industrial
connecting to databases or accessing web services difficult. application example in Sect. IV. The low-level execution of
Hence, developing software for configuring and supervising Robotics API commands with regard to real-time constraints
industrial robotic cells is a very complex and error-prone is described in Sect. V. Finally, conclusions are drawn in
task. The examples from [1] and [2] illustrate the efforts nec- Sect. VI.
essary for developing custom applications for commercially

The authors are with the Institute for Software and Systems Engineering, II. O BJECT-O RIENTATION IN I NDUSTRIAL ROBOTICS
University of Augsburg, D-86135 Augsburg, Germany. E-mail of corre-
sponding author: [email protected] Nowadays, business software systems are usually con-
This work presents results of the research project SoftRobot which is structed using techniques such as object-oriented analy-
funded by the European Union and the Bavarian government within the
High-Tech-Offensive Bayern. The project is carried out together with KUKA sis, design and programming. Elaborate software design
Roboter GmbH and MRK-Systeme GmbH. processes like the Unified Process [13] exist, as well as

978-1-4244-6676-4/10/$25.00 ©2010 IEEE 4036


methods and guidelines for constructing object-oriented soft- led to the conclusion that most of the workflow of such
ware architectures and solving recurring design problems applications can be programmed in a standard environment
by appropriate patterns [14]. In object-oriented design, real without regarding real-time aspects, whereas only real-time
world items are often modeled directly as software objects, critical control flows have to be executed in a specially
which dramatically helps understanding complex software designed runtime environment.
architectures. The object-orientation paradigm has a variety We developed a novel architectural approach that allows
of features to support reuse of existing code. By using a tight integration of high-level robot programs and real-
concepts like inheritance, it is possible to create large but time critical execution of low-level control flows, which is
still generic libraries, that can simplify the development of presented in detail in [12]. Following this approach, real-time
new applications for a certain domain. In that way, cost and critical commands can be specified using the object-oriented
effort of software development can be greatly reduced. Robotics API and are then dynamically translated into com-
There exists a number of object-oriented frameworks for mands for the Realtime Primitives Interface (RPI) [15]. A
the robotics domain (e.g. [4], [5], [6], [11]). Some of these RPI-compatible Robot Control Core (RCC) executes those
mainly focus on communication aspects between distributed commands, which consist of a combination of certain, pre-
system parts, whereas others also provide a basic class model defined (yet extendable) calculation modules, and a speci-
covering important robotics concepts. However, an elaborate fication of the data flow among them. Implementing only
class model cannot be found in any of those frameworks. the RCC with real-time aspects in mind is sufficient to
Brugali and Scandurra [7] even argue that it is difficult allow the atomic execution of Robotics API commands under
to define an object-oriented framework that remains stable real-time conditions. Existing frameworks for real-time robot
during its evolution, with regard to its class structure. For this control can be used to implement the RCC. In [15], we used
reason, they propose the approach of constructing robotics OROCOS as a basis for our prototypical RCC. In this paper,
software systems out of functional components with defined we focus mainly on the design of the Robotics API and
interfaces (defined as Component Based Software Engineer- how this programming framework supports the development
ing), which they argue to be suited for robotics. While this of applications for robotics, but also outline the process
may fit for the extensive domain of experimental robotics, we of dynamic RPI command generation on the basis of an
believe that an elaborate object-oriented framework can be a application example.
valuable basis for application development in the compara-
tively narrow domain of industrial robotics. Applications in III. S TRUCTURE OF THE ROBOTICS API
this domain are usually built upon common, stable functions
like high-level motion instructions that are provided by the Robotic applications usually model some part of the
basic robot control system. Thus, the required abstraction physical world. In the simplest case, they just define points
level is higher than the level a coarse-grained component ar- in space that are necessary for robot movements in the
chitecture provides. An adequate object-oriented framework application. Depending on the concrete application or appli-
architecture can provide a high abstraction level for re-using cation class, more information about the robot’s environment
functionality that is common for today’s industrial robot use is represented. For example, in assembly applications, the
cases and, furthermore, even cover future trends by reusing, notion of different workpieces that have to be assembled
extending and composing existing classes. Aside of that, can be helpful. However, such complex models of the reality
a large number of existing, object-oriented class libraries are predominantly supported by offline programming tools,
exist for modern languages like C# or Java. They provide whereas standard robot controls usually only support the
advanced functionality like image processing or complex definition of points.
user interface elements, which can be directly used with an The scope of the Robotics API comprises both use cases,
object-oriented robotics framework. as it supports both basic robot programs as well as complex,
A fact that complicates the design of an object-oriented domain specific applications. Therefore, the definition of
programming framework for robotics – and robot program- points in space as well as arbitrary physical objects is
ming in general – is the need for real-time hardware control. possible. Fig. 1 shows an overview of the basic class structure
Especially in the domain of industrial robotics, determin- inside the Robotics API, which consists of a total of about
istic execution of developed and once tested programs is 70 classes. The lower left part of this diagram contains those
of utmost importance. This special design requirement to classes that support modeling geometric relations:
today’s industrial robot controls, in particular for meeting • A Frame is a spatial point with a unique name. Each
safety criteria and quality standards, resulted in the de- Frame can have an arbitrary number of relations direct-
velopment of proprietary programming languages that are ing to it or originating from it.
interpretable with certain timing guarantees. However, an • A Relation connects a pair of Frames and defines a geo-
analysis of a wide range of industrial tasks for which robots metric relation between them, expressed mathematically
are employed provides an interesting result: Those actions by a homogeneous TransformationMatrix.
that require hard real-time guarantees comprise a closed set • A SpatialObject aggregates a number of Frames that
of basic primitives, whereas most of the workflow inside logically belong together. A SpatialObject can be seen
the respective applications is not hard real-time critical. This as the ’scope’ of a Frame, as each Frame is assigned to

4037
Motion
Command Action
model -durationInMs : int

A state that can occur during


the execution of a Command
MotionStartEvent 0..*
Is fired at the given temporal delay after the starts 1
Event Trigger
occurence of the given event Command
-delay : int hasAttached
MotionEndEvent
1 triggeredBy 0..*
targets

Relation TransformationMatrix representedBy Device


Homogenous matrix World Device
1 1 model model
0..* 0..* 1..*
SpatialObject Can execute control
Geometric commands ComposedDevice
Interesting spatial
relation between a
location
to from pair of frames
1 PrismaticJoint
PhysicalObject
Manipulator
-Mass : int Joint
1 1 A named -HomePosition : JointPosition
spatial point RevoluteJoint
Frame
A tangible
-Name : string Workpiece physical
object LinearUnit Robot

Fig. 1. Robotics API: basic class structure

exactly one SpatialObject. The World is a special Spa- Devices that can be controlled separately. This makes
tialObject defining the notion of a globally unique world a Manipulator the simplest kind of ComposedDevice.
concept. It knows a distinguished Frame WorldOrigin, One example for a concrete Manipulator is a Robot.
which is a globally unique world frame. In object-oriented design and programming, objects usu-
• A PhysicalObject is a concrete object in the world and ally carry their state with them as attributes, as well as
is a specialization of SpatialObject. Attributes like mass, methods operating on that state and representing the func-
size etc. can be defined for it. tionality of the respective object. Thus, one would expect that
When it comes to the type and number of devices that Robotics API devices contain several methods for performing
shall be controlled, classical robot controls can handle two usual actions (e.g. Robot.MoveLinear(), Gripper.Close()).
kinds: (1) A single robot that can be controlled natively and However, the Robotics API models commands that shall be
(2) external (from the view of the robot control) periphery executed by devices as separate objects, which corresponds
like tools, sensors and complete systems that the robot to the ’Command pattern’ defined in [16]. This kind of
shall interact with. Periphery is usually connected to the modeling allows the flexible combination of multiple com-
robot control via field buses which can be directly read mands into a bigger, complex command. This realizes the
and written in most robot control languages. Languages like core idea of encapsulating real-time critical action sequences.
KRL provide additional commands for the control of certain Sec. V explains this mechanism more in detail. Of course,
periphery that abstract from the IO level by introducing convenience methods can be implemented that serve as
control commands on a logical level. shortcuts for commonly used functionality and rely on the
The Robotics API provides basic classes to support all Command pattern structure in their implementations, like
kinds of devices and to control them. The relevant part of the aforementioned movement methods in the class Robot.
the class structure is shown in the lower right part of Fig. 1: Below, the classes forming the Robotics APIs command
• Device is the basic class for all kinds of mechanical model are explained:
devices that can be controlled, such as robots, grippers • An Action represents some sort of operation that can
or any other kinematic structures. Any such Device must be executed by certain devices (e.g. a Motion by a
be controllable by the Robot Control Core (see Sec. II). Manipulator).
Each Device can have a PhysicalObject attached that • A Command specifies a Device and an Action the
describes its physical properties. ComposedDevice is Device should execute.
intended to be the base class for devices that are • Actions contain a set of Events. An Event can occur
compositions of other devices, like e.g. a robot on a when an Action is executed and describes that a certain
mobile platform. It is possible to control each single state has been reached. Events are specific for each
device of this composed device separately, or treat the Action: E.g., a Motion has a MotionStartEvent and a
composition as a whole, depending on the use case. MotionEndEvent.
• A Joint represents a mechanical joint, which is the most • A Trigger can be attached to any Event. A Trigger
basic element of any robot. The Robotics API supports starts a new Command when the respective Event has
PrismaticJoints and RevoluteJoints occurred, with a specified temporal delay. An arbitrary
• A Manipulator is an example of a common concrete de- number of Triggers can be defined for each Event.
vice. It consists of several Joints, which are themselves The Robotics API is an open, extendable framework for

4038
high-level programming of industrial robot applications. By (1) Perform the ignition movement to the start point of the
providing basic concepts for world modeling, devices and welding seam and turn on the shielding gas flow. (2) After
action specifications, it promotes reuse of logic common to the defined gas preflow time, initiate the arc ignition. (3) As
such applications. Compared to the manipulator-level robot soon as the arc has been established, start the first motion
programming languages used in today’s robot controls, the along the seam. (4) Perform all necessary motions along the
Robotics API also facilitates the notion of real-world objects seam in a continuous manner. (5) As soon as the end of the
like workpieces that are to be manipulated. last motion segment has been reached, turn off the arc. (6)
Wait for the defined crater time and the defined gas postflow
IV. A PPLICATION E XAMPLE
time and after that, turn off the shielding gas flow. (7) Finally,
For evaluating the usefulness of the Robotics API as a perform the final movement away from the workpiece.
basic framework for industrial applications, we created a Most parts of this operation have to be executed within
draft implementation of a welding application as a proof of defined timing windows. In particular, all movements have
concept, based on the manual of the KUKA ArcTech Digital to be executed without any delay as soon as the arc has
add-on technology package [17]. Welding is a typical use been established, otherwise the workpiece will be damaged.
case for industrial robots and comprises many challenging Though some other steps (e.g. waiting for the gas postflow)
aspects of programming robots: perhaps do not require hard real-time guarantees, we chose to
• Configuring and controlling a robot and a welding tool implement all steps as one complex Robotics API Command.
• Specifying points and welding lines that are, in general, Listing 1 shows the first lines of code of the Weld() method.
specific to a type of workpiece // start the ignition movement
• Defining the welding work flow, including movements Command ignitionMovementCmd =
new Command(Robot, line.IgnitionMovement);
and synchronized tool actions
During implementation, we extended the Robotics API by // start gas depending on status of ignition movement
Command startGasCmd =
classes that are common for welding tasks. In Fig. 2, those new Command(WeldingTorch, new GasOn());
classes are shown in dark color, together with those classes
Trigger startGasTrigger = new Trigger(
of the Robotics API (in light color) that they relate to. The line.IgnitionMovement.OnMotionEnded,
diagram is structured similar to Fig. 1. startGasCmd);

For the welding application, the device model of the ignitionMovementCmd.AddTrigger(startGasTrigger);


Robotics API had to be extended. Two new Device sub-
// initialize the welding arc after the gas preflow time
classes were introduced: WeldingTorch and WeldingRobot. Command startIgnition =
WeldingTorch represents the tool that is used for welding new Command(WeldingTorch, new ArcOn());

and is modeled as ComposedDevice, consisting of multiple Trigger startIgnitionTrigger = new Trigger(


instances of IODevice. An IODevice is a generic representa- line.IgnitionMovement.OnMotionEnded,
startIgnition,
tion of an input or output port of the robot control that can be parameters.GasPreflowTime.Milliseconds);
read or written. In that way, the WeldingTorch class provides
ignitionMovementCmd.AddTrigger(startIgnitionTrigger);
a high-level interface for configuring and controlling the
device, while the information about the IO configuration ... // code for steps 4-7 is omitted

stored in the IODevices can be used for mapping the high- ignitionMovementCmd.Execute();
level actions to input and output signals on the robot control.
Listing 1. Excerpt of the WeldingRobot.Weld() method.
The class WeldingRobot is a ComposedDevice, too, which
aggregates the used Robot and the WeldingTorch mounted The statements that are shown create a command for
at its flange. The WeldingRobot has the ability of moving letting the Robot do the ignition movement, and attach
the robot arm (with the correct center of motion, i.e. the tip triggers to the command, which start the gas preflow and
of the WeldingTorch) and for executing a complex Weld() initiate the ignition. The last line of the listing shows the
operation. This operation takes a WeldingLine as a parameter end of the Weld() method, where the ignition movement
as well as specific WeldingParameters. Those parameters is actually executed. This leads to the execution of all
specify characteristic details of the welding operation like the commands that are connected to the ignition movement
timeout for the ignition of the welding arc. The WeldingLine command via events and triggers. The execution of all those
consists of a sequence of motion definitions, which specify commands is performed as one atomic step and with real-
the welding seam that the WeldingRobot shall follow. The time guarantees considering the timing specifications.
class WeldedWorkpiece is a subclass of Workpiece and knows Having defined those basic classes, every weld application
a set of WeldingLines, so that large parts of a welding can just be implemented as a series of calls of the Weld()
program can be reused when the WeldedWorkpiece is ex- method (corresponding to the WeldingLines defined on the
changed. WeldedWorkpiece) with adequate transfer movements in
The most interesting part of our application is the im- between that move the robot from one welding line to the
plementation of the WeldingRobot.Weld() method. This op- next. Programmers of welding applications do not have to
eration performs a complete welding operation of a given deal with any real-time aspects of their applications, as those
WeldingLine. This operation consists of the following parts: are encapsulated inside the WeldingRobot’s implementation.

4039
Motion
Action
Command Full path defined in -durationInMs : int
model cartesian space
ArcStandingEvent ArcOn
ArcOff
PathMotion
PointToPointMotion Start- and endpoint Event WeldingTorchAction
... defined in cartesian
space GasFlowingEvent
GasOn GasOff
1..* 1 1

SeamMovements

Device
IgnitionMovement World Input Device
1..*
PhysicalObject model model
finalMovement
-Mass : int IODevice ComposedDevice
Frame toolCenter Output
1..*
1
1 1 WeldingTorch Robot
WeldingTorchObject Workpiece
preIgnitionPosition WeldingParameters
-GasPreflowTime 1 1
postWeldingPosition -IgnitionTime WeldingRobot
-CraterTime
WeldingLine -GasPostflowTime
WeldedWorkpiece
+Weld( WeldingLine, WeldingParameters )
... 1..* 1 +LIN( Frame )
+PTP( Frame )
representedBy +CIRC( Frame, Frame )

Fig. 2. Structure of a welding application on top of the Robotics API

The implementation of all the classes specific for welding To enable control flow and conditional execution, the

functionality (shown in dark color in Fig. 2) took only about modules representing actions and devices have an input
100 lines of code, while all other parts (Robot, Command, port active controlling whether the module shall be
Trigger etc.) could be re-used directly from the Robotics API. evaluated. These ports are connected to a module called
V. E XECUTION OF ROBOTICS API C OMMANDS trigger which can be controlled over its on and off
inputs. This way, a trigger activating an action for a
Using the Robotics API, complex and domain specific device can be transformed into a module evaluating the
commands can be specified. However, to run these com- event condition and switching on the trigger for the
mands on real robots, they are converted into an executable action and device. Of course, this evaluation module
form. Our approach uses dataflow specifications, which are has to be able to access status information about other
expressed with RPI and describe the communication among running actions, provided as additional output ports by
various basic modules representing hardware, calculation and action modules.
control algorithms.
Fig. 3 gives an example for a Robotics API command
These RPI commands can be sent to a compatible robot
structure. It consists of the initial movement in a welding
controller, where they are executed respecting hard real-
application (action ignitionMovement) executed by a certain
time constraints. The controller also has to return status
device (robot), and enables the gas flow (action gasOn) of
information about its executed commands and devices to
the attached weldingTorch (device) once the initial motion
the Robotics API layer. This way, the running application
is completed (startGasTrigger triggered by a motionEnded
can be synchronized to the execution on the controller, and
event).
can always work on up-to-date state information about the
existing devices. ignitionMovement : : action ignitionMovementCmd : targets robot : Robot
The transformation of Robotics API commands into RPI Motion : Command

commands is specific to the particular Actions, Devices and


motionEnded : Event : triggeredBy startGasTrigger :
Events used, but follows a general pattern: Trigger

• Devices are turned into consumer modules featuring : starts


gasOn : GasOn : action startGasCmd : : targets weldingTorch :
dataflow inputs specific to the type of actions the Command WeldingTorch

devices support. For example, a robot object in the


Robotics API can be mapped into a module accepting Fig. 3. Robotics API Command
Cartesian position values as an input. This module’s
implementation controls the physical robot, following The generated RPI command is given in Fig. 4. The top
the trajectory received on the input port. part shows the representation of the ignitionMovementCmd
• Actions are represented by modules producing data that command, consisting of a trajectory generator as an imple-
will be processed by the devices. Motion actions thus mentation of the motion, and a robot module representing
become trajectory generator modules which calculate the controlled robot. The lower part implements the start-
the desired position of the robot end effector at each GasCmd command by sending a binary value to the digital
interpolation cycle. Motion overlays or other action output the torch is attached to. The check and trigger modules
modifiers cause additional modules to be added which in the lower left part check the progress of the trajectory
accept data produced by the primary action and calcu- generator and enable control of the digital output once the
late the corresponding overlay or modification. motion is completed.

4040
Action: ignitionMovement Device: robot
trigger robot
advantages in software development for robotics. In order to
active
prove its universal validity and the improvements in software
trajectory generator
active position quality, we are applying our approach to a set of more
progress complex examples. Concerning the Robotics API, next steps
check
on
trigger
active
digital output include the introduction of sensors, extensions of the world
model (e.g. including moving frames) as well as sophisti-
active binary value io control
cated error handling concepts. Moreover, we are currently
Event: motionEnded Action: gasOn Device: weldingTorch
extending our approach to program real-time cooperation
Fig. 4. Generated RPI net tasks like load-sharing motions or rendezvous operations.
The Robotics API was designed to support such advanced
tasks as well, and we plan to verify that using the lightweight
VI. C ONCLUSION & F UTURE W ORK robots.

In this paper, we have proposed the Robotics Application R EFERENCES


Programming Interface for developing software for industrial [1] J. N. Pires, G. Veiga, and R. Araújo, “Programming by demonstration
robots. It offers an object-oriented model for the (industrial) in the coworker scenario for SMEs,” Industrial Robot, vol. 36, no. 1,
pp. 73–83, 2009.
robotics domain and provides common functionality which [2] J. G. Ge and X. G. Yin, “An object oriented robot programming
can be easily used by developers. Main concepts are objects approach in robot served plastic injection molding application,” in
for robots, tools, frames or actions (e.g. motions or tool Robotic Welding, Intelligence & Automation, ser. Lect. Notes in
Control & Information Sciences, vol. 362. Springer, 2007, pp. 91–97.
actions). The Robotics API is embedded into a software [3] M. Hägele, T. Skordas, S. Sagert, R. Bischoff, T. Brogårdh, and
architecture and relies on a real-time capable robot control M. Dresselhaus, “Industrial robot automation,” European Robotics
core. Actions which need precise timings (e.g. motions Network, White Paper, Jul. 2005.
[4] D. J. Miller and R. C. Lennox, “An object-oriented environment for
or switching actions) are encapsulated inside a command robot system architectures,” in Proc. 1990 IEEE Intl. Conf. on Robotics
structure and will be executed atomically by the robot and Automation, Cincinnati, Ohio, USA, May 1990, pp. 352–361.
control. Developers can extend the Robotics API in order [5] C. Zieliński, “Object-oriented robot programming,” Robotica, vol. 15,
no. 1, pp. 41–48, 1997.
to create application-specific functionality or to add support [6] M. S. Loffler, V. Chitrakaran, and D. M. Dawson, “Design and
for new devices. The welding example from Sect. IV is implementation of the Robotic Platform,” Journal of Intelligent and
such an extension and introduces e.g. the composed device Robotic System, vol. 39, pp. 105–129, 2004.
[7] D. Brugali and P. Scandurra, “Component-based robotic engineering
WeldingRobot with its (configuration) properties and actions. (Part I),” IEEE Robotics & Automation Magazine, vol. 16, no. 4, pp.
Due to the high-level programming model and the tight, 84–96, Dec. 2009.
[8] D. Brugali, A. Agah, B. MacDonald, I. A. Nesnas, and W. D. Smart,
but hidden integration of low-level command execution, “Trends in robot software domain engineering,” in Software Engi-
application developers are able to focus on solving the neering for Experimental Robotics, ser. Springer Tracts in Advanced
application-specific problems and, as far as possible, do not Robotics, D. Brugali, Ed. Springer, Apr. 2007, vol. 30.
[9] T. Collett, B. MacDonald, and B. Gerkey, “Player 2.0: Toward a
need profound knowledge in robot control and real-time practical robot programming framework,” in Proc. 2005 Australasian
programming. Extensions facilitate the reuse of application- Conf. on Robotics and Automation, Sydney, Australia, Dec. 2005.
specific functionality and promote a separation of con- [10] H. Bruyninckx, “Open robot control software: the OROCOS project,”
in Proc. 2001 IEEE Intl. Conf. on Robotics and Automation, Seoul,
cerns: Domain experts model and implement extensions Korea, May 2001, pp. 2523–2528.
while application developers use them. Furthermore, the [11] L. B. Becker and C. E. Pereira, “SIMOO-RT – An object oriented
Robotics API allows robotic applications to be developed framework for the development of real-time industrial automation
systems,” IEEE Transactions on Robotics and Automation, vol. 18,
using standard technologies and non real-time environments. no. 4, pp. 421–430, Aug. 2002.
The current implementation of our prototypical Robotics [12] A. Hoffmann, A. Angerer, F. Ortmeier, M. Vistein, and W. Reif,
API is created as a class library based on Microsoft’s “Hiding real-time: A new approach for the software development of
industrial robots,” in Proc. 2009 IEEE/RSJ Intl. Conf. on Intelligent
C#, which is an object-oriented language on top of the Robots and Systems, St. Louis, USA, Oct. 2009, pp. 2108–2113.
.NET framework. With its built-in memory management, the [13] C. Larman, Applying UML and Patterns: An Introduction to Object-
.NET framework runtime and its languages are very robust Oriented Analysis and Design and Iterative Development, 3rd ed.
Prentice Hall, 2004.
against many common programming errors. Furthermore, the [14] F. Buschmann, R. Meunier, H. Rohnert, P. Sornmerlad, and M. Stal,
development environment Visual Studio provides extensive Pattern-Oriented Software Architecture. Wiley, 1996.
support for developing, modifying and testing applications. [15] M. Vistein, A. Angerer, A. Hoffmann, A. Schierl, and W. Reif,
“Interfacing industrial robots using realtime primitives,” in Proc. 2010
For the realization of the welding example (see IV), we also IEEE Intl. Conf. on Automation and Logistics, Hong Kong, China,
used C# and Visual Studio. This allowed a fast and clean Aug. 2010.
implementation of this application. From our point of view, [16] E. Gamma, R. Helm, R. Johnson, and J. Vlissides, Design Patterns:
Elements of Reusable Object-Oriented Software. Addison-Wesley,
our proposed approach will improve productivity as well as 1994.
quality in the development of robotics software [18] and can [17] KUKA.CR.ArcTech Digital 2.0, KUKA Robot Group, 2008.
leverage the use of industrial robots in small and medium [18] A. Hoffmann, A. Angerer, A. Schierl, M. Vistein, and W. Reif,
“Towards object-oriented software development for industrial robots,”
enterprises. in Proc. 7th Intl. Conf. on Informatics in Control, Automation and
The approach has been successfully applied to program Robotics, Funchal, Madeira, Portugal, Jun. 2010.
and control two KUKA lightweight robots, showing its

4041

You might also like