ProposalforNaturalHuman RobotInteractionwithRoboticArms v6 ISO
ProposalforNaturalHuman RobotInteractionwithRoboticArms v6 ISO
net/publication/283854338
CITATIONS READS
3 4,834
3 authors, including:
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Fernando De la Rosa on 15 November 2015.
Abstract—The study of the ways in which humans interact II. PROBLEM OF INTEREST
with robots is a multidisciplinary field with multiple
contributions from electronics, robotics, human-computer The robotics industry is preparing itself for a new business
interaction, ergonomics and even social sciences. The robotics opportunity made possible by new affordable technologies, the
industry is mainly focused on the development of conventional domestic market. However, the fact that electronic agents
technologies that improve efficiency and reduce the amount of cannot be used universally prevents the industry from
repetitive work. To achieve this, enterprises must train their promoting and selling them to the average consumer.
technical staff to accompany the robot when performing tasks, Fortunately, the challenge of producing a robot that adapts to
during configuration and technical programming for proper an untested environment is no longer an issue; for example, the
operation. Taking the latter into account, the development and Roomba robot vacuum cleaner manufactured by the American
creation of unconventional interfaces for interaction between company iRobot [2], has been often compared to a pet due to
humans and robots is critical, because they allow for a natural the low level maintenance that it requires. This example also
control over a robot to generate wide acceptance and massive use shows the unawareness of users regarding the pieces of
in the performance of a wide range of possible tasks. This paper equipment they acquire, making it the most remarkable success
presents the challenges in the design, implementation and testing story in the industry by sales and being the sole example in this
of a hand-based interface to control two robotic arms and the
market.
benefits of this technology that is between robotics and human
interaction. A proposition has been made to simply offer the public
nonspecific-use robots at margin prices in order to see what the
Keywords—Robotic arm, Human-Robot Interaction, gesture- consumer would do with them. At first glance, this looks like a
based commands, hand-based interface. move from a desperate industry trying to get a market share
without the importance of solving a real problem, in other
I. INTRODUCTION
words, a capitalist move for a capitalist society. However, it is
Currently, the use, implementation and programming of a deeper well taught step. After seeing the evolution of the
robots in the small business and domestic environments have Makerbot Replicator 3D-printer, that was initially conceived
been associated to an unfavorable cost-efficiency ratio. This without a target market and, as shown in an architectural
factor is regarded as the main obstacle against the publication [3], created for the sole purpose of designing
popularization and promotion of the advantages of robots structures but ended up having unintended uses, it is evident
among the general public. Taking into account machines that that the eye of the consumer is the one that has the real
can be controlled using gestures or that enable a natural potential to exploit the capabilities of such a product. The latter
interaction in order to perform and repeat tasks of daily life, could be looked at as an opportunity which confronts
without the intervention of a professional or the requirement of companies to revise one of the biggest challenges of the
technical knowledge can be the differentiating factor that industry, Human-Robot Interaction (HRI). If the average
allows the massive entry of robots to small and medium consumer has to find a way to use an advanced piece of
businesses as well as domestic environments. Other proposed electronics in his/her daily live, the interaction between them
research directions include the use of the gestures as guidelines must be seamless to facilitate its use.
for autonomous devices but, in this case, the perception of the
environment and planning is delegated to a human [1]. The implementation of new interfaces for interaction
between humans and robots proposes several unique challenges
This paper presents the process of design, implementation, to the systems being developed. The problem addressed in this
testing, and benefits of a hand-based interface that enables paper is to develop an unconventional interface to control two
controlling two robotic arms. All test iterations were designed robotic arms in order to enable the user to perform tasks near
with the aim of improving the precision of the system and the arm’s workspace or from a remote workspace because of
reducing the perception of discomfort that any user may security reasons.
manifest.
The developed system must adhere to certain restrictions signals to an Owi robotic arm that can be moved according to
besides working properly and differentiating itself. The first the user's movements (Fig. 1).
constraint is economic in nature and is one of the necessary
requirements to build a project that differentiates itself from
those that are currently on the market, therefore, the budget of
the whole device should not exceed US$500. The robotic arm
OWI provides the second limitation. The budgeted hardware
moves using DC motors, which are operated by pulses of
current in one direction for each degree of freedom. These
motors, unlike servo or stepper motors, have no way to know
the arm’s position. For this reason, the system is command-
based.
III. RELATED WORK
Regarding the natural interface, the proposed project
focuses on the ease of configuration and the intuitive
implementation of such systems (without the need for writing
code). Danish company Universal Robot [4] and Rethink
Robotics robots [5] have proposed approaches to this premise.
These two companies focus their efforts on the ability of the
robotic arms to maintain a state of neutral gravity. The operator
can guide the system and point out what locations should be
reached sequentially, without learning about its functioning, so
it is an interesting contribution to the line of assembly
technology. The mentioned technologies describe a promising
future and provide improvements for the use of robotic arms in
medium-sized industries; nevertheless, they do not consider the Fig. 1. (Top left) Adafruit Motor-Shield, (Top rigth) and explosion view of
cost-benefit limitation nor the implementation of an implicitly the Leap Motion peripheral, (Bottom) An assembled OWI Robotic Arm.
natural interaction.
The Leap Motion device has been used in several other B. Hand position tracking
HRI projects. For instance, the Pomodoro is a platform that A software component was developed to track the user’s
integrates the Leap Motion to control a mobile robot with the hand position, gestures and if it is the right or the left one, on
goal of rehabilitating children with reduced hand mobility in a the Processing platform that uses the Java language. The Leap
didactic manner [6]. This work exemplifies the range of motion interface provides real time position in the three
applications that are possible by using a natural interface conventional axes as well as the other needed aforementioned
between a human and a robot. It also highlights the importance information (Fig. 2). This information is analyzed whether or
of studying the factors that improve this experience and make it not it activates a trigger that sends a command to a robotic arm
viable for the general public. Another sensor that has been a depending on the hand that is being used. The triggers are
disruptive technology in commercial and research fields is the conditional statements defined if a hand trespasses a limit.
Microsoft Kinect [7], where the human body is meant to be the
controller. This sensor provides a high level of precision that
boosts its capabilities as an input device in fields as virtual
reality, augmented reality and, clearly, HRI.
IV. PROPOSED NATURAL INTERFACE
The following section describes the materials, concepts and
architecture used to build the system of two robotic arms
controlled through a natural interface, with the capabilities of
remote control via WEB protocol.
A. Components
The movement of the hands and fingers of the user is
captured, in real time at a frequency of 120 fps, through only
one Leap Motion device [8]. Then, a computer processes this Fig. 2. Real time position visualizer provided by the Leap Motion API.
information. If the user generates a gesture or places a hand
over a limit, it triggers a command and sends it to one of the The Leap Motion input interface enables the interaction
two Arduino that serve as controllers of both robotic arms between the user and the robotic arms by capturing the motions
(each Arduino related to a specific arm). Each Arduino is and gestures made by the user’s hands. The developed software
attached to an Adafruit Motor-Shield [9] that sends electrical receives the information of the motions and gestures in real-
time and then processes it according to the commands
presented in the function below (Fig. 3). These commands The integration of two robotic arms with only one Leap
depend on the hand being used (right or left, which the Leap Motion interface takes place and the user can control the robots
Motion can detect on its own) as well as on the position of the through motions and gestures provided by his/her hands. The
hand over the X, Y, and Z axis, defining (0, 0, 0) as the center necessary adjustments were made to recognize each hand with
of the space. Additionally, hand gestures made by the user are the Leap Motion interface; this means that the Leap motion
taken into account. In the case that any of these triggers is recognizes the movements of both hands and these are
detected, the corresponding command is sent to the Arduino processed properly so that the arms move in an equivalent way.
board, which is responsible for sending electrical signals This architecture may be simplified to allow control of a single
through the Adafruit Motor-Shield to move a robotic arm robotic arm with one of the user’s hands.
according to the user’s motions and/or gestures. All of these
factors determine the first letter of the command (R for right or Remote Architecture
L for Left) and a number corresponding to a selected ASCII A remote communication platform for handling the robotic
character. arms was developed, therefore, the user can interact with the
robotic arms from any place in the world through an Internet
connection. The development of this platform consists of a
Client-Server architecture that uses a UDP communication
protocol, which allows for fast communication due to the small
size of the data packets (Fig. 5).
In the first instance, the connection between the client and
the server is achieved, where the user enters the IP address of
{
the server and the IP address of the client is also configured on
the server side. This software is responsible for capturing
information received by the interface, and building datagrams
or data packets that will be sent to the server software where
the arms are located through the preset connection. The server
software receives the datagrams or data packets sent by the
client software and then processes and sends this information
{ { to the Arduino Uno which also processes it thus enabling the
arms to follow the movements executed by the user from the
Fig. 3. Commands result of the processing of the information retrieved by the
Leap Motion interface. client software.
C. System architecture
Next, the overall system architecture developed and the
integration of its various components, both software and
hardware, are described. The two developed architectures that
facilitate user interaction with the robotic arms are also
explained. These architectures allow the user to naturally
interact with the robotic arms, either locally or remotely.
Local Architecture
The local architecture configuration allows the user to
perform tasks and have physical access to the robotic arms.
The physical components of Fig. 4 are needed for this
configuration.