0% found this document useful (0 votes)
125 views5 pages

Hand Gestures Controlled Robot

Based on the data sheet available we can easily control a robot with the help of the hand gestures save already in the data base! For codes, contact me @

Uploaded by

Tushar Anand
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
125 views5 pages

Hand Gestures Controlled Robot

Based on the data sheet available we can easily control a robot with the help of the hand gestures save already in the data base! For codes, contact me @

Uploaded by

Tushar Anand
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

International Journal For Technological Research In Engineering

Volume 1, Issue 8, April-2014

ISSN (Online): 2347 - 4718

REAL - TIME ROBOT CAR CONTROL USING HAND SIGN


RECOGNITION
Ankit Multanmal Oswal1, Gagan Shivarama Shetty2, Mridul Anil Hiwarkar3
Sanjeev Kumar Malik4, Mrs. Hemangi Shinde5
1,2,3,4
Students, 5Assistance Professor, Department of Electronics and Telecommunication,
AISSMSs Institute of Information Technology
Pune, India.
Abstract: This paper presents a novel algorithm to
recognize a set of static hand gestures for the HumanComputer Interaction (HCI) and controlling a robotic car
wirelessly after successful recognition using X-Bee module.
Through this method, the user can control or navigate the
robot by using gestures of his/her palm. The command
signals are generated from these gestures using image
processing. These signals are then passed to the robot to
navigate it in the specified directions. In short we have
implemented a system through which the user can give
commands to a wireless robot using gestures. The
application for gesture recognition has been created on
MATLAB environment with operating system as windows 7.
The system has been tested and verified under both
incandescent and fluorescent lighting conditions. The
experimental results are very encouraging as the system
produces real-time responses and highly accurate
recognition towards various gestures.
Index Terms: Hand Gesture Recognition, Human Robot
Interaction, Image Processing, MATLAB, Robot Control,
Webcam, Wireless, X-Bee.
I. INTRODUCTION
Recently, the demand for the indoor robots has increased
tremendously. Therefore, increased opportunities for many
people to operate the robots have emerged. However, for
many people, it is often difficult to operate a robot using the
conventional methods like remote control ([15], [16], and
[17]). The variety of physical shapes and functional
commands that each remote control features also raises
numerous problems: the difficulties in locating the required
remote control, the confusion with the button layout, the
replacement issue and so on. A simple definition of the term
gesture is suggestive movement of bodily parts such as
fingers, arms etc, which convey some information ([2], [8],
[17]). Waving the hand is a gesture that suggests good bye.
Even though gestures can originate from any bodily
movement, generally it originates from movement of face or
hand. Gesture is one of the most natural and expressive ways
of communications between human and computer in a real
system ([3], [10], [13], and [17]). We naturally use various
gestures to express our own intentions in everyday life. Hand
gesture is one of the important methods of non-verbal
communication for human beings. Hand gesture has been one
of the most common and natural communication media
among human being. Hand gesture recognition research has
gained a lot of attention because of its applications for intera-

www.ijtre.com

ctive human-machine interface and virtual environments


([3], [10], and [19]). So, we propose a robot operation
system using the hand gesture recognition. Based on one
unified set of hand gestures, this system interprets the user
hand gestures into pre-defined commands to control the
robotic car wirelessly using X-Bee module. Unlike the
conventional method for hand gesture recognition which
makes use of markers, special gloves or any other devices,
this method does not require any additional devices and
makes the user comfortable as in the glove-based system
user needs to wear burdensome accessories, which are
generally connected to computer ([4],[7],[14],[18]). This
barehanded proposed technique uses only 2D video input.
Our hand gesture recognition system was carried out on a
2.33 GHz Intel (R) Core TM 2Duo CPU 2 GB RAM on
Windows 7 platform using MATLAB R2010a.
II. PROPOSED WORK
The robot control system includes four parts as shown in
Figure 1: A webcam connected with the laptop computer. A
gesture recognition system running on a laptop computer. A
robotic car and the robot controller. A pair of wireless
communication modules connected with the gesture
recognition system and the robot controller respectively. The
webcam is used to obtain the image data of the human palm
and fingers. The image or video acquired as input may be
noisy or may reduce the performance by recognizing
surrounding as hand region. The acquired data is subjected to
enhancement and processed further to make it fit for
approximation with the gestures (data) stored in the database.
Then the data are processed to recognize the gesture. Each
gesture is corresponding to a different robot control comma-

Fig. 1 Proposed System Framework

Copyright 2013.All rights reserved.

635

International Journal For Technological Research In Engineering


Volume 1, Issue 8, April-2014
nd. Then the X-bee wireless module is used to send these
different robot control commands to the robot controller.
Accordingly, the robotic car will do actions according to
different human hand gestures, thus human-robot interaction
can be achieved. The gesture recognition system is developed
with MATLAB.
III. SYSTEM COMPONENTS
The different System Components are Camera Unit, Gesture
Recognition Unit, Wireless Communication Unit, and Robot
Car Unit. We will take look at each unit sequentially

ISSN (Online): 2347 - 4718

The video obtained through webcam is in RGB color model


& it is assumed that while capturing video, black background
is used. Thus the hand region is represented using white
color and the background region as black. From the corpus
of gestures, specific gesture of interest can be identified, and
on the basis of that, specific command for execution of
action can be given to robotic system.
Convert the pictures of database into grey

Carry out smoothing, normalization and

A. Camera Unit
A 6 megapixel USB 2.0 still webcam of INTEX Company is
mounted on a stand having a perfectly black platform. The
user has to perform different gestures wearing a white glove
in his/her hand on this platform.

morphological operations to enhance the

Calculate the threshold of each


image & store the values in an array

Connect video input to

Wait for user to make corresponding

Take the snapshot from the video


input & convert it into grey image
Carry out smoothing, normalization and

Fig. 2 Camera Unit

morphological operations to enhance the

Skin tones and texture changes from person to person. This is


the only reason user is supposed to wear white glove, so that
same database can be used by different users. However, the
user is free to create his/her database anytime and any
number of times. Use of a white glove while performing a
gesture also enhances the discrimination capability of the
gesture recognition unit.
B. Gesture Recognition Unit
The Gesture recognition unit comprises of a MATLAB code
running on a laptop/computer. A gesture recognition system
is proposed and developed which can effectively recognize
hand gestures with less computing but high accuracy
([1],[5],[11],[12],[14],[20],[25]). The proposed methodology
for gesture recognition involves acquisition of live video
from a camera for gesture identification. It acquires frames
from live video stream whenever the user desires to capture a
gesture. The overall proposed technique to acquire hand
gestures, recognizing it accurately and sending appropriate
commands wirelessly to the robot consists of four subparts:
Image acquisition to get hand gestures.
Enhancement of the acquired gesture image.
Determining gesture by co-relation with the prestored database gestures.
Generation of instructions corresponding to
matched gesture, for specified robotic action &
sending them on serial port.

www.ijtre.com

Calculate the threshold of this

Compare the threshold value with


stored values of array

If
Match
found

NO

YES
Exaction of instruction set corresponding
to the matched gesture
Passing the instructions to robot through
serial port for execution of corresponding

Fig. 3 Flow diagram for robot control using gesture

Copyright 2013.All rights reserved.

636

International Journal For Technological Research In Engineering


Volume 1, Issue 8, April-2014
Connect video input to MATLAB

ISSN (Online): 2347 - 4718

from a back EMF generated by the dc motor while changing


the direction of rotation, the dc motor driver have an internal
protection suit. We have also provided the back EMF
protection suit by connecting 4 diode configuration across
each dc motor. LCD is used in a project to visualize the
output of the application. We have used 16x2 LCD which
indicates 16 columns and 2 rows. LCD can also used in a
project to check the output of different modules interfaced
with the microcontroller. Thus LCD plays a vital role in a
project to see the output and to debug the system module
wise in case of system failure in order to rectify the problem.

Take the snapshots from the video input &


store in root directory when directed by user

Close all opened inputs and files.

Clear all memory allocated

Fig. 4 Flow diagram for creating database


C. Wireless Communication Unit
After gesture recognition, each gesture is encoded and the
coding is sent to the robot controller wirelessly using X-Bee
S1 Module. ([22], [23])
X-BEE S1 Module

Fig. 7 Robot Car Unit


IV. ROBOT CONTROL WITH HAND GESTURE
Fig. 5 Wireless Communication Unit
D. Robot Car Unit
The Robotic Car Unit comprises of a microcontroller
(PIC16F877A) to take decisions depending on the received
code ([22], [23],[24]). The different microcontroller
interfaces implemented in the Robotic Car Unit are shown in
figure 6. PIC16F877A works on 5V, while the X-Bee module
works on 3.3V, so we have designed a regulated power
supply for this purpose. DC motors are used to physically
drive the application as per the received code. The dc motor
works on 12 V. To drive a dc motor, we need a dc motor
driver called L293D. This dc motor driver is capable of
driving 2 dc motors at a time. In order to protect the dc motor

Once a hand gesture is recognized, an appropriate command


is sent to a robot. After the robot receives a command, it
performs a pre-defined work and keeps doing until a new
command arrives. Movement commands are written as a
function in robot specific language.We define total seven
gestures to direct the operation of the robot. The operations
include the following motions Straight, Reverse, Left,
Right, Up, Down and Stop. These gestures with the
assigned code and their functions are listed in TABLE I
TABLE II
USED GESTURES AND THERE MEANINGS
Sr. No.

Hand Gesture

Code

Description

#11

Forward

2.

#21

Reverse

3.

#31

Left

1.

Fig. 6 Block Diagram for Robot Car Unit

www.ijtre.com

Copyright 2013.All rights reserved.

637

International Journal For Technological Research In Engineering


Volume 1, Issue 8, April-2014

4.

#41

Right

5.

#51

Up

6.

#61

Down

7.

#71

Immersive game technology: Gestures can be used to control


interactions within video games to try and make the game
player's experience more interactive or immersive.
Virtual controllers: For systems where the act of finding or
acquiring a physical controller could require too much time,
gestures can be used as an alternative control mechanism.
Controlling secondary devices in a car or controlling a
television set are examples of such usage.
Affective computing: gesture recognition is used in the
process of identifying emotional expression through
computer systems.
Remote control: Through the use of gesture recognition,
remote control with the wave of a hand" of various devices is
possible. The signal must not only indicate the desired
response, but also which device to be controlled.
VI. APPLICATIONS AND FUTURE WORK
Controlling a robot, in real time, through the hand gestures is
a novel approach and its applications are myriad. The
inflammation of service robot to domestic users and
industries in the upcoming years would need such methods
extensively. The approach has huge potential once it gets
further optimized, as its time complexity is higher, with the
help of hardware having better specifications. Use of more
efficient wireless communication technique and a camera on
the robot unit would improve the performance of system to a
great exte nt and can be incorporated in future work.

Stop

V. GESTURE USES
Gesture recognition is useful for processing information from
humans which is not conveyed through speech or type. As
well, there are various types of gestures which can be
identified by computers.
Sign language recognition: Just as speech recognition can
transcribe speech to text, certain types of gesture recognition
software can transcribe the symbols represented through sign
language into text ([9], [24]).
For socially assistive robotic: By using proper sensors
(accelerometers and gyros) worn on the body of a patient and
by reading the values from those sensors, robots can assist in
patient rehabilitation. The best example can be stroke
rehabilitation.
Directional indication through pointing: Pointing has a very
specific purpose in our society, to reference an object or
location based on its position relative to ourselves. The use of
gesture recognition to determine where a person is pointing is
useful for identifying the context of statements or
instructions. This application is of particular interest in the
field of robotics.
Control through facial gestures: Controlling a computer
through facial gestures is a useful application of gesture
recognition for users who may not physically be able to use a
mouse or keyboard. Eye track in particular may be of use for
controlling cursor motion or focusing on elements of a
display.
Alternative computer interfaces: Foregoing the traditional
keyboard and mouse setup to interact with a computer, strong
gesture recognition could allow users to accomplish frequent
or common tasks using hand or face gestures to a camera.

www.ijtre.com

ISSN (Online): 2347 - 4718

VII. CONCLUSIONS
A low cost computer vision system that can be executed in a
common PC equipped with low power USB web cam was
one of the main objectives of our work, which has been
implemented successfully. We have experimented with
around 30 hand gesture images and achieved higher average
precision. The best classification rate of 97% was obtained
under different light conditions. But the drawback in this
method is that the hand should be properly placed with
respect to the webcam so that the entire hand region is
captured. If the hand is not placed properly the gesture is not
recognized appropriately. Gesture made in this method
involves only one hand and this reduces the number of
gestures that can be made using both hands.
VIII. ACKNOWLEDGMENT
We would like to thank Prof. H. D. Shinde, for her valuable
guidance and support during the completion of this project.
REFERENCES
[1] R. Gonzales, and E. Woods, Digital Image
Processing, Prentice Hall, Inc, New Jersey, 2002.
[2] J. Davis, M. Shah, Visual gesture recognition.
IEE Proc. Vis. Image Signal Process, 141(2):101106, 1994.
[3] E. Sanchez-Nielsen, L. Anton-Canalis, M.
Hernandez-Tejera, Hand gesture recognition for
human machine interaction, Journal of WSCG,

Copyright 2013.All rights reserved.

638

International Journal For Technological Research In Engineering


Volume 1, Issue 8, April-2014
Vol.12, No.1-3, 2003.
[4] Xioming Yin, and Xing Zhu, Hand pose
recognition in gesture based human-robot
interaction, proceedings of IEEE conference on
Industrial Electronics and Applications, Singapore,
May 2006
[5] Asanterabi Malima, Erol zgr, and Mjdat etin,
A fast algorithm for vision-based hand gesture
recognition using for robot control, IEEE
conference
on
Signal
Processing
and
Communications Applications, Antalya, Turkey,
14th, April 2006.
[6] Igorevich, R.R; Pusik Park ; Dugki Min ; Yunjung
Park ; Jongchan Choi ; Eunmi Choi: "Hand gesture
recognition algorithm based on grayscale histogram
of the image", Application of Information and
Communication Technologies (AICT), 2010 4th
International Conference
[7] Amornched Jinda-apiraksa, Warong Pongstiensak,
and Toshiaki Kondo, A Simple Shape-Based
Approach to Hand Gesture Recognition, in
Proceedings of IEEE International Conference on
Electrical
Engineering/Electronics
Computer
Telecommunications and Information Technology
(ECTI-CON), Pathumthani, Thailand , pages 851855, May 2010
[8] A.F. Bobick, A.D. Wilson, A state-based technique
for the summarization and recognition of gesture,
Proceedings fifth international conference on
computer vision (1995)
[9] J. Davis, M. Shah, Visual gesture recognition,
IEE Proceedings - Vision Image Signal Process 141
(2) (1994).
[10] R.H. Liang, M. Ouhyoung, a real-time continuous
gesture recognition system for sign language,
Proceedings IEEE Second International Conference
on Automatic Face and Gesture Recognition, 1998
[11] Y. Cui, J.J. Weng, Hand sign recognition from
intensity image sequences
with
complex
backgrounds,
Proceedings
IEEE
Second
International Conference on Automatic Face and
Gesture Recognition, 1996.
[12] Dardas, N.H. and E.M. Petriu, 2011. Hand gesture
detection and recognition usind principal
component analysis, proceedings of the IEEE
International Conference on Computational
Intelligence for Measurement Systems and
Applications, September 19-21, 2011, Tianjin.
[13] Yohei SATO, Kokichi Sugihara: An approach to
real-time gesture recognition for human-PC
interaction Guide to the Technical Report and
Templete, TECHNICAL REPORT OF IEICE
NLC2004-112 (2005).
[14] G. R. S. Murthy & R. S. Jadon. (2009) A Review of
Vision Based Hand Gestures Recognition,
International Journal of Information Technology and
Knowledge Management, Vol. 2, No. 2, pp 405-410.
[15] S. Mitra, & T. Acharya. (2007) Gesture

www.ijtre.com

ISSN (Online): 2347 - 4718

Recognition: A Survey IEEE Transactions on


systems, Man and Cybernetics, Part C: Applications
and reviews, Vol. 37, No. 3, pp 311- 324, doi:
10.1109/TSMCC.2007.893280.
[16] Rafiqul Zaman Khan1 & Noor Adnan Ibraheem,
(2012) Survey on Gesture Recognition for Hand
Image Postures, International Journal of Computer
and Information Science, Vol. 5, No. 3, pp 110-121.
doi:10.5539/cis.v5n3p110
[17] P. Garg, N. Aggarwal & S. Sofat, (2009). Vision
Based Hand Gesture Recognition, World Academy
of Science, Engineering and Technology, Vol. 49,
pp 972-977.
[18] Luigi Lamberti & Francesco Camastra, (2011)
Real-Time Hand Gesture Recognition Using a
Color Glove, Springer 16th international
conference on Image analysis and processing: Part I
(ICIAP'11), pp 365-373.
[19] Chan Wah Ng, Surendra Ranganath, Real-time
gesture recognition system and application. Image
and Vision computing (20): 993-1007, 2002.
[20] Tomita, A., Ishii. R. J. Hand shape extraction from
a sequence of digitized grayscale images.
Proceedings of tweenth International Conference on
Industrial Electronics, Control and Instrumentation,
IECON '94, vol.3, pp.19251930, Bologna, Italy,
1994.
[21] Sushmita Mitra and Tinku Acharya, Gesture
Recognition: A Survey, IEEE Transactions on
Systems, Man and Cybernetics-Part C: Applications
and Reviews, Vol. 37(3), pp. 56-68, May 2007.
[22] Prof. R. W. Jasutkar, Ms. Shubhangi J. Moon, A
Real Time Hand Gesture Recognition Technique by
using Embedded device. International Journal of
Advanced Engineering Sciences and Technologies,
vol. 2, issue no.1, pp. 043046 may 2005.
[23] https://github.com/zk00006/OpenTLD
[24] http://touchless.codeplex.com/releases/view/17986
[25] http:// Wikipedia, the free encyclopedia gesture
recognition based on MATLAB simulation

Copyright 2013.All rights reserved.

639

You might also like