Gesture Controlled Car Via Image Processing: Embedded System Design & Application
Gesture Controlled Car Via Image Processing: Embedded System Design & Application
Processing
Embedded System Design & Application
Submitted by
ACKNOWLEDGEMENT
We are grateful because we managed to complete our Embedded System project within time
given by our lecturer Sir Rauf. This project cannot be completed without the effort & co-
operation from our group members, Group members Syed Zeeshan Haider, Sarosh Raees, Ali
Iqrar & Osama Jamil. We also sincerely thank our lecturer of practical of Communication
Systems, Sir Shahrukh for the guidance & encouragement in finishing this assignment and for
teaching us in this course. Finally, we would like to express our gratitude to our friends and
respondents for the support and willingness to spend some time with us to help us in making this
project possible.
2
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
TABLE OF CONTENTS
Page
Acknowledgment 2
Table of Contents 3
Abstract 4
1. INTRODUCTION
1.1 Introduction 5
1.2 Theoretical Background 5
3. METHODOLOGY
3.1 Methodology 6
4. SYSTEM HARDWARE
5. SYSTEM SOFTWARE
5.1 Processing 7
5.2 Software Code 7
REFRENCES 20
3
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
ABSTRACT
A gesture controlled car is based on a F1 which can be controlled with our body gestures instead
of a remote control. The project would take the real time image of a person & process it on a
laptop to get the gestures given by the people, these gestures could be completely customized for
any persons need, then the gestures would be processed at the laptop & would be sent to the car
using RF transmitter the receiver would catch the given gestures & act accordingly whether to go
left, right, forward or reverse. This kind of controlling can become very handy because this does
not require any remote control & the person controlling the car will be completely hands free.
4
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
INTRODUCTION
1.1 INTRODUCTION
Robotics is an important branch of electronics which deals with the operation, design, construction
and application of robots for sensory feedback, control and processing of information. Such
technologies are used in the automated machines which work in manufacturing process or in
hazardous and dangerous environment to take the place of humans. The robotics field is aimed to
build advanced, efficient robots which serve in various practical scenarios whether commercially,
domestically or militarily. The new research and design technologies are growing rapidly.
It is also now that computers have been so tightly integrated with everyday life, that new
applications and hardware are constantly introduced. The means of communicating with computers
at the moment are limited to keyboards, mouse, light pen, trackball, keypads etc. These devices have
grown to be familiar but inherently limit the speed and naturalness with which we interact with the
computer. On the other hand, Gestures and voice, as a natural interface means, has been attracting so
much attention for interactive communication with robots in the recent years.
In the recent years, gesture recognition is gaining very much interest and the recent
developments in this field have provided more natural and easier ways of communicating and
interacting with machines. Gesture recognition technology has the potential to enhance the user’s
way of interacting with machines and to provide easy, fast, an efficient and more user friendlier
environment. Gesture allows user’s body to give commands to the system through different body
movements and easy gestures like car forward, arm up/down etc. The project does not require
any special equipment like glove or other devices to be attached to the body to sense the
movements. Instead, it is totally based on the image processing techniques. The camera reads the
full body movements which are then further processed to detect different gestures. This data is
then can be used to control devices or applications.
5
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
PROJECT BLOCK DIAGRAM
2.1 BLOCK DIAGRAM
METHODOLOGY
3.1 METHODOLOGY
We are designing a gesture controlled car, which will be controlled through image processing.
The Kinect camera will be placed with a laptop & a transmitter circuit. The Kinect camera will
6
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
detect our body gestures. The car will act according to the given body gestures. The car consists
of motors & H-Bridge. The Kinect camera will provide the necessary gesture data to the
computer. The computer will decode the gesture data given in the form of angles into serial
values. The serial values are then transmitted using a transmitter circuit, the receiver grabs the
transmitted signal & gives the command to the H-Bridge, the H-Bridge controls the motor
accordingly.
SYSTEM HARDWARE
SYSTEM SOFTWARE
Software used in this project are Processing IDE & Arduino IDE. The drivers installed for this
project are Open NI, Simple Open NI and Kinect SDK.
5.1 PROCESSING
Processing is an open-source graphical library and integrated development environment (IDE) /
playground built for the electronic arts, new media art, and visual design communities with the
7
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
purpose of teaching non-programmers the fundamentals of computer programming in a visual
context.
Processing uses the Java language, with additional simplifications such as additional classes and
aliased mathematical functions and operations. As well as this, it also has a graphical user
interface for simplifying the compilation and execution stage.
8
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
//If we detect one user we have to draw it kinect.drawLimb(userId,
SimpleOpenNI.SKEL_RIGHT_SHOULDER,
if( kinect.isTrackingSkeleton(userId)){ SimpleOpenNI.SKEL_RIGHT_ELBOW);
//DrawSkeleton kinect.drawLimb(userId,
SimpleOpenNI.SKEL_RIGHT_ELBOW,
drawSkeleton(userId);
SimpleOpenNI.SKEL_RIGHT_HAND);
//drawUpAngles
kinect.drawLimb(userId,
ArmsAngle(userId); SimpleOpenNI.SKEL_LEFT_SHOULDER,
SimpleOpenNI.SKEL_TORSO);
//Draw the user Mass
kinect.drawLimb(userId,
MassUser(userId); SimpleOpenNI.SKEL_RIGHT_SHOULDER,
SimpleOpenNI.SKEL_TORSO);
//AngleLeg
kinect.drawLimb(userId,
LegsAngle(userId); SimpleOpenNI.SKEL_TORSO,
SimpleOpenNI.SKEL_LEFT_HIP);
}
kinect.drawLimb(userId,
}
SimpleOpenNI.SKEL_LEFT_HIP,
} SimpleOpenNI.SKEL_LEFT_KNEE);
stroke(0); kinect.drawLimb(userId,
SimpleOpenNI.SKEL_TORSO,
strokeWeight(5); SimpleOpenNI.SKEL_RIGHT_HIP);
kinect.drawLimb(userId, kinect.drawLimb(userId,
SimpleOpenNI.SKEL_HEAD, SimpleOpenNI.SKEL_RIGHT_HIP,
SimpleOpenNI.SKEL_NECK); SimpleOpenNI.SKEL_RIGHT_KNEE);
kinect.drawLimb(userId, kinect.drawLimb(userId,
SimpleOpenNI.SKEL_NECK, SimpleOpenNI.SKEL_RIGHT_KNEE,
SimpleOpenNI.SKEL_LEFT_SHOULDER); SimpleOpenNI.SKEL_RIGHT_FOOT);
kinect.drawLimb(userId, kinect.drawLimb(userId,
SimpleOpenNI.SKEL_LEFT_SHOULDER, SimpleOpenNI.SKEL_RIGHT_HIP,
SimpleOpenNI.SKEL_LEFT_ELBOW); SimpleOpenNI.SKEL_LEFT_HIP);
kinect.drawLimb(userId, noStroke();
SimpleOpenNI.SKEL_LEFT_ELBOW,
SimpleOpenNI.SKEL_LEFT_HAND); fill(255,0,0);
9
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
drawJoint(userId, if(confidence < 0.5){
SimpleOpenNI.SKEL_LEFT_SHOULDER);
return;
drawJoint(userId,
SimpleOpenNI.SKEL_LEFT_ELBOW); }
drawJoint(userId, kinect.convertRealWorldToProjective(joint,
SimpleOpenNI.SKEL_RIGHT_SHOULDER); convertedJoint);
drawJoint(userId, kinect.startTrackingSkeleton(userID);
SimpleOpenNI.SKEL_LEFT_HIP);
}
drawJoint(userId,
SimpleOpenNI.SKEL_RIGHT_FOOT); void onLostUser(SimpleOpenNI curContext, int
userId) {
drawJoint(userId,
SimpleOpenNI.SKEL_RIGHT_HAND); println("onLostUser - userId: " + userId);
drawJoint(userId, }
SimpleOpenNI.SKEL_LEFT_HAND);
void MassUser(int userId) {
}
if(kinect.getCoM(userId,com)){
void drawJoint(int userId, int jointID) {
vertex(com2d.x,com2d.y + 5);
kinect.getJointPositionSkeleton(userId,SimpleOpenN
vertex(com2d.x - 5,com2d.y);
I.SKEL_LEFT_ELBOW,leftElbow);
vertex(com2d.x + 5,com2d.y);
PVector leftShoulder = new PVector();
endShape(); kinect.getJointPositionSkeleton(userId,SimpleOpenN
I.SKEL_LEFT_SHOULDER,leftShoulder);
fill(0,255,100);
// we need left hip to orient the shoulder angle
text(Integer.toString(userId),com2d.x,com2d.y);
PVector leftHip = new PVector();
}
} kinect.getJointPositionSkeleton(userId,SimpleOpenN
I.SKEL_LEFT_HIP,leftHip);
public void ArmsAngle(int userId){
// reduce our joint vectors to two dimensions for
// get the positions of the three joints of our right right side
arm
PVector rightHand2D = new PVector(rightHand.x,
PVector rightHand = new PVector(); rightHand.y);
11
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
PVector leftShoulder2D = new //Arduino serial for legs
PVector(leftShoulder.x,leftShoulder.y);
ArduinoSerialArms();
PVector leftHip2D = new PVector(leftHip.x,
leftHip.y); }
LeftshoulderAngle = angleOf(leftElbow2D, }
leftShoulder2D, torsoLOrientation);
else{
LeftelbowAngle =
if( LeftelbowAngle <=45){
angleOf(leftHand2D,leftElbow2D,upperArmLOrient
ation); myPort.write('3'); //send a 0
// show the angles on the screen for debugging println('3');
fill(255,0,0); }
scale(1); else{
text("Left shoulder: " + int(LeftshoulderAngle) + if( LeftelbowAngle >=45 && LeftelbowAngle
"\n" + " Left elbow: " + int(LeftelbowAngle), 20, 55); <=150){
12
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
myPort.write('4'); //send a 0 PVector rightFoot2D = new PVector(rightFoot.x,
rightFoot.y);
println('4');
PVector rightKnee2D = new PVector(rightKnee.x,
} rightKnee.y);
} RightLegAngle =
angleOf(rightFoot2D,rightKnee2D,RightLegOrientati
} on);
} fill(255,0,0);
} scale(1);
void LegsAngle(int userId) { // get the positions of the three joints of our left leg
// get the positions of the three joints of our right leg PVector leftFoot = new PVector();
// show the angles on the screen for debugging int out3=10; //output1 for HT12E IC
scale(1);
//ArduinoSerialLegs(); pinMode(out1,OUTPUT);
} pinMode(out2,OUTPUT);
println("4"); Serial.begin(9600);
} else { //otherwise }
} if (Serial.available() > 0) {
} else { {
14
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
digitalWrite(out1,HIGH); else
digitalWrite(out2,LOW); {
digitalWrite(out4,LOW); {
} digitalWrite(out1,LOW);
else digitalWrite(out2,LOW);
{ digitalWrite(out3,HIGH);
{ }
digitalWrite(out1,LOW); else
digitalWrite(out2,HIGH); {
digitalWrite(out4,HIGH); {
} digitalWrite(out1,LOW);
else digitalWrite(out2,LOW);
{ digitalWrite(out3,LOW);
{ }
digitalWrite(out1,HIGH); }
digitalWrite(out2,LOW); }
digitalWrite(out3,LOW); }
digitalWrite(out4,LOW); }
} }
CONCLUSION
6.1 CONCLUSION & FUTURE SCOPE
The gesture controlled car has been designed in such a way that it can cater to the needs of the
bomb disposal squad, the military, the police and also for the Personnel who handle radioactive
materials. It has countless applications and can be used in different environments and scenarios.
15
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
For instance, at one place it can be used by the bomb disposal squad, while at another instance it
can be used for handling mines. While another application can be to provide up to date
information in a hostage situation.
REFRENCES
Books
[2] Arduino & Kinect Projects Design & blow their minds by Enrique Ramos Melgar & Ciriaco
Castro Diez
APPENDIX-A
COST ANALYSIS OF THE PROJECT
S.No Component Name QTY Cost
16
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology