0% found this document useful (0 votes)
26 views

Gesture Controlled Car Via Image Processing: Embedded System Design & Application

This document presents a project report for a gesture controlled car using image processing. A group of 4 students from the Department of Electronic Engineering at Sir Syed University of Engineering and Technology completed the project. The project uses a Kinect camera to detect gestures which are then processed by a laptop and transmitted using RF to control a model car. The report includes an acknowledgement, table of contents, abstract, introduction on the theoretical background, block diagram, methodology, list of hardware components, and description of the software code used.

Uploaded by

SaRosh Raees
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views

Gesture Controlled Car Via Image Processing: Embedded System Design & Application

This document presents a project report for a gesture controlled car using image processing. A group of 4 students from the Department of Electronic Engineering at Sir Syed University of Engineering and Technology completed the project. The project uses a Kinect camera to detect gestures which are then processed by a laptop and transmitted using RF to control a model car. The report includes an acknowledgement, table of contents, abstract, introduction on the theoretical background, block diagram, methodology, list of hardware components, and description of the software code used.

Uploaded by

SaRosh Raees
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 16

Gesture Controlled Car via Image

Processing
Embedded System Design & Application

Submitted by

Syed Zeeshan Haider 2016-EE-095


Sarosh Raees 2016-EE-105
Ali Iqrar 2016-EE-113
Osama Jamil 2016-EE-139

Semester Project Report

Department of Electronic Engineering


Sir Syed University Of Engineering and Technology, Karachi

ACKNOWLEDGEMENT
We are grateful because we managed to complete our Embedded System project within time
given by our lecturer Sir Rauf. This project cannot be completed without the effort & co-
operation from our group members, Group members Syed Zeeshan Haider, Sarosh Raees, Ali
Iqrar & Osama Jamil. We also sincerely thank our lecturer of practical of Communication
Systems, Sir Shahrukh for the guidance & encouragement in finishing this assignment and for
teaching us in this course. Finally, we would like to express our gratitude to our friends and
respondents for the support and willingness to spend some time with us to help us in making this
project possible.

2
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
TABLE OF CONTENTS

Page
Acknowledgment 2
Table of Contents 3
Abstract 4

1. INTRODUCTION
1.1 Introduction 5
1.2 Theoretical Background 5

2. PROJECT BLOCK DIAGRAM

2.1 Block Diagram 6


2.2 Description of Blocks 6

3. METHODOLOGY

3.1 Methodology 6

4. SYSTEM HARDWARE

4.1 List of Components 7

5. SYSTEM SOFTWARE

5.1 Processing 7
5.2 Software Code 7

6. CONCLUSION & FUTURE RECOMMENDATION 20

REFRENCES 20

APPENDIX A: Cost Analysis of the Project 21

APPENDIX B: Datasheets of Major Components 21

3
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
ABSTRACT
A gesture controlled car is based on a F1 which can be controlled with our body gestures instead
of a remote control. The project would take the real time image of a person & process it on a
laptop to get the gestures given by the people, these gestures could be completely customized for
any persons need, then the gestures would be processed at the laptop & would be sent to the car
using RF transmitter the receiver would catch the given gestures & act accordingly whether to go
left, right, forward or reverse. This kind of controlling can become very handy because this does
not require any remote control & the person controlling the car will be completely hands free.

4
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
INTRODUCTION
1.1 INTRODUCTION
Robotics is an important branch of electronics which deals with the operation, design, construction
and application of robots for sensory feedback, control and processing of information. Such
technologies are used in the automated machines which work in manufacturing process or in
hazardous and dangerous environment to take the place of humans. The robotics field is aimed to
build advanced, efficient robots which serve in various practical scenarios whether commercially,
domestically or militarily. The new research and design technologies are growing rapidly.

It is also now that computers have been so tightly integrated with everyday life, that new
applications and hardware are constantly introduced. The means of communicating with computers
at the moment are limited to keyboards, mouse, light pen, trackball, keypads etc. These devices have
grown to be familiar but inherently limit the speed and naturalness with which we interact with the
computer. On the other hand, Gestures and voice, as a natural interface means, has been attracting so
much attention for interactive communication with robots in the recent years.

1.2 THEORETICAL BACKGROUND


The most important thing before starting the project we must clearly understand about the topic
that we want to do. So by doing literature review we can gain knowledge to make sure we fully
understand and can complete the project. The topics related to this project are motors, gestures,
Kinect camera, and chassis.

In the recent years, gesture recognition is gaining very much interest and the recent
developments in this field have provided more natural and easier ways of communicating and
interacting with machines. Gesture recognition technology has the potential to enhance the user’s
way of interacting with machines and to provide easy, fast, an efficient and more user friendlier
environment. Gesture allows user’s body to give commands to the system through different body
movements and easy gestures like car forward, arm up/down etc. The project does not require
any special equipment like glove or other devices to be attached to the body to sense the
movements. Instead, it is totally based on the image processing techniques. The camera reads the
full body movements which are then further processed to detect different gestures. This data is
then can be used to control devices or applications.

5
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
PROJECT BLOCK DIAGRAM
2.1 BLOCK DIAGRAM

Figure 2.1 block diagram of gesture controlled car

2.2 DESCRIPTION OF BLOCKS


The user gives the commands to Kinect Camera, the camera takes the input & sends to the laptop
for decoding, then there is an Arduino connected with the laptop which is transmitting the
control signals to the car. The car also contains a receiver circuit a supply H-Bridge & motors.
The laptop shows the camera output of the Kinect camera.

METHODOLOGY
3.1 METHODOLOGY
We are designing a gesture controlled car, which will be controlled through image processing.
The Kinect camera will be placed with a laptop & a transmitter circuit. The Kinect camera will
6
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
detect our body gestures. The car will act according to the given body gestures. The car consists
of motors & H-Bridge. The Kinect camera will provide the necessary gesture data to the
computer. The computer will decode the gesture data given in the form of angles into serial
values. The serial values are then transmitted using a transmitter circuit, the receiver grabs the
transmitted signal & gives the command to the H-Bridge, the H-Bridge controls the motor
accordingly.
SYSTEM HARDWARE

Fig 1-1 Kinect camera

4.1 LIST OF COMPONENTS


 LAPTOP
 KINECT CAMERA
 F1 CHASIS
 MOTORS
 NRF 24L01 Transmitter
 NRF 24L01 Receiver
 Li Po Battery
 H-Bridge

SYSTEM SOFTWARE
Software used in this project are Processing IDE & Arduino IDE. The drivers installed for this
project are Open NI, Simple Open NI and Kinect SDK.

5.1 PROCESSING
Processing is an open-source graphical library and integrated development environment (IDE) /
playground built for the electronic arts, new media art, and visual design communities with the
7
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
purpose of teaching non-programmers the fundamentals of computer programming in a visual
context.

Processing uses the Java language, with additional simplifications such as additional classes and
aliased mathematical functions and operations. As well as this, it also has a graphical user
interface for simplifying the compilation and execution stage.

5.2 SOFTWARE CODE

5.2.1 PROCESSING IDE CODE


import SimpleOpenNI.*; //kinect.enableIR();

import processing.serial.*; kinect.enableUser();// because of the version this


change
//Generate a SimpleOpenNI object
//size(640, 480);
SimpleOpenNI kinect;
fill(255, 0, 0);
// Create object from Serial class
//size(kinect.depthWidth()+kinect.irWidth(),
Serial myPort; // Create object from Serial class kinect.depthHeight());
//Vectors used to calculate the center of the mass kinect.setMirror(false);

//Up //Open the serial port for Arduino

float LeftshoulderAngle = 0; String portName = Serial.list()[0]; //change the 0 to


a 1 or 2 etc. to match your port
float LeftelbowAngle = 0;
myPort = new Serial(this, portName, 9600);
float RightshoulderAngle = 0;
}
float RightelbowAngle = 0;
void draw() {
//Legs
kinect.update();
float RightLegAngle = 0;
//image(kinect.depthImage(), 0, 0);
float LeftLegAngle = 0;
//image(kinect.irImage(),kinect.depthWidth(),0);
void settings() {
image(kinect.userImage(),0,0);
size(500, 420);
IntVector userList = new IntVector();
}
kinect.getUsers(userList);
void setup() {
if (userList.size() > 0) {
kinect = new SimpleOpenNI(this);
int userId = userList.get(0);
kinect.enableDepth();

8
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
//If we detect one user we have to draw it kinect.drawLimb(userId,
SimpleOpenNI.SKEL_RIGHT_SHOULDER,
if( kinect.isTrackingSkeleton(userId)){ SimpleOpenNI.SKEL_RIGHT_ELBOW);

//DrawSkeleton kinect.drawLimb(userId,
SimpleOpenNI.SKEL_RIGHT_ELBOW,
drawSkeleton(userId);
SimpleOpenNI.SKEL_RIGHT_HAND);
//drawUpAngles
kinect.drawLimb(userId,
ArmsAngle(userId); SimpleOpenNI.SKEL_LEFT_SHOULDER,
SimpleOpenNI.SKEL_TORSO);
//Draw the user Mass
kinect.drawLimb(userId,
MassUser(userId); SimpleOpenNI.SKEL_RIGHT_SHOULDER,
SimpleOpenNI.SKEL_TORSO);
//AngleLeg
kinect.drawLimb(userId,
LegsAngle(userId); SimpleOpenNI.SKEL_TORSO,
SimpleOpenNI.SKEL_LEFT_HIP);
}
kinect.drawLimb(userId,
}
SimpleOpenNI.SKEL_LEFT_HIP,
} SimpleOpenNI.SKEL_LEFT_KNEE);

//Draw the skeleton kinect.drawLimb(userId,


SimpleOpenNI.SKEL_LEFT_KNEE,
void drawSkeleton(int userId) { SimpleOpenNI.SKEL_LEFT_FOOT);

stroke(0); kinect.drawLimb(userId,
SimpleOpenNI.SKEL_TORSO,
strokeWeight(5); SimpleOpenNI.SKEL_RIGHT_HIP);

kinect.drawLimb(userId, kinect.drawLimb(userId,
SimpleOpenNI.SKEL_HEAD, SimpleOpenNI.SKEL_RIGHT_HIP,
SimpleOpenNI.SKEL_NECK); SimpleOpenNI.SKEL_RIGHT_KNEE);

kinect.drawLimb(userId, kinect.drawLimb(userId,
SimpleOpenNI.SKEL_NECK, SimpleOpenNI.SKEL_RIGHT_KNEE,
SimpleOpenNI.SKEL_LEFT_SHOULDER); SimpleOpenNI.SKEL_RIGHT_FOOT);

kinect.drawLimb(userId, kinect.drawLimb(userId,
SimpleOpenNI.SKEL_LEFT_SHOULDER, SimpleOpenNI.SKEL_RIGHT_HIP,
SimpleOpenNI.SKEL_LEFT_ELBOW); SimpleOpenNI.SKEL_LEFT_HIP);

kinect.drawLimb(userId, noStroke();
SimpleOpenNI.SKEL_LEFT_ELBOW,
SimpleOpenNI.SKEL_LEFT_HAND); fill(255,0,0);

kinect.drawLimb(userId, drawJoint(userId, SimpleOpenNI.SKEL_HEAD);


SimpleOpenNI.SKEL_NECK,
SimpleOpenNI.SKEL_RIGHT_SHOULDER); drawJoint(userId, SimpleOpenNI.SKEL_NECK);

9
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
drawJoint(userId, if(confidence < 0.5){
SimpleOpenNI.SKEL_LEFT_SHOULDER);
return;
drawJoint(userId,
SimpleOpenNI.SKEL_LEFT_ELBOW); }

drawJoint(userId, SimpleOpenNI.SKEL_NECK); PVector convertedJoint = new PVector();

drawJoint(userId, kinect.convertRealWorldToProjective(joint,
SimpleOpenNI.SKEL_RIGHT_SHOULDER); convertedJoint);

drawJoint(userId, ellipse(convertedJoint.x, convertedJoint.y, 5, 5);


SimpleOpenNI.SKEL_RIGHT_ELBOW);
}
drawJoint(userId, SimpleOpenNI.SKEL_TORSO);
//Generate the angle
drawJoint(userId,
float angleOf(PVector one, PVector two, PVector
SimpleOpenNI.SKEL_LEFT_HIP);
axis) {
drawJoint(userId,
PVector limb = PVector.sub(two, one);
SimpleOpenNI.SKEL_LEFT_KNEE);
return degrees(PVector.angleBetween(limb, axis));
drawJoint(userId,
SimpleOpenNI.SKEL_RIGHT_HIP); }
drawJoint(userId, //Calibration not required
SimpleOpenNI.SKEL_LEFT_FOOT);
void onNewUser(SimpleOpenNI kinect, int userID) {
drawJoint(userId,
SimpleOpenNI.SKEL_RIGHT_KNEE); println("Start skeleton tracking");

drawJoint(userId, kinect.startTrackingSkeleton(userID);
SimpleOpenNI.SKEL_LEFT_HIP);
}
drawJoint(userId,
SimpleOpenNI.SKEL_RIGHT_FOOT); void onLostUser(SimpleOpenNI curContext, int
userId) {
drawJoint(userId,
SimpleOpenNI.SKEL_RIGHT_HAND); println("onLostUser - userId: " + userId);

drawJoint(userId, }
SimpleOpenNI.SKEL_LEFT_HAND);
void MassUser(int userId) {
}
if(kinect.getCoM(userId,com)){
void drawJoint(int userId, int jointID) {

PVector joint = new PVector(); kinect.convertRealWorldToProjective(com,com2d);

float confidence = stroke(100,255,240);


kinect.getJointPositionSkeleton(userId, jointID,
strokeWeight(3);
joint);
10
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
beginShape(LINES);
kinect.getJointPositionSkeleton(userId,SimpleOpenN
vertex(com2d.x,com2d.y - 5); I.SKEL_LEFT_HAND,leftHand);

vertex(com2d.x,com2d.y + 5);
kinect.getJointPositionSkeleton(userId,SimpleOpenN
vertex(com2d.x - 5,com2d.y);
I.SKEL_LEFT_ELBOW,leftElbow);
vertex(com2d.x + 5,com2d.y);
PVector leftShoulder = new PVector();
endShape(); kinect.getJointPositionSkeleton(userId,SimpleOpenN
I.SKEL_LEFT_SHOULDER,leftShoulder);
fill(0,255,100);
// we need left hip to orient the shoulder angle
text(Integer.toString(userId),com2d.x,com2d.y);
PVector leftHip = new PVector();
}

} kinect.getJointPositionSkeleton(userId,SimpleOpenN
I.SKEL_LEFT_HIP,leftHip);
public void ArmsAngle(int userId){
// reduce our joint vectors to two dimensions for
// get the positions of the three joints of our right right side
arm
PVector rightHand2D = new PVector(rightHand.x,
PVector rightHand = new PVector(); rightHand.y);

PVector rightElbow2D = new


kinect.getJointPositionSkeleton(userId,SimpleOpenN PVector(rightElbow.x, rightElbow.y);
I.SKEL_RIGHT_HAND,rightHand);
PVector rightShoulder2D = new
PVector rightElbow = new PVector(); PVector(rightShoulder.x,rightShoulder.y);

PVector rightHip2D = new PVector(rightHip.x,


kinect.getJointPositionSkeleton(userId,SimpleOpenN rightHip.y);
I.SKEL_RIGHT_ELBOW,rightElbow);
// calculate the axes against which we want to
PVector rightShoulder = new PVector(); measure our angles
kinect.getJointPositionSkeleton(userId,SimpleOpenN
I.SKEL_RIGHT_SHOULDER,rightShoulder); PVector torsoOrientation =
PVector.sub(rightShoulder2D, rightHip2D);
// we need right hip to orient the shoulder angle
PVector upperArmOrientation =
PVector rightHip = new PVector(); PVector.sub(rightElbow2D, rightShoulder2D);

// reduce our joint vectors to two dimensions for left


kinect.getJointPositionSkeleton(userId,SimpleOpenN side
I.SKEL_RIGHT_HIP,rightHip);
PVector leftHand2D = new PVector(leftHand.x,
// get the positions of the three joints of our left arm leftHand.y);

PVector leftHand = new PVector(); PVector leftElbow2D = new PVector(leftElbow.x,


leftElbow.y);

11
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
PVector leftShoulder2D = new //Arduino serial for legs
PVector(leftShoulder.x,leftShoulder.y);
ArduinoSerialArms();
PVector leftHip2D = new PVector(leftHip.x,
leftHip.y); }

// calculate the axes against which we want to void ArduinoSerialArms(){


measure our angles
if(RightelbowAngle <=35) {
PVector torsoLOrientation =
//if we clicked in the window
PVector.sub(leftShoulder2D, leftHip2D);
myPort.write("1"); //send a 1
PVector upperArmLOrientation =
PVector.sub(leftElbow2D, leftShoulder2D); println("1");
// calculate the angles between our joints for } else { //otherwise
rightside
if(RightelbowAngle >=150 && RightelbowAngle
RightshoulderAngle = angleOf(rightElbow2D, <=170) {
rightShoulder2D, torsoOrientation);
//if we clicked in the window
RightelbowAngle =
angleOf(rightHand2D,rightElbow2D,upperArmOrien myPort.write('2'); //send a 2
tation);
println('2');
// show the angles on the screen for debugging
} else {
fill(255,0,0);
if(RightelbowAngle >=35 && RightelbowAngle
scale(1); <=150){

text("Right shoulder: " + int(RightshoulderAngle) + //otherwise


"\n" + " Right elbow: " + int(RightelbowAngle), 20,
20); myPort.write('0'); //send a 0

// calculate the angles between our joints for leftside println('0');

LeftshoulderAngle = angleOf(leftElbow2D, }
leftShoulder2D, torsoLOrientation);
else{
LeftelbowAngle =
if( LeftelbowAngle <=45){
angleOf(leftHand2D,leftElbow2D,upperArmLOrient
ation); myPort.write('3'); //send a 0
// show the angles on the screen for debugging println('3');
fill(255,0,0); }
scale(1); else{
text("Left shoulder: " + int(LeftshoulderAngle) + if( LeftelbowAngle >=45 && LeftelbowAngle
"\n" + " Left elbow: " + int(LeftelbowAngle), 20, 55); <=150){

12
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
myPort.write('4'); //send a 0 PVector rightFoot2D = new PVector(rightFoot.x,
rightFoot.y);
println('4');
PVector rightKnee2D = new PVector(rightKnee.x,
} rightKnee.y);

else{ PVector rightHip2DLeg = new


PVector(rightHipL.x,rightHipL.y);
if( LeftelbowAngle <=180 && LeftelbowAngle
>=90){ // calculate the axes against which we want to
measure our angles
myPort.write('0'); //send a 0
PVector RightLegOrientation =
println('0');
PVector.sub(rightKnee2D, rightHip2DLeg);
}
// calculate the angles between our joints for
} rightside

} RightLegAngle =
angleOf(rightFoot2D,rightKnee2D,RightLegOrientati
} on);

} fill(255,0,0);

} scale(1);

} text("Right Knee: " + int(RightLegAngle), 500, 20);

void LegsAngle(int userId) { // get the positions of the three joints of our left leg

// get the positions of the three joints of our right leg PVector leftFoot = new PVector();

PVector rightFoot = new PVector();


kinect.getJointPositionSkeleton(userId,SimpleOpenN
I.SKEL_LEFT_FOOT,leftFoot);
kinect.getJointPositionSkeleton(userId,SimpleOpenN
I.SKEL_RIGHT_FOOT,rightFoot); PVector leftKnee = new PVector();

PVector rightKnee = new PVector();


kinect.getJointPositionSkeleton(userId,SimpleOpenN
I.SKEL_LEFT_KNEE,leftKnee);
kinect.getJointPositionSkeleton(userId,SimpleOpenN
I.SKEL_RIGHT_KNEE,rightKnee); PVector leftHipL = new PVector();

PVector rightHipL = new PVector();


kinect.getJointPositionSkeleton(userId,SimpleOpenN
I.SKEL_LEFT_HIP,leftHipL);
kinect.getJointPositionSkeleton(userId,SimpleOpenN
I.SKEL_RIGHT_HIP,rightHipL); // reduce our joint vectors to two dimensions for left
side
// reduce our joint vectors to two dimensions for
right side PVector leftFoot2D = new PVector(leftFoot.x,
leftFoot.y);
13
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
PVector leftKnee2D = new PVector(leftKnee.x, //otherwise
leftKnee.y);
myPort.write("7"); //send a 7
PVector leftHip2DLeg = new
PVector(leftHipL.x,leftHipL.y); println("7");

// calculate the axes against which we want to }


measure our angles
}
PVector LeftLegOrientation =
PVector.sub(leftKnee2D, leftHip2DLeg); 5.2.2 ARDUINO IDE CODE
// calculate the angles between our joints for left int xPin;
side
int out1=8; //output1 for HT12E IC
LeftLegAngle =
angleOf(leftFoot2D,leftKnee2D,LeftLegOrientation); int out2=9; //output1 for HT12E IC

// show the angles on the screen for debugging int out3=10; //output1 for HT12E IC

fill(255,0,0); int out4=11; //output1 for HT12E IC

scale(1);

text("Leftt Knee: " + int(LeftLegAngle), 500, 55); void setup() {

//ArduinoSerialLegs(); pinMode(out1,OUTPUT);

} pinMode(out2,OUTPUT);

void ArduinoSerialLegs(){ pinMode(out3,OUTPUT);

if(RightLegAngle <= 150) { pinMode(out4,OUTPUT);

myPort.write("4"); //send a 4 // put your setup code here, to run once:

println("4"); Serial.begin(9600);

} else { //otherwise }

myPort.write("5"); //send a 5 void loop() {

println("5"); // put your main code here, to run repeatedly:

} if (Serial.available() > 0) {

if(LeftLegAngle <= 150) { char xval = Serial.read();

myPort.write("6"); //send a 6 Serial.println(xval);

println("6"); if (xval == '1')

} else { {

14
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
digitalWrite(out1,HIGH); else

digitalWrite(out2,LOW); {

digitalWrite(out3,HIGH); if(xval == '4')

digitalWrite(out4,LOW); {

} digitalWrite(out1,LOW);

else digitalWrite(out2,LOW);

{ digitalWrite(out3,HIGH);

if (xval == '2') digitalWrite(out4,LOW);

{ }

digitalWrite(out1,LOW); else

digitalWrite(out2,HIGH); {

digitalWrite(out3,LOW); if(xval == '0')

digitalWrite(out4,HIGH); {

} digitalWrite(out1,LOW);

else digitalWrite(out2,LOW);

{ digitalWrite(out3,LOW);

if(xval == '3') digitalWrite(out4,LOW);

{ }

digitalWrite(out1,HIGH); }

digitalWrite(out2,LOW); }

digitalWrite(out3,LOW); }

digitalWrite(out4,LOW); }

} }

CONCLUSION
6.1 CONCLUSION & FUTURE SCOPE
The gesture controlled car has been designed in such a way that it can cater to the needs of the
bomb disposal squad, the military, the police and also for the Personnel who handle radioactive
materials. It has countless applications and can be used in different environments and scenarios.
15
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology
For instance, at one place it can be used by the bomb disposal squad, while at another instance it
can be used for handling mines. While another application can be to provide up to date
information in a hostage situation.

REFRENCES
Books

[1] Making Things See by Greg Borenstien

[2] Arduino & Kinect Projects Design & blow their minds by Enrique Ramos Melgar & Ciriaco
Castro Diez

APPENDIX-A
COST ANALYSIS OF THE PROJECT
S.No Component Name QTY Cost

1 Kinect Camera 1 3500


2 Motors 2 300
3 H Bridge 1 150
4 Battery 1 2000
5 Arduino 1 500
Total Cost 6650

16
Dept. of Electronic Engineering
Sir Syed University Of Engineering & Technology

You might also like