0% found this document useful (0 votes)
14 views

FINAL PPT

The Hand Gesture Control Robot Car project combines IoT and gesture recognition to enable intuitive control of a robot car through hand movements, utilizing technologies like Arduino Uno and NodeMCU. The system captures user gestures via an accelerometer and communicates wirelessly for remote operation, enhancing accessibility and functionality. A literature survey highlights various approaches to gesture recognition, including vision-based systems and sensory gloves, suggesting future applications in military and broader navigation contexts.

Uploaded by

sanjureddy5132
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views

FINAL PPT

The Hand Gesture Control Robot Car project combines IoT and gesture recognition to enable intuitive control of a robot car through hand movements, utilizing technologies like Arduino Uno and NodeMCU. The system captures user gestures via an accelerometer and communicates wirelessly for remote operation, enhancing accessibility and functionality. A literature survey highlights various approaches to gesture recognition, including vision-based systems and sensory gloves, suggesting future applications in military and broader navigation contexts.

Uploaded by

sanjureddy5132
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 14

INTRODUCTION:

The Hand Gesture Control Robot Car project is a cutting-edge system designed to provide
intuitive control of a robot car using hand gestures. This project integrates the power of Internet of
Things (IoT) with gesture recognition to build an interactive and user-friendly robot control
system. The robot’s movement is determined by the gestures made by the user, which are captured
by an accelerometer and processed by a microcontroller. The data captured is communicated
wirelessly, allowing the user to control the robot from a distance. This project integrates multiple
technologies, including Arduino Uno, NodeMCU, and Accelerometers, providing a versatile and
flexible solution for robot control. By incorporating the IoT aspect, the system allows for wireless
interaction between the user and the robot, which adds a layer of functionality and accessibility to
the control mechanism. This project integrates multiple technologies, including Arduino Uno,
NodeMCU, and Accelerometers, providing a versatile and flexible solution for robot control. By
incorporating the IoT aspect, the system allows for wireless interaction between the user and the
robot, which adds a layer of functionality and accessibility to the control mechanism.
LITERATURE SURVEY:

A review of the hand gesture recognition system


[1] N. Mohamed et al,(2021)(IEEE) The emphasis of the vision-based gesture recognition that was
performed by the present paper was motivated by the fact that most of us own smartphones that have
built-in cameras. As smartphone technology had improved tremendously, so has the ability of these
cameras in capturing high quality images. Thus, integrating the vision-based gesture recognition
system into an existing ecosystem of smartphone cameras would be an effective way to bring the
system to be utilized for real-life applications. This explains why most of the existing works used
digital cameras as the primary method for data collection.

A system atic review on systems-based sensory gloves for sign language pattern
recognition
[2] Z. B. Zainol et al,(2022)(IEEE) This research paper forwards several proposals by the
researchers to address current and expected problems in the related field. The major advantage of a
sensory-based approach is that the gloves enable the acquisition of data directly from the sensors
(bend, hand orientation, hand rotation, etc.) with its capabilities. Furthermore, this approach is not
affected by environmental conditions, such as the location of the performer and background
conditions, which consequently enables the generation of more accurate data. The sensor-based
approach might be cumbersome and bulky because it requires several boards and sensors to be worn
so as to accurately capture the signs.
LITERATURE SURVEY (continued)
Hand Gesture Controlled Robot
[3] Kathiravan Sekar et al,(IJERT)(2022) This paper introduces a hand-gesture-based control
interface for a car-robot using a 3-axis accelerometer to record user gestures. The data is transmitted
wirelessly via an RF module to a microcontroller, which classifies the gestures into commands for
navigation. The system achieved a success rate of 92.2% in simulations.
In future we can design a wireless robot which can sense hand gesture by using wireless technologies.
It can be used in military applications as a robotic vehicle which can be handled by a soldier to avoid
casualties. Our system has shown the possibility that interaction with machines through gestures is a
feasible task and the set of detected gestures could be enhanced to more commands by implementing
a more complex model of an advanced vehicle for not only in limited space while also in the broader
area as in the roads too.
BLOCK DIAGRAM Transmitter
Start Transmitter

Receiver
Initialize Arduino
Start Receiver

Setup LCD Display


Initialize ESP32 cam

Initialize Buttons
Setup Motor Driver Matrix
Wireless Communication

Receiver
command
Valid gesture
Read
Yes
Gesture
Process Gesture Data Input
Decode Command

No
Display On LCD No Input
Process Motor Control

Transmit Command
Control DC Motors
CIRCUIT DIAGRAM (TRANSMITTER):
1. ATMega – 32 chip
2. MPU6050 ACCELOMETER
3. Bluetooth Module
4. Capacitor
3 5. Battery

2 4
1

5
CIRCUIT DIAGRAM (RECIEVER):

ARDuino

DC Motors Battery Holder


Charging Port

Motor shield
COMPONENTS:
1.ESP 32: The ESP32 is a low-power, dual-core microcontroller with built-in
WiFi and Bluetooth, making it ideal for IoT and wireless communication
projects. It is often used in applications like smart home systems, robotics,
and wearables.
Processor: ESP32-D0WDQ6 dual-core 32-bit LX6 microprocessor
Clock Speed: Up to 240 MHz
RAM: 520 KB SRAM

2.NANO BOARDS: Arduino Nano is a small, breadboard-friendly


microcontroller board based on the ATmega328 chip. It is often used in
compact projects where space is limited.
Microcontroller: ATmega328P (or ATmega168 in some older versions)
Architecture: 8-bit AVR
Operating Voltage: 5V
Input Voltage: 7-12V (via VIN pin)
Clock Speed: 16 MHz

3. BLUETOOH MODULE(HC-05): The HC-05 Bluetooth module is used for wireless


communication with other Bluetooth-enabled devices. It's commonly used
for simple communication between a microcontroller (like Arduino or ESP32)
and a smartphone or other Bluetooth devices..
COMPONENTS (continue)
4.MPU6050 ACCELOMETER: The MPU6050 is a MEMS-based
accelerometer and gyroscope used to measure motion and orientation.
It’s commonly used in robotics, wearable devices, and navigation systems.

5.BATTERY : This is a standard 12V battery that provides power to the system. A 1Amp battery
implies the system can draw 1A at 12V before depleting the charge.

6.CAR CHASSIS:The car chassis is the frame or structure of the robot, typically made of plastic or
metal, to mount all components such as motors, sensors, and batteries.
COMPONENTS (continue)
7.BATTERY HOLDER : A component that holds and connects the batteries to the circuit. It ensures
easy replacement of batteries and provides secure contacts.

8.ARDUNIO MOTOR SHIELD: This is an add-on board that allows the Arduino to control DC
motors, stepper motors, or servos. It simplifies motor control by providing the necessary motor
drivers.

9.HAND GLOVE: Likely a wearable component integrated with sensors (e.g., flex sensors,
accelerometers) to detect hand movement, often used in gesture-based control systems.
10.ARDUNIO IDE: Programming and uploading code to microcontrollers like Arduino Nano and
ESP32.
Features: Supports C/C++ for writing and uploading code. Comes with built-in libraries for working
with sensors (e.g., MPU6050, HC-05 Bluetooth module).Open-source and widely supported by the
maker community.
BUDGET FILE:
COMPONENT QUANTITY COST(INR) PURPOSE
ESP32 1 950 controller
Nano Boards 2 650 Microcontroller
Bluetooth module 2 750 communication
MPU6050 Accelerometer 1 280 sensing

Battery(12v) 1 750 power


Car chassis 8 800 structure
Batteries(3.7v) 5 150 energy
Charging module 3 200 recharging
12v AMP Adaptor 1 150 converter
Arduino motor shield 1 230 control

Wires required FEW 100 connection


Miscellaneous NA 200 extras
switches 3 50 control
Hand glove 1 100 wearable
TOTAL BUDGET 6000/-
REFERENCES:
[1] N. Mohamed, M. B. Mustafa, and N. Jomhari, ‘‘A review of the hand gesture recognition
system: Current progress and future directions,’’ IEEE Access, vol. 9, pp. 157422–157436, 2021,
doi: 10.1109/ACCESS.2021.3129650.

[2] Z. R. Saeed, Z. B. Zainol, B. B. Zaidan, and A. H. Alamoodi, ‘‘A system atic review on
systems-based sensory gloves for sign language pattern recognition: An update from 2017 to
2022,’’ IEEE Access, vol. 10, pp. 123358–123377, 2022, doi: 10.1109/ACCESS.2022.3219430.

[3] R. T. Johari, R. Ramli, Z. Zulkoffli, and N. Saibani, ‘‘A systematic literature review on vision-
based hand gesture for sign language trans lation,’’ Jurnal Kejuruteraan, vol. 35, no. 2, pp. 287–302,
Mar. 2023, doi: 10.17576/jkukm-2023-35(2)-03.

[4] R. T. Johari, R. Ramli, Z. Zulkoffli, and N. Saibani, ‘‘A systematic literature review on vision-
based hand gesture for sign language trans lation,’’ Jurnal Kejuruteraan, vol. 35, no. 2, pp. 287–302,
Mar. 2023, doi: 10.17576/jkukm-2023-35(2)-03.
THANK YOU

You might also like