FINAL PPT
FINAL PPT
The Hand Gesture Control Robot Car project is a cutting-edge system designed to provide
intuitive control of a robot car using hand gestures. This project integrates the power of Internet of
Things (IoT) with gesture recognition to build an interactive and user-friendly robot control
system. The robot’s movement is determined by the gestures made by the user, which are captured
by an accelerometer and processed by a microcontroller. The data captured is communicated
wirelessly, allowing the user to control the robot from a distance. This project integrates multiple
technologies, including Arduino Uno, NodeMCU, and Accelerometers, providing a versatile and
flexible solution for robot control. By incorporating the IoT aspect, the system allows for wireless
interaction between the user and the robot, which adds a layer of functionality and accessibility to
the control mechanism. This project integrates multiple technologies, including Arduino Uno,
NodeMCU, and Accelerometers, providing a versatile and flexible solution for robot control. By
incorporating the IoT aspect, the system allows for wireless interaction between the user and the
robot, which adds a layer of functionality and accessibility to the control mechanism.
LITERATURE SURVEY:
A system atic review on systems-based sensory gloves for sign language pattern
recognition
[2] Z. B. Zainol et al,(2022)(IEEE) This research paper forwards several proposals by the
researchers to address current and expected problems in the related field. The major advantage of a
sensory-based approach is that the gloves enable the acquisition of data directly from the sensors
(bend, hand orientation, hand rotation, etc.) with its capabilities. Furthermore, this approach is not
affected by environmental conditions, such as the location of the performer and background
conditions, which consequently enables the generation of more accurate data. The sensor-based
approach might be cumbersome and bulky because it requires several boards and sensors to be worn
so as to accurately capture the signs.
LITERATURE SURVEY (continued)
Hand Gesture Controlled Robot
[3] Kathiravan Sekar et al,(IJERT)(2022) This paper introduces a hand-gesture-based control
interface for a car-robot using a 3-axis accelerometer to record user gestures. The data is transmitted
wirelessly via an RF module to a microcontroller, which classifies the gestures into commands for
navigation. The system achieved a success rate of 92.2% in simulations.
In future we can design a wireless robot which can sense hand gesture by using wireless technologies.
It can be used in military applications as a robotic vehicle which can be handled by a soldier to avoid
casualties. Our system has shown the possibility that interaction with machines through gestures is a
feasible task and the set of detected gestures could be enhanced to more commands by implementing
a more complex model of an advanced vehicle for not only in limited space while also in the broader
area as in the roads too.
BLOCK DIAGRAM Transmitter
Start Transmitter
Receiver
Initialize Arduino
Start Receiver
Initialize Buttons
Setup Motor Driver Matrix
Wireless Communication
Receiver
command
Valid gesture
Read
Yes
Gesture
Process Gesture Data Input
Decode Command
No
Display On LCD No Input
Process Motor Control
Transmit Command
Control DC Motors
CIRCUIT DIAGRAM (TRANSMITTER):
1. ATMega – 32 chip
2. MPU6050 ACCELOMETER
3. Bluetooth Module
4. Capacitor
3 5. Battery
2 4
1
5
CIRCUIT DIAGRAM (RECIEVER):
ARDuino
Motor shield
COMPONENTS:
1.ESP 32: The ESP32 is a low-power, dual-core microcontroller with built-in
WiFi and Bluetooth, making it ideal for IoT and wireless communication
projects. It is often used in applications like smart home systems, robotics,
and wearables.
Processor: ESP32-D0WDQ6 dual-core 32-bit LX6 microprocessor
Clock Speed: Up to 240 MHz
RAM: 520 KB SRAM
5.BATTERY : This is a standard 12V battery that provides power to the system. A 1Amp battery
implies the system can draw 1A at 12V before depleting the charge.
6.CAR CHASSIS:The car chassis is the frame or structure of the robot, typically made of plastic or
metal, to mount all components such as motors, sensors, and batteries.
COMPONENTS (continue)
7.BATTERY HOLDER : A component that holds and connects the batteries to the circuit. It ensures
easy replacement of batteries and provides secure contacts.
8.ARDUNIO MOTOR SHIELD: This is an add-on board that allows the Arduino to control DC
motors, stepper motors, or servos. It simplifies motor control by providing the necessary motor
drivers.
9.HAND GLOVE: Likely a wearable component integrated with sensors (e.g., flex sensors,
accelerometers) to detect hand movement, often used in gesture-based control systems.
10.ARDUNIO IDE: Programming and uploading code to microcontrollers like Arduino Nano and
ESP32.
Features: Supports C/C++ for writing and uploading code. Comes with built-in libraries for working
with sensors (e.g., MPU6050, HC-05 Bluetooth module).Open-source and widely supported by the
maker community.
BUDGET FILE:
COMPONENT QUANTITY COST(INR) PURPOSE
ESP32 1 950 controller
Nano Boards 2 650 Microcontroller
Bluetooth module 2 750 communication
MPU6050 Accelerometer 1 280 sensing
[2] Z. R. Saeed, Z. B. Zainol, B. B. Zaidan, and A. H. Alamoodi, ‘‘A system atic review on
systems-based sensory gloves for sign language pattern recognition: An update from 2017 to
2022,’’ IEEE Access, vol. 10, pp. 123358–123377, 2022, doi: 10.1109/ACCESS.2022.3219430.
[3] R. T. Johari, R. Ramli, Z. Zulkoffli, and N. Saibani, ‘‘A systematic literature review on vision-
based hand gesture for sign language trans lation,’’ Jurnal Kejuruteraan, vol. 35, no. 2, pp. 287–302,
Mar. 2023, doi: 10.17576/jkukm-2023-35(2)-03.
[4] R. T. Johari, R. Ramli, Z. Zulkoffli, and N. Saibani, ‘‘A systematic literature review on vision-
based hand gesture for sign language trans lation,’’ Jurnal Kejuruteraan, vol. 35, no. 2, pp. 287–302,
Mar. 2023, doi: 10.17576/jkukm-2023-35(2)-03.
THANK YOU