0% found this document useful (0 votes)
19 views8 pages

Miniproject Synopsis:: Prof - Sumarani H

Uploaded by

KASHFAN K
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views8 pages

Miniproject Synopsis:: Prof - Sumarani H

Uploaded by

KASHFAN K
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

C.

BYREGOWDA INSTITUTE OF TECHNOLOGY


DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING
An ISO 9001:2015 certified Institute
THORDEVANDAHALLI,SRINIVASPUR ROAD,KOLAR-563101
2024-2025

MINIPROJECT SYNOPSIS

Name Of Guide: Prof.Sumarani H

Name of Contact Email-id


Student Number

• Nandini.N 8792699216 [email protected]


• Nandushree B. M 7483074162 [email protected]
• Monisha.J 7483439384 [email protected]
• Monika H N 9606459440 [email protected]

Title: “Gesture Controlled Virtual Mouse”

ABSTRACT:

This mini project focuses on the development of a gesture-controlled virtual mouse system,
which aims to replace traditional physical mice with intuitive hand movements detected
through a camera-based system. The project utilizes computer vision techniques, specifically
hand gesture recognition algorithms, to interpret gestures captured by a webcam. These
gestures are mapped to corresponding mouse movements and actions, allowing users to
navigate and interact with computer interfaces in a natural and hands-free manner. The
system’s design includes modules for gesture detection, feature extraction, gesture
classification, and translation into mouse commands. The implementation leverages Python
programming language and libraries such as OpenCV for real-time image processing and
gesture analysis. The effectiveness of the gesture-controlled virtual mouse system is evaluated
through usability tests, comparing its performance with conventional mouse devices in terms
of accuracy, responsiveness, and user satisfaction. This mini project aims to contribute to the
field of human-computer interaction by demonstrating a practical application of gesture
recognition technology in improving user interface accessibility and ergonomics.

INTRODUCTION:

In the real-time of human-computer interaction, the traditional mouse and keyboard setup
has long been the standard for navigating digital interfaces. However, advancements in
technology have spurred the development of alternative input methods aimed at improving
user experience and accessibility. One such innovation is gesture recognition, which enables
users to interact with computers through natural hand movements rather than physical
peripherals. The concept of a gesture-controlled virtual mouse presents an intriguing solution
to this evolution in interface design. By harnessing computer vision techniques and machine
learning algorithms, this project endeavors to create a system where users can manipulate an
onscreen cursor and perform mouse actions simply by gesturing with their hands in front of a
camera.

This approach not only eliminates the need for physical contact with input devices but also
offers potential ergonomic benefits by reducing repetitive strain injuries associated with
traditional mouse usage. Key components of the project include exploring and implementing
computer vision libraries like OpenCV for image processing, as well as leveraging machine
learning models for gesture recognition. Python will serve as the primary programming
language due to its versatility and robust libraries suited for both image processing and machine
learning tasks. The significance of this project lies in its potential to enhance user interaction
with computing devices, particularly in scenarios where traditional input methods may be
impractical or inaccessible. By enabling users to control digital interfaces through intuitive
gestures, the gesture-controlled virtual mouse system aims to redefine how we interact with
computers, paving the way for more natural and ergonomic computing experiences. In the
following sections, we will delve into the methodology, implementation details, and evaluation of
the gesture-controlled virtual mouse system, aiming to demonstrate its feasibility, effectiveness,
and potential applications in real-world computing environments.

OBJECTIVE:
• Algorithm Development: Design and implement robust algorithms for real-time
detection, tracking, and recognition of hand gestures using computer vision techniques
such as contour analysis, feature extraction, or deep learning models.

• Gesture Classification and Mapping: Develop methods to classify detected gestures


and map them to corresponding mouse actions, including cursor movement, left and
right clicks, and scrolling, ensuring accuracy and responsiveness.
• User Interface Design: Design an intuitive user interface (UI) that visually represents
detected gestures and provides feedback to the user in real-time, enhancing usability and
interaction experience.
• Performance Optimization: Optimize the system for real-time performance,
minimizing latency between gesture recognition and execution of mouse actions. This
may involve algorithm optimization, parallel processing techniques, or hardware
acceleration.
• Accuracy and Robustness Evaluation: Conduct comprehensive testing to evaluate
the accuracy, robustness, and reliability of the gesture recognition system across
various conditions (e.g., different lighting, hand orientations). Compare its
performance against traditional mouse devices in terms of accuracy and user
preference.
• User Experience Assessment: Gather user feedback through usability testing to assess
the system’s ease of use, ergonomics, and overall user satisfaction. Use feedback to
iteratively improve the system’sdesign and functionality.

• Documentation and Presentation: Document the development process, including


design choices, implementation details, challenges faced, and solutions adopted.
Prepare a comprehensive report and presentation summarizing objectives,
methodologies, results, and conclusions

EXISTING SYSTEM
There are several existing systems and technologies for gesture-controlled virtual mouse,
each utilizing different approaches and technologies like Camera-Based Systems, Wearable
Devices, Gesture Glove, Myo- Armband, Ultrasound-Based Systems, Infrared (IR) Sensors,
Machine Learning and AI, Mobile Device Applications. Each of these technologies has its
strengths and limitations, such as accuracy, ease of use, and adaptability to different
environments. The choice of system often depends on the specific application requirements and
user preferences for gesture-based interaction.

PROPOSED SYSTEM
Creating a proposed system for a gesture-controlled virtual mouse involves designing a
framework that integrates hardware and software components to effectively interpret and
respond to user gestures. Here’s a structured outline for such a system:

Hardware Components:
• Camera or Sensor Module

• Microcontroller or Processor

• Power Supply

• Python 64 bit

• Communication interfaces

Software Techniques:
• Gesture Recognition Algorithm

• Tracking and Positioning Logic

• Gesture-to-Action Mapping

• User Interface Integration


System Workflow:
• Initialization and Calibration

• Gesture Detection and Recognition

• Action Execution

• Feedback and Adjustment

Software Components:

• Gesture recognition software

• Hand and gesture tracking

• Machine learning models

• Input mapping and translation

Implementation Considerations:
• Performance Optimization: Optimize algorithms and code for real-time responsiveness
and accuracy.

• Robustness: Account for variations in lighting conditions, background noise, and


different user gestures.

• User Experience: Design an intuitive user interface and ensure ergonomic considerations
for prolongeduse.

Potential Enhancements:
• Multi-Gesture Support: Recognize and differentiate between multiple gestures for complex
interactions.

• Voice Commands Integration: Combine gesture control with voice commands for
enhanced usability.
• Integration with VR/AR Systems: Extend functionality to virtual or augmented reality
environments forimmersive interaction. ### Potential Enhancements:

METHODOLOGY
This methodology for a gesture-controlled virtual mouse involves several key steps to
ensure clarity, effectiveness, and successful implementation.

• Define Objectives: Clearly state goals like improving usability or enabling hands-free
interaction.

• Research: Study existing gesture recognition technologies and virtual mouse


implementations

• Design: Create system architecture and select appropriate hardware and software
components.

• Development: Implement gesture recognition algorithms and virtual mouse control


software.

• Implementation: Build and test a prototype, iterating based on feedback.

• Evaluation: Assess performance, usability, and gather feedback for refinement.

• Documentation: Document the process and outcomes for a final report and
presentation.
BLOCK DIAGRAM

FIGURE: HAND DETECTION

REFERENCES
• W. Schwartz, A. Kembhavi, D. Harwood, and L. Davis. Human Detection Using Partial
Least SquaresAnalysis. 2009

• Križnar V, Leskovšek M, Batagelj B. editors. Use of Computer Vision Based Hand Tracking
in Educational Environments: 44th International Convention on Information,
Communication and Electronic Technology (MIPRO), 27 September 2021 - 01 October
2021, Opatija. Croatia: IEEE; 2021

• Katona, J. A Review of Human–Computer Interaction and Virtual Reality Research


Fields in CognitiveInfoCommunications. Appl. Sci. 2021, 11, 2646.
https://doi.org/10.3390/app11062646 HYPERLINK
"https://doi.org/10.3390/app11062646%20HYPERLINK%20%22https://doi.org/10.339
0/app11062646%22." HYPERLINK "https://doi.org/10.3390/app11062646"
HYPERLINK
"https://doi.org/10.3390/app11062646%20HYPERLINK%20%22https://doi.org/10.339
0/app11062646%22.".

• Hyung-Jeong Yang, et, al. Real-time virtual mouse system using RGB-D images and
fingertip detectionin Multimedia Tools and Applications, 2020,
https://doi.org/10.1007/s11042-020 10156-5.

• J Kumara Swamy, et, al, AI Based Virtual Mouse with Hand Gesture and AI Voice Assistant
Using Computer Vision and Neural Networks, Journal For Research in Applied Science and
EngineeringTechnology, 2023.

• Likitha R, et. al. International Journal of Engineering Research and Applications


www.ijera.com ISSN: 2248-9622, AI Virtual Mouse System Using Hand Gestures and Voice
Assistant Vol. 12, Issue 12, December 2022, pp. 132-138

• Mr. E.Sankar, B.Nitish Bharadwaj, A.V.Vignesh Assistant Professor, Dept. Of CSE,


SCSVMV Sankar CHAVALI Sri Chandrasekhar Endra Saraswathi Viswa Mahavidyalaya
Universityr Virtual Mouse Using

Hand Gesture this publication (2023) at: https://www.researchgate.net/publication/372165002

Signature of Guide Signature of Coordinator Signature of HOD

You might also like