Project Report to EDITED
Project Report to EDITED
A PROJECT REPORT
Submitted by
of
BACHELOR OF TECHNOLOGY
in
Guided By
YEAR 2024-25
APPENDIX 2
CERTIFICATE
Dr. H. H. Shinde
Principal
Jawaharlal Nehru Engineering College
MGM University Chhatrapati Sambhajinagar (M.S.)
APPENDIX 3
CONTENTS
List of Abbreviations i
List of Figure ii
List of Table iii
Abstract iv
1. INTRODUCTION
1
1.1 Introduction 1
1.2 Necessity
2
2. LITERATURE SURVEY
Problem Statement
Objectives
Major Inputs
Major Outputs
Major Constraints
Hardware Resources Required
Area of Project
3.2.1 Introduction
Purpose
Scope of the Project
Intended Audience & Reading Suggestions
3.2.2 Data Model and Description
5 PERFORMANCE ANALYSIS
5.1 Different Modules and their working, Output Screens
5.2 Analysis
5.3 Testing
6 CONCLUSIONS
6.1 Conclusions
6.2 Future Scope
References
Acknowledgement
*******
List of Abbreviations
List of Figure
Sr No. Figure Name Page No.
Fig No. 3.2.1 Data Flow Pg no. 8
Fig No. 3.2.2 Mediapipe Hand Landmark Pg no. 8
Fig No. 4.2.1 Use Case Diagram Pg no. 10
Fig No. 4.2.2 System Architecture Pg no. 10
Fig No. 5.1.1 Pointer Gesture Pg no. 13
Fig No. 5.1.2 Eraser Gesture Pg no. 14
Fig No. 5.1.3 Next Slides Gesture Pg no. 14
Fig No. 5.1.4 Previous Slides Gesture Pg no. 14
Fig No. 5.1.5 Gesture Writing Pg no. 15
Fig No. 5.1.6 Recognition of Gestures Pg no. 15
Fig No. 5.1.7 Next Slide Demo Pg no. 16
List of Table
ABSTRACT
In today's digital world Presentation using a slideshow is an effective and attractive way that
helps speakers to convey information and convince the audience. There are ways to control
slides with devices like mouse, keyboard, or laser pointer, etc. The drawback is one should have
previous knowledge about the devices in order to manage them. Gesture recognition has acquired
importance a couple of years prior and are utilized to control applications like media players,
robot control, gaming. The hand gesture recognition system builds the use of gloves, markers
and so on However, the utilization of such gloves or markers expands the expense of the system.
In this proposed system, Artificial intelligence-based hand gesture detection methodology is
proposed. Users will be able to change the slides of the presentation in both forward and
backward directions by just doing hand gestures. Use of hand gestures cause connection simple,
helpful, and doesn't need any additional gadget. The suggested method is to help speakers for a
productive presentation with natural improved communication with the computer. Specifically,
the proposed system is more viable than utilizing a laser pointer since the hand is more apparent
and thus can better grab the attention of the audience
The project encompasses various stages, including data collection, preprocessing, model training,
and evaluation, culminating in the deployment of a functional sign language recognition system.
Through rigorous testing and evaluation, the system's accuracy, speed, and robustness to
variations in gestures and environmental factors are assessed to ensure its effectiveness and
reliability in real-world scenarios.
1. INTRODUCTION
1.1 INTRODUCTION
In today's fast-paced and increasingly digital world, effective communication is essential for
success in various domains, ranging from business presentations to educational seminars.
Traditional methods of delivering presentations often rely on static slides and conventional input
devices, limiting the presenter's ability to engage with the audience and convey information
dynamically. The emergence of gesture-based interaction technology has paved the way for a
paradigm shift in presentation delivery, offering presenters a more intuitive and immersive
means of controlling and interacting with their slides.
A comprehensive literature survey forms the foundation of the project, exploring existing
research and technologies in the fields of gesture recognition, computer vision, and presentation
control. Insights gleaned from the literature survey inform the design and implementation of the
Gesture-Controlled PowerPoint Tool, ensuring alignment with state-of-the-art techniques and
best practices.
1.2 NECESSITY
Moreover, individuals with physical challenges may find these tools less accessible, creating a
barrier to effective communication. There is a pressing need for a more inclusive and intuitive
system that accommodates diverse user requirements while maintaining simplicity and
efficiency.
2.LITERATURE SURVEY
In recent years, the intersection of human-computer interaction (HCI) and gesture recognition
has witnessed substantial progress, leading to diverse applications across various domains.
Lawrence and Ashleigh (2019) conducted a comprehensive study focusing on the impact of HCI
within the educational context of the University of Southampton. Their findings not only
highlighted the positive influence of HCI on literacy and efficacy but also underscored its
potential to transform educational environments [1]. Ren et al. (2013) made significant strides by
developing a robust hand gesture recognition system that harnessed the power of Kinect sensors.
Their system boasted impressive accuracy and speed, showcasing the feasibility of gesture
recognition technologies in practical applications [2]. Additionally, Dhall, Vashisth, and
Aggarwal (2020) delved into the realm of automated hand gesture recognition, leveraging deep
convolutional neural networks. Their research not only advanced theoretical understanding but
also provided valuable insights into deploying such systems in real-world scenarios, thereby
bridging the gap between theory and practice [3]. Meanwhile, Talele, Patil, and Barse (2019)
introduced an innovative real-time object detection approach using TensorFlow and OpenCV,
tailored specifically for mobile technology domains. This demonstrated the versatility and
adaptability of gesture recognition technologies in addressing contemporary technological
challenges [4]. Moreover, AlSaedi and Al Asadi (2020) proposed an economical hand gesture
recognition system, highlighting the potential for achieving remarkable recognition rates with
minimal hardware requirements. This research represents a significant step towards
democratizing access to gesture recognition technologies, making them more accessible and
practical for a wider range of applications [5]. Collectively, these selected studies not only
contribute substantially to the advancement of HCI and gesture recognition technologies but also
underscore their diverse applications across various domains, ranging from education to mobile
technology.
3. PROBLEM DEFINITION AND SRS
Problem Statement
The current presentation tools rely on physical input devices such as keyboards, mice, or laser
pointers, which can hinder dynamic interaction between the presenter and the audience. These
methods are not inclusive for individuals with physical challenges and limit the natural flow of
communication. The Gesture-Controlled PowerPoint Tool aims to resolve these issues by
enabling hands-free slide navigation through intuitive hand gestures.
Objectives
• To develop a gesture-based tool for seamless navigation of PowerPoint slides.
• To enhance presentation interactivity and engagement using natural hand gestures.
• To eliminate reliance on traditional input devices, making the tool more inclusive.
• To integrate a web-based feature for slide-to-JPG conversion, ensuring compatibility.
• To maintain affordability and simplicity without requiring expensive hardware.
Major Inputs
1. Live video stream from a webcam for gesture recognition.
2. PowerPoint slides converted into JPG format for processing.
3. Predefined gestures recognized by the system (e.g., swipe left/right).
Major Outputs
1. Navigation commands (Next slide, Previous slide).
2. Dynamic feedback displayed during gesture recognition.
3. A seamless presentation flow controlled by gestures.
Major Constraints
1. Lighting Conditions: The system’s accuracy may vary under poor or inconsistent lighting.
2. Hardware Dependency: Requires a functioning webcam with sufficient resolution for
hand tracking.
3. Gesture Complexity: Limited to simple gestures to ensure system responsiveness.
4. Processing Speed: Real-time processing is necessary to minimize latency.
Area of Project
• HCI: Leveraging natural hand gestures to interact with systems.
• Computer Vision: Using image processing techniques to detect and interpret gestures.
• Presentation Technology: Revolutionizing slide navigation tools to enhance user
experience.
3.2 Software Requirements Specification
3.2.1 Introduction
Purpose
The purpose of this document is to define the functional, performance, and user interface
requirements for the Gesture-Controlled PowerPoint Tool. This tool allows presenters to
navigate slides using hand gestures, eliminating the need for conventional input devices and
enhancing inclusivity and interactivity.
Input Module:
Captures video input from a webcam and sends it for processing.
Key technology: OpenCV for real-time image capture.
Processing Module:
Detects and interprets hand gestures using the MediaPipe framework.
Converts input gestures into meaningful navigation commands.
Feedback Module:
Provides real-time visual feedback on recognized gestures to the user.
4.2 Diagrams
5. PERFORMANCE ANALYSIS
4. Output Screens:
Fig.5.1.1 Pointer Gesture
The system stands out for its impressive responsiveness, delivering real-time gesture recognition
with almost no noticeable delay. This makes it ideal for presentations, ensuring smooth and
natural interactions. With gesture-based controls, users can navigate their slides without the need
for a mouse or keyboard, keeping their focus on the audience and the content.
The accuracy of gesture recognition is a key feature of the tool. Testing shows a success rate of
85% to 95%, depending on factors like lighting and the clarity of hand movements. To improve
reliability, the system uses smart algorithms that adapt to slight variations in gestures, making it
work well in different environments and conditions.
Accessibility is one of the biggest strengths of this tool. It has been designed to include everyone,
especially those with physical disabilities. By removing the need for traditional input devices, the
tool ensures that people with limited mobility can still use it easily. This thoughtful design makes
the system accessible to a wider range of users.
Another advantage is its efficiency in using resources. The tool works perfectly on regular
hardware and doesn’t require any fancy or expensive add-ons. This makes it not only easy to set
up but also affordable for users, whether they are individuals or organizations. By leveraging
existing devices, it ensures flexibility and widespread usability.
Users have shared highly positive feedback about the tool. Many have praised its user-friendly
interface, which is straightforward and easy to understand. The system is quick to learn, even for
first-time users. Its interactive features and smooth performance have been especially appreciated
for making presentations more engaging and effortless.
Overall, the system combines advanced technology with a simple, practical design. It’s a reliable
and accessible tool that makes giving presentations easier and more enjoyable for everyone.
5.3 Testing
Testing of Gesture-Controlled PowerPoint Tool
Objective of Testing
The purpose of testing the Gesture-Controlled PowerPoint Tool is to evaluate its functionality,
usability, and performance to ensure that it meets the intended design goals and user
requirements. This section details the testing methodology, results, and conclusions.
Testing Methodology
Testing was conducted in two phases: Functional Testing and Usability Testing.
Functional Testing:
This phase tested whether the system functions correctly according to the specified requirements,
such as gesture recognition, PowerPoint control features, and system response time. Various
gestures (e.g., swipe left, swipe right, and gestures for transitions) were tested for accuracy and
responsiveness.
Usability Testing:
In this phase, the focus was on the ease of use and user experience. Participants (including team
members and external users) were asked to interact with the tool during a PowerPoint
presentation. Feedback was collected on aspects like comfort, intuitiveness, and any difficulties
experienced during the interaction.
Conclusion:
The testing phase revealed that the Gesture-Controlled PowerPoint Tool works well in most
conditions and provides an innovative way of interacting with presentations. However, there are
areas for improvement, particularly in gesture recognition under varied lighting conditions and
response times. With further fine-tuning and user feedback, the tool has the potential to enhance
presentation delivery.
6.CONCLUSION
6.1 Conclusions
The Gesture-Controlled PowerPoint Tool makes presentations easier and more interactive by
letting users navigate slides without using a mouse or keyboard. This hands-free approach
adds a fun and engaging element, making it perfect for keeping the audience’s attention. It’s
also designed with accessibility in mind, making sure people with physical challenges can
use it comfortably, so everyone gets a fair chance to benefit from it.
We built this tool to be affordable by using open-source software and regular hardware, so
anyone can set it up without spending too much. It’s practical and fits right into today’s
needs, whether you’re using it in a classroom, office, or business meeting. By combining
simple tech with modern trends, the tool works well in a variety of settings and adds real
value for its users.
Adding cloud integration would take things even further. Users could store their
presentations online and access them from anywhere, eliminating the hassle of carrying files
around. This upgrade would make the tool perfect for classrooms, conference halls, and even
remote meetings. It would offer a flexible, convenient solution that adapts to modern
presentation needs while keeping everything smooth and interactive.
REFERENCES
1] D. Jadhav, Prof. L.M.R.J. Lobo, Hand Gesture Recognition System To Control Slide Show
Navigation, IJAIEM, Vol. 3, No. 4 (2014)
2] M. Harika, A. Setijadi P, H. Hindersah, Finger-Pointing Gesture Analysis for Slide
Presentation, Bong-Kee Sin Journal Of Korea Multimedia Society, Vol. 19, No. 8, August
(2016)
3] Ahmed Kadem Hamed AlSaedi, Abbas H. Hassin Al Asadi, A New Hand Gestures
Recognition System, Indonesian Journal of Electrical Engineering and Computer Science,
Vol 18, (2020)
4] D.O. Lawrence, and Dr. M.J. Ashleigh, Impact Of Human-Computer Interaction (HCI) on
Users in Higher Educational System: Southampton University As A Case Study, Vol.6, No 3,
pp. 1-12, September (2019)
5] I. Dhall, S. Vashisth, G. Aggarwal, Automated Hand Gesture Recognition using a Deep
Convolutional Neural Network, 10th International Conference on Cloud Computing, Data
Science & Engineering (Confluence), (2020)
ACKNOWLEDGEMENT
We extend our sincere gratitude to the Principal, Vice Principal, and the Head of the
Department of Computer Science at JNEC College, MGM University, for their unwavering
support and encouragement throughout the duration of our project, Gesture-Controlled
PowerPoint Tool. Their guidance and motivation played a pivotal role in fostering our
development and enabling us to accomplish this milestone.
A heartfelt thank you to our project mentor, whose advice and feedback helped us tackle
challenges and improve our work. We are also grateful to our faculty members, whose
knowledge and insights guided us throughout the process and contributed to the success of
this project.
Lastly, we want to thank our team members for their dedication and teamwork. This project
was a journey of learning, problem-solving, and creativity, and it wouldn’t have been
possible without everyone’s effort and enthusiasm.