0% found this document useful (0 votes)
45 views

REPORT

Uploaded by

Vinoji V
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
45 views

REPORT

Uploaded by

Vinoji V
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 49

SMART ATTENDANCE SYSTEM WITH HAND

CONTROLLED PRESENTATION

A PROJECT REPORT

Submitted by

VINOJI.V (961618104038)

ARJUN.M(961618104012)

RIKSON RAJU(961618104027)

in partial fulfillment for the award of the degree

Of

BACHELOR OF ENGINEERING

in

COMPUTER SCIENCE AND ENGINEERING

MARTHANDAM COLLEGE OF ENGINEERING AND TECHNOLOGY


KUTTAKUZHI,KANYAKUMARI DISTRICT-629 177

ANNA UNIVERSITY: CHENNAI 600 25

JULY 2022
ANNA UNIVERSITY: CHENNAI 600 025

BONAFIDE CERTIFICATE

Certified that this project report “SMART ATTENDANCE SYSTEM WITH


HAND CONTROLLED PRESENTATION ”is the bonafide work of
"VINOJI.V(961618104038),ARJUN.M(961618104012),RIKSONRAJU
(961618104027)” who have carried out the project work under my supervision.

SIGNATURE SIGNATURE
Mrs.ANCHANA .B.S,M.E Mr.J.PRAKASH M.E,
HEAD OF THE DEPARTMENT SUPERVISOR
Computer Science & Engineering Computer Science & Engineering
Marthandam College of Marthandam College of
Engineering and Technology Engineering and Technology
Kuttakuzhi-629177 Kuttakuzhi-629177

Submitted for Project Viva-Voice held on………………………

INTERNAL EXAMINER EXTERNAL EXAMINER


ABSTRACT

The hand recognition makes possible the control of power point presentation through

hand gesture .It is not necessary for the user to control the power point presentation

through keyboard or mouse or laser pointer .This system does not makes use of

traditional method for hand gesture recognition such hand gloves,markers,rings,pen

or any other devices .A camera records a live video stream from which a snapshot is

taken with the help of interface. The system is trained for each type of count hand

gesture.And also this system aims to build a class attendance system with the concept

of face recognition .Faces of the student can be stored in the database.Faces are

detected and collect data from database when the live streaming video of the

classroom and it show the attendance.This system has used Machine Learning

method for face recognition and gesture recognition with python. The concept of

attendance system based on face recognition technology is proposed, and the

research on face recognition attendance system based on real-time video processing

is carried out. The face recognition time and attendance system with real-time video

processing through the face recognition can quickly complete the tasks of students

in the time and attendance check-in system, get rid of the complex naming

phenomenon, greatly improve the efficiency of class, and play an important role in

guiding the development of the time and attendance system.

ii
ACKNOWLEDGEMENT

It gives us pleasure to acknowledgement our Indebtedness to all those


who have helped us in completing this project.

First and foremost we would like to thank GOD almighty for the blessings
he has showered upon us to complete this project successfully.

We wish to convey our immense gratitude and thanks to our respected


Chairman Prof. Dr. T. James Wilson, B.E., M.BA., M.I.Mar.Tech., Ph.D., and
Vice Chairman Er.F. Prince Vino,B.E., for providing us all facilities for the
successful completion of the project.

We are grateful to Dr.C.Sudhahar, M.E.,Ph.D., Principal for our


institution for arranging the facilities for successful completion of our project.

We are deeply indebted to Mrs.Anchana.B.S., M.E., Head of Computer


Science and Engineering Department who gave us splendid help, valuable
suggestions and support in bringing out this dissertation work successfully.

We would like to express our sincere thanks to Mr.J.Prakash.M.E


Assistant Professor of Computer Science and Engineering Department for his
valuable guidance, encouragement and immense help in making the project success.
We extend our sincere thanks to all our computer science department staff
members. VINOJI.V
ARJUN.M
RIKSON RAJU

iii
TABLE OF CONTENTS

CHAPTER TITLE PAGE


NO. NO.
ABSTRACT ii

ACKNOWLEDGEMENT iii

TABLE OF CONTENTS iv

LIST OF FIGURES vi

LIST OF ABBREVIATIONS vii

1 INTRODUCTION 1

2 LITERATURE REVIEW 4
3 SYSTEM ANALYSIS 7
3.1 EXISTING SYSTEM 8

3.2 PROPOSED SYSTEM 8

3.3 SYSTEM REQUIREMENTS 9

3.4 LANGUAGE DESCRIPTION 9

4 SYSTEM DESIGN 12

4 .1 SYSTEM ARCHITECTURE 12

4.2 ACTIVITY DIAGRAM 13

5 IMPLEMENTATION 14

5.1 MODULE 14

5.1.1 Hand Gesture Controlled Presentation 14

iv
5.1.2 Smart Attendance System 15

6 CONCLUSION AND FUTURE ENHANCEMENT 16

REFERENCES 18

APPENDICES 20

A1. SCREEN SHOTS 20

A2.SOURCE CODE 23

v
LIST OF FIGURES

FIGURE NO. TITLE PAGE NO.

4.1.1 System Architecture 12


4.2.1 Activity Diagram 13
A1.1 UI page 20

A1.2 Hand Tracking 21

A1.3 Draw over slide 22

A1.4 Smart Attendance system 22

A1.5 Student Details 23

A1.6 Attendence System 23

vi
LIST OF ABBREVIATIONS

RFID - Radio Frequency Identity Card

ID - Identy Card

RS 232 - Recommended Standard 232

RGB - Red Green Blue

CNN - Class of deep neural networks

NUMPY - Numerical Python

SCIPY - Scientific Python

YOLO - You Only Look Once

COCO - Common Objects In Context

HSI - Hue, Saturation, and Intensity

vii
CHAPTER 1
INTRODUCTION
1.1 OVERVIEW
In this era of Internet explosion, computer technology has involved many areas of
people's lives and work. The occasions where people come into contact with computers
are gradually expanding. The frequency with which people use computing is also
increasing. One of the most challenging projects in the field has a broad application
prospect because of its huge sense of innovation. As an important identity label for people
to distinguish different individuals, face recognition and hand recognition technology has
gradually entered people's lives. Face recognition is the combination of artificial
intelligence and computer. Because of its huge challenging innovation and broad
application prospects, it has become the most challenging topic in this field .

There have been the increasing demands for a more active and interesting viewing
experience, and interactive projection technology has been considered as a solution to
this issue. For example, if you can flip pages with a gesture when you make a
presentation, or write a sentence without any manual tools, if the appliances can control
using the gesture, then the presentations can be more immersive and attractive to the
audiences.

An interactive projection system also helps people to produce more attractive


artistic exhibits, such as interactive walls and floors. For example, images of the
current fabrication process are processed to obtain binary images, from which
various features and shapes can be analyzed for checking the current status.It deals
with the development of an interactive projection technology, which provides more
active and interesting viewing experience by recognizing the user‫׳‬s gesture in
realtime. The presenter can perform by walking into the audience. Using this

1
method, a few interactive applications are developed. The system also provides a
means to control electronic devices using hand gestures. Thus, this system will act
like a remote control for operating all the consumer electronic devices present in a house.
The detected and recognized hand gestures are used as the command signals for
controlling devices. The product can be also used to display a text or an image. This can
be used by disabled for displaying their needs. All what the user has to do is just to wear
the product and show a pre-defined gesture. By using this single product it helps in
presentation field, home automation, as well as an aid for the disabled. It’s a great advance
of technology.The basic purpose of this system is to provide a means to control electronic
devices (capable of infrared communication) using hand gestures.

In recent years, the face recognition application system has developed rapidly as a
computer security technology in the world, especially today, when terrorist activities are
rampant, this technology has received more and more attention. Face recognition
technology has many typical applications in the field of public safety, civil economy, and
home entertainment . The pipeline of general enterprises needs to record the attendance
of personnel, which has become a basic requirement of the company. However, when
these attendance systems are formulated, unnecessary errors often occur. Taking the
current fingerprint attendance system as an example, the study has found that The
fingerprint attendance system has an error rate of about 5%, and there will be a
phenomenon that fingerprints cannot be hit, which seriously affects the efficiency of
attendance, especially in large attendance sites, which is more likely to cause congestion
.However, the card attendance system has the phenomenon of employees swiping cards
for someone else, and it is difficult to achieve the purpose of real time attendance.
Compared with the two attendance systems, the face recognition system has higher
accuracy and stability, because there are more points for face recognition, which is more
accurate than other systems. Greatly improved, it is difficult to congestion . Although

2
China's research on face recognition technology started late, our scientific researchers
have caught up and some leading figures have established their own industry positions in
the field of face recognition. With the advent of the era of big data in today's world and
the commercial value of face recognition technology, the prospect of this technology
research is very bright and has great market demand . The future system time and the
form of attendance system conversion have made tremendous innovations, greatly
improving the attendance rate and the reliability of face recognition technology. It is
worthy of further exploration and realization by our scientists.

3
CHAPTER 2
LITERATURE REVIEW
2.1 A Counterpart Approach to Attendance and Feedback System using
Machine Learning Techniques :

In this paper, the idea of two technologies namely Student Attendance and Feedback
system has been implemented with a machine learning approach. This system
automatically detects the student performance and maintains the student's records
like attendance and their feedback on the subjects like Science, English, etc.
Therefore the attendance of the student can be made available by recognizing the
face. On recognizing, the attendance details and details about the marks of the
student is obtained as feedback.

N.Sudhakar Reddy, M.V.Sumanth, S.Suresh Babu, "A Counterpart Approach to


Attendance and Feedback System using Machine Learning Techniques",Journal of
Emerging Technologies and Innovative Research (JETIR), Volume 5, Issue 12, Dec
2018

2.2 Automated Attendance System Using Face Recognition

Automated Attendance System using Face Recognition proposes that the system is
based on face detection and recognition algorithms, which is used to automatically
detects the student face when he/she enters the class and the system is capable to
marks the attendance by recognizing him. Viola-Jones Algorithm has been used for
face detection which detect human face using cascade classifier and PCA algorithm
for feature selection and SVM for classification. When it is compared to traditional
attendance marking this system saves the time and also helps to monitor the students.

4
Akshara Jadhav, Akshay Jadhav, Tushar Ladhe, Krishna Yeolekar, "Automated
Attendance System Using Face Recognition", International Research Journal of
Engineering and Technology (IRJET), Volume 4, Issue 1, Jan 2017.

2.3 Student Attendance System Using Iris Detection

In this proposed system the student is requested to stand in front of the camera to
detect and recognize the iris, for the system to mark attendance for the student. Some
algorithms like Gray Scale Conversion, Six Segment Rectangular Filter, Skin Pixel
Detection is being used to detect the iris. It helps in preventing the proxy issues and
it maintains the attendance of the student in an effective manner, but in one of the
time-consuming process for a student or a staff to wait until the completion of the
previous members.

Prajakta Lad, Sonali More, Simran Parkhe, Priyanka Nikam, Dipalee Chaudhari, "
Student Attendance System Using Iris Detection", IJARIIE-ISSN(O)-2395-4396,
Vol-3 Issue-2 2017.

2.4 Face Based Recognition System

The facial recognition technology can be used in recording the attendance through
a high-resolution digital camera that detects and recognizes the faces of the students
and the machine compares the recognized face with students’ face images stored in
the database. Once the face of the student is matched with the stored image, then the
attendance is marked in attendance database for further calculation. If the captured
image doesn't match with the students' face present in the database then this image
is stored as a new image onto the database. In this system, there are possibilities for

5
the camera to not to capture the image properly or it may miss some of the students
from capturing.

Yohei KAWAGUCHI, Tetsuo SHOJI, Weijane LIN, Koh KAKUSHO, Michihiko


MINOH, "Face Recognition-based Lecture Attendance System", Oct 2019.

2.5 Gesturebased Interaction

For Human – Computer Interaction (HCI), hand gestures have a wide range of
applications that can ensure speed of interaction with the computer, provide a
userfriendly and aesthetic experience which attracts users, provide remote non-
physical contact with the device for user comfort and security, and monitor dynamic
and virtual environments in a much simpler approach.

Andrea A. 2019. Advantages and drawbacks of gesturebased interaction. Computer


Science—Miscellaneous 369514:1–11.

2.6 Hand gesture recognition for human computer interaction.

The computer is capable of recognizing the various users and environmental factors
that occur and exist around it. Having said that, hand gesture recognition is a type of
perceptual computing user interface that is used in HCI to enable computers to
capture and interpret hand gestures and execute commands based on a gesture
understanding.

Meenakshi P, Pawan Singh M. 2018. Hand gesture recognition for human computer
interaction. In: International conference on image information processing.
Piscatway: IEEE, 1–7.

6
CHAPTER 3
SYSTEM ANALYSIS

3.1 EXISTING SYSTEM

The existing system consists of the generic mouse and trackpad system
of monitor controlling and the nonavailability of a hand gesture system. The remote
accessing of monitor screen using the hand gesture is unavailable. Even-though it is
largely trying to implement the scope is simply restricted in the field of virtual
mouse. The existing virtual mouse control system consists of the simple mouse
operations using the hand recognition system, where we could perform the basic
mouse operation like mouse pointer control, left click, right click, drag etc. The
further use of the hand recognition is not been made use of. Even-though there are a
number of systems which are used for hand recognition, the system they made used
is the static hand recognition which is simply recognition of the shape made by hand
and by defining an action for each shape made, which is limited to a number of
defined actions and a large amount of confusion..
In the Fingerprint based existing attendance system, a portable fingerprint device
need to be configured with the students fingerprint earlier. Later either during the
lecture hours or before, the student needs to record the fingerprint on the configured
device to ensure their attendance for the day. The problem with this approach is that
during the lecture time it may distract the attention of the students.
In the RFID based existing system, the student needs to carry a Radio Frequency
Identity Card with them and place the ID on the card reader to record their presence
for the day. The system is capable of to connect to RS232 and record the attendance
to the saved database. There are possibilities for the fraudulent access may occur.
Some are students may make use of other students ID to ensure their presence when
the particular student is absent or they even try to misuse it sometimes

7
3.2 PROPOSED SYSTEM

This system is based on gesture recognition method proposed for interactive


projection systems. There have been the increasing demands for a more active and
interesting viewing experience, and interactive projection technology. Using this
system you can flip pages with a gesture when you make a presentation, or write a
sentence without any manual tools and the appliances can control using the gesture.
Then the presentations can be more immersive and attractive to the audiences. This
system is to provide a means to control electronic devices using hand gestures. The
detected and recognized hand gestures are used as the command signals for
controlling devices. It’s a great advance of technology which will make the
presentations more attractive.
The task of the proposed system is to capture the face of each student and to store it
in the database for their attendance. The face of the student needs to be captured in
such a manner that all the feature of the students' face needs to be detected, even the
seating and the posture of the student need to be recognized. There is no need for the
teacher to manually take attendance in the class because the system records a video
and through further processing steps the face is being recognized and the attendance
database is updated.

8
3.3 SYSTEM REQUIREMENTS

3.3.1 Hardware Requirements

PROCESSOR : Intel Core i3 1.19GHz


RAM : 4GB
Hard Disk : 1TB
Mouse : Optical Mouse
Keyboard : 104 Keys Standard
Monitor : SVGA
GPU : 4GB
WEBCAM :HD
3.3.2 Software Requirements
Python
Anaconda
Open CV
3.4 LANGUAGE DESCRIPTION

About The Choosen Software

1. Python is an interpreted high-level general-purpose programming language.


Python's design philosophy emphasizes code readability with its notable use
of significant indentation. Its language constructs as well as its object-
oriented approach aim to help programmers write clear, logical code for small
and large-scale projects.

2. Guido van Rossum began working on Python in the late 1980s, as a successor
to the ABC programming language, and first released it in 1991 as Python

9
0.9.0.] Python 2.0 was released in 2000 and introduced new features, such
as list comprehensions and a garbage collection system using reference
counting.

3. Completely backward-compatible and much Python 2 code does not run


unmodified on Python 3. Python 2 was discontinued with version 2.7.18 in
2020.

4. Python consistently ranks as one of the most popular programming


languages..

COMMON USES OF PYTHON

1. Data visualisation
2. Programming applications
3. Web development
4. AI and machine learning
5. Data analytics
6. Game development
7. Language development
8. Finance

CHARACTERISTICS OF PYTHON

1. Its ease of use. For those who are new to coding and programming, Python
can be an excellent first step. It’s relatively easy to learn, making it a great
way to start building your programming knowledge.
2. Its simple syntax. Python is relatively easy to read and understand, as its
syntax is more like English. Its straightforward layout means that you can
work out what each line of code is doing.

10
3. Its thriving community. As it’s an open-source language, anyone can use
Python to code. What’s more, there is a community that supports and develops
the ecosystem, adding their own contributions and libraries.
4. 4. Its versatility. As we’ll explore in more detail, there are many uses for
Python. Whether you’re interested in data visualisation, artificial intelligence
or web development, you can find a use for the language.

OpenCV-Python

1. OpenCV-Python is a library of Python bindings designed to solve computer


vision problems.

5. Python is a general purpose programming language started by Guido van


Rossum that became very popular very quickly, mainly because of its
simplicity and code readability. It enables the programmer to express ideas in
fewer lines of code without reducing readability.
6. Compared to languages like C/C++, Python is slower. That said, Python can
be easily extended with C/C++, which allows us to write computationally
intensive code in C/C++ and create Python wrappers that can be used as
Python modules. This gives us two advantages: first, the code is as fast as the
original C/C++ code (since it is the actual C++ code working in background)
and second, it easier to code in Python than C/C++. OpenCV-Python is a
Python wrapper for the original OpenCV C++ implementation.

11
CHAPTER 4
SYSTEM DESIGN

4.1.SYSTEM ARCHITECTURE
This gives a high level view of the system with the main component and the
services they provide and how they communicate in the system.

Figure 4.1.1:System Architecture

12
4.2 ACTIVITY DIAGRAM

Figure 4.2.1:Activity diagram for attendance

13
CHAPTERE 5
IMPLEMENTATION

5.1 MODULES

The project can be divided in to two parts

1. Hand Gesture Controlled Presentation


2. Smart Attendance System

5.1.1 Hand Gesture Controlled Presentation

1. Image Acquisition
2. Segmentation of hand region
3. Finger count recognition
4. Motion recognition
1.Image Acquisition
The user makes gestures by positioning hand parallel to webcam. Images are
continuously captured and then given as input for segmentation.
2.Segmentation of Hand Region
The images captured are given for analysis which is done using segmentation. Here
hand detection algorithm is used that detects the hand region from the input image
as the background may consist of many other things along with the hand region. The
video obtained through a webcam is in the RGB color model. This video is converted
to HSI color model because the regions which belong to the hand can be easily
identified in HSI model. Following this, the rules for hand segmentation are applied.
After recognizing hand it is converted into a binary image. The hand regions are
represented using white color and all other non-hand regions are black. The largest
connected region, which is detected as hand is taken as the hand region. This gives

14
the segmented hand region and this is the region of interest. The recognition of the
gestures depends on this region.

3.Finger Count and Motion Recognition


After segmentation the binary image is given to the distance transform method which
counts the number of active fingers or the motion of the hand. In this method firstly
the centroid of palm is calculated by considering each pixel and calculating distance
from nearest boundary. Therefore the pixel that is far from every boundary is chosen
as centroid. Using this centroid active fingers are counted and if there is motion of
hand, this is detected by motion of centroid from original position from a set of
continuously captured images and the slide show is controlled that is PowerPoint
presentation either goes to the next slide, previous slide or particular numbered slide
after recognizing gestures.

5.1.2 Smart Attendance System

1. Capture video:
2. Separate as frames from the video
3. Face Detection
4. Face Recognition
5. Post-Processing
1.Capture video
The Camera is fixed at a specific distance inside a classroom to capture videos of
the frontal images of the entire students of the class.
2.Separate as frames from the video
The captured video needs to be converted into frames per second for easier detection
and recognition of the students

15
3.Face Detection
Face Detection is the process where the image, given as an input (picture) is searched
to find any face, after finding the face the image processing cleans up the facial
image for easier recognition of the face.CNN algorithm can be implemented to
detect the faces.
4.Face Recognition
After the completion of detecting and processing the face, it is compared to the faces
present in the students' database to update the attendance of the students.
5.Post-Processing
The post-processing mechanism involves the process of updating the names of the
student into an excel sheet. The excel sheet can be maintained on a weekly basis or
monthly basis to record the students' attendance. This attendance record can be sent
to parents or guardians of students to report the performance of the student

16
CHAPTER 6

CONCLUSION AND FUTURE ENHANCEMENT

Thus, the aim of this project is to capture the video of the students, convert it into
frames, relate it with the database to ensure their presence or absence, mark
attendance to the particular student to maintain the record. The Automated
Classroom Attendance System helps in increasing the accuracy and speed ultimately
achieve the high-precision real-time attendance to meet the need for automatic
classroom evaluation.

Both static and dynamic gestures are used to control the slide show. Any finger can
be used to denote the gesture. No restriction to use a specific finger for particular
gesture as active fingers are counted using distance transform method. The
drawbacks of previously implemented methods like circular profiling method using
single hand are overcome since distance transform method can use both the hands.
This does not require any training phase to identify a hand gesture hence does not
require storage of images in database to recognize the hand gestures. Usage of hand
gestures can be extended to control real time applications like paint, pdf reader and
Robot arm control,Drone control etc.
Automated Attendance System can be implemented in larger areas like in a seminar
hall where it helps in sensing the presence of many people. Sometimes the poor
lighting condition of the classroom may affect image quality which indirectly
degrades system performance, this can be overcome in the latter stage by improving
the quality of the video or by using some algorithms.

17
REFERENCES

1. Ahmed Elgammal, Vinay Shet, Yaser Yacoob, Larry S. Davis (2003).


Learning Dynamics for Exemplar-based Gesture Recognition. In Proceedings
of the Fifteenth International Conference on Computer Vision and Pattern
Recognition, pages 578. IEEE.

2. Akshara Jadhav, Akshay Jadhav, Tushar Ladhe, Krishna Yeolekar, (2017).


"Automated Attendance System Using Face Recognition", International
Research Journal of Engineering and Technology (IRJET).

3. Asanterabi Malima, Erol Ozgur (2006 )A fast algorithm for vision based
hand gesture recognition for robot control. In International Conference on
Signal Processing and Communications Applications, pages 1{4. IEEE.

4. B Prabhavathi, V Tanuja, V Madhu Viswanatham and M Rajashekhara


Babu(2017) "A smart technique for attendance system to recognize faces
through parallelism", IOP Conf. Series: Materials Science and Engineering
263.

5. Dan Wang, Rong Fu, Zuying Luo,(2017) "Classroom Attendance Auto-


management Based on Deep Learning",Advances in Social Science,
Education and Humanities Research, volume 123,ICESAME .

6. Ginu Thomas.(2006) Review of Various Hand Gesture Recognition


Techniques. VSRD-IJEECE, Vol. 1 (7), 2011, 374-383

18
7. Lars Bretzner, Ivan Laptev, Tony Lindeberg(2002). Hand Gesture
Recognition using Multi-Scale Color Features, Hierarchical Models and
Particle Filtering. In Proceedings of the Fifth International Conference on
Automatic Face and Gesture Recognition, pages 423{428. IEEE.

8. N.Sudhakar Reddy, M.V.Sumanth, S.Suresh Babu (2008)"A Counterpart


Approach to Attendance and Feedback System using Machine Learning
Techniques",Journal of Emerging Technologies and Innovative Research
(JETIR), Volume 5, Issue 12.

9. Prajakta Lad, Sonali More, Simran Parkhe, Priyanka Nikam, Dipalee


Chaudhari, (2009)" Student Attendance System Using Iris Detection",
IJARIIE-ISSN(O)-2395-4396, Vol-3 Issue-.

10. Ruize Xu, Shengli Zhou, Wen J. Li (2012). MEMS Accelerometer Based
Nonspecific-User Hand Gesture Recognition. IEEE, . Vol:12, 1166-1173.

11. Sheng-Yu Peng, Wattanachote K., Hwei-Jen Lin and Kuan-Ching Li 2011.
A Real-Time Hand Gesture Recognition System for Daily Information
Retrieval from Internet. In 4th International Conference on Ubi-Media
Computing (UMedia), pages 146{151. IEEE.

12. Siddharth Swarup Rautaray and Anupam Agrawal (2010). A Vision based
Hand Gesture Interface for Controlling VLC Media Player. International
Journal of Computer Applications. Vol: 10, 0975-8887

13. Yikai Fang, Jian Cheng, Kongqiao Wang, Hanqing Lu (2020). Hand
Gesture Recognition Using Fast Multi-scale Analysis. In fourth International
Conference on Image and Graphics, pages 694{698, IEEE

19
A.APPENDICES

A1.SCREENSHOTS

Figure A1.1:UI page

20
Figure A1.2:Hand Tracking

Figure A1.3:Using gesture draw over slide

Figure A1.4:Smart attendance system

21
Figure A1.5:Student details in database

Figure A1.6:Attendance Details

22
A2.SOURCE CODE

Ui Code:

import os
import tkinter.font as font
from tkinter import *
from tkinter import ttk, filedialog
from tkinter.filedialog import askopenfile
from PIL import Image, ImageTk
root = Tk()
root.geometry("1280x720")
img = ImageTk.PhotoImage(Image.open("v.png"))
panel = Label(root, image = img)
panel.pack(side = "bottom", fill = "both", expand = "yes")

DataFrame=Frame(root,bd=15,relief=RIDGE,padx=20,bg="black")
DataFrame.place(x=0,y=10,width=1280,height=200)
root.lblsearch=Label(DataFrame,font=("arial", 40, "bold"),text="HAND
GESTURE CONTROLLED
PRESENTATION",padx=2,bg="magenta",fg="black")
root.lblsearch.place(relx=0.5, rely=0.5, anchor=CENTER)

def run_program():
os.system('python roi.py')
rightframe = Frame(root)
rightframe.place(anchor=CENTER)

buttonFont = font.Font(family='Helvetica', size=12, weight='bold')

23
B = Button(root, text="PRESENT POWER
POINT",bd=15,relief=RIDGE,padx=20,height=5,width=20,
command=run_program,bg="black",fg="red", font=buttonFont)
B.place(relx=0.5, rely=0.5, anchor=CENTER)

#Let us create a label for button event


root.mainloop()

Gesture control

from cvzone.HandTrackingModule import HandDetector


import cv2
import os
import numpy as np

# Parameters
width, height = 1280, 720
gestureThreshold = 300
folderPath = "Presentation"

# Camera Setup
cap = cv2.VideoCapture(0)
cap.set(2, width)
cap.set(4, height)

# Hand Detector
detectorHand = HandDetector(detectionCon=0.8, maxHands=1)

# Variables
imgList = []
buttondelay = 30
buttonPressed = False
buttoncounter = 0
drawMode = False
imgNumber = 0
delayCounter = 0
annotations = [[]]
annotationNumber = -1

24
annotationStart = False
hs, ws = int(120 * 1), int(213 * 1) # width and height of small image

# Get list of presentation images


pathImages = sorted(os.listdir(folderPath), key=len)
print(pathImages)

while True:
# Get image frame
success, img = cap.read()
img = cv2.flip(img, 1)
pathFullImage = os.path.join(folderPath, pathImages[imgNumber])
imgCurrent = cv2.imread(pathFullImage)

# Find the hand and its landmarks


hands, img = detectorHand.findHands(img) # with draw
# Draw Gesture Threshold line
cv2.line(img, (0, gestureThreshold), (width, gestureThreshold), (0, 255, 0), 10)

if hands and buttonPressed is False: # If hand is detected

hand = hands[0]
cx, cy = hand["center"]
lmList = hand["lmList"] # List of 21 Landmark points
fingers = detectorHand.fingersUp(hand) # List of which fingers are up

# Constrain values for easier drawing


xVal = int(np.interp(lmList[8][0], [width // 2, w], [0, width]))
yVal = int(np.interp(lmList[8][1], [150, height-150], [0, height]))
indexFinger = lmList[8][0],lmList[8][1]

if cy <= gestureThreshold: # If hand is at the height of the face


if fingers == [1, 0, 0, 0, 0]:
print("Left")

if imgNumber > 0:
buttonPressed = True
imgNumber -= 1
annotations = [[]]

25
annotationNumber = -1
annotationStart = False
if fingers == [0, 0, 0, 0, 1]:
print("Right")

if imgNumber < len(pathImages) - 1:


buttonPressed = True
imgNumber += 1
annotations = [[]]
annotationNumber = -1
annotationStart = False
if fingers == [0, 1, 1, 0, 0]:
cv2.circle(imgCurrent, indexFinger, 12, (0, 0, 255), cv2.FILLED)

if fingers == [0, 1, 0, 0, 0]:


if annotationStart is False:
annotationStart = True
annotationNumber += 1
annotations.append([])
print(annotationNumber)
annotations[annotationNumber].append(indexFinger)
cv2.circle(imgCurrent, indexFinger, 12, (0, 0, 255), cv2.FILLED)

else:
annotationStart = False

if fingers == [0, 1, 1, 1, 0]:


if annotations:
annotations.pop(-1)
annotationNumber -= 1
buttonPressed = True

else:
annotationStart = False

if buttonPressed:
buttoncounter += 1
if buttoncounter > buttondelay:
buttoncounter = 0
buttonPressed = False

26
for i, annotation in enumerate(annotations):
for j in range(len(annotation)):
if j != 0:
cv2.line(imgCurrent, annotation[j - 1], annotation[j], (0, 0, 200), 12

imgSmall = cv2.resize(img, (ws, hs))


h, w, _ = imgCurrent.shape
imgCurrent[0:hs, w - ws: w] = imgSmall

cv2.imshow("Slides", imgCurrent)
cv2.imshow("Image", img)

key = cv2.waitKey(1)
if key == ord('q'):
break

# After the loop release the cap object


cap.release()
# Destroy all the windows
cv2.destroyAllWindows()

Attendance system

import csv
import datetime
import time
import tkinter as tk
import tkinter.simpledialog as tsd
from tkinter import messagebox as mess
from tkinter import ttk

import cv2
import numpy as np
import os
import pandas as pd
from PIL import Image

############################################# FUNCTIONS
################################################

27
def assure_path_exists(path):
dir = os.path.dirname(path)
if not os.path.exists(dir):
os.makedirs(dir)

##################################################################
################

def tick():
time_string = time.strftime('%H:%M:%S')
clock.config(text=time_string)
clock.after(200,tick)

##################################################################
#################

def contact():
mess._show(title='Contact us', message="Please contact us on :
'[email protected]' ")

##################################################################
#################

def check_haarcascadefile():
exists = os.path.isfile("haarcascade_frontalface_default.xml")
if exists:
pass
else:
mess._show(title='Some file missing', message='Please contact us for help')
window.destroy()

##################################################################
#################

def save_pass():
assure_path_exists("TrainingImageLabel/")
exists1 = os.path.isfile("TrainingImageLabel\psd.txt")
if exists1:
tf = open("TrainingImageLabel\psd.txt", "r")

28
key = tf.read()
else:
master.destroy()
new_pas = tsd.askstring('Old Password not found', 'Please enter a new
password below', show='*')
if new_pas == None:
mess._show(title='No Password Entered', message='Password not set!!
Please try again')
else:
tf = open("TrainingImageLabel\psd.txt", "w")
tf.write(new_pas)
mess._show(title='Password Registered', message='New password was
registered successfully!!')
return
op = (old.get())
newp= (new.get())
nnewp = (nnew.get())
if (op == key):
if(newp == nnewp):
txf = open("TrainingImageLabel\psd.txt", "w")
txf.write(newp)
else:
mess._show(title='Error', message='Confirm new password again!!!')
return
else:
mess._show(title='Wrong Password', message='Please enter correct old
password.')
return
mess._show(title='Password Changed', message='Password changed
successfully!!')
master.destroy()

##################################################################
#################

def change_pass():
global master
master = tk.Tk()
master.geometry("400x160")
master.resizable(False,False)

29
master.title("Change Password")
master.configure(background="white")
lbl4 = tk.Label(master,text=' Enter Old Password',bg='white',font=('times', 12,
' bold '))
lbl4.place(x=10,y=10)
global old
old=tk.Entry(master,width=25 ,fg="black",relief='solid',font=('times', 12, ' bold
'),show='*')
old.place(x=180,y=10)
lbl5 = tk.Label(master, text=' Enter New Password', bg='white', font=('times',
12, ' bold '))
lbl5.place(x=10, y=45)
global new
new = tk.Entry(master, width=25, fg="black",relief='solid', font=('times', 12, '
bold '),show='*')
new.place(x=180, y=45)
lbl6 = tk.Label(master, text='Confirm New Password', bg='white', font=('times',
12, ' bold '))
lbl6.place(x=10, y=80)
global nnew
nnew = tk.Entry(master, width=25, fg="black", relief='solid',font=('times', 12, '
bold '),show='*')
nnew.place(x=180, y=80)
cancel=tk.Button(master,text="Cancel", command=master.destroy ,fg="black"
,bg="red" ,height=1,width=25 , activebackground = "white" ,font=('times', 10, '
bold '))
cancel.place(x=200, y=120)
save1 = tk.Button(master, text="Save", command=save_pass, fg="black",
bg="#3ece48", height = 1,width=25, activebackground="white", font=('times', 10, '
bold '))
save1.place(x=10, y=120)
master.mainloop()

##################################################################
###################

def psw():
assure_path_exists("TrainingImageLabel/")
exists1 = os.path.isfile("TrainingImageLabel\psd.txt")
if exists1:

30
tf = open("TrainingImageLabel\psd.txt", "r")
key = tf.read()
else:
new_pas = tsd.askstring('Old Password not found', 'Please enter a new
password below', show='*')
if new_pas == None:
mess._show(title='No Password Entered', message='Password not set!!
Please try again')
else:
tf = open("TrainingImageLabel\psd.txt", "w")
tf.write(new_pas)
mess._show(title='Password Registered', message='New password was
registered successfully!!')
return
password = tsd.askstring('Password', 'Enter Password', show='*')
if (password == key):
TrainImages()
elif (password == None):
pass
else:
mess._show(title='Wrong Password', message='You have entered wrong
password')

##################################################################
####################

def clear():
txt.delete(0, 'end')
res = "1)Take Images >>> 2)Save Profile"
message1.configure(text=res)

def clear2():
txt2.delete(0, 'end')
res = "1)Take Images >>> 2)Save Profile"
message1.configure(text=res)

##################################################################
#####################

31
def TakeImages():
check_haarcascadefile()
columns = ['SERIAL NO.', '', 'ID', '', 'NAME']
assure_path_exists("StudentDetails/")
assure_path_exists("TrainingImage/")
serial = 0
exists = os.path.isfile("StudentDetails\StudentDetails.csv")
if exists:
with open("StudentDetails\StudentDetails.csv", 'r') as csvFile1:
reader1 = csv.reader(csvFile1)
for l in reader1:
serial = serial + 1
serial = (serial // 2)
csvFile1.close()
else:
with open("StudentDetails\StudentDetails.csv", 'a+') as csvFile1:
writer = csv.writer(csvFile1)
writer.writerow(columns)
serial = 1
csvFile1.close()
Id = (txt.get())
name = (txt2.get())
if ((name.isalpha()) or (' ' in name)):
cam = cv2.VideoCapture(0)
harcascadePath = "haarcascade_frontalface_default.xml"
detector = cv2.CascadeClassifier(harcascadePath)
sampleNum = 0
while (True):
ret, img = cam.read()
gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
faces = detector.detectMultiScale(gray, 1.3, 5)
for (x, y, w, h) in faces:
cv2.rectangle(img, (x, y), (x + w, y + h), (255, 0, 0), 2)
# incrementing sample number
sampleNum = sampleNum + 1
# saving the captured face in the dataset folder TrainingImage
cv2.imwrite("TrainingImage\ " + name + "." + str(serial) + "." + Id + '.' +
str(sampleNum) + ".jpg",
gray[y:y + h, x:x + w])
# display the frame

32
cv2.imshow('Taking Images', img)
# wait for 100 miliseconds
if cv2.waitKey(100) & 0xFF == ord('q'):
break
# break if the sample number is morethan 100
elif sampleNum > 100:
break
cam.release()
cv2.destroyAllWindows()
res = "Images Taken for ID : " + Id
row = [serial, '', Id, '', name]
with open('StudentDetails\StudentDetails.csv', 'a+') as csvFile:
writer = csv.writer(csvFile)
writer.writerow(row)
csvFile.close()
message1.configure(text=res)
else:
if (name.isalpha() == False):
res = "Enter Correct name"
message.configure(text=res)

##################################################################
######################

def TrainImages():
check_haarcascadefile()
assure_path_exists("TrainingImageLabel/")
recognizer = cv2.face_LBPHFaceRecognizer.create()
harcascadePath = "haarcascade_frontalface_default.xml"
detector = cv2.CascadeClassifier(harcascadePath)
faces, ID = getImagesAndLabels("TrainingImage")
try:
recognizer.train(faces, np.array(ID))
except:
mess._show(title='No Registrations', message='Please Register someone
first!!!')
return
recognizer.save("TrainingImageLabel\Trainner.yml")
res = "Profile Saved Successfully"
message1.configure(text=res)

33
message.configure(text='Total Registrations till now : ' + str(ID[0]))

##################################################################
##########################3

def getImagesAndLabels(path):
# get the path of all the files in the folder
imagePaths = [os.path.join(path, f) for f in os.listdir(path)]
# create empth face list
faces = []
# create empty ID list
Ids = []
# now looping through all the image paths and loading the Ids and the images
for imagePath in imagePaths:
# loading the image and converting it to gray scale
pilImage = Image.open(imagePath).convert('L')
# Now we are converting the PIL image into numpy array
imageNp = np.array(pilImage, 'uint8')
# getting the Id from the image
ID = int(os.path.split(imagePath)[-1].split(".")[1])
# extract the face from the training image sample
faces.append(imageNp)
Ids.append(ID)
return faces, Ids

##################################################################
#########################

def TrackImages():
check_haarcascadefile()
assure_path_exists("Attendance/")
assure_path_exists("StudentDetails/")
for k in tv.get_children():
tv.delete(k)
msg = ''
i=0
j=0
recognizer = cv2.face.LBPHFaceRecognizer_create() #
cv2.createLBPHFaceRecognizer()
exists3 = os.path.isfile("TrainingImageLabel\Trainner.yml")

34
if exists3:
recognizer.read("TrainingImageLabel\Trainner.yml")
else:
mess._show(title='Data Missing', message='Please click on Save Profile to
reset data!!')
return
harcascadePath = "haarcascade_frontalface_default.xml"
faceCascade = cv2.CascadeClassifier(harcascadePath);

cam = cv2.VideoCapture(0)
font = cv2.FONT_HERSHEY_SIMPLEX
col_names = ['Id', '', 'Name', '', 'Date', '', 'Time']
exists1 = os.path.isfile("StudentDetails\StudentDetails.csv")
if exists1:
df = pd.read_csv("StudentDetails\StudentDetails.csv")
else:
mess._show(title='Details Missing', message='Students details are missing,
please check!')
cam.release()
cv2.destroyAllWindows()
window.destroy()
while True:
ret, im = cam.read()
gray = cv2.cvtColor(im, cv2.COLOR_BGR2GRAY)
faces = faceCascade.detectMultiScale(gray, 1.2, 5)
for (x, y, w, h) in faces:
cv2.rectangle(im, (x, y), (x + w, y + h), (225, 0, 0), 2)
serial, conf = recognizer.predict(gray[y:y + h, x:x + w])
if (conf < 50):
ts = time.time()
date = datetime.datetime.fromtimestamp(ts).strftime('%d-%m-%Y')
timeStamp =
datetime.datetime.fromtimestamp(ts).strftime('%H:%M:%S')
aa = df.loc[df['SERIAL NO.'] == serial]['NAME'].values
ID = df.loc[df['SERIAL NO.'] == serial]['ID'].values
ID = str(ID)
ID = ID[1:-1]
bb = str(aa)
bb = bb[2:-2]
attendance = [str(ID), '', bb, '', str(date), '', str(timeStamp)]

35
else:
Id = 'Unknown'
bb = str(Id)
cv2.putText(im, str(bb), (x, y + h), font, 1, (255, 255, 255), 2)
cv2.imshow('Taking Attendance', im)
if (cv2.waitKey(1) == ord('q')):
break
ts = time.time()
date = datetime.datetime.fromtimestamp(ts).strftime('%d-%m-%Y')
exists = os.path.isfile("Attendance\Attendance_" + date + ".csv")
if exists:
with open("Attendance\Attendance_" + date + ".csv", 'a+') as csvFile1:
writer = csv.writer(csvFile1)
writer.writerow(attendance)

csvFile1.close()
else:
with open("Attendance\Attendance_" + date + ".csv", 'a+') as csvFile1:
writer = csv.writer(csvFile1)
writer.writerow(col_names)
writer.writerow(attendance)
csvFile1.close()
with open("Attendance\Attendance_" + date + ".csv", 'r') as csvFile1:
reader1 = csv.reader(csvFile1)
for lines in reader1:
i=i+1
if (i > 1):
if (i % 2 != 0):
iidd = str(lines[0]) + ' '
tv.insert('', 0, text=iidd, values=(str(lines[2]), str(lines[4]),
str(lines[6])))
csvFile1.close()
cam.release()
cv2.destroyAllWindows()

######################################## USED STUFFS


############################################

global key

36
key = ''

ts = time.time()
date = datetime.datetime.fromtimestamp(ts).strftime('%d-%m-%Y')
day,month,year=date.split("-")

mont={'01':'January',
'02':'February',
'03':'March',
'04':'April',
'05':'May',
'06':'June',
'07':'July',
'08':'August',
'09':'September',
'10':'October',
'11':'November',
'12':'December'
}

######################################## GUI FRONT-END


###########################################

window = tk.Tk()
window.geometry("1280x720")
window.resizable(True,False)
window.title("Attendance System")
window.configure(background='#a3c1ad')

frame1 = tk.Frame(window, bg="#36454f")


frame1.place(relx=0.11, rely=0.17, relwidth=0.39, relheight=0.80)

frame2 = tk.Frame(window, bg="#36454f")


frame2.place(relx=0.51, rely=0.17, relwidth=0.38, relheight=0.80)

message3 = tk.Label(window, text=" Face Recognition Based Attendance System


" ,fg="white",bg="#262523" ,width=55 ,height=1,font=('exo', 29, ' bold '))
message3.place(x=10, y=10)

frame3 = tk.Frame(window, bg="#51D29D")

37
frame3.place(relx=0.52, rely=0.09, relwidth=0.09, relheight=0.07)

frame4 = tk.Frame(window, bg="#51D29D")


frame4.place(relx=0.36, rely=0.09, relwidth=0.16, relheight=0.07)

datef = tk.Label(frame4, text = day+"-"+mont[month]+"-"+year+" | ",


fg="#51D29D",bg="#262523" ,width=59 ,height=1,font=('times', 19, ' bold '))
datef.pack(fill='both',expand=1)

clock = tk.Label(frame3,fg="#51D29D",bg="#262523" ,width=55


,height=1,font=('exo', 22, ' bold '))
clock.pack(fill='both',expand=1)
tick()

head2 = tk.Label(frame2, text=" For New Registrations


", fg="black",bg="#5d8aa8" ,font=('times', 17, ' bold ') )
head2.grid(row=0,column=0)

head1 = tk.Label(frame1, text=" For Already Registered


", fg="black",bg="#5d8aa8" ,font=('times', 17, ' bold ') )
head1.place(x=0,y=0)

lbl = tk.Label(frame2, text="Enter ID",width=20 ,height=1 ,fg="black"


,bg="#5d8aa8" ,font=('times', 17, ' bold ') )
lbl.place(x=80, y=55)

txt = tk.Entry(frame2,width=32 ,fg="black",font=('exo', 15, ' bold '))


txt.place(x=30, y=88)

lbl2 = tk.Label(frame2, text="Enter Name",width=20 ,fg="black" ,bg="#5d8aa8"


,font=('times', 17, ' bold '))
lbl2.place(x=80, y=140)

txt2 = tk.Entry(frame2,width=32 ,fg="black",font=('exo', 15, ' bold ') )


txt2.place(x=30, y=173)

message1 = tk.Label(frame2, text="1)Take Images >>> 2)Save Profile"


,bg="#5d8aa8" ,fg="black" ,width=39 ,height=1, activebackground = "yellow"
,font=('times', 15, ' bold '))
message1.place(x=7, y=230)

38
message = tk.Label(frame2, text="" ,bg="white" ,fg="black" ,width=39,height=1,
activebackground = "yellow" ,font=('exo', 16, ' bold '))
message.place(x=7, y=450)

lbl3 = tk.Label(frame1, text="Attendance",width=20 ,fg="black" ,bg="#5d8aa8"


,height=1 ,font=('exo', 17, ' bold '))
lbl3.place(x=100, y=115)

res=0
exists = os.path.isfile("StudentDetails\StudentDetails.csv")
if exists:
with open("StudentDetails\StudentDetails.csv", 'r') as csvFile1:
reader1 = csv.reader(csvFile1)
for l in reader1:
res = res + 1
res = (res // 2) - 1
csvFile1.close()
else:
res = 0
message.configure(text='Total Registrations till now : '+str(res))

##################### MENUBAR #################################

menubar = tk.Menu(window,relief='ridge')
filemenu = tk.Menu(menubar,tearoff=0)
filemenu.add_command(label='Change Password', command = change_pass)
filemenu.add_command(label='Contact Us', command = contact)
filemenu.add_command(label='Exit',command = window.destroy)
menubar.add_cascade(label='Help',font=('times', 29, ' bold '),menu=filemenu)

################## TREEVIEW ATTENDANCE TABLE


####################

tv= ttk.Treeview(frame1,height =13,columns = ('name','date','time'))


tv.column('#0',width=82)
tv.column('name',width=130)
tv.column('date',width=133)
tv.column('time',width=133)
tv.grid(row=2,column=0,padx=(0,0),pady=(150,0),columnspan=4)

39
tv.heading('#0',text ='ID')
tv.heading('name',text ='NAME')
tv.heading('date',text ='DATE')
tv.heading('time',text ='TIME')

###################### SCROLLBAR ################################

scroll=ttk.Scrollbar(frame1,orient='vertical',command=tv.yview)
scroll.grid(row=2,column=4,padx=(0,100),pady=(150,0),sticky='ns')
tv.configure(yscrollcommand=scroll.set)

###################### BUTTONS ##################################

clearButton = tk.Button(frame2, text="Clear", command=clear ,fg="black"


,bg="#d40000" ,width=11 ,activebackground = "white" ,font=('exo', 11, ' bold '))
clearButton.place(x=335, y=86)
clearButton2 = tk.Button(frame2, text="Clear", command=clear2 ,fg="black"
,bg="#d40000" ,width=11 , activebackground = "white" ,font=('exo', 11, ' bold '))
clearButton2.place(x=335, y=172)
takeImg = tk.Button(frame2, text="Take Images", command=TakeImages
,fg="white" ,bg="#d40000" ,width=34 ,height=1, activebackground = "white"
,font=('exo', 15, ' bold '))
takeImg.place(x=30, y=300)
trainImg = tk.Button(frame2, text="Save Profile", command=psw ,fg="white"
,bg="#d40000" ,width=34 ,height=1, activebackground = "white" ,font=('exo', 15,
' bold '))
trainImg.place(x=30, y=380)
trackImg = tk.Button(frame1, text="Take Attendance", command=TrackImages
,fg="black" ,bg="#ffcc00" ,width=35 ,height=1, activebackground = "white"
,font=('exo', 15, ' bold '))
trackImg.place(x=30,y=50)
quitWindow = tk.Button(frame1, text="Quit", command=window.destroy
,fg="white" ,bg="#d40000" ,width=35 ,height=1, activebackground = "white"
,font=('times', 15, ' bold '))
quitWindow.place(x=30, y=450)

##################### END ######################################

window.configure(menu=menubar)
window.mainloop

40
41

You might also like