0% found this document useful (0 votes)
99 views6 pages

Virtual Mouse Control Using Hand Class Gesture: December 2020

This document proposes a virtual mouse control system using hand gestures detected by a webcam. The system would allow basic mouse functions like cursor movement, clicking, and dragging to be performed through hand poses and gestures without any physical mouse device. It aims to provide a low-cost and portable alternative to traditional mice. The system would work by using computer vision and machine learning techniques to identify the user's hands in images from the webcam input, detect gestures, and translate those into mouse control commands.

Uploaded by

Navya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
99 views6 pages

Virtual Mouse Control Using Hand Class Gesture: December 2020

This document proposes a virtual mouse control system using hand gestures detected by a webcam. The system would allow basic mouse functions like cursor movement, clicking, and dragging to be performed through hand poses and gestures without any physical mouse device. It aims to provide a low-cost and portable alternative to traditional mice. The system would work by using computer vision and machine learning techniques to identify the user's hands in images from the webcam input, detect gestures, and translate those into mouse control commands.

Uploaded by

Navya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/347983092

Virtual Mouse Control Using Hand Class Gesture

Article · December 2020

CITATIONS READS
0 4,935

4 authors, including:

Vijay Kumar Sharma


Meerut Institute of Engineering & Technology
25 PUBLICATIONS   3 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Ai Enabled Virtual Environment Simulator View project

All content following this page was uploaded by Vijay Kumar Sharma on 29 December 2020.

The user has requested enhancement of the downloaded file.


GIS SCIENCE JOURNAL ISSN NO : 1869-9391

Virtual Mouse Control Using Hand Class Gesture


Vijay Kumar Sharma, Vimal Kumar, Md. Iqbal, Sachin Tawara, Vishal Jayaswal
Department of Computer Science and Engineering
MIET, Meerut

Abstract - This paper proposes a way to control the


II. EXISTING SYSTEM
position of the cursor with the bare hands without using
any electronic device. While the operations like clicking The existing system consists of a mouse that can be either
and dragging of objects will be performed with different wireless or wired to control the cursor, know we can use hand
hand gestures. The proposed system will only require a gestures to monitoring the system. The existing virtual mouse
webcam as an input device. The software’s that will be control system consists of the simple mouse operation using
required to implement the proposed system are OpenCV the colored tips for detection which are captured by web-cam,
and python. The output of the camera will be displayed on hence colored fingers acts as an object which the web-cam
the system’s screen so that it can be further calibrated by sense color like red, green, blue color to monitor the system,
the user. The python dependencies that will be used for whereas could perform basic mouse operation like minimize,
implementing this system are NumPy, math, wx and drag, scroll up , scroll down , left-click right-click using hand
mouse. gestures without any colored finger because skin color
recognition system is more flexible than the existing system.
Index Terms - OpenCv; Numpy; Calibrated; Gesture.
In the existing system use static hand recognition like fingertip
I. INTRODUCTION identification, hand shape, Number of fingers to defined action
explicitly , which makes a system more complex to understand
It has been generations since we have been using hand
and difficult to use.
gestures for communicating in human society[1].The shaking
of hands, Thumbs up and Thumbs down signs have been ever
existing in the environment. It is believed that gestures are the III. PROPOSED SYSTEM
easiest way of interaction with anyone. So then why not apply
it to the machines that we are using. In this work, we are The system works by identifying the color of the hand and
demonstrating, real- gesture. The initial setup includes a low- decides the position of the cursor accordingly but there are
different conditions and scenario which make it difficult for
cost USB web camera that can be used for providing the input
the algorithm to run in the real environment due to the
to the system. The complete process is divided into 4 steps
which are frame-capturing, image-processing, region- following reason as shown in Fig. 1.
extraction, feature-matching. To the extreme, it can also be
● Noises in the environment.
called as hardware because it uses a camera for tracking hands
● Lighting condition in the environment
[5].
● Different textures of skin.
Aim and objective of research work include- ● Background object in the same colour of skin.
● For most laptop touchpad is not the most comfortable
and convenient.
● Main objective pre-processing is to represent the data
in such a way that it can be easily interpreted and
processed by the system.
● Reduce cost of hardware [2].

It focuses on extracting the features over the human hands and


then matching their features to recognize the movement of the
hand.

Project essential feature- Fig. 1 Input Processing


● User friendly. So it becomes very important that the colour determining
● Portable. algorithm works accurately. The proposed system can work
● Handle simple operation left- click dragging, for the skin tone of any color as well as can work accurately in
minimizing. any lighting condition as well for the purpose of clicking the
● No hardware [3]. user needs to create a 15 degree angle between its two-finger
the proposed system can easily replace the traditional mouse

VOLUME 7, ISSUE 12, 2020 PAGE NO: 454


GIS SCIENCE JOURNAL ISSN NO : 1869-9391

as well as the algorithm that requires colored tapes for


controlling the mouse .the research paper can be a pioneer in
its field and can be a source of further research in the
corresponding field. The project can be developed with “zero-
cost” and can easily integrate with the existing system.

IV. APPLICATION OF PROPOSED WORK


This work can easily replace the traditional mouse system
that has been in existence for decades with the use of this
algorithm the user can control the mouse without the fuss of
any other hardware device this is done using a hand gestures
recognition with inputs from a web-cam.

V. METHODOLOGY
The following steps are included to develop the algorithm:-

(i) The first step is to capture the image using the camera. Fig 2 Activating the USB web-cam
(ii) The camera then extracts and recognizes the human hand
from the input image. B. Skin Color Extraction
(iii) Then the position of the human hand is stored in the
system using the regular” coordinate-system”. To identify the skin color and separating it from the other
(iv) Then when the second frame is captured. The position of colors of the background for this we use the mask and the
the hand from the second frame is captured and is stored in the kernel function the mask function identifies the skin color
system. using the RGB parameters which range from [92, 56, 54] to
(v) Then the position of both hands is compared and then the [255,223,196] than the open kernels and the close kernels are
cursor moves accordingly. used to remove the noises from the input.
(vi)Now for the system of clicking the angle between the two
hands of the finger is measured and if the angle is less than 15 The open kernel and close kernel work on a simple theory that
degrees the system responds to it as a left-click. In this way, if the pixilated noise bit is greater than the stored value than it
the complete working of the mouse can be done with bare will be removed using the masking effect, hence only the
hands. correct input is provided to the system for the process of
computer as shown in Fig. 3.
By this paper, we aim to create totally cost-free hand
recognition software for laptops and PCs with the help of web-
cam support .The project emphasis on creating software that
can be used to move the cursor with the help of hands and
performing operations like clicking.

A. Activating Camera

The first step is to activate the camera so that the input can be
provided to the system for this to happen we need to assign the
resources of the camera to a variable the command that will be
used for this purpose is cam=cv2. Video Capture(0) this
command will activate the camera that is connected to the
system and will be able to take the input from the camera as
shown in Fig. 2.

Fig 3.Extracting the skin color from input image


C. Cursor Movement

VOLUME 7, ISSUE 12, 2020 PAGE NO: 455


GIS SCIENCE JOURNAL ISSN NO : 1869-9391

D. Displaying Output
For moving the cursor the first step is to find the middle of the
hand which can be determined using the following command. A window will pop up on the screen of the user displaying the
hands of the user and the subordinates lines controlling the
Algorithm cursor the output can be shown by the command
cv2.imshow(‘frame’, frame); other than only showing the
1.var_leftmost→min_argument[tuple(hull[hull[:,:,0].argmin()] camera input window the user also provided with additional
[0])] information such as additional and appropriate sources of light
in the background which will help the user in setting the
surroundings right as shown in Fig. 5.
2.var_rightmost→max_argument[tuple(my_con[my_con[:,:,0]
.argmax()][0])

3.var_topmost→tuple assignment (hull [ hull


[ :, :,1 ].argmin() ][0])

4.var_bottommost→tupleassignment(my_con[my_con[:,:,1].ar
gmax()][0])

5.var_Temp→bottommost[0]+30

6.cv2.line(roi,topmost,(topmost[0],h-280),(0,242,225),2)

7.cv2.line(roi,leftmost,(topmost[0],bottommost[1]-
80),(0,242,225),2);

The following part of the code is responsible for finding the


middle point of the hand the coordinates of the midpoint of the
hand will be used for moving the cursor in different directions
depending on the movement of the corresponding users.

Fig 5 Displaying the Output

E. Flow chart

[1]Represent the capturing the frame from the web- cam and
Process the frame capturing by the web-cam after processing
convert the image HSV to RGB format.

[2] Creating the filter which create mask of skin color.

[3] If the input provided by the user through the web-cam is


skin color than calculating the midpoint of the image
otherwise processing the frame provided by the webcam.

[4] If the angle between the two points is less than 15 degree
perform operation left click else move cursor in the direction
of input image.
Fig 4.Performing clicking operation by creating angle of less than 15 degree

VOLUME 7, ISSUE 12, 2020 PAGE NO: 456


GIS SCIENCE JOURNAL ISSN NO : 1869-9391

system is highly has a very les


dependent on the dependency on the
environment in environmental
which the factors.
system is being
used
Complexity The existing The proposed system
system are very is very simple and
complex and requires very basic
require powerful processors
processors

Practicality The existing The proposed system


system in not a is a practically viable
practically viable option in the real
option in the real world.
world
Future The existing The existing system
Perspective system does not can integrate well
integrate well with the other
with the existing technologies
technology

REFERENCES
[1] Amardip Ghodichor, Binitha Chirakattu “Virtual Mouse using
Hand Gesture and Color Detection ”, Volume 128 – No.11,
Fig 6 Flow Chart of Proposed Methodology October 2015.
[2] Chhoriya P., Paliwal G., Badhan P., 2013, “Image Processing
Based Color Detection”, International Journal of Emerging
Technology and Advanced Engineering, Volume 3, Issue 4, pp.
VI. RESULTS AND EVALUATION 410-415
[3] Rhitivij Parasher,Preksha Pareek ,”Event triggering Using hand
gesture using open cv”, volume -02-february,2016 page
The motive of this paper was to make the machine more No.15673-15676.
interactive and responsive towards human behavior. The sole [4] AhemadSiddique, Abhishek Kommera, DivyaVarma, ”
aim of this paper was to make a technology that is affordable Simulation of Mouse using Image Processing Via Convex Hull
and portable with any standard operating system. Method ”, Vol. 4, Issue 3, March 2016.
[5] Student, Department of Information Technology, PSG College
The proposed system is used to control the pointer of the of Technology, Coimbatore, Tamilnadu, India,”Virtual Mouse
Using Hand Gesture Recognition ”,Volume 5 Issue VII, July
mouse by detecting the human hand and moving the pointer in
2017.
the direction in the human hand respectively. the system [6] Kalyani Pendke1 , Prasanna Khuje2 , Smita Narnaware3 ,
Control simple function of the mouse such as left-clicking, Shweta Thool4 , Sachin Nimje5 ,”International Journal of
dragging and cursor movement. Computer Science and Mobile Computing ”,IJCSMC, Vol. 4,
Issue. 3, March 2015.
The method detects the hand of the human skin and tracks it [7] [. Abhilash S S1, Lisho Thomas2, Naveen Wilson3, Chaithanya
continuously for the movement of the cursor when the angle C4,”VIRTUAL MOUSE USING HAND GESTURE”, Volume:
between the fingers of the human hand is less than 15 degree 05 Issue: 04 | Apr-2018.
the process performs the task of the left-click. [8] Abdul Khaliq and A. Shahid Khan, “Virtual Mouse
Implementation Using Color Pointer Detection”, International
TABLE I: Comparison of Existing and Proposed System Journal of Electrical Electronics & Computer Science
Engineering, Volume 2, Issue 4, August, 2015, pp. 63-66
Feature Existing System Proposed System [9] Erdem, E. Yardimci, Y. Atalay, V. Cetin, A. E., “Computer
Stability The Existing The proposed system vision based mouse”, Acoustics, Speech, and Signal Processing,
Proceedings (ICASS), IEEE International Conference, 2002.
System is very stable as
[10] Chu-Feng Lien, “Portable Vision-Based HCI – A Realtime
is poor on compared to its Hand Mouse System on Handheld Devices”, National Taiwan
stability front predecessor University, Computer Science and Information Engineering
environment The existing The proposed system Department

VOLUME 7, ISSUE 12, 2020 PAGE NO: 457


GIS SCIENCE JOURNAL ISSN NO : 1869-9391

[11] Hojoon Park, “A Method for Controlling the Mouse Movement


using a Real Time Camera”, Brown University, Providence, RI,
USA, Department of Computer Science, 2008.
[12] AsanterabiMalima, Erol Ozgur, and Mujdat Cetin, “A
FastAlgorithm for Vision-Based Hand Gesture Recognition for
Robot Control”
[13] using Hand Gesture Recognition”, InternationalJournal of
Engineering Sciences & Research Technology,ISSN:2277-
9655,March 2014.
[14] ShanyJophin, Sheethal M.S, Priya Philip, T M Bhruguram,
“Gesture Based Interface Using Motion and Image Comparison”,
International Journal of Advanced Information Technology
(IJAIT) Vol. 2, No.3, June 2012.
[15] Abhik Banerjee, Abhirup Ghosh, Koustuvmoni Bharadwaj,
HemantaSaik, Mouse Control using a Web Camera based on
ColourDetection”, International Journal of Computer Trends and
Technology (IJCTT) –volume 9 number 1, ISSN:2231-2803,
March 2014.
[16] S.Sadhana Rao, “Sixth Sense Technology”, Proceedings of the
International Conference on Communication and Computational
Intelligence– 2010, pp.336-339.
[17] Game P. M., Mahajan A.R,”A gestural user interface to Interact
with computer system ”, International Journal on Science and
Technology (IJSAT) Volume II, Issue I, (Jan.- Mar.) 2011,
pp.018 – 027
[18] Abhik Banerjee, Abhirup Ghosh, Koustuvmoni Bharadwaj,”
Mouse Control using a Web Camera based on Color
Detection”,IJCTT,vol.9, Mar 2014.
[19] Angel, Neethu.P.S, “Real Time Static & Dynamic Hand Gesture
Recognition”, International Journal of Scientific & Engineering
Research Volume 4, Issue3, March-2013.
[20] Q. Y. Zhang, F. Chen and X. W. Liu, “Hand Gesture Detection
and Segmentation Based on Difference Back-ground Image with
Complex Background,” Proceedings of the 2008 International
Conference on Embedded Soft-ware and Systems, Sichuan, 29-
31 July 2008, pp. 338-343
[21] A. Erdem, E. Yardimci, Y. Atalay, V. Cetin, A. E. “Computer
vision based mouse”,Acoustics, Speech, and Signal Processing,
Proceedings. (ICASS). IEEE International Conference, 2002
[22] Hojoon Park, “A Method for Controlling the Mouse Movement
using a Real Time Camera”, Brown University, Providence, RI,
USA, Department of computer science, 2008
[23] Chu-Feng Lien, “Portable Vision-Based HCI –A Real-time
Hand Mouse System on Handheld Devices”, National Taiwan
University, Computer Science and Information Engineering
Department
[24] Kamran Niyazi, Vikram Kumar, Swapnil Mahe, Swapnil
Vyawahare, “Mouse Simulation Using Two Coloured Tapes”,
Department of Computer Science, University of Pune, India,
International Journal of Information Sciences and Techniques
(IJIST) Vol.2, No.2, March 2012
[25] K N. Shah, K R. Rathod and S. J. Agravat, “A survey on
Human Computer Interaction Mechanism Using Finger
Tracking”

VOLUME 7, ISSUE 12, 2020 PAGE NO: 458


View publication stats

You might also like