0% found this document useful (0 votes)
7 views50 pages

Documentation

Uploaded by

ujraj8767
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views50 pages

Documentation

Uploaded by

ujraj8767
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 50

AI VIETUAL MOUSE

By RIHANA BEGAM . R - 2202639

Submitted in partial fulfillment of the requirements for the award of the


degree of

MASTER OF SCIENCE IN

COMPUTER SCIENCE

DEPARTMENT OF COMPUTER SCIENCE


KANCHI MAMUNIVAR GOVERNMENT INSTITUTE FOR
POSTGRADUATE STUDIES AND RESEARCH

(AUTONOMOUS-REACCREDITED WITH “B++” GRADE BY NAAC)

PUDUCHERRY – 605 008

JUNE 2021
Government of Puducherry
Kanchi Mamunivar Government Institute for Postgraduate Studies and
Research (Autonomous-Reaccredited with “B++” Grade by NAAC)
(College with Potential for
Excellence) Lawspet,

Puducherry-605008

BONAFIDE CERTIFICATE

This is to certify that the project work entitled ” AI VIRTUAL MOUSE ” is

a bonafide record of project work done by RIHANA BEGAM . R [Reg. No.2202639] in partial
fulfillment for the award of the degree of Master of Science in Computer Science of Kanchi
Mamunivar Government Institute for Postgraduate Studies and Research
(Autonomous), Affiliated to Pondicherry University, Puducherry.
This work has not been submitted for the award of any other degree to the best of our
knowledge.

Head of the Department Project Guide

Submitted for the project viva-voce Examination held ______________

INTERNAL EXAMINER EXTERNAL EXAMINER


Government of Puducherry
Kanchi Mamunivar Government Institute for Postgraduate Studies and
Research (Autonomous-Reaccredited with “B++” Grade by NAAC)
(College with Potential for
Excellence) Lawspet,
Puducherry-605008

CERTIFICATE

Dr.KANNATHASAN
M.Sc.,M.Phil.,Ph.D,
Department of Computer
Science, KMGIPSR,
Puducherry.

This is to certify that this project work entitled “AI VIRTUAL MOUSE” is done by RIHANA
BEGAM. R [Reg. No.2192628] under my supervision and guidance in partial fulfillment for the
award the degree of Master of Science in ComputerScience of Kanchi Mamunivar
Government Institute for Postgraduate Studies and Research(Autonomous), Affiliated to
Pondicherry University, Puducherry.
This work has not been submitted elsewhere for the award of any other degree
to the best of my knowledge.

Project Guide
ACKNOWLEDGEMENT
The satisfaction of completing my project will not be complete until I thank all people

who have helped me through this.

I take this opportunity to express my profound gratitude and deep regards to my guide

Dr.N. KANNATHASAN, M.Sc., M.Phil., Ph.D., for her exemplary guidance,

monitoring and constant encouragement throughout the course of this project.

The blessing, help and guidance given by her time shall carry me a long way in the

journey of life on which I am about to embark.

I would like thank Dr.M. SELVARAJ, M.Com., M.Phil., B.Ed.,Ph.D., Director, Kanchi

Mamunivar Government Institute for Postgraduate Studies and Research, Puducherry,

providing us the confidence and all the necessary support in undertaking the project. At

this junction, I would like to thank Mrs. V.K. MIXYMOL, MCA, MTech head of the

department of computer science for her valuable help and support.

I would like to thank all my teachers, Mrs.V.K.MIXYMOL, DR.P.DINADAYALAN,

Mrs.N.VIMALA, Dr.C. MANJU, Mr.S. RAJABADHAR for the timely help and

suggestions given by them that render a perfect shape for this work.

I would like to thank all my family members for their direct and indirect support extended

to me for my academic and social progress.

I also thank all my classmates who have helped me at different times during the study of

my course.Those who find their comments and suggestions incorporated in this report should not

think me churlish for leaving them unnamed.

RIHANA BEGAM .R
TABLE OF CONTENT
CHAPTER TITLE PAGE
NO
ACKNOWLEDGEMENT

ABSTRACT
OBJECTIVES
1 INTODUCTON
1.1 MECHANMIAL MOUSE
1.2 OPTICAL MOUSE
2 LITERATURE REVIEW
2.1 MOUSE SIMULATION USING RGB
2.2 VIRTUAL MOUSE USING A WEBCAM
3 ANALYSIS AND REQUIREMENT
3.1 HARDWARE REQUIREMENTS
3.2 SOFTWARE REQUIREMENTS
3.3 SYSTEM REQUIREMENTS
4 DESIGN
4.1 SYSTEM DESIGN
4.2 ARCHITECTURE DESIGN
4.3 DATA DESIGN
5 IMPLEMENTATION
5.1 AT USER SIDE
5.2 IMPLEMENTATION ISSUES AND
CHALLENGES
6 TESTING
6.1 USABILITY TESTING
6.2 PERFORMANCE IN VARIOUS
ENVIRONMENT
7 TOOLS AND TECHNOLOGY
8 PROJECT SCOPE
8.1 PROBLEM STATEMENT
8.2 MOTIVATION OF VIRTUAL MOUSE
8.3 CONVINIENT
9 CONCLUSION
9.1 FUTURE SCOPE OF THE PROJECT
9.2 CONCLUSION
10 APPENDICS
10.1 AI VIRTUAL MOUSE
10.2 HAND TRACKING MOUDLE
10.3 OUTPUT
AI
VIRTUAL
MOUSE
ABSTRACT

This project promotes an approach for the Human Computer Interaction (HCI) where cursor
movement can be controlled using a real-time camera, it is an alternative to the current methods
including manual input of buttons or changing the positions of a physical computer mouse.
Instead, it utilizes a camera and computer vision technology to control various mouse events and
is capable of performing every task that the physical computer mouse can. The Virtual Mouse
color recognition program will constantly acquiring real-time images where the images will
undergone a series of filtration and conversion. Whenever the process is complete, the program
will apply the image processing technique to obtain the coordinates of the targeted colors position
from the converted frames. After that, it will proceed to compare the existing colors within the
frames with a list of color combinations, where different combinations consists of different mouse
functions. If the current colors combination found a match, the program will execute the mouse
function, which will be translated into an actual mouse function to the users’ machine.
Hand Gesture Recognition plays a key role in human-computer interactions. As we can see that
there are so many new Technological advancements happening such as biometric authentication
which we can see frequently in our smart phones, similarly hand gesture recognition is a modern
way of human computer interaction i.e., we can control our system by showing our hands in front
of webcam and hand gesture recognition can be useful for all kinds of people. Based upon this idea
this module is presented.

OBJECTIVES
The purpose of this project is to develop a Virtual Mouse application that targets a few aspects of
significant development. For starters, this project aims to eliminate the needs of having a physical
mouse while able to interact with the computer system through webcam by using various image
processing techniques. Other than that, this project aims to develop a Virtual Mouse application
that can be operational on all kind of surfaces and environment.

The following describes the overall objectives of this project:


To design to operate with the help of a webcam. The Virtual Mouse application will be
operational with the help of a webcam, as the webcam are responsible to capture the images in
real time. The application would not work if there are no webcam detected.

To design virtual input that can operate on all surface. The Virtual Mouse application
will be operational on all surface and indoor environment, as long the users are facing the webcam
while doing the motion gesture.

To program the camera to continuously capturing the images, which the images will be
analysed, by using various image processing techniques.

As stated above, the Virtual Mouse application will be continuously capturing the images in real
time, where the images will be undergo a series of process, this includes HSV conversion, Binary
Image conversion, salt and pepper noise filtering, and more.

To convert hand gesture/motion into mouse input that will be set to a particular screen
position.

The Virtual Mouse application will be programmed to detect the position of the defined colour
where it will be set as the position of the mouse pointers. Furthermore, a combination of different
colors may result in triggering different types of mouse events, such as the right/left clicks, scroll
up/down, and more.

CHAPTER -1
1. INTRODUCTION

A Computer Mouse is an input device that helps to point and to interact with whatever
that is being pointed. There are so many types of mouse in the current trend, there’s the mechanical
mouse that consists of a single rubber ball which can rotate in any direction and the movement of
the pointer is determined by the motion of that rubber ball. Later the mechanical mouse is replaced
by the Optical Mouse. Optical Mouse consists of a led sensor to detect the movement of the pointer.
Years Later the laser mouse was introduced to improve the accuracy and to overcome the
drawbacks of the Optical Mouse. Later as the Technology has been increased drastically wireless
mouse was introduced so as to enable hassle free movement of the mouse and to improve the
accuracy. No Matter how much the accuracy of the mouse increases but there will always be
limitations of the mouse as the mouse is a hardware input device and there can be some problems
like mouse click not functioning properly and etc., as the mouse is a hardware device like any other
physical object even the mouse will have a durability time within which is functional and after its
durability time we have to change the mouse. As the technology increase everything becomes
virtualized such as speech recognition. Speech Recognition is used for recognition and translation
of the spoken language into text. Thus, Speech Recognition can replace keyboards in the future,
Similarly Eye Tracking which is used to control the mouse pointer with the help of our eye. Eye
Tracking can replace mouse in the future.

1. Mechanical Mouse
Known as the trackball mouse that is commonly used in the 1990s, the ball within the
mouse are supported by two rotating rollers in order to detect the movement made by the ball
itself. One roller detects the forward/backward motion while the other detects the left/right
motion. The ball within the mouse are steel made that was covered with a layer of hard rubber,
so that the detection are more precise. The common functions included are the left/right buttons
and a scroll-wheel. However, due to the constant friction made between the mouse ball and the
rollers itself, the mouse are prone to degradation, as overtime usage may cause the rollers to
degrade, thus causing it to unable to detect the motion properly, rendering it useless.
Furthermore, the switches in the mouse buttons are no different as well, as long term usage may
cause the mechanics within to be loosed and will no longer detect any mouse clicks till it was
disassembled and repaired.

2. Optical and Laser MOUSE


A mouse that commonly used in these days, the motions of optical mouse rely on the
Light Emitting Diodes (LEDs) to detect movements relative to the underlying surface, while the laser
mouse is an optical mouse that uses coherent laser lights. Comparing to its predecessor, which is the
mechanical mouse, the optical mouse no longer rely on the rollers to determine its movement, instead
it uses an imaging array of photodiodes. The purpose of implementing this is to eliminate the
limitations of degradation that plagues the current predecessor, giving it more durability while offers
better resolution and precision. However, there’s still some downside, even-though the optical mouse
are functional on most opaque diffuse surface, it’s unable to detect motions on the polished surface.
Furthermore, long term usage without a proper cleaning or maintenance may leads to dust particles
trap between the LEDs, which will cause both optical and laser mouse having surface detection
difficulties. Other than that, it’s still prone to degradation of the button switches, which again will
cause the mouse to function improperly unless it was disassembled and repaired.

CHAPTER – 2
3. LITERATURE REVIEW
As modern technology of human computer interactions become important in our everyday
lives, varieties of mouse with all kind of shapes and sizes were invented, from a casual office mouse
to a hard-core gaming mouse. However, there are some limitations to these hardware as they are not
as environmental friendly as it seems. For example, the physical mouse requires a flat surface to
operate, not to mention that it requires a certain area to fully utilize the functions offered.
Furthermore, some of these hardware are completely useless when it comes to interact with the
computers remotely due to the cable lengths limitations, rendering it inaccessible.

2.1 Mouse Simulation Using RGB


Thus, color tracking mouse simulation was proposed. The said system tracks two RGB
Color model on the user fingers by utilizing the computer vision technology. RGB color model
will be used for controlling the movement of the cursor while the other will act as an agent to the
click events of the mouse.

CAMERA

IMAGE
FRAME

MOUSE RIGHT DOUBLE


LEFT CLICK CCLICK
CURSOR CLICK

To detect the colours, the system are first required to process the captured image by separating the
hand pixels from the non-hand pixels, which can be done by background subtraction scheme that
segments the hands movement information from the non-changing background scene. In order to
implement this, the system requires to capture a pair of images to represent the static workplace
from the camera view. When subtraction process is complete, the system will undergo another
process that separates the RGB pixels to calculate the probability and differentiate the RGB values
to determine which part are the skin and which are not. After the process is completed, it will start
detecting the defined colour in the image, the image RGB pixels will be converted into HSV colour
plane in order to eliminate the variation in shades of similar colour. The resulting image will be
converted to Binary Image and will undergo a filtering process to reduce the noise within the
image.

2.2 Virtual Mouse Using a Webcam


The system requires three fingers with three colour pointers to simulate the click events.
The proposed system are capable of detecting the pointers by referring the defined colour
information, track the motion of the pointers, move the cursor according to the position of the
pointer, and simulate the single and double left or/and right click event of the mouse.

CHAPTER – 3

3. ANALYSIS AND REQUIREMENTS


3.1 Hardware Requirements
Computer Desktop or Laptop.

The computer desktop or a laptop will be utilized to run the visual software in order to
display what webcam had captured. A notebook which is a small, lightweight and inexpensive
laptop computer is proposed to increase mobility.

System will be using.


➢ Processor : Core2Duo
➢ Main Memory : 4GB RAM
➢ Hard Disk : 320GB
➢ Display : 14” Monitor
Webcam
Webcam is utilized for image processing, the webcam will continuously taking image
in order for the program to process the image and find pixel position.

3.2 Software Requirements

Python Language.
The coding technique on developing the Virtual Mouse application will be the
python with the aid of the integrated development environment (IDE) that are used for
developing computer programs, known as the Microsoft Visual Studio.
Open CV Library.
OpenCV are also included in the making of this program. OpenCV (Open Source
Computer Vision) is a library of programming functions for real time computer vision. OpenCV
have the utility that can read image pixels value
OS : Window 10 Ultimate 64-bit
Language : python
Tool Used : Open CV , Media pipe , Autopy , Numpy

3.3 SYSTEM REQUIREMENTS

Functional Requirements
The system is will provide a good user interface through which the user can
interact with the system. The virtual mouse enables the user to give mouse.

Non-Functional Requirements
Non-functional requirements define system properties and constraints it arises
through user needs, because of budget constraints or organizational policies, or due to the
external factors such as safety
regulations, privacy registration and so on.
Non-functional requirements are:
Security
Reliability
Maintainability
Portability
Extensibility
Reusability
Application Affinity/Compatibility
Resource Utilization

CHAPTER – 4
4. DESIGN
4.1 SYSTEM DESIGN

4.2 ARCHITECTURE DESIGN


ACTIVITY DIAGRAM

SEQUENCE DIAGRAM
State Diagram

4.3 Data Design

Entity Relational Diagram


CHAPTER – 5

5. IMPLEMENTATION
5.1 At user side screenshot:

4. HAND TRACKING

It presents finger movement gesture detection on our computer’s window using camera
& handling the whole system by just moving your one finger. Using finger detection methods for
instant camera access and user-friendly user interface makes it more easily accessible. The system
is used to implement motion tracking mouse, a signature input device and an application selector.

5. IMAGE PROCESSING:
The program will start of by capturing real-time images via a webcam where it will
await for users color input. The size of the acquired image will be compressed to a reasonable size
to reduce the processing loads of processing the pixels within the captured frame.
3.DRAW:
Here, we will use colors and paint for drawing pictures. It connects the dots literally
using the coordinates (x,y) and with the help of the fonts the user to draws very effectively and
enjoy well.
5.2 Implementation Issues and Challenges
Throughout the development of the application, there are several implementation issues
occurred. The following describes the issues and challenges that will likely to be encountered
throughout the development phase:

The interruptions of salt and pepper noises within the captured frames:
Salt and pepper noises occurred when the captured frame contains required HSV values that are
too small, but still underwent a series of process even though it’s not large enough to be
considered an input. To overcome this issue, the unwanted HSV pixels within the frame must
first be filtered off, this includes the area of the pixels that are too large and small. With this
method, the likelihood of interruptions of similar pixels will reduce greatly.

Performance degradation due to high process load for low-tier system:


Since the application is required to undergo several of process to filter, process and execute the
mouse functions in real time, the application can be CPU intensive for most of the low-tier
system. If the size of the captured frames is too large, the time-taken for the application to
process the entire frame are increase drastically. Therefore, to overcome this issue, the
application is required to process only the essential part of the frames, and reduces the redundant
filtering process that could potentially slow the application down.

The difficulties of calibrating the brightness and the contrast of the:


frames to get the required HSV values. The intensity of brightness and contrast matters greatly
when it comes to acquiring the required colour pixels. In order for the application to execute the
entire mouse functions provided, all of the required HSV values to execute the specific mouse
functions must be satisfied, meaning that the overall HSV values must be satisfied with the
brightness and contrast as well. However, the calibration can be somewhat tedious as certain
intensity could only satisfy part of the required HSV values, unless the original HSV values were
modified to prove otherwise. To overcome this issue, the application must first start up with a
calibration phase, which allows the users to choose their desired colour pixels before directing
them to the main phase.
CHAPTER - 6
6. TESTING

System testing provides the file assurance that software once validated must be combined with
all other system elements. System testing verifies whether all elements have been combined
properly and that overall system function and performance is achieved.

Characteristics of a Good Test:


• Tests are likely to catch bugs
• No redundancy
• Not too simple or too complex

6.1 USABILITY TESTING

TEST SUITE
Most big challenge until now is Integration part i.e. How to Integration with
attractive user interface. After searching lot about this we finally decided to go with pycharm
package of python. Pycharm is very popular package for creating simple and attractive user
Interfaces.

TEST CASES
➢ To ensure during operation the system will perform as
per specifications.
➢ To make sure that the system meets user’s requirements
during operation.
➢ To verify that the controls incorporated in the system
function as intended.
➢ To see that when correct inputs are fed to the system the
outputs are correct.
➢ To make sure that during operation incorrect input and output will be deleted.
INFERENCE

A vision-based virtual mouse interface is described that utilizes a robotic head, visual
tracking of the users head and hand positions and recognition of user hand signs to control an
intelligent kiosk. The user interface supports, among other things, smooth control of the mouse

Pointer and buttons using hand signs and movements.

6.2 Performance in Various Environments

The following describes the outcome of the program testing in various environments:

1. NORMAL ENVIRONMENT:

All colours are successfully recognized. The three highlighted squares indicate
that the targeted colours are identified, compared, and execute accordingly.

2. BRIGHT ENVIRONMENT:

All colours are successfully recognized. The three highlighted squares


indicate that the targeted colours are identified, compared, and execute
accordingly. However, minimal adjustment of HSV track-bars were required.

3. BRIGHTER ENVIRONMENT:

All colours cannot be recognized.

Reason: The intensity of brightness were too high that greatly alters the
original RGB values of the targeted colours, causing it to be unrecognizable .

4. DARK ENVIRONMENT:

All colours are successfully recognized. The three highlighted squares indicate
that the targeted colours are identified, compared, and execute accordingly. However, minimal
adjustment of HSV track-bars were required.
5. DARKER ENVIRONMENT:

Some colours cannot be recognized.


Reason: The intensity of brightness were too low that greatly alters the original
RGB values of the targeted colours, causing it to be unrecognizable.

6. COLOR CONFLICTS:
Colours input are successfully ignored, the conflicts were detected, causing it
to stop executing unwanted mouse functions until the colour conflicts are no
longer within the frame.

7. TILT AND ROTATION:


All colours are successfully recognized. The three highlighted squares
indicate that the targeted colours are identified, compared, and execute
accordingly.

8. DISTANCE:
All colours are successfully recognized. The three highlighted squares
indicate that the targeted colours are identified, compared, and execute
accordingly
CHAPTER - 7
7. TOOLS AND TECHNOLOGY
For this project we'll be using the Pycharm Software Development methodology approach in
developing the application. The stated approach is an alternative to the traditional model that helps the
project to respond the iterative work. It promotes adaptive planning, evolutionary development,
continuous improvement, and encourages rapid and flexible respond.

Workable software is delivered frequently.

Continuous collaboration between the stakeholders and the developers.

Project are developed around motivated individuals.

Encourage informal meetings.

Operational software is the principle measure of progress.

Sustainable development, able to maintain a constant pace.

Continuous attention to technical excellence and good design

Simplicity

Self-organizing

Regular adaption to changing circumstances

The reason for choosing this technology is to the fact that the Virtual Mouse are still considered to be
at the introduction stage, which means it still requires a great deal of extensive research and
development before it could actually make it into the market. Therefore, this project requires a thorough
yet iterative planning and requirements gathering where the lifecycle will be continually revisited to
re-evaluates the direction of the project and to eliminate the ambiguities in the process of the
development, and at the same time welcome changes of requirements, which promotes adaptability
and flexibility. Furthermore, due to the Virtual Mouse application are more towards serving the users,
this project requires continuous customer collaboration, as they're essential for gathering the proper
requirements in all aspects. This is why that the pychram tools and technology is the ideal approach
for developing the project.

The following describes the phases within the agile methodology approach:

Planning

A thorough planning will be conducted in this phase where the existing systems/product, for this case,
physical computer mouse will be reviewed and studied to identify the problems existed, a comparison
of problems will be made to compare which problems are more crucial and requires improvement. An
outline objective and the scope will be identified in order to provide an alternative solution to the
problem.

Requirement Analysis

The phase that gathers and interpreting the facts, diagnosing problems and recommending
improvements to the system. In this phase, the collected problem statements will be extensively studied
in order to find a proper solution or at least an improvements to the proposed system. All proposed
solutions will be converted into requirements where it will be documented in a requirement
specification.

Designing

The requirement specification from the previous phase will be studied and prioritize to determine
which requirement are more important where the requirement with the highest priority will be delivered
first. After the study, the system design will be prepared as it helps in defining the overall system
architecture and specifying the hardware and the software requirements.
Building

The phase where the actual coding implementation takes place. By referring to the inputs from the
system design, the system will be developed based on the prioritize requirements. However, due to
we're using the agile methodology approach, the developed system will be considered as a prototype
system where it will be integrated and tested by the users.

Testing

The phase where the prototype system going through a series of test. The prototype system will first
undergo integration where the features from the previous iteration cycle are added to the latest cycle.
After the integration, the prototype system will be thoroughly tested by the users to determine whether
they are satisfied with the latest deliverables, the completion of the project depends on whether they've
accepted it or otherwise. If the users requires additional features or modification, feedback gathering
will be conducted, which resulted in further modification of the requirements and features where it will
recorded and documented for the requirement analysis phase on the next iteration.

CHAPTER - 8
8.PROJECT SCOPE
Mouse that will soon to be introduced to replace the physical computer mouse to promote
convenience while still able to accurately interact and control the computer system. To do that, the
software requires to be fast enough to capture and process every image, in order to successfully
track the user's gesture. Therefore, this project will develop a software application with the aid of
the latest software coding technique and the open-source computer vision library also known as
the OpenCV.

The scope of the project is as below:

• Real time application.


• User friendly application.
• Removes the requirement of having a physical mouse.

The process of the application can be started when the user's gesture was captured in real time by
the webcam, which the captured image will be processed for segmentation to identify which pixels
values equals to the values of the defined colour.

After the segmentation is completed, the overall image will be converted to Binary Image where
the identified pixels will show as white, while the rest are black. The position of the white segment
in the image will be recorded and set as the position of the mouse pointer, thus resulting in
simulating the mouse pointer without using a physical computer mouse.

The software application is compatible with the Windows platform. The functionality of the
software will be coded with C++ programming language code with the integration of an external
library that does the image processing known as the OpenCV.

8.1 Problem Statement


It's no surprised that every technological devices have its own limitations, especially when it
comes to computer devices. After the review of various type of the physical mouse, the problems
are identified and generalized.

• The following describes the general problem that the current physical mouse suffers:
• Physical mouse is subjected to mechanical wear and tear.
• Physical mouse requires special hardware and surface to operate.
• Physical mouse is not easily adaptable to different environments and its performance varies
depending on the environment.
• Mouse has limited functions even in present operational environments.

• All wired mouse and wireless mouse have its own lifespan.

8.2 Motivation of Virtual Mouse

It is fair to say that the Virtual Mouse will soon to be substituting the traditional physical
mouse in the near future, as people are aiming towards the lifestyle where that every
technological devices can be controlled and interacted remotely without using any peripheral
devices such as the remote, keyboards, etc. it doesn't just provides convenience, but it's cost
effective as well.

8.3 Convenient
It is known in order to interact with the computer system, users are required to use an actual
physical mouse, which also requires a certain area of surface to operate, not to mention that it
suffers from cable length limitations. Virtual Mouse requires none of it, as it only a webcam to
allow image capturing of user's hand position in order to determine the position of the pointers that
the user want it to be. For example, the user will be able to remotely control and interact the
computer system by just facing the webcam or any other image capturing devices and moving your
fingers, thus eliminating the need to manually move the physical mouse, while able to interact with
the computer system from few feet away.

CHAPTER – 9
9.CONCLUSION
In conclusion, it’s no surprised that the physical mouse will be replaced by a virtual non-physical
mouse in the Human-Computer Interactions (HCI), where every mouse movements can be executed
with a swift of your fingers everywhere and anytime without any environmental restrictions. This
project had develop a color recognition program with the purpose of replacing the generic physical
mouse without sacrificing the accuracy and efficiency, it is able to recognize color movements,
combinations, and translate them into actual mouse functions. Due to accuracy and efficiency plays an
important role in making the program as useful as an actual physical mouse, a few techniques had to
be implemented.

First and foremost, the coordinates of the colors that are in charge of handling the cursor movements
are averaged based on a collections of coordinates, the purpose of this technique is to reduce and
stabilize the sensitivity of cursor movements, as slight movement might lead to unwanted cursor
movements. Other than that, several color combinations were implemented with the addition of
distance calculations between two colors within the combination, as different distance triggers different
mouse functions. The purpose of this implementation is to promote convenience in controlling the
program without much of a disturbance. Therefore, actual mouse functions can be triggered accurately
with minimum trial and errors.

Furthermore, to promote efficient and flexible tracking of colors, calibrations phase was implemented,
this allows the users to choose their choices of colors on different mouse functions, as long the selected
colors doesn't fall within the same/similar RGB values (e.g. Green, Blue, Red, Pink). Other than that,
adaptive calibrations were also implemented as well, it is basically allows the program to save different
set of HSV values from different angles where it will be used during the recognition phase.

This model conclude that by using the topics of computer vision like open CV, it can form masks
that can variate colors by using color variation techniques and also development of mouse
movement by using certain packages like ‘mouse’ which will be used for the movement of mouse
by using the coordinates that are linked to the detected color. This can provide ease use of systems
and many other applications. So the open CV is helping the users with different accessible forms
of models that will make ease life.
All the suggestions forwarded during the software proposal have been successfully completed
and final threshold of application has been crossed. Some errors were spotted out which have
been tested and corrected with the conditions and is working effectively. More future work
can be established on the project by using artificial neural networks in combination with the
system.

9.1 FUTURE SCOPE OF THE PROJECT

The development of these techniques and models are really vast. The color detection model can
be developed if we want to identify a particular color out of a colored photo. And the mouse
movement can be developed in such a way it can act like a real mouse that will help us for using
system without even touching the system’s keyboard or mouse. The development can be in such a
way it can be training on CNN’s that will help for a better performed model. The Models can be
developed in different ways by using some latest packages like ‘AutopyGUI’ that will help us to
give commands which will identify an input and perform some function on the system. So if any
separate color is detected it can perform special function or if an input from user is detected it will
open any specific folder with ease without performing any actions, a simple gesture can do the job.

9.2 Cost Effective

A quality physical mouse is normally cost from the range of 30 ringgit to a hefty 400 ringgit,
depending on their functionality and features. Since the Virtual Mouse requires only a webcam, a
physical mouse are no longer required, thus eliminating the need to purchase one, as a single
webcam is sufficient enough to allow users to interact with the computer system through it, while
some other portable computer system such as the laptop, are already supplied with a built-in
webcam, could simply utilize the Virtual Mouse software without having any concerns about
purchasing any external peripheral devices.

CHAPTER - 10
10.APPENDICES

10.1 AI VIRTUAL MOUSE

import cv2
import numpy as np
import HandTrackingModule as htm
import time
import autopy

wCam, hCam = 640, 480


frameR = 100 #Frame Reduction
smoothening = 7 #random value

pTime = 0
plocX, plocY = 0, 0
clocX, clocY = 0, 0
cap = cv2.VideoCapture(0)
cap.set(3, wCam)
cap.set(4, hCam)

detector = htm.handDetector(maxHands=1)
wScr, hScr = autopy.screen.size()

#### print(wScr, hScr) ####

while True:
## Step1: Find the landmarks ##
success, img = cap.read()
img = detector.findHands(img)
lmList, bbox = detector.findPosition(img)

## Step2: Get the tip of the index and middle finger ##


if len(lmList) != 0:
x1, y1 = lmList[8][1:]
x2, y2 = lmList[12][1:]

## Step3: Check which fingers are up ##


fingers = detector.fingersUp()
cv2.rectangle(img, (frameR, frameR), (wCam - frameR, hCam - frameR), (255, 0, 255), 2)

## Step4: Only Index Finger: Moving Mode ##


if fingers[1] == 1 and fingers[2] == 0:
## Step5: Convert the coordinates ##
x3 = np.interp(x1, (frameR, wCam-frameR), (0, wScr))
y3 = np.interp(y1, (frameR, hCam-frameR), (0, hScr))

## Step6: Smooth Values ##


clocX = plocX + (x3 - plocX) / smoothening
clocY = plocY + (y3 - plocY) / smoothening

## Step7: Move Mouse ##


autopy.mouse.move(wScr - clocX, clocY)
cv2.circle(img, (x1, y1), 15, (255, 0, 255), cv2.FILLED)
plocX, plocY = clocX, clocY

## Step8: Both Index and middle are up: Clicking Mode ##


if fingers[1] == 1 and fingers[2] == 1:

## Step9: Find distance between fingers ##


length, img, lineInfo = detector.findDistance(8, 12, img)

## Step10: Click mouse if distance short ##


if length < 40:
cv2.circle(img, (lineInfo[4], lineInfo[5]), 15, (0, 255, 0), cv2.FILLED)
autopy.mouse.click()

## Step11: Frame rate ##


cTime = time.time()
fps = 1/(cTime-pTime)
pTime = cTime
## Step12: Display ##
cv2.imshow("Image", img)
cv2.waitKey(1)

cv2.putText(img, str(int(fps)), (28, 58), cv2.FONT_HERSHEY_PLAIN, 3, (255, 8, 8),3)

10.2HANDTRACKINGMODULE

import cv2
import time
import math
import numpy as np

class handDetector():
def __init__(self, mode=False, maxHands=2, detectionCon=0.5, trackCon=0.5):
self.mode = mode
self.maxHands = maxHands
self.detectionCon = detectionCon
self.trackCon = trackCon

self.mpHands = mp.solutions.hands
self.hands = self.mpHands.Hands(self.mode, self.maxHands, self.detectionConLE self.trackCon)
self.mpDraw = mp.solutions.drawing_utils
self.tipIds = [4, 8, 12, 16, 20]

def findHands(self, img, draw=True):


imgRGB = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)
self.results = self.hands.process(imgRGB)
# print(results.multi_hand_landmarks)

if self.results.multi_hand_landmarks:
for handLms in self.results.multi_hand_landmarks:
if draw:
self.mpDraw.draw_landmarks(img, handLms, self.mpHands.HAND_CONNECTIONS)
return img

def findPosition(self, img, handNo=0, draw=True):


xList = []
yList = []
bbox = []
self.lmList = []
if self.results.multi_hand_landmarks:
myHand = self.results.multi_hand_landmarks[handNo]
for id, lm in enumerate(myHand.landmark):
# print(id, lm)
h, w, c = img.shape
cx, cy = int(lm.x * w), int(lm.y * h)
xList.append(cx)
yList.append(cy)
# print(id, cx, cy)
self.lmList.append([id, cx, cy])
if draw:
cv2.circle(img, (cx, cy), 5, (255, 0, 255), cv2.FILLED)

xmin, xmax = min(xList), max(xList)


ymin, ymax = min(yList), max(yList)
bbox = xmin, ymin, xmax, ymax
if draw:
cv2.rectangle(img, (xmin - 20, ymin - 20), (xmax + 20, ymax + 20), (0, 255, 0), 2)

return self.lmList, bbox

def fingersUp(self):
fingers = []
# Thumb
if self.lmList[self.tipIds[0]][1] > self.lmList[self.tipIds[0] -1][1]:
fingers.append(1)
else:
fingers.append(0)
# Fingers
for id in range(1, 5):
if self.lmList[self.tipIds[id]][2] < self.lmList[self.tipIds[id] -2][2]:
fingers.append(1)
else:
fingers.append(0)
# totalFingers = fingers.count(1)
return fingers

def findDistance(self, p1, p2, img, draw=True, r=15, t=3):


x1, y1 = self.lmList[p1][1:]
x2, y2 = self.lmList[p2][1:]
cx, cy = (x1 + x2) // 2, (y1 + y2) // 2
if draw:
cv2.line(img, (x1, y1), (x2, y2), (255, 0, 255), t)
cv2.circle(img, (x1, y1), r, (255, 0, 255), cv2.FILLED)
cv2.circle(img, (x2, y2), r, (255, 0, 255), cv2.FILLED)
cv2.circle(img, (cx, cy), r, (0, 0, 255), cv2.FILLED)
length = math.hypot(x2 - x1, y2 - y1)

return length, img, [x1, y1, x2, y2, cx, cy]

def main():
pTime = 0
cTime = 0
cap = cv2.VideoCapture(1)
detector = handDetector()
while True:
success, img = cap.read()
img = detector.findHands(img)
lmList, bbox = detector.findPosition(img)
if len(lmList) != 0:
print(lmList[4])
cTime = time.time()
fps = 1 / (cTime - pTime)
pTime = cTime
cv2.putText(img, str(int(fps)), (10, 70), cv2.FONT_HERSHEY_PLAIN, 3,
(255, 0, 255), 3)
cv2.imshow("Image", img)
cv2.waitKey(1)

if __name__ == "__main__":

10.3 OUTPUT
Fig:1 HANDTRACKING
Fig: 2 CLICK EVENT
Fig: 3 DOBLE CLICK EVENT
Fig: 4 SELECT TEXT
Fig: 5 SELECTED TEXT
Fig: 5 SCROL DOCUMENT
Fig.6 ZOOM
Fig: 7 ZOOMED IMAGE
Fig: 8 FRAME PER SECOND

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy