0% found this document useful (0 votes)
5 views17 pages

6th Sem Project

The project report details the development of a hands-free Eye-Tracking Virtual Mouse that allows users to control a cursor and perform clicks through eye movements and blinks. Utilizing technologies such as Python, OpenCV, and MediaPipe, this system aims to enhance accessibility for individuals with physical disabilities and in sterile environments. The report outlines the methodology, features, applications, limitations, and future enhancements of the project, emphasizing its potential for inclusive human-computer interaction.

Uploaded by

yashu code
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views17 pages

6th Sem Project

The project report details the development of a hands-free Eye-Tracking Virtual Mouse that allows users to control a cursor and perform clicks through eye movements and blinks. Utilizing technologies such as Python, OpenCV, and MediaPipe, this system aims to enhance accessibility for individuals with physical disabilities and in sterile environments. The report outlines the methodology, features, applications, limitations, and future enhancements of the project, emphasizing its potential for inclusive human-computer interaction.

Uploaded by

yashu code
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

Project Report

On
Virtual Mouse
Submitted in partial fulfillment of the requirement for the award of degree of

Bachelor of Engineering
In
Computer Science and Engineering

Title: Virtual Mouse By Eye Tracking

1
Table of Contents
Certificate........................................................3
Abstract............................................................4
Introduction......................................................5
Objective...........................................................6
Methodology.....................................................7
Tools and Technologies Used............................8
System Design....................................................9
Working Mechanism..........................................10
Key Features.......................................................11
Applications and Scope.......................................12
Limitations...........................................................13
Future Enhancements..........................................14
Conclusion.............................................................15
References.............................................................16

2
CERTIFICATE
I hereby certify that the work which is being presented in
the project report entitled “Virtual Mouse By Eye
Tracking” by “RAJAT BAREJA” in partial fulfillment of the
requirement for the award of degree of B.TECH(CSE)
submitted in the department of ADVANCED INSTITUTE
OF TECHNOLOGY AND MANAGEMENT is an authentic
record of my own work carried out during a period from
February 2025 TO June 2025 under the supervision of

SIGN: SIGN:
Under the Guidance of Head of Department
MRS. RITU MRS Priyanka
ASST. PROFESSOR H.O.D
CSE A.I.T.M(PALWAL) CSE A.I.T.M(PALWAL)

3
Abstract:

This project introduces a novel, hands-free method


for human-computer interaction through an Eye-
Tracking Virtual Mouse. By detecting and
interpreting eye movements, this system enables
users to control a cursor and perform click actions
via blinking—eliminating the need for traditional
input devices.

Developed using Python, OpenCV, MediaPipe, and


PyAutoGUI, it offers an accessible, efficient solution
for users with physical disabilities and applications
in sterile or hands-restricted environments.
The project demonstrates the potential for
enhanced accessibility, user experience, and future
integration of AI-based gaze predictionI
4
INTRODUCTION
The evolution of human-computer interaction has significantly
influenced the development of modern computing systems. Traditional
input devices such as keyboards and mice have served us well, but they
present limitations for users with physical disabilities and in
environments where hands-free operation is required.

With increasing demand for accessibility and ergonomic solutions,


alternative input methods have gained attention. One such method is
eye-tracking technology, which leverages the movement of the user's
eyes to interact with digital interfaces. This project, 'Virtual Mouse by
Eye Tracking,' is a step towards creating an inclusive, contactless user
experience.

This report outlines the design, development, and implementation of a


virtual mouse that operates based on real-time eye tracking. The
system eliminates the need for traditional physical interaction and
opens doors for people with mobility challenges and applications in
sterile or constrained environments such as medical labs or
cleanrooms.

5
Objective
The primary objective of this project is to develop a
virtual mouse system that operates through eye-
tracking. The specific goals include:

- Implementing eye movement tracking to control


cursor movement.

- Detecting blinks to simulate mouse click events.

- Designing a user-friendly interface that adapts to


different users.

- Ensuring real-time performance and smooth


operation of the virtual mouse.

- Exploring the possibilities of integration into


assistive technologies and healthcare environments.

6
Methodology
The project methodology involves a structured development
process using agile practices and iterative prototyping. The key
stages include:

1. Requirement Analysis – Understanding user needs, technical


constraints, and system goals.

2. Research and Feasibility Study – Investigating existing


technologies and identifying appropriate tools such as
MediaPipe and OpenCV.

3. System Design – Designing the architecture involving


webcam input, image processing, and GUI control.

4. Implementation – Writing Python code using libraries for eye


detection, image processing, and GUI automation.

5. Testing – Performing functional and usability tests to refine


the system's responsiveness and accuracy.

6. Evaluation – Comparing performance with traditional input


systems and identifying areas of improvement.
7
Tools and Technologies Used
The following tools and libraries were used in the
implementation of the Eye-Tracking Virtual Mouse system:

- Python: The main programming language due to its flexibility


and wide range of libraries.

- OpenCV: Used for image capturing, processing, and handling


visual data from the webcam.

- MediaPipe: Google's open-source framework used for facial


landmark detection, including eyes and blink tracking.

- PyAutoGUI: Library that allows programmatic control of the


mouse, including movement and clicks.

- NumPy: Used for numerical operations and efficient array


handling in image processing tasks.

- PyCharm: Integrated Development Environment (IDE) for


Python development with features like debugging and version
control

8
System Design
The system design of the Eye-Tracking Virtual Mouse follows a modular
architecture that ensures flexibility, scalability, and real-time performance. The
key components of the architecture are:

1. Data Acquisition Module – This module uses a webcam to capture real-time


video frames of the user's face and eyes. It acts as the input device for the entire
system.

2. Image Processing Module – The frames from the camera are processed using
OpenCV to extract facial landmarks and identify the region of interest (eyes).
MediaPipe is employed here to perform facial and eye detection with high
accuracy.

3. Cursor Mapping Module – The coordinates of the detected eye movements are
mapped to the screen coordinates using mathematical scaling. PyAutoGUI
handles this translation and moves the mouse cursor accordingly.

4. Blink Detection Module – A blink detection algorithm monitors the eye aspect
ratio (EAR) to determine if a blink has occurred. If a blink is detected, a click event
is triggered.

5. User Interface – A visual interface may optionally display the real-time tracking
status or assist in calibration for different lighting conditions and users.

9
Working Mechanism
The working of the Eye-Tracking Virtual Mouse is divided into the
following stages:

1. **Capture Stage**: The webcam captures continuous video frames


of the user. Each frame is passed on to the image processing pipeline.

2. **Detection Stage**: MediaPipe detects the position of the eyes and


tracks pupil movement in real-time.

3. **Processing Stage**: OpenCV refines the landmark positions and


determines the direction in which the eyes are moving.

4. **Cursor Control Stage**: Based on eye direction, the system


calculates screen coordinates and moves the cursor using PyAutoGUI.

5. **Click Simulation Stage**: The blink detection module evaluates


whether a blink occurs using the eye aspect ratio method. If the ratio
falls below a threshold, it simulates a mouse click.

The overall system works continuously and smoothly by executing all


these steps multiple times per second, providing near-instantaneous
response to user input.

10
Key Features
The Eye-Tracking Virtual Mouse boasts several unique and user-
focused features:

- **Hands-Free Operation**: Enables completely touchless


computer control, ideal for sterile or constrained environments.

- **Blink-Based Clicking**: Users can simulate mouse clicks


simply by blinking, without the need for hand gestures or
button pressing.

- **High Accuracy and Responsiveness**: Real-time eye


tracking ensures smooth and natural cursor movement with
minimal delay.

- **User Accessibility**: The system is particularly beneficial for


users with physical impairments, enabling inclusive digital
access.

- **Cost-Effective Implementation**: Utilizes a basic webcam


and open-source libraries, making it affordable for widespread
use.

11
Applications and Scope
The scope and applications of this project are vast and cross-
disciplinary:

- **Assistive Technology**: Helps individuals with mobility issues or


paralysis interact with computers and digital devices.

- **Healthcare Settings**: Useful in sterile environments like operating


rooms where hands-free computer interaction is required.

- **Gaming Industry**: Adds a new dimension to gaming by enabling


head and eye-controlled gameplay.

- **Public Kiosks and ATMs**: Enhances hygiene and accessibility in


public interfaces by reducing the need for physical contact.

- **Smart Home and Automation**: Allows integration into smart


systems where devices are controlled using only gaze and blinks.

The system is scalable and can be tailored to specific industrial or


domestic use cases, increasing its utility and adoption potential.

12
Limitations
Despite its advantages, the Eye-Tracking Virtual Mouse system
has a few limitations:

- **Lighting Sensitivity**: Accuracy of detection can decrease


under poor or fluctuating lighting conditions.

- **User Calibration**: Requires individual calibration to


accommodate differences in eye shape, glasses, and camera
angles.

- **Blink Ambiguity**: Distinguishing intentional blinks from


natural ones can be challenging and may lead to false clicks.

- **Processor Load**: Continuous video processing may


demand considerable system resources, affecting older
machines.

- **Head Movement Restriction**: Excessive head movement


can affect accuracy and disrupt eye tracking alignment.

13
Future Enhancements

The system can be enhanced in several ways to improve


accuracy, user experience, and broader applicability:

- **AI-Powered Gaze Prediction**: Integrating machine


learning algorithms to predict gaze more accurately under
varying conditions.

- **Multimodal Inputs**: Adding voice recognition or gesture


control to support a hybrid input model.

- **Cloud-Based Calibration**: Centralized user profiles that


store calibration data for consistent experience across devices.

- **Mobile Compatibility**: Extending support for mobile


devices and tablets using their built-in front-facing cameras.

- **Gesture-Based Click Alternatives**: Implementing winks,


prolonged stares, or gaze fixation as alternative click actions.

14
Conclusion
The Eye-Tracking Virtual Mouse project presents a promising
step toward inclusive and accessible technology. By replacing
conventional input devices with eye-based interaction, it
empowers individuals with disabilities and opens up new
possibilities for hands-free computing.

The project successfully demonstrates the feasibility of using


low-cost, widely available tools to create a high-impact
solution. Its applications in healthcare, gaming, and automation
signal a shift towards more natural and adaptive human-
computer interaction.

Continued development and refinement of this system,


including AI integration and adaptive calibration, will lead to
broader adoption and improved functionality. In conclusion,
this project not only meets its technical objectives but also
contributes meaningfully to the goal of digital inclusivity.

15
References

1. OpenCV Documentation – https://docs.opencv.org

2. MediaPipe Framework by Google –


https://google.github.io/mediapipe

3. PyAutoGUI Documentation –
https://pyautogui.readthedocs.io

4. NumPy Library – https://numpy.org/

5. Eye Aspect Ratio (EAR) Blink Detection Method -


Tereza Soukupova and Jan Cech, 'Real-Time Eye Blink
Detection using Facial Landmarks'

6. Research Articles on Gaze Tracking – IEEE and ACM


Digital Library

7. Python Programming Language –


https://www.python.org

16
17

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy