Skip to content

The tool leverages OpenCV and MediaPipe Face Mesh to detect and track eye movements in real-time. It identifies eye and iris landmarks, calculates the iris center, and overlays visual feedback on a webcam feed.

Notifications You must be signed in to change notification settings

Real-J/Gaze-Tracking

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 

Repository files navigation

Gaze Detection with Visual Feedback

This project is a Python-based implementation of gaze detection using Mediapipe's Face Mesh solution. The program detects and tracks the position of the eyes and irises in real-time using a webcam, providing visual feedback by overlaying markers on the video feed.

Features

  • Real-time detection of facial landmarks using Mediapipe's Face Mesh.
  • Identification of eye and iris positions.
  • Visual feedback with circles drawn around eye and iris landmarks.
  • Visualization of the average iris center for both eyes.

Requirements

To run this project, you need the following:

  • Python 3.7 or higher
  • OpenCV
  • Mediapipe
  • NumPy

Installation

  1. Clone the repository:

    git clone https://github.com/Real-J/gaze-detection.git
    cd gaze-tracking
  2. Install the required dependencies:

    pip install opencv-python mediapipe numpy

Usage

  1. Run the Python script:

    python gazeupdate.py
  2. Allow access to your webcam.

  3. The program will display a window showing the live webcam feed with the following visual feedback:

    • Green circles: Eye landmarks
    • Red circles: Iris landmarks
    • Yellow circle: Combined center of the left and right irises
  4. Press the q key to exit the program.

Code Overview

The main script (gazeupdate.py) includes the following components:

  • Initialization: Setting up Mediapipe Face Mesh and OpenCV.
  • Eye and Iris Landmark Detection: Extracting facial landmarks corresponding to the eyes and irises.
  • Visual Feedback: Drawing circles on eye and iris landmarks, and calculating the average iris center.
  • Webcam Loop: Continuously processing frames from the webcam and displaying visual feedback.

Contributing

Contributions are welcome! Feel free to open an issue or submit a pull request if you have suggestions or improvements.

License

This project is licensed under the MIT License.

Acknowledgements

  • Mediapipe for the Face Mesh solution.
  • OpenCV for real-time video processing.
  • NumPy for numerical computations.

About

The tool leverages OpenCV and MediaPipe Face Mesh to detect and track eye movements in real-time. It identifies eye and iris landmarks, calculates the iris center, and overlays visual feedback on a webcam feed.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages