This project is a Python-based implementation of gaze detection using Mediapipe's Face Mesh solution. The program detects and tracks the position of the eyes and irises in real-time using a webcam, providing visual feedback by overlaying markers on the video feed.
- Real-time detection of facial landmarks using Mediapipe's Face Mesh.
- Identification of eye and iris positions.
- Visual feedback with circles drawn around eye and iris landmarks.
- Visualization of the average iris center for both eyes.
To run this project, you need the following:
- Python 3.7 or higher
- OpenCV
- Mediapipe
- NumPy
-
Clone the repository:
git clone https://github.com/Real-J/gaze-detection.git cd gaze-tracking -
Install the required dependencies:
pip install opencv-python mediapipe numpy
-
Run the Python script:
python gazeupdate.py
-
Allow access to your webcam.
-
The program will display a window showing the live webcam feed with the following visual feedback:
- Green circles: Eye landmarks
- Red circles: Iris landmarks
- Yellow circle: Combined center of the left and right irises
-
Press the
qkey to exit the program.
The main script (gazeupdate.py) includes the following components:
- Initialization: Setting up Mediapipe Face Mesh and OpenCV.
- Eye and Iris Landmark Detection: Extracting facial landmarks corresponding to the eyes and irises.
- Visual Feedback: Drawing circles on eye and iris landmarks, and calculating the average iris center.
- Webcam Loop: Continuously processing frames from the webcam and displaying visual feedback.
Contributions are welcome! Feel free to open an issue or submit a pull request if you have suggestions or improvements.
This project is licensed under the MIT License.