This work introduces a new eye-controlled mouse technology that enables hands-free human-computer interaction. For people with physical restrictions, the system offers a useful and reasonably priced substitute by utilizing Mediapipe\'s advanced face mesh detection, OpenCV, and PyAutoGUI. The system includes features like fatigue monitoring, blink-based click recognition, and fluid cursor transitions to enhance usability. It transforms eye movements in real time into cursor control. Extensive testing demonstrated dependable performance with excellent responsiveness and mouse tracking accuracy across a variety of user profiles and lighting conditions. This method maintains competitive performance at a large cost reduction as compared to conventional hardware-dependent systems. User comments also underline how the approach has the ability to transform technology accessibility,particularly for those who have mobility challenges. Future development will focus on improving scalability, integrating state-of-the-art machine learning models, and offering multi-monitor capability. This study highlights how combining computer vision and human-computer interaction technologies can democratize access and inclusivity in digital interfaces.
Introduction
Overview:
This project introduces an affordable and user-friendly eye-controlled mouse system to improve accessibility for individuals with physical impairments. Traditional assistive technologies like hardware-based eye trackers are expensive and less accessible. Using open-source tools (Mediapipe, OpenCV, PyAutoGUI), this system enables cursor control through eye movement and mouse clicks via blinking, while also monitoring user fatigue.
Key Features:
Real-time video input from a standard webcam.
Face and eye tracking using Mediapipe’s facial mesh (468 landmarks).
Cursor control and blink-based clicking with smooth transitions.
Fatigue detection via blink rate and duration, triggering rest alerts.
Tkinter-based GUI for real-time feedback and customization (e.g., sensitivity settings).
Literature Review Highlights:
Commercial systems (e.g., Tobii) are accurate but expensive.
Open-source computer vision allows for effective eye tracking with basic webcams.
Studies emphasize HCI applications, fatigue detection, and dynamic algorithmic adaptation for better usability.
Methodology:
Capture live video via webcam.
Detect eye landmarks using Mediapipe.
Map eye movement to screen coordinates for cursor control.
Identify blinks using distance thresholds between eye landmarks.
Trigger mouse clicks based on voluntary blink detection.
Implement fatigue monitoring by analyzing blink frequency and duration.
Test system under varying lighting and user conditions.
Results:
95% precision in cursor tracking under good lighting.
Blink detection accuracy was high, even across users with varied facial features.
Fatigue alerts aligned well with user feedback.
10-minute learning curve for most users to operate comfortably.
Performance decreased slightly in poor lighting or for users wearing glasses.
Conclusion
This study introduces a user-friendly eye-controlled mouse system that uses Mediapipe to track gaze in real time and control the cursor. The system provides a viable option for those with physical limitations by tackling issues like cost and usability, enabling hands-free digital device contact.
By combining blink-based clicking and tiredness monitoring, the user experience is further improved and the groundwork for more user-friendly HCI systems is laid. The system\'s performance is confirmed by experimental results, which also demonstrate the system\'s potential for practical uses [8].
Future research will concentrate on enhancing resilience in the face of changing lighting, using cutting-edge machine learning models for increased precision, and extending functionality to accommodate multi-monitor configurations. Partnerships with medical experts could also investigate its potential for therapeutic uses, like motor disability rehabilitation [6].
This study adds to the expanding field of accessible technology by bridging the gap between affordability and functionality, highlighting inclusivity and empowerment via innovation.
The Eye-Controlled Mouse System demonstrates the potential of computer vision in enhancing accessibility and user interaction [8]. By integrating fatigue detection and customizable settings, the system offers a robust and adaptable solution for hands-free computer control. Future improvements could include enhanced accuracy with 3D eye tracking, support for additional gestures, and improved performance optimization.
References
[1] Chen, W. X., Cui, X. Y., Zheng, J., Zhang, J. M., Chen, S., and Yao, Y. D. (2019) ‘Gaze Gestures and Their Applications in Human-Computer Interaction with a Head-Mounted Display’, arXiv preprint arXiv:1910.07428.
[2] Eye of Horus: Open Source Eye Tracking Assistance. (n.d.) Available at: https://www.instructables.com/Eye-of-Horus-Open-Source-Eye-Tracking-Assistance/ (Accessed: 30 December 2024).
[3] Eye-LCOS-Tracker: Low Cost Open Source Eye Tracker. (n.d.) Available at: https://github.com/asterics/eye-lcos-tracker (Accessed: 30 December 2024).
[4] Jacob, R. J. K., and Karn, K. S. (2003) ‘Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises’, The Mind\'s Eye, Elsevier, pp. 573-605.
[5] Low Cost Open Source Eye Tracking. (n.d.) Available at: https://hackaday.io/project/153293-low-cost-open-source-eye-tracking (Accessed: 30 December 2024).
[6] Pantanowitz, A., Kim, K., Chewins, C., Tollman, I. N. K., and Rubin, D. M. (2020) ‘Addressing the Eye-Fixation Problem in Gaze Tracking for Human-Computer Interface Using the Vestibulo-ocular Reflex’, arXiv preprint arXiv:2009.02132.
[7] Precision Gaze Mouse. (n.d.) Available at: https://precisiongazemouse.org (Accessed: 30 December 2024).
[8] Eye Tracking Based Control System for Natural Human-Computer Interaction. (2017) Journal of Healthcare Engineering, 2017, Article ID 5748315.