This research introduces an innovative method of human-computer interaction by controlling a virtual keyboard by detecting eye blinks. The system operates in real-time and employs advanced computer vision algorithms, supported by open-source libraries such as OpenCV, to accurately detect and interpret voluntary blinks. It is specifically intended for people who have significant motor disabilities.who are unable to use conventional input devices like a mouse or keyboard. By offering an alternative mode of communication and control, this interface serves as a valuable tool in the field of assistive technology. The paper outlines the overall system architecture, implementation strategy, performance evaluation, and the potential impact of this solution on improving accessibility and user independence.
Introduction
Background & Motivation
Traditional input methods (touchscreens, keyboards) are inaccessible to individuals with severe physical disabilities.
There is a growing need for non-invasive, low-cost, hands-free digital interfaces for communication and digital access.
???? Proposed Solution
A vision-based virtual keyboard controlled by eye movements and intentional blinks:
Utilizes OpenCV, dlib, and a standard webcam to detect facial landmarks and monitor eye blinks.
Users navigate with gaze and select characters via blinks, enabling hands-free typing.
???? Literature Review Highlights
Early works used blinks as control signals in HCI (e.g., Królak, Kotani).
EEG-based systems are accurate but expensive and complex.
Recent webcam-based systems are more practical but suffer from false detections and user fatigue.
This project addresses these with adaptive thresholding, real-time detection, and user-focused design.
???? Research Gaps Identified
EEG systems: Accurate but costly and non-scalable.
Webcam systems: Affected by lighting, lack personalization, and are prone to false detections.
Lack of adaptability: Few systems adjust to user fatigue, blink strength, or individual behavior.
???? Research Objectives
Build a low-cost, vision-based blink detection system.
Integrate it into a responsive virtual keyboard.
Use adaptive blink thresholding for higher accuracy.
Eye Aspect Ratio (EAR) calculation using facial landmarks to identify blinks.
Adaptive thresholding used to distinguish between involuntary and intentional blinks.
Participants:
10 users (3 with motor impairments) performed typing tasks (e.g., names, phrases).
Standard indoor lighting, minimal setup, and real-world simulation.
Evaluation Metrics:
Blink Detection Accuracy
Typing Speed (Characters Per Minute)
Error Rate
User Satisfaction (via Likert scale)
???? Results & Analysis
Blink Detection Accuracy: 92%
Typing Speed: Up to 28 CPM
Error Rate: Low, with reduced false positives due to adaptive thresholding
User Feedback: Generally positive, though some fatigue reported
The system works reliably on consumer-grade hardware, making it suitable for assistive use in education and healthcare.
???? System Workflow (Algorithm Summary):
Capture live video from webcam.
Detect face and extract eye landmarks using dlib.
Compute EAR to detect blinks.
Classify blink (intentional or natural) using threshold logic.
Navigate and select keys on virtual keyboard via gaze and blink.
Display output and update interface for the next input.
Conclusion
The outcome of this study confirm the practical viability of using eye-blink detection as an effective input method for virtual keyboards. The real-time system achieved high accuracy and was positively received by users, particularly those with motor impairments. With technology becoming more embedded in everyday activities, ensuring digital accessibility for
all individuals is essential. This system marks a meaningful advancement toward more inclusive human-computer interaction. Looking ahead, future enhancements may include refining blink detection algorithms for better differentiation, integrating cloud-based user profiles for personalized settings, and expanding compatibility with mobile platforms to broaden its usability
References
[1] S. Geetharani and S. Keerthna, “Eyeball Based Cursor Control for Paralyzed Individuals Using Eye Blink Detection,” Int. J. Eng. Res. Technol. (IJERT), vol. 14, no. 3, 2025. https://ijert.org/eyeball-based-cursor-control-for-paralyzed-individuals-using-eye-blink-detection
[2] S. T. Ahmad, K. M. K. Hasan, A. Chowdhury, M. H. Chowdhury, and Q. D. Hossain, “EEG driven Eye Blink Controlled Smart Interface for Physically Challenged,” Universal Access in the Information Society, Springer, 2024. https://link.springer.com/article/10.1007/s10209-024-01182-3
[3] M. Liu, S. Bian, Z. Zhao, B. Zhou, and P. Lukowicz, “Energy efficient, Low latency, and Non contact Eye Blink Detection with Capacitive Sensing,” Frontiers in Computer Science, 2024. [Online]. Available: https://www.frontiersin.org/articles/10.3389/fcomp.2024.1394397/full
[4] N. Schärer, F. Villani, A. Melatur, S. Peter, T. Polonelli, and M. Magno, “ElectraSight: Smart Glasses with Fully Onboard Non Invasive Eye Tracking Using Hybrid Contact and Contactless EOG,” arXiv, Dec. 2024. [Online]. Available: https://arxiv.org/abs/2412.14848
[5] G. R. Chhimpa, A. Kumar, S. Garhwal et al., “Development of a Real time Eye Movement Based Computer Interface for Communication with Improved Accuracy for Disabled People Under Natural Head Movements,” J. Real-Time Image Process., Springer, 2023. https://link.springer.com/content/pdf/10.1007/s11554-023-01336-1.pdf
[6] C. Porter and G. Zammit, “Blink, Pull, Nudge or Tap? The Impact of Secondary Input Modalities on Eye Typing Performance,” in Human Computer Interaction – INTERACT 2023, LNCS, Springer, pp.?1 12. https://link.springer.com/chapter/10.1007/978-3-031-48038-6_15
[7] H. Singh, A. R. Borah, N. H. Suthar et al., “Eye Blink—A Mode of Communication,” in ICDAI 2023, Lecture Notes in Networks and Systems, Springer, pp.?113–124, 2023. https://link.springer.com/chapter/10.1007/978-981-99-3878-0_10
[8] W. Zeng, Y. Xiao, S. Wei, J. Gan, X. Zhang, and Z. Cao, “Real time Multi person Eyeblink Detection in the Wild for Untrimmed Video,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit. (CVPR), 2023.
[9] R. Daza, D. DeAlcala, A. Morales, R. Tolosana, R. Cobos, and J. Fierrez, “ALEBk: Feasibility Study of Attention Level Estimation via Blink Detection Applied to e Learning,” arXiv preprint, 2023. https://arxiv.org/abs/2112.09165
[10] X. Meng, W. Xu, and H.-N. Liang, “An Exploration of Hands free Text Selection for VR HMDs Using Gaze and Blink,” arXiv preprint, 2022. https://arxiv.org/abs/2209.06825
[11] M. Liu, S. Bian, and P. Lukowicz, “Non Contact, Real Time Eye Blink Detection with Capacitive Sensing,” Frontiers in Computer Science, 2022. https://arxiv.org/abs/2211.05541
[12] M. P. Roy and D. Chanda, “A Robust Webcam based Eye Gaze Estimation System,” in Proc. 2022 Int. Conf. Innovations in Sci., Eng. Technol. (ICISET), IEEE, 2022. https://doi.org/10.1109/ICISET54810.2022.9775896