Assistive Eye Blink Technology helps to communicate with the individuals having paralysis by analyzing their eye patterns and detect eye blink using OpenCV and computer vision. It performs tasks using different blink patterns and has emergency alert. It does not require caressing and is cost-effective solution which creates a sense of independence in the users and also helps them to perform their day-to-day tasks.
Introduction
Purpose
This system enables people with severe physical disabilities (e.g., ALS, spinal injuries) to communicate using eye blinks and eye movements, replacing traditional verbal or physical communication methods. It leverages facial landmark detection and real-time video processing to track eye blinks, enabling the user to send messages or interact with devices.
Key Objectives
Help paralyzed patients communicate and perform tasks independently.
Provide a customizable blink pattern interface.
Offer real-time, accurate, and cost-effective communication.
Use low-resource and easily accessible technologies.
Literature Review Highlights
Eye-LRCN: Combines CNN and RNN for blink pattern analysis, aiding in diagnosing Computer Vision Syndrome.
Blink-based Typing: Uses DLib and SVM for on-screen keyboard navigation via blinking; employs transfer learning.
RT-BENE Dataset: Offers real-time blink detection using CNNs for public and HCI settings.
EAR Method: Detects eye blinks using Eye Aspect Ratio (EAR) from facial landmarks; suitable for fatigue detection.
Lightweight CNN Models: Optimized for low-power devices like wearables, supporting mobile healthcare.
Background movement can interfere with blink detection.
User fatigue may occur with extended usage.
Single input method—temporary eye issues can limit usability.
Conclusion
Assistive eye blink technology offers a solution for people having paralysis. This system allow them to connect and interact more effectively by using eye blink. It provides them new and better way to communicate. By using eye-tracking techniques, users can associate themselves more effectively though eye movements.
The user-friendly web interface provides easy accessibility for users. It is more than technology; it gives people a voice and helps them to be independent and feel confident and connected.
References
[1] G. de la Cruz, M. Lira, O. Luaces, and B. Remeseiro, “Eye-LRCN: A Long-Term Recurrent Convolutional Network for Eye Blink Completeness Detection,” Neural Networks, 2024.
[2] P. B. Jain, S. Bhat, G. Pujari, V. Hiremath, and D. C., “Eye Typing - Vision Based Human Activity Control,” in Proceedings of IEEE International Conference on Signal Processing, Communication, and Engineering Systems, 2022.
[3] P. A. Shinde, S. N. Lanjwal, R. A. Kalantre, and S. P. Pawar, “Eye Blink Based Typing Software for Paralyzed Patient,” in Proceedings of International Conference on Recent Trends in Engineering and Technology, 2021.
[4] K. Cortacero, T. Fischer, and Y. Demiris, “RT-BENE: A Dataset and Baselines for Real-Time Blink Estimation in Natural Environments,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2019.
[5] 5. T. Appel, T. Santini, and E. Kasneci, “Brightness- and Motion-Based Blink Detection for Head-Mounted Eye Trackers,” in Proceedings of ACM Symposium on Eye Tracking Research and Applications (ETRA), 2016..
[6] T. Soukupová and J. Cech, “Real-Time Eye Blink Detection Using Facial Landmarks,” in Proceedings of International Conference on Computer Vision Theory and Applications (VISAPP), 2016..
[7] P. Fogelton and W. Benesova, “Eye Blink Completeness Detection,” in Proceedings of International Conference on Image Processing Theory, Tools and Applications (IPTA), 2018.
[8] M. Jordan, A. Pegatoquet, A. Castagnetti, J. Raybaut, and P. L. Coz, “Deep Learning for Eye Blink Detection Implemented at the Edge,” in Proceedings of IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS), 2020.