Authors: Mahender K, Mahesh Chandra A, Malika. P, Mallesh E, Malleswari Neelam, Mallikarjun O, Ragipati Karthik
Certificate: View Certificate
In an era dominated by digital interactions, human-computer engagement holds pivotal significance in daily life. This project addresses the accessibility challenges faced by individuals, particularly those with physical disabilities, by introducing an innovative eye-tracking-based pointer control system. In today’s world computers are widely used, but still there is an hindrance for the physically disabled people.They cannot use the computer independently, they need help from someone .To ensure that even people with physical disability can use the computers , and that too independently , the proposed project will ensure that they can operate the computer with muchease. Also this project is useful for all types of people who would use their systems efficiently and with greater speed.This project is not only useful for the physically disabled people, but anyone can use it, because it has been observedthat people after using computer for long hours are facing discomfort in their wrists . The system performs by usinga webcam that will capture live images of the user. A particular area of interest, the eyes in this situation, are considered and some image processing techniques are used to work better in eye tracking. The cursor can be controlled by certain functions in the Python library. The users face and eyes are captured on webcam and the eye movements are used to control the cursor.
II. LITERATURE REVIEW
Machine Learning (ML) plays a pivotal role in the development of eye-tracking-based pointer control systems, aiming to enhance human-computer interaction, particularly for individuals with physical disabilities. This literature review delves into existing research, emphasizing strengths, limitations, and identified gaps in the application of ML in this context.
A. Strengths of Existing Approach
B. Limitations in Current Approaches
III. PROBLEM STATEMENT
The Developing an eye-tracking system to control a pointer through machine learning, aiming to enhance user interface interaction by accurately predicting and responding to eye movements.
The goal is to create an eye-tracking-based pointer control system using machine learning, aiming to provide a means for users to control a computer's cursor solely through their eye movements. The project seeks to overcome the challenges posed by conventional input methods, making computing more inclusive and intuitive for a diverse range of users.
A. Model and Algorithm Used
Support vector machines (SVMs): SVMs are a powerful machine learning algorithm that can be used for classification and regression tasks. They have been shown to be effective for eye tracking applications, such as gaze estimation and pointing prediction.
SVMs: SVMs is used for gaze estimation and pointing prediction tasks. SVMs are a powerful machine learning algorithm that can be used for classification and regression tasks. They have been shown to be effective for eye tracking applications, such as gaze estimation and pointing prediction.
Support Vector Machines (SVMs) are applied in eye-tracking ML projects by training on eye movement data. Extracting features like gaze coordinates, SVMs learn to map these to pointer movements. Collected data is labeled for training, and the SVM's accuracy is validated. Once integrated, the SVM controls the pointer in real-time based on eye gaze, offering a precise human-computer interaction. Tuning parameters enhances performance, making SVMs a robust choice for eye- tracking systems controlling pointers.
B. Data Preprocessing Techniques
V. EXPERIMENTAL RESULTS
1) Accuracy: The percentage of time that the model predicts the user's intended cursor movement correctly.
2) Precision: The percentage of predicted cursor movements that are within a certain distance of the user's intended cursor movement.
3) Recall: The percentage of user's intended cursor movements that are predicted within a certain distance of the predicted cursor movement.
4) F1 score : A weighted harmonic mean of precision and recall.
These metrics can be used to assess the overall performance of the model in predicting cursor movements. For example, if the model has an accuracy of 90%, precision of 80%, recall of 70%, and F1 score of 75%, then it is predicting the user's intended cursor movements correctly 90% of the time, and 80% of the predicted cursor movements are within a certain distance of the user's intended cursor movement.
The specific model evaluation metrics that are most important for this project will depend on the specific goals of the system. For example, if the goal is to develop a system that is highly accurate, then the accuracy metric would be most important. If the goal is to develop a system that is fast and responsive, then the precision and recall metrics would be more important. The developers of the eye tracking pointer control system should choose the model evaluation metrics that are most relevant to their specific goals and use them to track the performance of the model over time.
VI. FUTURE WORK
Improve accuracy and reliability of eye tracking and Develop more intuitive interactions and make easy eye- tracking systems affordable and easy to use.
Traditional input devices like mice and touchpads may not be accessible to individuals with physical disabilities, making it difficult to interact with computers. This project aims to address this issue by creating an eye-tracking-based pointer control system using machine learning techniques. This system would allow users to control a computer\'s cursor or pointer solely through the movement of their eyes. The project leverages state-of-the-art machine learning algorithms to accurately track and interpret the user\'s eye movements, enabling them to interact with computers more effectively. The primary objective of this project is to design, develop, and implement an eye-tracking- based pointer control system that enables users, especially those with physical disabilities, to manipulate a computer\'s cursor solely through the movement of their eyes. The specific goals include leveraging state-of- the-art machine learning algorithms to accurately track and interpret eye movements.
 B. Rebsamen, C. L. Teo, Q. Zeng, M. Ang Jr. \"Controlling a wheel chair indoors using thought\" IEEE Intelligent Systems, 2007, pp. 18-24  C. A. Chin \"Enhanced Hybrid Electromyogram / Eye gaze tracking cursor control system for hands-free computer interaction\". Proceedings of the 28th IEEE EMBS Annual International Conference, New York City, USA, Aug 30-Sept 3, 2006, pp 2296-2299.  J. Kierkels, J. Riani, J. Bergmans, \"Using an Eye tracker for Accurate Eye Movement Artifact Correction\", IEEE Transactions on Biomedical Engineering, vol. 54, no. 7, July 007, pp. 1257-1267  A. E. Kaufman, A. Bandyopadhyay. B. D. Shaviv, \"An Eye Tracking Computer User Interface\", Research Frontier in Virtual Reality Workshop Proceedings, IEEE Computer Society Press, October 1993, pp. 78-84. [  T. Kocejko, \"Device which will allow people suffered from Lateral Amyotrophic Sclerosis to communicate with the environment\", MSc thesis, January 2008. G. A. Myers, K. R. Sherman, L. Stark, \"Eye Monitor\", IEEE Computer Magazine, Vol. March 1991, pp. 14-21.  C. Collet, A. Finkel, R. Gherbi, \"A Gaze Tracking System in Man-Machine Interaction\", Proceedings of IEEE International Conference on Systems, September 1997. Intelligent Engineering  B. Hu, M. Qiu, \"A New Method for Human- Computer Interaction by using Eye-Gaze\", Proceedings of IEEE International Conference on Systems, Man, and Cybernetics, October 1994.  P. Ballard, G. C. Stockman, \"Computer operation via Face Orientation\", Pattern Recognition vol 1. Conference A: Computer Vision and Applications.  Proceedings., 11th IAPR International Conference, 1992. https://www.mathworks.com/matlabcentral/fileexchange/ 247-vfm  Christian Szegedy et al. “Going Deeper with Convolutions”. In: CoRR abs/1409.4842 (2014). arXiv: 1409.4842. URL: http://arxiv.org/abs/1409.4842.
Copyright © 2023 Mahender K, Mahesh Chandra A, Malika. A, Mallesh E, Malleswari Neelam, Mallikarjun O, Ragipati Karthik. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.