Themethodforcreatingaprocessof human-computerinteractionhaschangedsincethe advancement of computer technology. The mouse isagreattoolforthehuman-computerinteraction. Thisstudyoffersawaytomovethepointerwithout using any technological devices. On the other hand, other hand moves can be applied for operations like drag and click objects. The suggested system will just need a camera as an input device. Along with additional tools, the system will need to be used with OpenCV and Python.Theoutputfromthecamerashallbeshown on a display that is connected so that the user may adjustitfurther.ThePythontoolsthatwillbeused to build this system areNumPy andMediapipea very effective framework that offers quick fixes for AI tasks. It allows users to move the computer cursor around using hand movements while holding dvds like markersofcolor.Certainfingermovementscanbe used to carry out actions like pulling and left- clicking.A hand gesture detection system for controlling a virtual mouse is presented in this research, enabling more organic human-computer interaction.
Introduction
Overview
The project focuses on enhancing Human-Computer Interaction (HCI) by replacing traditional input devices like the mouse and keyboard with hand gesture-based controls. Using only a standard webcam, the system can:
Track hand and finger movements
Count fingers
Recognize gestures
Control volume
Simulate mouse operations like click, drag, and move
Eventually emulate a virtual keyboard
Technology Used
Languages & Tools: Python, OpenCV, NumPy, and other Python libraries.
Input: Webcam (no extra hardware needed).
Algorithms: Hand tracking using 21 landmarks, gesture recognition, finger counting.
Modules:
Hand Tracking: Detects and tracks the hand using palm detection and landmark recognition.
Finger Counter: Determines the number of extended fingers using spatial analysis of landmarks.
Gesture Volume Control: Adjusts audio levels by measuring distances between fingers.
Virtual Mouse: Moves the cursor and performs mouse operations based on index finger gestures.
Proposed System
Uses skin color detection and gesture recognition to control the cursor.
Recognizes specific gestures like showing only the index finger to move the cursor.
Aims to be modular, adaptable, and cost-effective, with gesture commands that are intuitive and require minimal movement.
Designed for real-time interaction, compatible with various environments and lighting conditions.
Key Advantages
No physical mouse required — purely vision-based.
Real-time interaction through webcam feed.
Portable and low-cost alternative to hardware-based devices.
Applications extend to smart TVs, sign language recognition, assistive technologies, and more.
Performance Metrics
Evaluated by accuracy, frames per second (FPS), response time (ms), and gesture precision.
Systems with ~95% accuracy are considered high-performing.
Challenges
Ensuring real-time responsiveness and low latency.
Handling varied lighting conditions and skin tones.
Mapping gestures consistently across users.
Maintaining ergonomic design to reduce fatigue.
Cross-platform compatibility for broader adoption.
Conclusion
This system uses real-time camera input in conjunction with hand tracking and recognition to carry out mouse operations including gesture- basedvolumecontrolandright-click.Itismadeto employ methods of computer vision for executing all common mouse functions. However, the variety of human skin tones and the fluctuating lighting circumstances might make it difficult to achieve consistent performance. The accuracy of many vision algorithms is impacted by illuminationirregularities.Thefindingsimplythat making these algorithms more dependable in any setting will greatly increase the system\'s effectiveness. merely erasing requiring to provide hand-held devices, this approach tends toconserve premises especially is especially helpful for presentations.
Since the proposed model has greater accuracy, the AI virtual mouse can be used for real-world applications, and also, it can be used to reduce the spread of COVID-19, since the proposed mouse system can be used virtually using hand gestures without using the traditional physical mouse. The modelhassomelimitationssuchassmalldecrease
in accuracy in right click mouse function and somedifficultiesinclickinganddraggingtoselect the text. Hence, we will work next to overcome these limitations by improving the fingertip detection algorithm to produce more accurate results.
References
[1] Singh, K., Tripathi, S., and Yadav, V., Hand Gesture Recognition Using Deep Learning for Contactless Mouse Control.
[2] Singh, S., Kaur, R., and Kaur, J., Virtual Mouse Control Through Hand Gesture Recognition [2].
[3] Real-Time Hand Gesture Recognition for MouseControl,Patil,R.,Dhamale,S.,andJadhav, R.
[4] Manjrekar, S., Chavan, A., and Deshmukh, P., Vision-BasedVirtualMouseUsingHandGesture.
[5] Kumar, S., and Sharma, R., Computer Vision- Based Gestures for Human-Computer Interaction.
[6] Contactless Mouse Control Using Deep Learning-Based Hand Gesture Recognition [6] Gupta, P., Vashistha, R., and Verma, K.
[7] Sharma,V.K.,Jayaswal,V.,Iqbal,M.,Tawara, S., & Kumar, V. Hands gestures for virtual mouse control.
[8] Thomas, L., Wilson, N., Abhilash, S. S., &Chaithanya, C. (2018). Using hand gestures as a virtual mouse. 5(4), 3903-3906, International Research Journal of Engineering and Technology (IRJET).
[9] Showrav, S. I., Islam, M. A., Dey, S. K., and Shibly, K. H. (2019, May). creation of an interactivebuttonthatuseshandgestures.Thefirst global gathering on discoveries in robotics, science,1stInternationalConferenceonAdvances in Science, Engineering and Robotics Technology (ICASERT)(pp. 1–5). IEEE.
[10] Nandwana, B., Vipparthi, S. K., Kumar, D., Trivedi,S.,&Tazi,S.(2017,November).Asurvey article on the identification of hand movements. 147–152 in the 7th International Conference on Communication Systems and Network Technologies (CSNT) in 2017. IEEE.