An amazing invention in Computer Technology is the mouse. Nowadays, A Bluetooth mouse or wireless mouse still has some limitations as it requires a battery for power and a dongle to connect to a PC. This issue may be solved in the proposed gesture-based AI virtual mouse by capturing hand motions and revealing hand tips with a webcam or integrated camera, as gestures are a powerful means of communication between people. Based on hand gestures, the computer can be almost controlled and can perform right- clicking, left-clicking, scrolling and computer cursor functions without using the physical mouse. Therefore, the proposed system will help avoid the dissemination of COVID19 by hindering the deadly intervention and confidence of bias in the control of the computer.
The use of computers has become an integral part of our daily lives and human-computer interaction is becoming more and more convenient in the daily life. While most people take these establishments for granted, people with disabilities face many difficulties in the proper use of these devices. This study presents a gesture-based AI virtual mouse system that uses computer vision to perform mouse activities in the computer utilising hand motions and hand tip detection. The suggested system's main goal is to employ a web camera or a computer's built-in camera to execute computer mouse cursor and scroll tasks instead of a standard mouse device.
Computer vision is used to identify hand gestures and tip detection as an HCI with the computer. We can utilise a built-in camera or a web camera to track the fingertip of a hand gesture and conduct mouse cursor operations and scrolling, as well as move the cursor using the gesture-based AI virtual mouse.
The gesture-based AI virtual mouse framework is written in the Python programming language, and it also makes use of OpenCV, a computer vision library. The suggested gesture-based AI virtual mouse framework makes use of the MediaPipe bundle for tracking the hands and the tips of the hands, and Autopy bundle for moving around the PC's window screen and completing roles like left clicking, right clicking, and looking over capabilities. The proposed model's results indicated an extraordinarily high level of exactness, and the proposed model can work magnificently in real-world applications when using a CPU rather than a GPU.
II. RELATED WORKS
There have been some analogous virtual mouse works that use hand gesture detection by wearing a glove in the hand and also using colour tips in the hands for gesture recognition, but they are not as accurate in mouse functionalities. Because of the gloves, the recognition is not as exact; also, the gloves are not suitable for some users, and in some circumstances, the recognition is not as accurate due to the failure of colour tip detection. The hand gesture interface has been detected using a camera in some cases.
Quam presented an early hardware-based system in 1990, which required the user to wear a DataGlove. Although Quam's proposed approach produces more accurate results, it is impossible to conduct certain of the gesture commands with it.
In 2013, Monika B. Gandhi, Sneha U. Dudhane, and Ashwini M. Patil suggested "Cursor Control System Using Hand Gesture Recognition" as a study. The constraint in this study is that saved frames must be processed for hand segmentation and skin pixel recognition
In the IJCA Journal, Vinay Kr. Pasi, Saurabh Singh, and Pooja Kumari suggested "Cursor Control Using Hand Gestures" in 2016. The system suggests that different bands be used to conduct distinct mouse tasks. The drawback is that mouse functions are dependent on distinct colours.
In 2018, Chaithanya C, Lisho Thomas, Naveen Wilson, and Abhilash SS proposed "Virtual Mouse Using Hand Gesture," which uses colours to recognise models. However, just a few mouse functions are used.
III. ALGORITHM USED IN THE HAND TRACKING
The MediaPipe framework is utilised for hand motion detection and tracking, and the OpenCV library is used for computer vision. To track and recognise hand movements and hand tips, the programme employs machine learning ideas.
MediaPipe is a Google opensource framework that is used for applying in a machine learning pipeline. Because it is built with time series data, the MediaPipe framework is useful for cross-platform development. The MediaPipe framework is multimodal, which means it can be applied to a variety of audios and videos.
OpenCV is a computer vision library that includes object detection image-processing algorithms. OpenCV is a computer vision library written in Python that can be used to create real-time computer vision applications. The OpenCV library is used to process images and videos as well as perform analysis like face and object detection.
The flowchart of the real-time gesture-based AI virtual mouse system in Figure 3 explains the many functions and conditions used in the system.
A. Capturing and Processing the Video
The AI virtual mouse system makes use of a webcam to capture each frame till the programme is finished. To detect the hands in the movie frame per frame, the video frames are converted from BGR to RGB colour space, as illustrated in the following code:
B. Rectangular Region for Moving through the Window (Virtual Screen Matching)
The transformational algorithm is used by the AI virtual mouse system to transform the co-ordinates of the fingertip from the webcam screen to the full-screen computer window for controlling the mouse. When the hands are detected and we determine which finger is capable of executing the desired mouse action, a rectangular box is generated in the webcam region in relation to the computer window, where we move the mouse cursor around the window, as shown in Figure 5.
C. Functions of the Mouse Using Computer Vision to Detect Hand Gestures and Hand Tip Detection
For the Mouse Pointer that Moves in the Computer Window:If the index finger is up with tip Id = 1, the mouse cursor is made to move around the window of the computer using the AutoPy package of Python, as shown in Figure 6.
2. For the Mouse to Perform Right Button Click:The computer is configured to make the right mouse button click using the AutoPy Python module if both the index finger with tip Id = 1 and the middle finger with tip Id = 2 are up and the distance between the two fingers is less than 40 px, as shown in Figure 7.
V. FUTURE SCOPE
The proposed gesture-based AI virtual mouse has various flaws, such as a slight loss of accuracy when using the right click mouse function, and the model has some difficulty selecting text by clicking and dragging. These are some of the drawbacks of the proposed gesture-based AI virtual mouse system, which will be addressed in future research.
Furthermore, the proposed system can be extended to handle virtual keyboard and mouse functionality, which is another prospective use of Human- Computer Interaction (HCI).
The gesture-based AI virtual mouse system can be used for a variety of purposes, including reducing the amount of space required for using a physical mouse and in situations where we are unable to use a physical mouse. The technology reduces the need for gadgets while also improving human- computer interaction.
Without the need of gadgets, the system can be utilized to control robots and automation systems.
Using hand motions, the AI virtual system can sketch 2D and 3D pictures.
Without the usage of wireless or cable mouse devices, AI virtual mouse can be utilized to play virtual reality and augmented reality games.
Because it is not safe to utilize equipment by touching them during the COVID-19 outbreak because contacting the gadgets could result in the virus spreading, the proposed AI virtual mouse can be used to control PC mouse functions without using the real mouse.
This technology can be used by those who have difficulty with their hands to handle computer mouse functions.
The proposed system, such as HCI, can be utilized to control robots in the field of robotics.
The proposed approach should be used in design and architecture to develop the prototype virtually.
The fundamental purpose of the gesture-based AI virtual mouse system is to control mouse cursor functions with hand gestures instead of using a hardware mouse. The proposed system can be implemented utilising a webcam or a built-in camera that detects and interprets hand motions and hand tips in order to perform certain mouse actions. Because the proposed mouse system may be utilised virtually using hand gestures rather than the standard physical mouse, it can be used for real-world applications as well as to decrease the spread of COVID-19.
 Katona, Jozsef. 2021. \"A Review of Human–Computer Interaction and Virtual Reality Research Fields in Cognitive InfoCommunications\" Applied Sciences 11, no. 6: 2646. https://doi.org/10.3390/app11062646
 C. Hsieh, D. Liou and D. Lee, \"A real time hand gesture recognition system using motion history image,\" 2010 2nd International Conference on Signal Processing Systems, 2010, pp. V2-394-V2-398, doi: 10.1109/ICSPS.2010.5555462.
 D. L. Quam, \"Gesture recognition with a DataGlove,\" IEEE Conference on Aerospace and Electronics, 1990, pp. 755-760 vol.2, doi: 10.1109/NAECON.1990.112862.
 P. Nandhini, J. Jaya and J. George, \"Computer vision system for food quality evaluation — A review,\" 2013 International Conference on Current Trends in Engineering and Technology (ICCTET), 2013, pp. 85-87, doi: 10.1109/ICCTET.2013.6675916.
 Google AI Blog, MediaPipe [Visit Site]
 Tutorials Point OpenCV [Visit Site]
 Tran, DS., Ho, NH., Yang, HJ. et al. “Real-time virtual mouse system using RGB-D images and fingertip detection”. Multimed Tools Appl 80, 10473– 10490 (2021). https://doi.org/10.1007/s11042-020-10156-5
 K. H. Shibly, S. Kumar Dey, M. A. Islam and S. Iftekhar Showrav, \"Design and Development of Hand Gesture Based Virtual Mouse,\" 2019 1st International Conference on Advances in Science, Engineering and Robotics Technology (ICASERT), 2019, pp. 1-5, doi: 10.1109/ICASERT.2019.8934612.
 J. Dulayatrakul, P. Prasertsakul, T. Kondo and I. Nilkhamhang, \"Robust implementation of hand gesture recognition for remote human-machine interaction,\" 2015 7th International Conference on Information Technology and Electrical Engineering (ICITEE), 2015, pp. 247-252, doi: 10.1109/ICITEED.2015.7408950.
 Intellias Blog [Visit Site]