This paper proposes a way to control the position of cursor with the bare hands without using any electronic device. While the operations like clicking and dragging of objects will be performed with different hand gestures. Only authorized persons can access the system by using facial recognition. Face recognition tracks target objects in live video images taken with a video camera. This application is based on face detection, feature extraction and recognition algorithms, which automatically detects the human face when the person in front of the camera recognizing the person. If the person is authorized then he can access the computer. We used transformational algorithm, K-cosine algorithm in face detection and finger detection.
It has been generations since we have been using hand gestures for communicating in human society. It is believed that gestures are the easiest way of interaction with anyone. So then why not apply it to the machines that we are using. In this work, we are demonstrating, real- gesture. The complete process is divided into 4 steps which are frame-capturing, image-processing, region-extraction, feature-matching. It focuses on extracting the features over the human hands and then matching their features to recognize the movement of the hand.
Face recognition system focuses especially on the human frontal faces. As face is recognized then the person can access the device. These face recognition consist of many steps to recognize the face and classify like Face Tracking, Face Detection, Haar Cascade Classfier Feautures.
A. Aim and Objective of Research Work Include
For most laptop touchpad is not the most comfortable and convenient.
Reduce cost of hardware.
B. Project Essential Feature
Handle simple operations.
II. EXISTING SYSTEM
The existing system consists of a mouse that can be either wireless or wired to control the cursor, know we can use hand gestures to monitoring the system. The existing virtual mouse control system consists of the simple mouse operation using the colored tips for detection which are captured by web-cam, hence colored fingers acts as an object which the web-cam sense color like red, green, blue color to monitor the system, whereas could perform basic mouse operation like minimize, drag, scroll up , scroll down , left-click right-click using hand gestures without any colored finger because skin color recognition system is more flexible than the existing system.
In the existing system use static hand recognition like fingertip identification, hand shape, Number of fingers to defined action explicitly, which makes a system more complex to understand and difficult to use.
III. PROPOSED SYSTEM
The system works by identifying the color of the hand and decides the position of the cursor accordingly but there are different conditions and scenario which make it difficult for the algorithm to run in the real environment due to the following reason as shown in Fig. 1.
Noises in the environment.
Lighting condition in the environment
Different texture of skin.
Background object in the same colour of skin.
The proposed system can work for the skin tone of any color as well as can accurately in any lighting condition as well for the purpose of clicking the user needs to create a 15 degree as well as the algorithm that requires colored tapes for controlling the mouse.The research paper can be a pioneer in its field and can be a source of further research in the corresponding field. The project can be developed with “zero-cost” and can easily integrate with the existing system.
And also includes with the face recognition such that it increases the security of a system so that only organized person can operator the system by classifying his face.
IV. APPLICATION OF PROPOSED WORK
This work can easily replace the traditional mouse system that has been in existence for decades with the use of this algorithm the user can control the mouse without the fuss of any other hardware device this is done using a hand gestures recognition with inputs from a web-cam.
The following steps are included to develop the algorithm:-
The first step is to capture the image using the camera.
The camera then extracts and recognizes the face .
Then the position of the human hand is stored in the system using the regular” coordinate-system”.
Then when the second frame is captured. The position of the hand from the second frame is captured and is stored in the system.
Then the position of both hands is compared and then the cursor moves accordingly.
Now for the system of clicking the angle between the two hands of the finger is measured and if the angle is less than 15 degrees the system responds to it as a left-click. In this way, the complete working of the mouse can be done with bare hands.
With the aid of web-cam assistance, we want to develop completely free hand identification software for laptops and PCs through this article. The project places a strong emphasis on developing software that allows users to click and move the cursor by using their hands.
A. Activating Camera
The first step is to activate the camera so that the system can receive input. To do this, we must assign the camera's resources to a variable. The command used for this is cam=cv2. Video recording (0) This command will enable the system's connected camera and enable the camera to provide input as indicated in Fig. 2.
B. Web-Cam Extraction of Angle
The skin colour is identified using a mask and the kernel function, which uses RGB parameters that range from [92,56,54] to [255,223,196] to distinguish it from the other colours of the background. Noise is then removed from the input using open and close kernels.
The open kernel and close kernel operate under the straightforward premise that only the proper input is sent to the system for computer processing if the pixilated noise bit is greater than the recorded value as shown in Fig. 3.
C. Extracting Angle
For moving the cursor the first step is to find the middle of the hand which can be determined using the following command.
Capture the persons image.
Apply Face detection algorithms to detect face.
Use viola jones and KLT algorithm extract the region of interest in Rectangle Bounding Box.
Covert to gray scale,apply histogram equalization and Resize to 100x100 i.e Apply pre-processing
if enrollment phase
Then store in Database
Apply PCA(For feature Extraction)
a. var_leftmost→min_argument[tuple(hull[hull[:,:,0].argmin()] )]
b. var_rightmost→max_argument[tuple(my_con[my_con[:,:,0] .argmax()])
d. var_bottommost→tupleassignment(my_con[my_con[:,:,1].ar gmax()])
The following part of the code is responsible for finding the middle point of the hand the coordinates of the midpoint of the hand will be used for moving the cursor in different directions depending on the movement of the corresponding users.
E. Flow Chart
Represent the capturing the frame from the web-cam and process the frame capturing by the web-cam after processing covert the image HSV to RGB format.
creating the filter which create mask of skin color.
if the input provided by the user through the web-cam is skin color than calculating the midpoint of the image otherwise processing the frame provided by the webcam.
If the angle between the two points is less than 15 degree perform operation left click else move cursor in the direction of input image.
VI. RESULTS AND EVALUATION
The goal of this work was to increase the machine's responsiveness to and interaction with human behaviour. Developing a technology that is portable, inexpensive, and compatible with any common operating system was the only goal of this article.
The proposed system works to control the mouse pointer by sensing the user's hand and moving the pointer in that hand's general direction. the mechanism Control basic mouse actions like left-clicking, dragging, and cursor movement.
When the angle between the fingers of the human hand is less than 15 degrees, the method identifies the hand of the human skin and tracks it continually for the movement of the cursor. At that point, the procedure executes the left-click action.
 Amardip Ghodichor, Binitha Chirakattu “Virtual Mouse using Hand Gesture and Color Detection ”, Volume 128 – No.11, October 2015.
 Chhoriya P., Paliwal G., Badhan P., 2013, “Image Processing Based Color Detection”, International Journal of Emerging Technology and Advanced Engineering, Volume 3, Issue 4, pp. 410-415
 Rhitivij Parasher,Preksha Pareek ,”Event triggering Using handgesture using open cv”, volume -02-february,2016 page No.15673-15676
 AhemadSiddique, Abhishek Kommera, DivyaVarma, ” Simulation of Mouse using Image Processing Via Convex Hull Method ”, Vol. 4, Issue 3, March 2016. Student, Department of Information Technology, PSG
 Student, Department of Information Technology, PSG College of Technology, Coimbatore, Tamilnadu, India,”Virtual Mouse Using Hand Gesture Recognition ”,Volume 5 Issue VII, July 2017
 Kalyani Pendke1 , Prasanna Khuje2 , Smita Narnaware3 , Shweta Thool4 , Sachin Nimje5 ,”International Journal of Computer Science and Mobile Computing ”,IJCSMC, Vol. 4, Issue. 3, March 2015
 chu-Feng Lien,’portable vision-based hcl-A Realtime Hand Mouse System on Handheld Devices’,Natonal Taiwan University,Computer Science and Information Engineering Department.