Presentations are essential in both academic and professional environments, yet conventional slide navigation methods often limit the presenter’s movement and interaction. Gesture recognition offers a more intuitive, hands-free alternative, removing the dependence on physical remotes or keyboards. This project presents a gesture-controlled system for managing PowerPoint slides and adjusting computer volume, leveraging embedded systems and wireless communication. The setup includes an APDS-9960 gesture sensor for hand gesture detection, an Arduino Uno for processing inputs, and an HC-05 Bluetooth module for transmitting commands to a computer. A custom-built desktop application developed with ASP.NET interprets these gestures in real time, allowing users to perform actions such as slide transitions, volume adjustments, and presentation pauses. The system is designed with accessibility in mind, aiming to support users with disabilities and improve the overall efficiency of presentations. By enabling touchless interaction, this solution enhances user experience and engagement. Future developments may include AI-based gesture recognition, support for multiple commands, and potential integration with AR/VR platforms to broaden its application in human-computer interaction.
Introduction
The APDS-9960 sensor is a compact, energy-efficient module integrating gesture recognition, proximity sensing, ambient light, and color detection, widely used for touchless human-machine interaction in devices like smartphones, gaming systems, and smart homes. Gesture recognition enables hands-free control by interpreting hand movements as commands, improving hygiene and accessibility.
This study focuses on developing a gesture-based system using the APDS-9960 sensor, Arduino Uno, and HC-05 Bluetooth module to control PowerPoint presentations and computer volume wirelessly. The system includes hardware setup, microcontroller programming for real-time gesture detection, and a desktop application built with ASP.NET to interpret commands.
Applications include enhancing presentation control for professionals, educators, people with disabilities, and public speakers, as well as use in smart homes, theaters, and VR environments. Benefits are increased mobility, accessibility, low cost, and contactless operation.
Testing showed the system achieves about 92% accuracy in gesture recognition with fast response times (~200-300 ms) and stable Bluetooth communication within 5–10 meters, though performance can vary with lighting and gesture consistency. Future improvements could involve AI-based gesture adaptation, expanded gesture vocabulary, and broader application integration.
Conclusion
This project successfully implemented a gesture-based control system for desktop interaction using the APDS-9960 sensor, Arduino Uno, HC-05 Bluetooth module, and an ASP.NET desktop application. The system allows users to control PowerPoint presentations, scroll through content, and adjust media playback using simple hand gestures, eliminating the need for physical input devices like keyboards or remotes. It demonstrates the potential of gesture recognition technology to offer a more natural, accessible, and hygienic way to interact with computers.
With an average gesture recognition accuracy of 92% and a response time of 200–300 milliseconds, the system proved effective for real-time use. The Bluetooth communication was reliable within a 5–10 meter range, ensuring smooth wireless connectivity. However, challenges such as decreased performance in bright lighting conditions and inconsistencies in gesture execution among new users were observed, pointing to areas for further optimization.
Overall, the system provides a practical, hands-free alternative to conventional input methods. It is particularly useful in scenarios where mobility, accessibility, and hygiene are important. This work lays the groundwork for future enhancements, including AI-based gesture adaptation, expanded gesture sets, and integration with more advanced applications, ultimately contributing to the evolution of human-computer interaction.
References
[1] Pustode, B., Pawar, V., Pawar, V., Pawar, T., &Pokale, S. (2023). Smart Presentation System Using Hand Gestures.
[2] Powar, S., Kadam, S., Malage, S., &Shingane, P. (2022). Automated Digital Presentation Control using Hand Gesture Technique. In ITM Web of Conferences (Vol. 44, p. 03031). EDP Sciences.
[3] Idrees, M., Ahmad, A., Butt, M. A., & Danish, H. M. (2021). CONTROLLING POWER POINT USING HAND GESTURES IN PYTHON. Webology, 18(6).
[4] Sawardekar, G., Thaker, P., & Singh, R. (2018). Arduino Based Hand Gesture Control of Computer Application. Vaishali, Arduino Based Hand Gesture Control of Computer Application (October 29, 2018).
[5] Salunke, T. P., &Bharkad, S. D. (2017, July). Power point control using hand gesture recognition based on hog feature extraction and K-NN classification. In 2017 International Conference on Computing Methodologies and Communication (ICCMC) (pp. 1151-1155). IEEE.
[6] Sebastian Raschka, Joshua Patterson and Corey Nolet, Machine Learning in Python: Main Developments and Technology Trends in Data Science, Machine Learning, and Artificial Intelligence,(2020)
[7] D.O. Lawrence, and Dr. M.J. Ashleigh, Impact Of Human-Computer Interaction (Hci) on Users in Higher Educational System: Southampton University As A Case Study, Vol.6, No 3, pp. 1-12,September (2019)
[8] A.Elshafee and K.A.Hamed, “Design and Implementation of a Wi-Fi Based Automation International Journal of Modern Agriculture ISSN: 2305-7246 Volume 10 Issue 2, 2021 Website: http://www.modern-journals.com/ 3798 System”, “World A cad. Sci. Eng. Tech- nol.”, Vol.6, No:8, PP:1856-1862, 2012
[9] G. Sziládi, T. Ujbányi, and J. Katona, “Cost-effective hand gesture computer control interface,” 7th IEEE International Conference on Cognitive Infocommunications , October 2016
[10] XuesongZhai, Xiaoyan Chu, Ching Sing chai, Morris Siu Yung Jong, AndrejaIstenic, Michael Spector,Jia-Bao Liu, Jing Yuan, Yan Li, A Review of Arti?cial Intelligence (AI) in Education from 2010 to 2020,(2021)
[11] K. Yamagishi, L. Jing, and Z. Cheng, “A System for Controlling Personal Computers by Hand Gestures using a Wireless Sensor Device,” IEEE International Symposium on Independent Computing (ISIC), December 2014