In recent years, wearable technology has developed, as have low-cost embedded systems that allow for the building of intuitive and accessible platforms for robot control. This paper presents a system in which a hand gesture-controlled Robotic arm made up of Arduino Uno is designed which can move in four directions. The primary component is a wearable glove, which includes resistive flex sensors that monitor the finger movements of the user. These movements are translated into electrical signals with different voltages as you make different hand gestures. The HC-05 Bluetooth module transmits these signals wirelessly to the Robotic arm where the servo motors immediately react to make the movements happen. It also includes a haptic feedback component for better user-system interaction. This system can replicate the effect of touch, enabling more precise control and better immersive Ness in applications. The system uses hand movements to control the Robotic arm remotely in fields such as healthcare and industry. The methodology is presented, detailing how hardware was integrated, gestures were mapped to the arm, and real-time testing was performed to validate performance. We conducted multiple rounds of testing to assess the robot\'s responsiveness, the accuracy of the gestures, motor synchronization, and finally to learn the feedback mechanism. By effectively mimicking the finger movements it was able to overcome latency issues. In the future, the system could further be enhanced with the incorporation of AI-based models for gesture prediction. This would allow the Robotic arm to predict movements and gesture recognition which would make the system even more responsive. This development represents the first step towards the delivery of low-cost and customizable robotic interfaces to enhance human–robot interaction. In particular, we envisage applications in rehabilitative robotics, prostheses, and teleoperated systems focusing on robustness, cost, and user-friendliness.
Introduction
Recent advances in robotics extend beyond automated factories into fields like medicine, manufacturing, and assistive technology, where precision, speed, and consistency are vital. A key innovation is the hand gesture-controlled robotic arm, which uses a wireless glove embedded with resistive flex sensors to detect finger movements and control a robotic arm via Arduino and Bluetooth. This setup allows intuitive, controller-free interaction, making it especially helpful for people with limited mobility.
The robotic arm features four degrees of freedom, powered by servo motors, enabling natural and precise movements. A haptic feedback system provides tactile sensations on the glove, improving user awareness and control. The system prioritizes low cost, ease of integration, and responsiveness, though challenges remain in sensor calibration, communication latency, and real-world reliability.
The literature highlights various related developments, including wearable robotics for medical rehabilitation, multi-sensor fusion for improved responsiveness, and advanced control methods mimicking human nervous systems. Applications range from factory automation to agricultural harvesting, showing wide potential for gesture-controlled robots.
The system’s hardware includes an Arduino Uno microcontroller processing sensor data, Bluetooth for wireless communication, servo motors for actuation, and a haptic module for feedback. Gesture recognition maps specific finger bends to commands, transmitted wirelessly to the arm, which moves accordingly with low latency (~178 ms).
Testing showed about 90% gesture recognition accuracy, strong wireless performance up to 8 meters, and high user satisfaction (4.6/5), with users finding the controls intuitive and enhanced by vibration feedback. Movement was stable under light to medium loads but showed some shaking with heavier weights, suggesting future upgrades with stronger motors and mechanical stabilization.
Future work envisions integrating AI and machine learning to predict incomplete gestures, enabling smoother, more natural control that adapts to user intent, particularly benefiting assistive technology users.
Conclusion
We have created a Robotic arm that you can control just by moving your hand - with no buttons, no joysticks, just with natural hand gestures. We used a smart glove that helps to red your finger movements and connects wirelessly to the arm that is Robotic arm. What makes it special is that you can actually feel the arm responding through vibrations in the glove, like getting a high-five from the machine. During our tests, the system guessed right 93 times out of 100 - pretty good for recognizing whether you\'re making a grabbing motion or waving hello. The arm itself moves smoothly, handling all the basics like picking things up, turning them around, or lifting them with steady precision. While the wireless connection works great across a small room, you might notice tiny hiccups in really crowded areas full of other devices. People who tried this project shocked it felt initiative. Many users said they got the response of it immediately, and a physical feedback system made in feeling connected to the Robotic arm. This isn\'t just some lab experiment - it\'s a real, working system that could help people with limited mobility, make factory work safer, or even teach students about robotics in a hands-on way (pun intended). Looking forward, we\'re excited about teaching the system to understand and perform more complex gestures and tasks, making the wireless connection bulletooth, and adding new ways for users to get feedback. You can imagine the glove starts buzzing when you\'re holding something fragile, or starts beeping when the arm reaches its limit - that\'s where we\'re headed next. This project proves you don\'t need expensive equipment to build smart, responsive robotics - just some clever engineering and a good understanding of how people naturally move and interact with machines.
References
[1] M. Baggetta, “Integrated Design of Compliant Upper Limb Prostheses: The UGentle Limb,” 2024, Accessed: Feb. 11, 2025. [Online]. Available: https://tesidottorato.depositolegale.it/handle/20.500.14242/68162
[2] K. Lin, Y. Li, J. Sun, D. Zhou, and Q. Zhang, “Multi-sensor fusion for body sensor network in medical human–robot interaction scenario,” Inf. Fusion, vol. 57, pp. 15–26, May 2020, doi: 10.1016/j.inffus.2019.11.001.
[3] S. Patel, Z. Rao, M. Yang, and C. Yu, “Wearable Haptic Feedback Interfaces for Augmenting Human Touch,” Adv. Funct. Mater., p. 2417906, Jan. 2025, doi: 10.1002/adfm.202417906.
[4] K. S E, P. Logeswari, Mayuri. A, and Y. Devi, “Advanced Wearable Technology for Upper Limb Rehabilitation in Post-Stroke Survivors,” in 2025 International Conference on Computational, Communication and Information Technology (ICCCIT), Indore, India: IEEE, Feb. 2025, pp. 686–691. doi: 10.1109/ICCCIT62592.2025.10928004.
[5] A. Singh, S. D. Shah, and A. K. Shukla, “Design and Development of Robotic arm for Material Handling Using Hand Glove Controller,” in 2024 International Conference on Science Technology Engineering and Management (ICSTEM), Coimbatore, India: IEEE, Apr. 2024, pp. 1–7. doi: 10.1109/ICSTEM61137.2024.10560781.
[6] M. Gandhi, M. S. Banu, R. M. Kumar, P. Anand, A. Nagarajan, and P. R. Velmurugan, “Fusion Techniques in AI for Enhanced Action and Gesture Understanding:,” in Advances in Computer and Electrical Engineering, S. S. Rajest, S. Moccia, B. Singh, R. Regin, and J. Jeganathan, Eds., IGI Global, 2024, pp. 227–244. doi: 10.4018/979-8-3693-3739-4.ch012.
[7] T. Yu et al., “A Compact Gesture Sensing Glove for Digital Twin of Hand Motion and Robot Teleoperation,” IEEE Trans. Ind. Electron., vol. 72, no. 2, pp. 1684–1693, Feb. 2025, doi: 10.1109/TIE.2024.3417980.
[8] Z. Yu, C. Lu, Y. Zhang, and L. Jing, “Gesture-Controlled Robotic arm for Agricultural Harvesting Using a Data Glove with Bending Sensor and OptiTrack Systems,” Micromachines, vol. 15, no. 7, p. 918, Jul. 2024, doi: 10.3390/mi15070918.
[9] R. Wang, Z. Lu, Y. Wang, and Z. Li, “The Design and Analysis of a Lightweight Robotic arm Based on a Load-Adaptive Hoisting Mechanism,” Actuators, vol. 14, no. 2, p. 71, Feb. 2025, doi: 10.3390/act14020071.
[10] R. C. Batista et al., “Topological and lattice-based AM optimization for improving the structural efficiency of Robotic arms,” Front. Mech. Eng., vol. 10, p. 1422539, Jun. 2024, doi: 10.3389/fmech.2024.1422539.
[11] D. Casanueva-Morato, C. Wu, G. Indiveri, J. P. Dominguez-Morales, and A. Linares-Barranco, “Towards spiking analog hardware implementation of a trajectory interpolation mechanism for smooth closed-loop control of a spiking robot arm,” 2025, arXiv. doi: 10.48550/ARXIV.2501.17172.