Thisprojectaimstoenhancetheindependenceof individuals with mobility impairments by integrating a robotic armwith awheelchairtoassist withdrinking. Therobotic arm is designed to facilitate the user’s access to beverages by automating the process through Voice detection and followed by Facerecognitionto navigateadrinkto theuser’smouth.By employing advanced sensors and precise control mechanisms, the system ensures accuracy and safety during operation. The project delves into the challenges of seamless integration between the robotic arm and the wheelchair, addressing issues of stability and user interface. The ultimate goal is to improve dailyliving experiences forwheelchair users, making the act of drinking more accessible and efficient. This innovation promises to provide greater autonomy and convenience for individuals with physical disabilities.
Introduction
Overview
The wheelchair-mounted robotic arm for drink assistance is an advanced assistive technology designed to enhance independence, safety, and quality of life for individuals with mobility impairments. It addresses the daily challenge of drinking without caregiver support, offering a dignified and user-friendly solution through robotics and smart technology integration.
Key Features and Innovations
Multi-degree-of-freedom robotic arm: Mimics natural human movements to fetch and assist with beverages.
Control methods: Offers joystick, voice commands, touchscreen, and face tracking for user adaptability.
Smart integration: Compatible with mobile apps, home automation, and includes preset drink routines.
Safety mechanisms: Includes collision detection, emergency stop, and grip adjustment based on container type.
Customizable interface: Supports user-defined preferences and drink positions.
Hydration tracking: Optional feature to remind users to drink at intervals.
Ergonomics and aesthetics: Designed to be unobtrusive and socially acceptable in appearance.
Benefits
Enhanced autonomy in daily activities.
Reduced reliance on caregivers.
User-friendly with multiple input options (voice, button, camera).
Real-time feedback via LCD.
Lightweight and wheelchair-compatible.
Minimal maintenance and modular for future upgrades.
Safe and precise operation with responsive sensors and machine learning for user adaptability.
System Design and Components
A. Hardware
Robotic arm with stable, lightweight frame and precise motion.
Gripper/cup holder designed for various containers.
Camera module (e.g., Raspberry Pi) for face and mouth position detection.
IR and proximity sensors for object detection.
Microphones for voice input.
Servo motors (e.g., MG995) for motion control.
LCD display for status monitoring.
B. Control System
Manual controls: Joysticks, switches for those with partial mobility.
Voice control: For hands-free interaction.
Face tracking: To detect user orientation and mouth position.
Central control unit: Integrates inputs, processes data, and issues commands to actuators.
C. Software and Processing
Programmed using Python/C++.
Utilizes machine learning for personalized interaction.
Real-time feedback loop for error detection and motion adjustment.
Capable of logging usage data and system health.
Workflow Summary
Power Initialization: Activates components like sensors, Raspberry Pi, and control unit.
Sensor Activation: IR and proximity sensors detect objects and user position.
Configuration: User-defined settings stored in memory.
Data Acquisition: Captures voice, visual, and proximity input.
Display and Output: LCD shows system status and feedback.
Feedback Loop: Stops or adjusts the arm upon detecting obstacles.
Testing & Optimization: Functional testing with user feedback ensures reliability.
Troubleshooting: Issues like misalignment or delayed response are analyzed and fixed.
Technical Components
Arduino Nano: Microcontroller for processing sensor input and controlling motion.
Raspberry Pi 5: For camera processing and machine learning tasks.
FC51 IR Sensor: Detects objects within 3–80 cm range.
USB Microphone: Noise-canceling voice input for commands.
MG995 Servo Motor: Drives the robotic arm with precise movement.
Project Objectives
Understand the drink assistance needs of mobility-impaired individuals.
Review existing assistive tech and identify gaps.
Design a customizable, adaptive robotic arm with:
Precision and stability
User-friendly interface
Compatibility with various wheelchair types
Conduct user testing to evaluate:
Satisfaction and usability
Reduction in caregiver reliance
Technical and economic feasibility
Conclusion
In conclusion, the Wheel Chair Mounted Robotic Arm for Drink Assist has demonstrated exceptional potential in enhancing independence and quality of life for individuals with mobility impairments. The robotic arm\'s precision, stability, and user-friendly interface enable effortless drink retrieval and manipulation, significantly reducing relianceoncaregivers.Extensivetestingincreaseshighusersatisfaction, increased confidence, and reduced assistance requirements. The system\'s adaptability and customization ensure compatibility with various wheelchair models and user needs.
References
[1] L. Minati, N. Yoshimura, and Y. Koike, ``Hybrid control of avisionguided robot arm by EOG, EMG, EEG biosignals and headmovement acquired via a consumer-grade wearable device,\'\' IEEEAccess, vol. 4, pp. 9528_9541, 2016.
[2] S. Li, J. Li, G. Tian, and H. Shang, ``Stiffness adjustment for asingle robot arm driven by series elastic actuator in muscletraining,\'\' IEEE Access, vol. 7, pp. 65029_65039, 2019.
[3] W. Wang, C. Du, W. Wang, and Z. Du, ``A PSO-optimized fuzzyreinforcement learning method for making the minimally invasivesurgical arm cleverer,\'\' IEEE Access, vol. 7, pp. 48655_48670,2019.
[4] B. Rouzbeh, G. M. Bone, G. Ashby, and E. Li, ``Design,implementation and control of an improved hybrid pneumatic-electric actuator for robot arms,\'\' IEEE Access, vol. 7, pp.14699_14713,2019.
[5] Y.Zhang,W.Zhu,andA.Rosendo,``QRcode-basedself-calibrationfor a fault-tolerant industrial robot arm,\'\' IEEE Access, vol. 7, pp.73349_73356,2019.
[6] S.Mori,K.Tanaka,S.Nishikawa,R.Niiyama,andY.Kuniyoshi,``Highspeed and lightweight humanoid robot arm for a skillfulbadminton robot,\'\' IEEE Robot. Autom. Lett., vol. 3, no. 3, pp.1727_1734, Jul. 2018.
[7] X.Liang, H.Cheong,Y. Sun, J.Guo,C. K.Chui,andC.-H.Yeow,`Design, characterization, and implementation of a two-DOFfabric-basedsoft robotic arm,\'\'IEEE Robot.Autom. Lett., vol. 3,no.3, pp. 2702_2709, Jul. 2018.
[8] A. Specian, C. Mucchiani, M. Yim, and J. Seo, ``Robotic edge-rolling manipulation: A grasp planning approach,\'\' IEEE Robot.Autom. Lett., vol. 3, no. 4, pp. 3137_3144.
[9] S. Kolathaya, W. Guffey, R. W. Sinnet, and A. D. Ames, ``Directcollocation for dynamic behaviors with nonprehensile contacts:Applicationto_ipping burgers,\'\' IEEERobot.Autom. Lett., vol. 3,no.4, pp. 3677_3684, Oct. 2018.
[10] H.Yang,M.Xu,W. Li,andS.Zhang,``Designandimplementationof a soft robotic arm driven by SMA coils,\'\' IEEE Trans. Ind.Electron., vol. 66, no. 8, pp. 6108_6116, Aug. 2019.
[11] T. Kishi, S. Shimomura, H. Futaki, H. Yanagino, M. Yahara, S.Cosentino, T. Nozawa, and K. Hashimoto, ``Development of ahumoroushumanoidrobotcapableofquick-and-widearmmotion,\'\'IEEE Robot. Autom. Lett., vol. 1, no. 2, pp. 10811088, Jul. 2016.
[12] M.Makarov,M.Grossard,P.Rodríguez-Ayerbe,andD.Dumur ,``Modeling and preview H1 control design for motioncontrol of elastic-joint robots with uncertainties,\'\' IEEE Trans. Ind.Electron., vol. 63, no. 10, pp. 64296438, Oct. 2016.