Myoelectricsystemshavelongbeenusedinassistive technologies,buttheirhighcostanddependenceonfunctional nerves limit their accessibility for individuals with severe physical impairments. As an alternative, this paper presents a non-invasive,cost-effectivewheelchaircontrolsystembasedon electroencephalogram(EEG)signals.ThesystememploysBrain- Computer Interface (BCI) technology to interpret brain activity and eye movements, enabling users to control the wheelchair through mental commands and gaze gestures. ANeuroSkyMindwaveheadsetisusedtocaptureEEGsignals, whicharethenprocessedusingsignal processingtechniquesto assess cognitive states such as concentration and relaxation. Eye-tracking is integrated to improvecommand precision.The interpreted signals are converted into directional commands andtransmitted wirelessly to anESP32 microcontroller, which controls the wheelchair’s movement.
Experimental results demonstrate an 85% accuracy rate in translating brain and eye activity into movement commands, supporting intuitive and reliable navigation. By combining EEGsignalprocessingwitheye-tracking,thissystemenhances mobility and independence for users with significant motor disabilities. The proposed approach offers a practical, user- friendly alternative to traditional assistive mobility devices.
Introduction
The text presents the design, implementation, and evaluation of a non-invasive brain-controlled wheelchair system aimed at improving mobility and independence for individuals with severe motor disabilities such as ALS, MS, spinal cord injuries, and cerebral palsy. The system leverages EEG-based Brain–Computer Interface (BCI) technology to enable hands-free wheelchair control by translating brain signals and eye gestures into movement commands.
EEG signals are captured using the NeuroSky Mindwave Mobile 2 headset, which records brain activity via scalp electrodes. These signals are wirelessly transmitted to a computer, where Python’s MNE library performs noise filtering, feature extraction, and frequency analysis. Key brainwave bands (delta, theta, alpha, beta, and gamma) and eye blink patterns are mapped to specific wheelchair commands such as start, left, right, stop, and emergency brake. The processed commands are sent to an ESP32 microcontroller, which controls DC motors through an H-bridge driver, enabling real-time wheelchair movement. Additional safety features include obstacle detection and emergency braking.
The methodology outlines a complete signal pipeline: signal detection, acquisition, transmission, mapping, and motor control logic. Mathematical modeling of EEG signals, Fourier analysis, band-pass filtering, power spectral density (PSD), and PWM-based motor control form the theoretical foundation of the system. Combining EEG frequency features with eye-blink detection improves command reliability and reduces false activations.
Experimental results demonstrate accurate real-time translation of EEG signals into wheelchair movements with low latency and high reliability. Testing with both live EEG signals and publicly available datasets (PhysioNet) confirmed effective signal extraction, command assignment, and motor execution via the ESP32.
To further enhance accuracy, machine learning–based validation was introduced. Using features extracted from EEG signals, three classifiers—SVM, Random Forest, and KNN—were evaluated. The SVM classifier achieved the best performance with 87% accuracy, high precision, and strong recall. The trained SVM model was deployed on the ESP32 using TinyML and TensorFlow Lite, enabling real-time prediction and improved responsiveness.
Conclusion
This study presents a brain-controlled wheelchair system that leverages EEG signals and eye movements to assist individuals with severe mobility impairments. The non- invasiveandcost-effectiveBCITechnologyconvertsbrain activityintoreal-timecontrol commands.Thesecommands are then processed and transmitted to the ESP32 microcontroller, enabling precise motor control for seamless wheelchair navigation. The integration of advanced signal processing ensures accurate command recognition, enhancing the system’s reliability and responsiveness.
The results of the testing demonstrated that the system successfully extracted events from raw EEG signals, assigned the correct movement commands, and sent them to the Arduino IDE for transfer to the ESP32.
The wheelchair accurately responded to commands for starting,turning,stopping,andemergencybraking,offering a reliable and intuitive controlmechanism.The integration of EEG signal processing, eye-tracking technology, and microcontroller-based motor control ensures that the system providesaseamlessuserexperiencewithhighaccuracyand responsiveness.
This system represents a promising solution for improving the mobility and independence of individuals with neuromusculardisorders,offeringapractical,non-invasive alternative to traditional assistive devices. Future research can enhance signal processing accuracy, expand system capabilities, and improve the user interface for wider applications. Ultimately, this project highlights the potential of combining advanced neuroscience and robotics to enhancethequalityoflifefordifferently-abledindividuals.
References
[1] E. A. Curran and M. J. Stokes, \"Learning To control brain activity: A review of the production and control of EEG components for driving brain-computer interface (BCI) systems\", Brain Cognition, vol. 51, no. 3, pp. 326-336, 2003.
[2] A. Kubler, B. Kotchoubey, T. Hinterberger, N. Ghanayim, J. erelmouter, M. Schauer, et al., \"The Thought translation device: A neurophysiological approach to communication in total motor paralysis\", Exp. Brain Res., vol. 124, no. 2, pp. 223-232, 1999.
[3] M. Palankar, K. De Laurentis, R. Alqasemi, E. Veras, R. Dubey, Y. Arbel, et al., \"Control of a 9-DoF wheelchair-mounted robotic arm system using a P300 brain computer interface: Initial experiments\", IEEE Int. Conf. Robot. Biomimetics 2008, pp. 348-353, 2009.
[4] J. R. Wolpaw, N. Birbaumer, D. J. MacFarland, G. Pfurtscheller and T. M. Vaughan, \"Brain-computer interface for communication and control\", Clin. Neurophysiol., no. 113, pp. 767-791, 2002.
[5] K. Tanaka, K. Matsunaga and H. Wang, \"Electroencephalogram-based control of wheelchair\", IEEE Trans. Robotics, vol. 21, no. 4, pp. 762-766, Aug. 2005.
[6] I. Iturrate, J. Antelis, A. Kbler and J. Minguez, \"Non-invasive brain-actuated wheelchair based on a P300 neurophysiological protocol and automated navigation\", IEEE Trans. Robot., vol. 25, no. 3, pp. 614-627, Jun. 2009.
[7] J. Philips, J. D. R. Milln, G. Vanacker, E. Lew, F. Galn, P. Ferrez, et al., \"Adaptive shared control of a brain-actuated simulated wheelchair\", IEEE Int. Conf. Rehabil. Robot. (ICORR), pp. 408-414, 2007.
[8] Q. Zeng, E. Burdet, B. Rebsamen and C. L. Teo, \"A collaborative wheelchair system\", IEEE Trans. Neural Syst. Rehabil. Eng., vol. 16, no. 2, pp. 161-170, Apr. 2008
[9] B. Rebsamen, E. Burdet, C. Guan, H. Zhang, C. L. Teo, Q. Zeng, et al., \"Controlling a wheelchair indoors using thought\", IEEE Intell. Syst. Mag., pp. 18-24, Mar./Apr. 2007
[10] G. Bourhis, K. Moumen, P. Pino, S. Rohmer and A. Pruski, \"Assisted navigation for a powered wheelchair\", IEEE Int. Conf. Syst. Man Cybern., no. 3, pp. 553-558, 1993.
[11] A. Farwell and E. Donchin, \"Talking off the top of your head: Toward a mental prosthesis utilizing event-related brain potentials\", Electroencephalogr. Clin. Neurophysiol., no. 70, pp. 510-523, 1988.
[12] M. Cheng, X. Gao and D. Xu, \"Design and implementation of a brain-computer interface with high transfer rates\", IEEE Trans. Biomed. Eng., vol. 49, no. 10, pp. 1181, Oct. 2002.
[13] M. Kaper, P. Meinicke, T. Linger and H. Ritter, \"BCI competition 2003 Dataset IIb: Support vector machines for the P300 speller paradigm\", IEEE Trans. Biomed. Eng., vol. 51, no. 6, pp. 1073-1076, Jun. 2004.
[14] E. A. Curran and M. J. Stokes, \"Learning To Control Brain Activity: A Review of the Production and Control of EEG Components for Driving Brain-Computer Interface (BCI) Systems,\" Brain Cognition, vol. 51, no. 3, pp. 326-336, 2003.
[15] J. R. Wolpaw, N. Birbaumer, D. J. MacFarland, G. Pfurtscheller, and T. M. Vaughan, \"Brain-Computer Interface for Communication and Control,\" Clinical Neurophysiology, vol. 113, pp. 767-791, 2002.
[16] K. Tanaka, K. Matsunaga, and H. Wang, \"Electroencephalogram-Based Control of Wheelchair,\" IEEE Transactions on Robotics, vol. 21, no. 4, pp. 762-766, Aug. 2005.
[17] I. Iturrate, J. Antelis, A. Kübler, and J. Minguez, \"Non-invasive Brain-Actuated Wheelchair Based on a P300 Neurophysiological Protocol and Automated Navigation,\" IEEE Transactions on Robotics, vol. 25, no. 3, pp. 614-627, Jun. 2009.
[18] M. Cheng, X. Gao, and D. Xu, \"Design and Implementation of a Brain-Computer Interface with High Transfer Rates,\" IEEE Transactions on Biomedical Engineering, vol. 49, no. 10, pp. 1181-1185, Oct. 2002.
[19] B. Rebsamen, E. Burdet, C. Guan, H. Zhang, C. L. Teo, Q. Zeng, et al., \"Controlling a Wheelchair Indoors Using Thought,\" IEEE Intelligent Systems Magazine, pp. 18-24, Mar./Apr. 2007
[20] F. Pedregosa et al., “Scikit-learn: Machine Learning in Python,” Journal of Machine Learning Research, vol. 12, pp. 2825–2830, 2011.
[21] A. L. Goldberger et al., “PhysioBank, PhysioToolkit, and PhysioNet: Components of a new research resource,” Circulation, vol. 101, no. 23, pp. e215–e220, 2000.
[22] Iturrate, J. Antelis, and J. Minguez, \"Synchronous EEG brain-actuated wheelchair with automated navigation,\" in Proceedings of 2009 IEEE International Conference on Robotics and Automation (ICRA), 2009
[23] K. Tanaka, K. Matsunaga, and H. O. Wang, \"Electroencephalogram-based control of an electric wheelchair,\" IEEE Transactions on Robotics, vol. 21, no. 4, pp. 762–766, Aug. 2005
[24] Dehrouye-Semnani F., N. M. Charkari, S. M. M. Mirbagheri, “Toward an improved BCI for damaged CNS-tissue patient using EEG-signal processing approach” (2021)