The Development of Autonomousvehicles has increased significant attention due to their potential to enhance safety, efficiency and accessibility in transportation. This study focuses on the development of NavDrive, a low-cost autonomous vehicle prototype, which is possible by using the raspberry pi 3, Arduino and the machine learning concept called Behavioral Cloning(BC) – Imitation with the Convolutional Neural Network(CNN). the methodology includes three key faces, data collection, model making, implementation. The data collection by driving the car manually on the custom design indoor track using the white sheets and black tape. The prototype is controlled using the rf based transmitter and receiver. The prototype is equipped with the raspberry pi camera and various other modules ,for the training of the sequential CNN model to mimic the human driving behaviorand the deployment of the model on the raspberry pi to enable the autonomous navigationThe outcomes of this project demonstrate the feasibility of creating a functional autonomous vehicle using readily available and affordable hardware, offering valuable insights into the challenges and successes of such an implementation and highlighting its potential as an educational prototype for exploring autonomous driving principles.
Introduction
Background and Motivation
The rise of autonomous vehicles (AVs) is driven by their potential to increase road safety, improve efficiency, and enhance mobility. Key enablers of AV technology are Artificial Intelligence (AI) and Machine Learning (ML), allowing vehicles to perceive, decide, and act without human input. To make AV development accessible, platforms like Raspberry Pi and Arduino are widely used for educational and low-cost prototyping.
Behavioral Cloning in Autonomous Driving
Behavioral cloning is an ML technique where an agent learns to imitate a human driver by mapping environmental input (e.g., camera images) to control actions (e.g., steering). It is simple to implement and effective in structured environments but struggles with situations not well-represented in training data.
NavDrive Project Overview
The NavDrive project focuses on building a low-cost autonomous driving system using behavioral cloning and vision-based control. The key objectives were:
Collecting driving data using a Raspberry Pi camera on custom indoor tracks.
Training a Convolutional Neural Network (CNN) to predict steering angles from images.
Deploying the model on the Raspberry Pi for real-time autonomous navigation.
Research Contributions
End-to-end implementation of behavioral cloning using affordable hardware (Raspberry Pi + Arduino).
Development of custom tracks, data collection, model training, and deployment workflow.
Demonstration of successful real-time driving on simple tracks and analysis of challenges on complex ones.
Practical insights into embedded systems and machine learning integration.
Literature Review Highlights
AVs have evolved from theory to practice using sensor fusion, path planning, and control systems.
Small-scale AVs serve as cost-effective educational tools and research platforms.
Behavioral cloning has shown success in lane following tasks using CNNs.
CNNs are well-suited for vision tasks like object detection, lane keeping, and steering prediction.
Methodology
Hardware Setup
The system includes:
Raspberry Pi 3 B: runs ML model and processes images.
Pi Camera: captures forward-facing visuals.
Arduino Uno: controls motors and steering.
L298N motor driver, DC motors, servo motor, battery, and manual controller.
Serial communication via USB allows coordination between Pi and Arduino.
Software Implementation
Python on Raspberry Pi for image capture, preprocessing (via OpenCV), ML inference (via TensorFlow/Keras), and serial communication.
C/C++ on Arduino for interpreting commands and controlling motors.
Threading on the Pi ensures simultaneous image processing and data logging.
Behavioral Cloning Framework
Data collected by manually driving the RC car on tracks while recording camera images and steering inputs.
A CNN model is trained to map image data to steering angles.
Training involves image preprocessing, data augmentation, and loss monitoring (e.g., MSE).
Project Phases
Data Collection
Custom tracks: oval, circle, and S-shape.
Manual driving while recording synchronized image-steering data into CSV files.
Model Training
CNN trained using preprocessed and augmented data.
Trained on GPU-powered laptop using Keras/TensorFlow.
Model evaluated using training/validation loss and saved for deployment.
Model Deployment and Testing
Trained CNN deployed on Raspberry Pi.
Real-time image input leads to steering predictions sent to Arduino.
Vehicle tested on tracks in autonomous mode, demonstrating success on simple tracks and challenges on complex paths (like the S-shape).
Discussion and Results
The system successfully demonstrated basic autonomous navigation on simple tracks.
Training and validation metrics confirmed effective learning by the CNN.
Struggles on complex tracks revealed limitations in data diversity and model generalization.
Despite limitations, the project validated the feasibility of using low-cost platforms for practical AV research and education.
Conclusion
The NavDrive project successfully demonstrated the development of a next-generation automated vehicle driving system utilizing a Raspberry Pi, Arduino, and real-time intelligent vision through the application of behavioral cloning and a Convolutional Neural Network. The three-phase approach, encompassing data collection, model training, and implementation, resulted in a low-cost prototype capable of autonomous navigation on custom-designed indoor tracks. The project highlights the potential of readily available and affordable hardware for exploring complex concepts in autonomous robotics and provides valuable insights into the practical aspects of implementing behavioral cloning for autonomous driving.
References
[1] Abdhalin, abdul Hafiz &Khairunizam, Wan & Ali, Hasimah. (2021). Autonomous Vehicle: Introduction and Key-elements. Journal of Physics: Conference Series. 1997. 012016. 10.1088/1742-6596/1997/1/012016.https://www.researchgate.net/publication/354186664_Autonomous_Vehicle_Introduction_and_Key-elements
[2] Pandey, Navneet & Harsh, & Jayant, Himanshu & Kumar, Narendra & Kumar, Vinod. (2022). Behavioural Cloning in Autonomous Vehicle. 10.3233/ATDE220771.https://www.researchgate.net/publication/365341113_Behavioural_Cloning_in_Autonomous_Vehicle
[3] Othman K. Exploring the implications of autonomous vehicles: a comprehensive review. Innov. Infrastruct. Solut. 2022;7(2):165. doi: 10.1007/s41062-022-00763-6. Epub 2022 Mar 1. PMCID: PMC8885781.https://pmc.ncbi.nlm.nih.gov/articles/PMC8885781/
[4] Bimbraw, Keshav. (2015). Autonomous Cars: Past, Present and Future - A Review of the Developments in the Last Century, the Present Scenario and the Expected Future of Autonomous Vehicle Technology.ICINCO 2015 - 12th International Conference on Informatics in Control, Automation and Robotics, Proceedings. 1. 191-198. 10.5220/0005540501910198. https://www.researchgate.net/publication/283757446_Autonomous_Cars_Past_Present_and_Future_-_A_Review_of_the_Developments_in_the_Last_Century_the_Present_Scenario_and_the_Expected_Future_of_Autonomous_Vehicle_Technology
[5] Morga-Bonilla SI, Rivas-Cambero I, Torres-Jiménez J, Téllez-Cuevas P, Núñez-Cruz RS, Perez-Arista OV. Behavioral Cloning Strategies in Steering Angle Prediction: Applications in Mobile Robotics and Autonomous Driving. World Electric Vehicle Journal. 2024; 15(11):486. https://doi.org/10.3390/wevj15110486
[6] Shahane, Vikrant & Jadhav, Hrushikesh &Sansare, Mihir &Gunjgur, Prathmesh. (2022). A Self-Driving Car Platform Using Raspberry Pi and Arduino.1-6. 10.1109/ICCUBEA54992.2022.10010814. https://www.researchgate.net/publication/367196472_A_Self-Driving_Car_Platform_Using_Raspberry_Pi_and_Arduino
[7] A, Prof & Salim, Aiman & Dileep, Arya & S, Anjana. (2020). Autonomous Car using Raspberry PI and Ml.International Journal of Recent Technology and Engineering (IJRTE). 9. 1067-1071. 10.35940/ijrte.B4033.079220. https://www.researchgate.net/publication/363668338_ Autonomous_Car _using_Raspberry_PI_and_Ml
[8] M. A. Hossain, S. Hossain, and M. A. Rahman, “Implications of autonomous vehicles on different sectors: A review,” J. Adv. Transp., vol. 2022, Art. no. 8885781, 2022, doi: 10.1155/2022/8885781. [Online]. Available: https://pmc.ncbi.nlm.nih.gov/articles/PMC8885781/
[9] M. A. P. Mahmud, M. S. Hossain, and M. A. Rahman, “Autonomous Vehicle: Introduction and Key-elements,” SSRN Electron. J., 2021, doi: 10.2139/ssrn.3918854. [Online]. Available: https://www.researchgate.net/publication/354186664_Autonomous_Vehicle_Introduction_and_Key-elements
[10] F. Karjanto and A. Yesufu, “Autonomous Cars: Past, Present and Future - A Review of the Developments in the Last Century, the Present Scenario and the Expected Future of Autonomous Vehicle Technology,” SSRN Electron. J., 2015, doi: 10.2139/ssrn.2698833. [Online]. Available:((https://www.researchgate.net/publication/283757446_Autonomous_Cars_Past_Present_and_Future_-_A_Review_of_the_Developments_in_the_Last_Century_the_Present_Scenario_and_the_Expected_Future_of_Autonomous_Vehicle_Technology.