Fruit spoilage leads to significant economic losses and quality degradation in the agricultural supply chain. Manual inspection of sweet lime freshness is time-consuming, inconsistent, and prone to human error. To address this issue, an AI-based automated fruit spoilage detection system is proposed using the YOLOv8 deep learning model and Raspberry Pi for real-time edge deployment. The system captures images of sweet lime using a Raspberry Pi camera and processes them through a trained YOLOv8 object detection model to classify fruits as Good or Rotten. The model is trained on a labeled dataset with bounding box annotations and optimized for lightweight inference to run efficiently on Raspberry Pi. The system provides real-time detection with bounding boxes, class labels, and confidence scores, achieving an mAP of approximately 92% and a frame rate of 8–12 FPS on edge hardware. This low-cost and portable solution reduces manual effort, minimizes human misclassification, and enables automated fruit quality assessment for small vendors, warehouses, and smart agriculture applications. The proposed system demonstrates the feasibility of deploying deep learning–based computer vision models on edge devices for real-time food quality monitoring.
Introduction
The project focuses on developing an AI-based automated fruit spoilage detection system for sweet lime, a highly perishable citrus fruit. Traditional manual inspection methods are time-consuming, inconsistent, and unable to detect early-stage spoilage, leading to economic losses and reduced consumer satisfaction. The proposed system leverages Artificial Intelligence (AI), Computer Vision, and edge computing to provide a low-cost, real-time solution for fruit quality assessment.
Key Highlights:
Technology Used:
YOLOv8 (You Only Look Once) deep learning model for object detection and classification.
Raspberry Pi 4 with Pi Camera for real-time image capture and on-device processing.
Python, OpenCV, and Ultralytics YOLO framework for image processing and model deployment.
Methodology:
Image Acquisition: Raspberry Pi Camera captures sweet lime images under various lighting conditions.
Preprocessing: Images are resized, normalized, and enhanced for better analysis.
Feature Extraction & Detection: CNN-based features (color, texture, shape) are fed into YOLOv8 for fruit detection and classification (Good or Rotten) with bounding boxes and confidence scores.
Edge Deployment: Lightweight YOLOv8n model optimized for Raspberry Pi enables real-time inference without cloud dependency.
Output Display: Results are shown on a monitor, indicating fruit condition and confidence percentages.
Objectives:
Collect and annotate a labeled sweet lime dataset.
Train and optimize YOLOv8 for accurate, real-time classification.
Reduce human effort, minimize errors, and provide a portable fruit grading solution for small markets and smart agriculture applications.
Real-time detection speed of 8–12 FPS suitable for small-scale operations.
System effectively identifies fresh and spoiled sweet limes under normal lighting, with minor degradation under low-light conditions.
Conclusion
This project successfully developed an AI-based sweet lime spoilage detection system using the YOLOv8 object detection model deployed on a Raspberry Pi for real-time edge processing. The system is capable of detecting and classifying sweet lime as Good or Rotten with high accuracy by analyzing visual features such as color variation, texture changes, and surface defects. The trained lightweight YOLOv8n model achieved strong performance metrics and provided real-time inference at approximately 8–12 FPS on Raspberry Pi, making it suitable for practical small-scale applications.
The proposed solution reduces manual inspection effort, minimizes human error, and offers a low-cost, portable, and automated fruit quality assessment system for vendors, warehouses, and smart agriculture environments. The results demonstrate the feasibility of deploying deep learning–based computer vision models on edge devices for real-time food quality monitoring and highlight the potential for extending the system to multi-fruit classification and automated sorting in future work.
References
[1] J. Redmon, S. Divvala, R. Girshick and A. Farhadi, “You Only Look Once: Unified, Real-Time Object Detection,” in Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 2016, pp. 779–788.
[2] J. Redmon and A. Farhadi, “YOLOv3: An Incremental Improvement,” arXiv preprint arXiv:1804.02767, 2018.
[3] K. He, X. Zhang, S. Ren and J. Sun, “Deep Residual Learning for Image Recognition,” in Proc. IEEE CVPR, 2016, pp. 770–778.
[4] A. Krizhevsky, I. Sutskever and G. E. Hinton, “ImageNet Classification with Deep Convolutional Neural Networks,” in Advances in Neural Information Processing Systems (NIPS), 2012, pp. 1097–1105.
[5] H. Sa, Z. Ge, F. Dayoub, B. Upcroft and T. Perez, “DeepFruits: A Fruit Detection System Using Deep Neural Networks,” Sensors, vol. 16, no. 8, pp. 1–15, 2016.
[6] R. Singh, A. Kaur and S. Sharma, “Deep Learning-Based Fruit Freshness Detection System,” International Journal of Computer Applications, vol. 176, no. 12, pp. 15–20, 2021.
[7] C. Chen and Z. Rao, “Fruit Defect Detection Using YOLO-Based Deep Learning Models,” IEEE Access, vol. 10, pp. 45021–45033, 2022.
[8] S. Patel and P. Kumar, “Computer Vision-Based Citrus Fruit Grading Using Image Processing Techniques,” IEEE Transactions on Instrumentation and Measurement, vol. 69, no. 9, pp. 6701–6712, 2020.
[9] A. Gupta, “Transfer Learning Approaches for Fruit Classification in Constrained Datasets,” in Proc. IEEE Int. Conf. Machine Learning and Applications (ICMLA), 2023, pp. 455–460.
[10] H. Varma, D. Mane and A. Thorat, “Low-Cost Fruit Quality Inspection System Using Raspberry Pi,” in Proc. IEEE Int. Conf. Smart Technologies for Smart Nation, 2020, pp. 112–116.
[11] G. Bradski, “The OpenCV Library,” Dr. Dobb’s Journal of Software Tools, 2000.
[12] F. Chollet, Deep Learning with Python. New York, NY, USA: Manning Publications, 2017.
[13] Ultralytics, “YOLOv8: Next-Generation Real-Time Object Detection,” 2023. [Online]. Available: https://docs.ultralytics.com
[14] Raspberry Pi Foundation, “Raspberry Pi 4 Model B Documentation,” 2023. [Online]. Available: https://www.raspberrypi.org/documentation/
[15] M. Grinberg, Flask Web Development: Developing Web Applications with Python, 2nd ed. Sebastopol, CA, USA: O’Reilly Media, 2018.