Maintaining plant health and ensuring suitable environmental conditions are essential aspects for modern agriculture. This work introduces an integrated system that combines artificial intelligence, mobile robotics, and Internet of Things (IoT) technologies to enable automated plant monitoring and greenhouse management. The proposed system employs a mobile robotic rover equipped with a vision unit to capture images of plant leaves. These images are processed using a YOLOv8-based deep learning algorithm to identify and classify plant diseases with high accuracy. In parallel, environmental parameters such as soil moisture, temperature, and humidity are continuously measured through embedded sensor modules via ESP32.The system adopts a distributed control framework using ESP32 microcontrollers, allowing seamless interaction between sensing components and actuators. When the soil moisture level falls below a predefined threshold, the system automatically activates a water pump to watering the plants. Along with, if the ambient temperature exceeds the desired limit, a cooling fan is triggered to lower the temperature. These automated control actions ensure that plants are consistently maintained under optimal growth conditions in disease free environment. Real-time data processing enables intelligent decision-making for irrigation and climate control. Furthermore, a mobile application interface allows users to remotely monitor system status, receive instant notifications, and make informed decisions based on collected data.
Introduction
Background and Motivation
Modern agriculture requires intelligent systems to detect plant diseases early and maintain optimal environmental conditions in controlled environments like greenhouses. Traditional manual monitoring is slow, inconsistent, and not scalable. Existing solutions often lack mobility and integrated control, creating a need for a unified, autonomous system that combines disease detection and environmental management.
Proposed System
This work presents a mobile robotic platform integrating deep learning, IoT sensing, and automated actuation. The rover, equipped with an IP camera and environmental sensors (soil moisture, temperature, humidity), navigates the greenhouse to capture plant images and monitor conditions in real time. A YOLOv8-based deep learning model processes images for disease detection, while environmental data are compared to thresholds to trigger automated irrigation and cooling via ESP32-controlled actuators. A mobile app provides remote monitoring, alerts, and real-time insights.
Novel Features
AI-enabled mobile plant inspection with real-time YOLOv8 detection.
Automated closed-loop climate control based on sensor feedback.
Decentralized IoT framework using ESP32 controllers for sensing, communication, and actuation.
Integrated analysis of plant health and environmental conditions.
Remote monitoring and intelligent user support through a mobile application.
Methodology
Visual Data Processing: Continuous image capture and analysis by YOLOv8 to classify plant diseases.
Environmental Monitoring: Sensors track soil moisture, temperature, and humidity.
Automated Decision Logic: System actuates water pumps and fans when thresholds are crossed.
User Interface: Mobile app provides real-time updates and alerts.
Hardware and Software
Hardware: ESP32 microcontrollers, robotic chassis with DC motors, camera, soil moisture and DHT sensors, motor driver.
Software: Embedded C, Python, TensorFlow/CNN for deep learning, mobile application for user interface.
Experimental Results
YOLOv8 model demonstrated high precision, recall, and mean Average Precision (mAP) for disease detection.
Rover reliably captured images and environmental data in real time.
Automated control actions (irrigation and cooling) responded effectively to environmental changes.
The system integration ensured seamless communication, early disease detection, and efficient greenhouse management.
Conclusion
This work presented a unified system for plant health monitoring and environmental control by integrating deep learning, IoT-based sensing, and a mobile robotic rover. The YOLOv8-based model enabled accurate and real-time detection of plant diseases, while the ESP32-driven sensor network ensured continuous monitoring of key environmental parameters. Automated actuation mechanisms, including irrigation and temperature control, were effectively triggered based on predefined thresholds.
Experimental evaluation demonstrated reliable performance in both disease detection and system operation. The rover platform facilitated real-time data acquisition, and the system consistently responded to environmental variations with timely control actions. The results highlight the potential of the proposed approach to improve efficiency and reduce manual intervention in greenhouse management.
Future work will focus on enhancing model robustness with larger datasets, expanding system scalability, and improving autonomous navigation for broader agricultural applications.
References
[1] D. P. Hughes and M. Salathé, “An open access repository of images on plant health,” arXiv:1511.08060, 2015.
[2] S. Mohanty, D. Hughes, and M. Salathé, “Using deep learning for image-based plant disease detection,” Frontiers in Plant Science, vol. 7, 2016.
[3] L. Da Xu, W. He, and S. Li,“Internet of Things in industries: A survey,” IEEE Trans. Industrial Informatics, vol. 10, no. 4, pp. 2233–2243, 2014.
[4] N. Wang, N. Zhang, and M. Wang, “Wireless sensors in agriculture,” Computers and Electronics in Agriculture, vol. 50, no. 1, pp. 1–14, 200
[5] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You Only Look Once: Unified, Real-Time Object Detection,” Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR), 2016.
[6] G. Jocher et al. “YOLOv8 by Ultralytics,” GitHub Repository, 2023.