Turmeric and ginger are economically vital spice crops cultivated for their underground rhizomes, which are largely susceptible to conditions similar as soft spoilage, rhizome spoilage, and bacterial wilt. Traditional discovery styles calculate on visual examination orpost- harvest opinion, frequently performing in delayed treatment and significant yield loss. This exploration proposes an AI- driven frame for early rhizome complaint discovery using a multimodal approach that integrates deep literacy, hyperspectral imaging, and IoT- grounded environmental seeing. Convolutional Neural Networks (CNNs), enhanced through transfer literacy, are employed to classify rhizome health from subterranean image data, while detector emulsion ways relate soil humidity, temperature, and pH with complaint onset. The system also incorporates time- series soothsaying and natural language interfaces to deliver real- time cautions and treatment recommendations to growers. By fastening on rhizome- position analysis — an area largely overlooked in being literature — this study aims to ameliorate individual delicacy, reduce crop losses, and promote sustainable spice husbandry through intelligent, accessible technology.
Introduction
Turmeric and ginger—important spice crops valued for their underground rhizomes—are highly vulnerable to diseases such as soft rot, rhizome rot, and bacterial wilt. These infections often remain hidden beneath the soil, making early detection difficult and leading to major yield losses, misdiagnosis, and increased chemical use. Traditional methods relying on manual inspection and post-harvest analysis are labor-intensive and unreliable for subsurface disease identification.
To address this challenge, the study introduces a multimodal AI framework that combines underground imaging (hyperspectral and near-infrared), environmental sensor data (soil moisture, temperature, pH), and deep learning models, particularly CNNs enhanced through transfer learning. The system uses semantic segmentation to pinpoint infected regions and integrates image data with IoT-based sensor readings using a fusion model that includes LSTM layers. This approach aims to detect rhizome diseases early, improve diagnostic accuracy, and provide farmers with real-time, actionable insights through mobile and web applications.
Literature review findings show that most AI-based crop disease research focuses on leaf diseases due to ease of imaging, while rhizome-level detection remains largely unexplored. CNNs, transfer learning, hyperspectral imaging, and IoT monitoring have shown promise individually, but have rarely been combined for subterranean disease detection. A major gap identified is the lack of rhizome-specific datasets and models capable of handling soil occlusion and subsurface imaging challenges.
The problem statement reinforces the need for a reliable early-detection system for rhizome diseases in turmeric and ginger. Environmental conditions strongly influence disease progression, but are seldom integrated into current AI models. This research attempts to fill that gap with a scalable, sustainable, and intelligent system tailored to rhizome-level pathology.
Methodology includes collecting NIR/hyperspectral rhizome images and continuous soil sensor data, preprocessing images with noise reduction and segmentation, training CNN and transfer-learning models (EfficientNet-B3, ResNet50, MobileNetV2), using U-Net for segmentation, and fusing sensor and image features through a CNN-LSTM architecture. The system is deployed via a user-friendly dashboard where farmers can upload images and receive disease classification, confidence scores, and treatment recommendations.
Experimental results show high performance: EfficientNet-B3 achieved 95.8% accuracy, outperforming ResNet50 and a custom CNN. U-Net segmentation achieved an IoU of 87.4%, effectively identifying infection zones. Multimodal fusion with sensor data further improved accuracy to 96.1% and revealed strong correlations between environmental spikes (e.g., soil moisture) and disease onset. Inference times (≈1.8 seconds per image) and dashboard response speeds confirm suitability for real-time field deployment.
Conclusion
This exploration introduces a new AI-grounded frame for early discovery of rhizome conditions in turmeric and ginger, addressing a critical gap in spice husbandry where subterranean infections like soft spoilage and bacterial wilt frequently go unnoticed until advanced stages. By integrating deep literacy models particularly CNNs enhanced through transfer literacy with hyperspectral imaging and IoT- grounded environmental seeing, the system enables accurate, real- time bracket of rhizome health. Experimental results demonstrate high individual performance, with multimodal emulsion perfecting vaticination trustability and responsiveness. The proposed result not only reduces crop losses and fungicide abuse but also empowers growers through multilingual interfaces and accessible mobile tools, contributing to sustainable, data- driven spice husbandry practices.
References
[1] Ansari, A., Singh, S., & Akhtar, N. (2025). AI-Driven Crop Disease Detection and Management in Smart Agriculture. International Journal of Scientific Research in Science and Technology, 12(3), 309–319. https://doi.org/10.32628/IJSRST2512341
[2] Orchi, H., Sadik, M., & Khaldoun, M. (2022). On Using Artificial Intelligence and the Internet of Things for Crop Disease Detection: A Contemporary Survey. Agriculture, 12(9), Article 1009. https://doi.org/10.3390/agriculture12010 009