Cotton productivity plays a crucial role in the global agricultural economy; however, various leaf diseases significantly threaten crop yield and fiber quality. Early and accurate disease detection is essential for effective crop management, yet traditional inspection methods are time-consuming, labor-intensive, and dependent on expert knowledge, often leading to inconsistent results. Conventional machine learning approaches also face limitations in real-world agricultural environments due to variations in lighting conditions, complex backgrounds, and similarities between disease symptoms. To address these challenges, this research proposes an intelligent framework called Cotton Plant Disease Identification Using ResMobNet with Attention-Guided Localization and Severity Analysis (CPDI-RMN).
The proposed system integrates advanced image preprocessing, hybrid feature extraction, deep learning classification, and attention-based localization to create a comprehensive disease detection framework. Initially, cotton leaf images are collected from a comprehensive dataset and preprocessed through image resizing, noise removal, contrast enhancement, and Min–Max normalization to improve visual quality and ensure stable model training. Data augmentation techniques such as rotation, flipping, zooming, and brightness adjustment are applied to enhance dataset diversity and improve model robustness against overfitting.
For feature enhancement, contour visualization and geometric feature representation are combined with texture analysis using the Gray-Level Co-occurrence Matrix (GLCM) and Laplacian filtering. The core of the framework is the ResMobNet hybrid architecture, which integrates ResNet-50, EfficientNet-B3, and MobileNet-V2 to capture multi-scale spatial and texture features while maintaining computational efficiency. Gradient-Weighted Class Activation Mapping (Grad-CAM) is employed to generate attention maps for disease localization, followed by segmentation to isolate infected regions. Disease severity is then quantified by calculating the percentage of infected leaf area and classifying it into mild, moderate, and severe categories. Experimental results using five-fold cross-validation demonstrate that the CPDI-RMN model achieves 98.85% classification accuracy, outperforming CNN, ANN, ResNet, and MobileNetV2 models. Additionally, the attention-based localization achieves 96.8% Intersection over Union and 98.0% Dice Score, indicating highly accurate disease region detection. Overall, the proposed framework provides a reliable and scalable solution for intelligent cotton disease monitoring and supports precision agriculture through data-driven crop management.
Introduction
Cotton is an essential global crop that supports millions of farmers, contributes to rural employment, and plays a major role in the textile industry and international trade. However, cotton plants are highly vulnerable to diseases caused by bacteria, fungi, and viruses, which can reduce yield and affect fiber quality. Major cotton diseases include Bacterial Blight, Fusarium Wilt, Verticillium Wilt, Root Rot, and Cotton Leaf Curl Disease (CLCuD). These diseases cause symptoms such as leaf spots, yellowing, curling, wilting, root damage, and stunted growth, leading to significant production losses.
Traditional disease detection methods rely on manual field inspection, which is time-consuming, less accurate, and prone to human error. To overcome these limitations, modern technologies such as Artificial Intelligence (AI), Machine Learning, and Deep Learning are being used for automatic disease detection. Deep learning models, especially Convolutional Neural Networks (CNNs) and pre-trained architectures like ResNet, MobileNet, and EfficientNet, can identify diseases from leaf images with high accuracy, even with limited data.
This study proposes a deep learning–based system called CPDI-RMN for cotton disease identification and severity assessment. The framework includes image preprocessing (resizing, noise removal, contrast enhancement, and normalization), data augmentation (rotation, flipping, zooming, brightness changes), and advanced feature extraction techniques such as contour analysis and GLCM texture features. The model uses an ensemble of ResNet-50, EfficientNet-B3, and MobileNet-V2 for accurate classification, while Grad-CAM is used to highlight infected regions. The system also evaluates disease severity by measuring the affected leaf area.
References
[1] S. Chohan, R. Perveen, M. Abid, M. N. Tahir, and M. Sajid, “Cotton Diseases and Their Management,” Springer Singapore, 2020, pp. 239–270. doi: 10.1007/978-981-15-1472-2_13.
[2] A. A. Alatawi, S. M. Alomani, N. I. Alhawiti, and M. Ayaz, “Plant Disease Detection using AI based VGG-16 Model,” IJACSA, vol. 13, no. 4, Jan. 2022, doi: 10.14569/ijacsa.2022.0130484.
[3] R. Sujatha, S. Krishnan, J. M. Chatterjee, and A. H. Gandomi, “Advancing plant leaf disease detection integrating machine learning and deep learning,” Sci Rep, vol. 15, no. 1, Apr. 2025, doi: 10.1038/s41598-024-72197-2.
[4] Ö. Özalt?n, “Early Detection of Alzheimer’s Disease from MR Images Using Fine-Tuning Neighborhood Component Analysis and Convolutional Neural Networks,” Arab J Sci Eng, vol. 50, no. 10, pp. 7781–7800, Jan. 2025, doi: 10.1007/s13369-024-09954-y.
[5] P. Bishshash, A. S. Nirob, H. Shikder, A. H. Sarower, T. Bhuiyan, and S. R. H. Noori, “A comprehensive cotton leaf disease dataset for enhanced detection and classification,” Sept. 10, 2024, Elsevier. doi: 10.1016/j.dib.2024.110913.
[6] P. Johri et al., “Advanced deep transfer learning techniques for efficient detection of cotton plant diseases.,” Front. Plant Sci., vol. 15, Dec. 2024, doi: 10.3389/fpls.2024.1441117.
[7] T. Kalpana et al., “An Image Based Classification and Prediction of Diseases on Cotton Leaves Using Deep Learning Techniques,” Institute Of Electrical Electronics Engineers, Jan. 2023. doi: 10.1109/iccci56745.2023.10128306.
[8] N. Sofina and M. Ehlers, “Building Change Detection Using High Resolution Remotely Sensed Data and GIS,” IEEE J. Sel. Top. Appl. Earth Observations Remote Sensing, vol. 9, no. 8, pp. 3430–3438, Aug. 2016, doi: 10.1109/jstars.2016.2542074.