Tea is among the most commercially significant crops in northeastern India, yet its leaves are highly prone to a range of fungal, bacterial, and pest-induced diseases that erode both yield and quality each season. While deep learning has shown strong promise for automated plant disease detection, most existing work on tea either overlooks model interpretability or relies on architectures too heavy for practical field use. In this paper, we describe an Attention-Enhanced EfficientNet-B3 model referred to as AE-EffNet that incorporates a Convolutional Block Attention Module (CBAM) after the final feature extraction stage of the backbone. The CBAM helps the network concentrate on disease-relevant spatial regions and feature channels rather than spreading its capacity over background noise. We trained and evaluated the model on a dataset of 5,867 tea leaf images spanning six classes: algal spot, brown blight, gray blight, healthy, helopeltis, and red spot. The images were drawn from a publicly available Kaggle repository and supplemented with field-collected samples from tea gardens in the Silchar region of Assam, India. On the held-out test set of 881 images, AE-EffNet achieved a classification accuracy of 98.98%, with macro-averaged precision, recall, and F1-score of 0.989, 0.990, and 0.990 respectively. Grad-CAM heat maps indicate that the model attends to lesion areas and disease-specific visual patterns rather than irrelevant background features. The framework maintains a compact parameter footprint by adding only a lightweight attention block on top of the already efficient backbone. This makes it a viable framework for deployment on mobile or edge devices in resource-constrained plantation settings.
Introduction
The text presents a deep learning-based system (AE-EffNet) for detecting diseases in tea leaves, addressing a major agricultural problem in India where tea production is economically critical but significantly affected by foliar diseases causing 10–15% yield losses annually.
Traditional disease detection relies on manual visual inspection, which is slow, subjective, and unreliable in early or visually similar disease stages. To overcome this, the study uses Convolutional Neural Networks (CNNs) and modern architectures like EfficientNet, enhanced with attention mechanisms for better accuracy and real-world applicability.
The proposed model, AE-EffNet, is built on EfficientNet-B3 as the backbone and incorporates a Convolutional Block Attention Module (CBAM) to focus on important diseased regions such as spots, lesions, and discoloration. It also uses Grad-CAM to improve explainability by showing which parts of the leaf influenced the prediction.
The model is trained on a dataset of 5,867 tea leaf images across six classes (five diseases and one healthy class), collected from Kaggle and real tea gardens in Assam (Silchar region). Data augmentation techniques are applied to improve robustness, and class imbalance is handled using weighted loss functions.
The system achieves very high performance (about 98.98% accuracy and 0.990 macro F1-score), outperforming many previous studies and showing strong real-world applicability under field conditions.
The literature review highlights:
CNNs like VGG, ResNet, and EfficientNet are widely used for plant disease detection
Attention mechanisms (especially CBAM) improve accuracy and focus on relevant regions
Tea disease detection research is still limited but improving with hybrid and lightweight models
Explainability tools like Grad-CAM are important for real-world agricultural adoption
The methodology includes dataset preparation, augmentation, CBAM-enhanced EfficientNet architecture, and optimized training using AdamW and cosine learning schedules.
Overall, the study proposes a high-accuracy, lightweight, and explainable AI system for real-world tea disease detection, designed specifically to handle field conditions and support farmers in early disease identification.
Conclusion
In this work, we presented AE-EffNet, an attention-enhanced variant of EfficientNet-B3 designed for multi-class tea leaf disease classification. The model augments the EfficientNet-B3 feature extractor with a CBAM attention module. That enables focused learning of disease-discriminative features. The Grad-CAM is used to make its predictions visually interpretable. The model is trained on 5,867 images having six classes, it achieves 98.98% on test set of the dataset.
In our future work we would like to make a mobile or web application so that people on the field can use this technology to detect diseases and can take steps to protect the plant. We also aim to develop a comprehensive, end-to-end diagnostic pipeline that integrates mobile data collection at the edge with centralized cloud processing. In real-world scenarios, a single tea leaf may suffer from multiple concurrent infections. In our future work we will investigate transitioning the architecture to a multi-label classification framework to detect and untangle overlapping symptomatic features.
References
[1] Tea Board of India, “Tea Statistics,” Annual Report 2023–2024, Ministry of Commerce and Industry, Government of India, 2024.
[2] S. Mukhopadhyay, M. Paul, R. Pal, and D. De, “Tea leaf disease detection using multi-objective image segmentation,” Multimed. Tools Appl., vol. 80, pp. 753–771, 2021.
[3] R. Hazarika, S. Sarmah, and K. K. Sarma, “Tea pest and disease management: challenges and opportunities in Assam,” Indian J. Agric. Res., vol. 56, no. 3, pp. 285–294, 2022.
[4] A. Latha, R. S. Raj, and G. Manikandan, “A review on deep learning techniques for plant leaf disease detection,” J. Plant Pathol., vol. 103, pp. 441–460, 2021.
[5] S. P. Mohanty, D. P. Hughes, and M. Salathé, “Using deep learning for image-based plant disease detection,” Front. Plant Sci., vol. 7, p. 1419, 2016.
[6] L. Li, S. Zhang, and B. Wang, “Plant disease detection and classification by deep learning: A review,” IEEE Access, vol. 9, pp. 56683–56698, 2021.
[7] M. Tan and Q. V. Le, “EfficientNet: Rethinking model scaling for convolutional neural networks,” in Proc. ICML, pp. 6105–6114, 2019.
[8] K. P. Ferentinos, “Deep learning models for plant disease detection and diagnosis,” Comput. Electron. Agric., vol. 145, pp. 311–318, 2018.
[9] R. R. Selvaraju et al., “Grad-CAM: Visual explanations from deep networks,” Int. J. Comput. Vis., vol. 128, no. 2, pp. 336–359, 2020.
[10] A. G. Howard et al., “MobileNets: Efficient convolutional neural networks for mobile vision applications,” arXiv:1704.04861, 2017.
[11] S. Woo, J. Park, J. Lee, and I. S. Kweon, “CBAM: Convolutional block attention module,” in Proc. ECCV, pp. 3–19, 2018.
[12] R. R. Selvaraju et al., “Grad-CAM: Visual explanations from deep networks via gradient-based localization,” in Proc. ICCV, pp. 618–626, 2017.
[13] S. Datta, “Tea Leaf Disease Dataset,” Kaggle, 2023. [Online]. Available: https://www.kaggle.com/datasets/saikatdatta1994/tea-leaf-disease
[14] E. C. Too, L. Yujian, S. Njuki, and L. Yingchun, “A comparative study of fine-tuning deep learning models for plant disease identification,” Comput. Electron. Agric., vol. 161, pp. 272–279, 2019.
[15] J. Chen, D. Zhang, Y. A. Nanehkaran, and D. Li, “Detection of rice plant diseases based on deep transfer learning,” J. Sci. Food Agric., vol. 100, no. 7, pp. 3246–3256, 2020.
[16] Y. Li et al., “Tea leaf disease and insect identification based on improved MobileNetV3,” Front. Plant Sci., vol. 15, p. 1459292, 2024.
[17] S. Srivastav, S. Kumar, and M. Gupta, “Tea leaf disease detection using CNN,” in Proc. Int. Conf. IoT in Social, Mobile, Analytics and Cloud, pp. 210–214, 2022.
[18] Y. Chen et al., “Detection and identification of tea leaf diseases based on AX-RetinaNet,” Sci. Rep., vol. 12, p. 2183, 2022.
[19] Y. Xia et al., “Classification and identification of tea diseases based on improved YOLOv7 model of MobileNeXt,” Sci. Rep., vol. 14, p. 11799, 2024.
[20] O. F. Shikdar et al., “Enhancing tea leaf disease recognition with attention mechanisms and Grad-CAM visualization,” arXiv:2512.17987, 2025.
[21] P. Bhuyan, P. K. Singh, and S. K. Das, “Res4net-CBAM: A deep CNN with CBAM for tea leaf disease diagnosis,” Multimed. Tools Appl., vol. 83, pp. 1–23, 2023.
[22] S. Ghosal et al., “An explainable deep machine vision framework for plant stress phenotyping,” Proc. Natl. Acad. Sci., vol. 115, no. 18, pp. 4613–4618, 2018.
[23] M. Shoaib et al., “Deep learning-based segmentation and classification of leaf images for tomato plant disease,” Front. Plant Sci., vol. 13, p. 1031748, 2022.
[24] A. Mahmood et al., “Deep learning framework using UAV imagery for multi-disease detection in cereal crops,” Sci. Rep., vol. 16, p. 3339, 2026.
[25] A. Buslaev et al., “Albumentations: Fast and flexible image augmentations,” Information, vol. 11, no. 2, p. 125, 2020.