Agricultural pests are spurs of economic, social and sorrowful environmental impacts around the globe. To control these pests proper identification and categorization is prudent in strategies used to tackle them. This work highlights the DeepPestNet, a CNN built specifically for accurately identifying nine classes of pests important in agriculture. Base onto the transfer learning of EfficientNetB0 which was developed to boost the performance of pest recognition, DeepPestNet has more convolutional and attention layers incorporated into the framework. Training and evaluation on this broad set shows here DeepPestNet got a test accuracy of 97%. To this end, the algorithm demonstrated an average accuracy of 33% while having stable performance indicators, such as the precision, recall, and F1-scores, for every type of pest. That is why, DeepPestNet is proved as an effective tool in developing new methods of automated pest recognition systems, giving farmers and agricultural specialists the opportunity to reduce crop losses and pesticides application.
Introduction
. Introduction
Agricultural productivity, crucial for food security and economies, is heavily threatened by pests like insects, pathogens, and weeds, causing global losses worth billions of dollars. Traditional pest control methods (manual scouting and pesticide use) are often costly, time-consuming, environmentally damaging, and partially ineffective. There is a clear need for sustainable, accurate, and intelligent pest identification systems.
2. Deep Learning in Pest Management
Advancements in deep learning, particularly Convolutional Neural Networks (CNNs), offer powerful tools for pest recognition through image analysis. These models can automatically identify pests with high accuracy, reducing human dependency.
3. DeepPestNet Model
DeepPestNet is a custom CNN model built on EfficientNetB0 and enhanced with:
Transfer learning: Utilizes pre-trained models for efficient learning.
Attention mechanisms: Directs the model to focus on critical image features, improving accuracy in pest classification.
Batch normalization: Improves training stability and speed.
Robust architecture: Can generalize well across pests with different shapes, colors, and sizes.
Nine pest classes: Trained on a large, diverse, and annotated dataset.
4. Literature Review Highlights
UAVs integrated with computer vision (OpenCV) enable real-time crop monitoring.
Traditional ML methods (SVM, contour, and color histograms) aid pest detection but are limited.
Cascaded CNNs and web-based detection tools show success in field and greenhouse pest monitoring.
Data augmentation significantly enhances model performance, especially in data-scarce regions.
5. Proposed Model Architecture
The DeepPestNet architecture consists of:
EfficientNetB0 for feature extraction.
Custom convolutional layers with ReLU/LeakyReLU, batch normalization, and dropout layers.
Attention layers to enhance focus on relevant image areas.
Softmax classifier for multi-class pest detection (9 classes).
Adam optimizer and categorical crossentropy loss used for training.
6. Data Augmentation & Preprocessing
Real-time data augmentation (rotation, zooming, flipping, etc.) improves generalization.
Normalization (rescaling pixel values to 0–1) and image resizing (to 224x224) are essential for model input compatibility and faster convergence.
Dataset partitioning ensures unbiased training and evaluation.
7. DeepCNN Extension
An extension of the core model, DeepCNN, further refines pest detection through:
Additional convolutional layers.
Use of various filter sizes to capture features at multiple scales.
Enhanced feature extraction through dropout, batch normalization, and attention mechanisms.
Final layers flatten and classify data using dense layers with LeakyReLU and softmax for output probabilities.
8. Feature Extraction
Feature extraction is performed using KerasLayer with EfficientNetB0 from TensorFlow Hub, which captures rich visual patterns from input images. These features are refined and passed through the classification pipeline for accurate pest identification.
Conclusion
The Proposed model demonstrates impressive performance in pest classification, achieving a high accuracy of 97.78% on the test set. The use of EfficientNetB0 as a feature extractor, combined with a custom convolutional architecture, attention mechanisms, and dropout layers, has enabled the model to effectively capture and classify pest features from the dataset. The results highlight the model\'s robustness and ability to generalize well to unseen data, as evidenced by high precision, recall, and F1-scores across all classes.
The extensive training process and detailed performance metrics show that our model not only excels in distinguishing between different pest classes but also outperforms other contemporary models in terms of accuracy. The training and validation loss curves indicate strong initial learning capabilities, although some signs of overfitting towards the end of the training suggest room for further optimization.
References
[1] Shankar, A.K. Veeraraghavan, Uvais, K. Sivaraman, and S. S. Ramachandran, \"Application of UAV for Pest, Weeds and Disease Detection Using Open Computer Vision,\" 2024.
[2] Ashok, J. Jayachandran, S. Sankara Gomathi, and M. Jayaprakasan, \"Pest Detection and Identification by Applying Color Histogram and Contour Detection by SVM Model,\" International Journal of Engineering and Advanced Technology (IJEAT), vol. 8, no. 3S, pp.463-467, Feb. 2019.
[3] D. J. A. Rustia, J.-J. Chao, L.-Y. Chiu, Y.-F. Wu, J.-Y.Chung, J.-C. Hsu, and T.-T. Lin, \"Automatic Greenhouse Insect Pest Detection and Recognition Based on a Cascaded Deep Learning Classification Method,\" Journal of Economic Entomology, vol. 113, no. 6, pp. 2377-2385, Nov. 2020. DOI: 10.1111/jen.12834.
[4] T. Kasinathan, D. Singaraju, and S. R. Uyyala, \"Insect Classification and Detection in Field Crops Using Modern Machine Learning Techniques,\" Information Processing in Agriculture, vol. 8, no. 3, pp. 446-457, Sept. 2021. DOI: 10.1016/j.inpa.2021.01.005.
[5] A. Columba-Guanoluisa, J. Aimacaña-Chuquimarca,M. Rosas-Lara, and J. C. Mendoza-Tello, \"Machine Algorithm-Based Web Prototype for Crop Pest Detection,\" Faculty of Engineering and Applied Sciences, Central University of Ecuador, 2021.
[6] K. Kusrini, S. Suputa, A. Setyanto, I. M. A. Artha, H. Priantoro, K. Chandramouli, and E. Izquierdo, \"Data Augmentation for Automated Pest Classification in Mango Farms,\" Computers and Electronics in Agriculture, vol. 179, p. 105842, Dec. 2020, doi: 10.1016/j.compag.2020.105842.
[7] A. Sayeed, N. Ayesha, and M. A. Sayeed, \"Detecting Crows on Sowed Crop Fields using Simplistic Image Processing Techniques by OpenCV in Comparison with TensorFlow Image Detection API,\" International Journal for Research in Applied Science & Engineering Technology (IJRASET), vol. 8, no. 3, pp. 61-66, Mar. 2020.
[8] E. Karar, F. Alsunaydi, S. Albusaymi, and S. Alotaibi, \"A New Mobile Application of Agricultural Pests Recognition Using Deep Learning in Cloud Computing System,\" Alexandria Engineering Journal, vol. 60, no. 5, pp. 4545-4555, Sep. 2021, doi: 10.1016/j.aej.2021.03.009.
[9] K. Rangarajan Aravind and P. Raja, \"Automated disease classification in (Selected) agricultural crops using transfer learning,\" Automatika, vol. 61, no. 2, pp. 260–272, 2020. [Online]. Available: https://doi.org/10.1080/00051144.2020.1728911.