Earlydetectionofplantdiseasesavoidstheadverseeffectsoncrops.Convolutionalneural network,intensivelearning,is appliedextensively inmachinevisionandpattern recognitionapplications.Butthedeeplearningmodelstakealotofparameters, and thus, the training time required is longer and hard to execute onsmalldevices.The model proposedhas beentrained &testedonthree plant diseasedatasets.
Introduction
Crop diseases caused by bacteria and fungi severely affect crop quality and yield. Deep learning, especially Convolutional Neural Networks (CNNs), has become popular for automatic feature extraction and accurate plant disease identification. However, existing deep learning models like AlexNet, VGGNet, ResNet, and Inception are computationally heavy, limiting their use on low-resource agricultural devices.
To address this, the paper proposes a novel lightweight CNN architecture combining Inception and Residual connections, replacing standard convolutions with depthwise separable convolutions. This significantly reduces the number of parameters and computational cost without sacrificing accuracy.
Key contributions:
New CNN model with Inception-residual architecture for better feature extraction and performance.
Use of depthwise separable convolutions to drastically cut down parameters and complexity.
Evaluation on three diverse plant disease datasets (PlantVillage, RiceDisease, CassavaDisease) with images from lab and real-field conditions.
Results:
Achieved high training and validation accuracy (around 99% for PlantVillage and Rice datasets).
Cassava dataset showed lower validation accuracy (~76%) due to complex backgrounds and class imbalance.
Model uses roughly 7.7 times fewer parameters than conventional InceptionV3, making it suitable for deployment on resource-limited devices.
Background:
The paper reviews related work using various CNN architectures for plant disease detection, highlighting the need for lightweight, efficient models. It also explains CNN basics, residual networks (ResNet), and depthwise separable convolutions.
Overall, the proposed model offers a practical solution for real-time, low-resource plant disease detection with competitive accuracy.
Conclusion
Deeplearning is also proven to be a good approach tofinding plant diseases. In this research work, we proposed alightweight CNN model with Inception modules, Residual connections, and depthwise separable convolutions thatreduced parameters by 70% and accelerated the training process. The developed model recorded astounding test accuracies of 99.39% on PlantVillage, 99.66% on Rice, and 76.59% on the imbalanced Cassava dataset compared to the traditional models such as CNN and ResNet. The model performswith higher accuracyand efficiencycompared to pastresearch.Ourfutureworkwillinvolveitsapplication for the detection of weeds and pests and the assessment of performanceonvariousdatasetsandlocation
References
[1] P.K.Sethy,N.K. Barpanda,A. K.Rath, and S.K. Behera, ‘‘Deep feature based Rice leaf disease identification using support vector machine,’’ Comput. Electron. Agricult., vol. 175, Aug. 2020, Art. no. 105527.
[2] V. Singh and A. K. Misra, ‘‘Detection of plantleaf diseases using image segmentation and soft computing techniques,’’ Inf. Process. Agricult., vol.4, pp. 41–49, Mar. 2017.
[3] S. H. Lee, C. S. Chan, S. J. Mayo,and P. Remagnino,‘‘Howdeeplearningextractsandlearns leaf features for plant classification,’’ Pattern Recognit., vol. 71, pp. 1–13, Nov. 2017.
[4] G. Farjon, O. Krikeb, A. B. Hillel, and V. Alchanatis, ‘‘Detection and countingof flowers on apple trees for better chemicalthinning decisions,’’ Precis.Agricult.,vol.21,pp.1–19,Aug.2019.
[5] S. P. Mohanty, D. P. Hughes, and M. Salathé, ‘‘Usingdeeplearningforimage-basedplantdisease detection,’’ Frontiers Plant Sci., vol. 7, p. 1419, Sep. 2016.
[6] J. Chen, J. Chen, D. Zhang, Y. Sun, and Y. A. Nanehkaran, ‘‘Using deep transfer learning forimage-based plant disease identification,’’ Comput. Electron. Agricult., vol. 173, Jun. 2020, Art. no. 105393.
[7] K. P. Ferentinos, ‘‘Deep learning models forplant disease detection and diagnosis,’’ Comput. Electron.Agricult.,vol.145,pp.311–318,Feb. 2018.
[8] V.HoangTrong,Y.Gwang-hyun,D.Thanh Vu, and K. Jin-young, ‘‘Late fusion of multimodal deep neural networks for weeds classification,’’ Comput. Electron. Agricult., vol. 175, Aug. 2020,Art. no. 105506.
[9] F. Ren, W. Liu, and G. Wu, ‘‘Feature reuse residualnetworksforinsectpestrecognition,’’IEEE Access, vol. 7, pp. 122758–122768, 2019.
[10] A. Krizhevsky, I. Sutskever, and G. E. Hinton, ‘‘ImageNet classification with deep convolutional neural networks,’’ inProc. Adv. Neural Inf. Pro-cess.Syst.(NIPS),vol.25,Dec.2012,pp.1097–1105.