Enhancing Agricultural Pest Management with YOLO V5: A Detection and Classification Approach
Abstract
Abstract Views: 0Due to the growing population and the numerous ecological challenges that affect crop yields, the need to modernize the crop production process has become increasingly critical. Swiftly managing potential threats to crops can have a substantial impact on overall crop production. Pests represent a significant menace, capable of causing substantial losses if not effectively controlled in a timely manner. In this study, a deep learning-based method for pest identification is introduced. The approach leverages the YOLO (You Only Look Once) object recognition SSD (single shot detection )algorithm in combination with the pre-trained DARKNET architecture to categorize pests into nine distinct classes. The study utilizes a publicly available dataset sourced from Kaggle, which comprises a total of 7,046 images. The outcomes reveal overall 83% of overall accuracy rate, with a notably low training and validation loss of 0.02%. Moreover, our model exhibits a notable enhancement in localization results, delivering a precision of 0.83, a recall of 0.83, an mAP-0.5 of 0.833, and an mAP-0.5:0.95 of 0.783.
Downloads
References
H. Venthur and J.-J. Zhou, “Odorant receptors and odorant-binding proteins as insect pest control targets: A comparative analysis,” Front. Physiol., vol. 9, Art, no. 1163, Aug. 2018, doi: https://doi.org/10.3389/fphys.2018.01163
T. Saranya, C. Deisy, S. Sridevi, K. S. Muthu, and M. K. A. A. Khan, “Performance Analysis of first order optimizers for plant pest detection using deep learning,” in Mach. Learn. Image Process. Net. Secur. Data Sci., N. Khare, D. S. Tomar, M. K. Ahirwal, V. B. Semwal, and V. Soni, Eds. Jan. 2023, pp. 37–52. doi: https://doi.org/10.1007/978-3-031-24367-7_4
S. Mascarenhas and M. Agarwal, “A comparison between VGG16, VGG19 and ResNet50 architecture frameworks for image classification,” in Int. Conf. Disrupt. Technol Multi-Discipl. Res. Appl., Bengaluru, India, Nov. 2021, pp. 96–99. doi: https://doi.org/10.1109/CENTCON52345.2021.9687944
Z. Anwar and S. Masood, “Exploring deep ensemble model for insect and pest detection from images,” Proc. Comput. Sci., vol. 218, pp. 2328–2337, 2023, doi: https://doi.org/10.1016/j.procs.2023.01.208
“Residual networks (ResNet) - deep learning.” GeeksforGeeks. https://www.geeksforgeeks.org/residual-networks-resnet-deep-learning/ (accessed Feb. 26, 2023).
R. Alake, “Deep learning: GoogLeNet explained.” Medium.com. https://towardsdatascience.com/deep-learning-googlenet-explained-de8861c82765 (accessed Feb. 26, 2023).
R. Hadipour-Rokni, E. A. Asli-Ardeh, A. Jahanbakhshi, I. E. paeen-Afrakoti, and S. Sabzi, “Intelligent detection of citrus fruit pests using machine vision system and convolutional neural network through transfer learning technique,” Comput. Biol. Med., vol. 155, Art. no. 106611, Mar. 2023, doi: https://doi.org/10.1016/j.compbiomed.2023.106611
X. Fu et al., “Crop pest image recognition based on the improved ViT method,” Info. Proc. Agricul. Feb. 2023, doi: https://doi.org/10.1016/j.inpa.2023.02.007
J. K. Lugemwa, “An embedded, machine learning-enabled platform for in-field screening of plant disease and pest damage.,” Ph.D. dissertation, Makerere Univ., Kampala, Uganda., 2023. [Online]. Available: http://dissertations.mak.ac.ug/handle/20.500.12281/14870
S. D. Meena, M. Susank, T. Guttula, S. H. Chandana, and J. Sheela, “Crop yield improvement with weeds, pest and disease detection,” Proc. Comput. Sci., vol. 218, pp. 2369–2382, 2023, doi: https://doi.org/10.1016/j.procs.2023.01.212
R. Akter, M. S. Islam, K. Sohan, and M. I. Ahmed, “Insect recognition and classification using optimized densely connected convolutional neural network,” in 12th Int. Conf. Info. Sys. Adv. Technol., M. R. Laouar, V. E. Balas, B. Lejdel, S. Eom, and M. A. Boudia, Eds. 2023, pp. 251–264. doi: https://doi.org/10.1007/978-3-031-25344-7_23
H. Waheed, N. Zafar, W. Akram, A. Manzoor, A. Gani, and S. Islam, “Deep learning based disease, pest pattern and nutritional deficiency detection system for ‘Zingiberaceae’ crop,” Agriculture, vol. 12, no. 6, Art. no. 742, May 2022, doi: https://doi.org/10.3390/agriculture12060742
D. Mondal, K. Roy, D. Pal, and D. K. Kole, “Deep learning-based approach to detect and classify signs of crop leaf diseases and pest damage,” SN Comput. Sci., vol. 3, no. 6, Art. no. 433, Aug. 2022, doi: https://doi.org/10.1007/s42979-022-01332-5
M.-L. Huang, T.-C. Chuang, and Y.-C. Liao, “Application of transfer learning and image augmentation technology for tomato pest identification,” Sustain. Comput. Inform. Syst., vol. 33, Art. no. 100646, Jan. 2022, doi: https://doi.org/10.1016/j.suscom.2021.100646
W. Yun, J. P. Kumar, S. Lee, D.-S. Kim, and B.-K. Cho, “Deep learning-based system development for black pine bast scale detection,” Sci. Rep., vol. 12, no. 1, Art. no. 606, Jan. 2022, doi: https://doi.org/10.1038/s41598-021-04432-z
D. Li, F. Ahmed, N. Wu, and A. I. Sethi, “YOLO-JD: A deep learning network for jute diseases and pests detection from images,” Plants, vol. 11, no. 7, Art. no. 937, Mar. 2022, doi: https://doi.org/10.3390/plants11070937
L. Xinmao, L. Yihui, X. Mingl, T. Shuijiaol, and M. Zhandong, “Research on identification of main cotton pests based on deep learning,” in IEEE 2nd Int. Conf. Digital Twins Parallel Intell., Boston, MA, USA, Oct. 2022, pp. 1–4. doi: https://doi.org/10.1109/DTPI55838.2022.9998883
X. Jin, Y. Sun, J. Che, M. Bagavathiannan, J. Yu, and Y. Chen, “A novel deep LEARNING‐BASED method for detection of weeds in vegetables,” Pest Manag. Sci., vol. 78, no. 5, pp. 1861–1869, May 2022, doi: https://doi.org/10.1002/ps.6804
S. W. Nasution and K. Kartika, “Eggplant disease detection using yolo algorithm telegram notified,” Int. J. Eng. Sci. Inf. Technol., vol. 2, no. 4, Art. no. 4, Dec. 2022, doi: https://doi.org/10.52088/ijesty.v2i4.383
S. Dong, J. Zhang, F. Wang, and X. Wang, “YOLO-pest: a real-time multi-class crop pest detection model,” in Int Conf Comput Appl Info Secur., Wuhan, China, May 2022, doi: https://doi.org/10.1117/12.2637467
Q. Guo, C. Wang, D. Xiao, and Q. Huang, “Automatic monitoring of flying vegetable insect pests using an RGB camera and YOLO-SIP detector,” Precis. Agric., vol. 24, pp. 436–457, Sep. 2022, doi: https://doi.org/10.1007/s11119-022-09952-w
I. Ahmad et al., “Deep learning based detector YOLOv5 for identifying insect pests,” Appl. Sci., vol. 12, no. 19, Art. no. 10167, Oct. 2022, doi: https://doi.org/10.3390/app121910167
T. Zheng, X. Yang, J. Lv, M. Li, S. Wang, and W. Li, “An efficient mobile model for insect image classification in the field pest management,” Eng. Sci. Technol. Int. J., vol. 39, Art. no. 101335, Mar. 2023, doi: https://doi.org/10.1016/j.jestch.2023.101335
H. Gong et al., “Based on FCN and DenseNet framework for the research of rice pest identification methods,” Agronomy, vol. 13, no. 2, Art. no. 410, Jan. 2023, doi: https://doi.org/10.3390/agronomy13020410
A. P. Syahputra, A. C. Siregar, and R. W. S. Insani, “Comparison of CNN models with transfer learning in the classification of insect pests,” IJCCS Indones. J. Comput. Cybern. Syst., vol. 17, no. 1, Art. no. 103, Feb. 2023, doi: https://doi.org/10.22146/ijccs.80956
M. Sujaritha, M. Kavitha, and S. Roobini, “Pest detection using improvised YOLO architecture,” in Comput. Vision Mach. Intell. Parad. SDGs, vol. 967, R. J. Kannan, S. M. Thampi, and S.-H. Wang, Eds. Jan. 2023, pp. 59–67. doi: https://doi.org/10.1007/978-981-19-7169-3_6
H. Zaki, M. Shaikh, M. Tahir, M. Naseem, and M. Khan, “Smart surveillance and detection framework using YOLOv3 algorithm”, PakJET, vol. 5, no. 4, pp. 36–43, Dec. 2022.
M. K. Shaikh, S. Palaniappan, and T. Khodadadi. An AI-driven automotive smart black box for accident and theft prevention. Int. J. Model. Ident. Cont., vol. 39, no. 4, pp. 332–339, June 2020, doi: https://doi.org/10.1504/IJMIC.2021.123800
Copyright (c) 2023 Asif Raza, Muhammad Kashif Shaikh, Osama Ahmed Siddiqui, Asher Ali, Afshan Khan
This work is licensed under a Creative Commons Attribution 4.0 International License.
UMT-AIR follow an open-access publishing policy and full text of all published articles is available free, immediately upon publication of an issue. The journal’s contents are published and distributed under the terms of the Creative Commons Attribution 4.0 International (CC-BY 4.0) license. Thus, the work submitted to the journal implies that it is original, unpublished work of the authors (neither published previously nor accepted/under consideration for publication elsewhere). On acceptance of a manuscript for publication, a corresponding author on the behalf of all co-authors of the manuscript will sign and submit a completed the Copyright and Author Consent Form.