ORIGINAL RESEARCH article
Front. Plant Sci.
Sec. Sustainable and Intelligent Phytoprotection
Volume 16 - 2025 | doi: 10.3389/fpls.2025.1655564
A lightweight deep convolutional neural network development for soybean leaf disease recognition
Provisionally accepted- 1Henan University of Science and Technology, Luoyang, China
- 2College of Biological and Agricultural Engineering, Jilin University, Changchun 130025, China, Henan University of Science and Technology, Luoyang, China
- 3Wah Engineering College, University of Wah, Quaid Avenue, Wah Cantt, District Rawalpindi, Punjab, Pakistan, Henan University of Science and Technology, Luoyang, China
- 4School of Agricultural Engineering, Jiangsu University, Zhenjiang 212013, China, Henan University of Science and Technology, Luoyang, China
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
Soybean is one of the world's major oil-bearing crops and occupies an important role in the daily diet of human beings. However, the frequent occurrence of soybean leaf diseases caused serious threats to its yield and quality during soybean cultivation. Rapid identification of soybean leaf diseases could provide a better solution for efficient control and subsequent precision application. In this study, a lightweight deep convolutional neural network (CNN) based on multiscale feature extraction fusion (MFEF) and combined with a dense connectivity (DC) network (MFEF-DCNet) was proposed for soybean leaf disease identification. In MFEF-DCNet, a multiscale feature extraction fusion (MFEF) module for soybean leaves was constructed by utilizing a convolutional attention module and depth-separable convolution to improve the model feature extraction capability. Multiscale features are fused by using dense connections (DC) in the backbone network to improve the model generalization capability. Experiments were implemented on eight distinct disease and deficiency classes of soybean images (including bacterial blight, cercospora leaf blight, downy mildew, frogeye leaf spot, healthy, potassium deficiency, soybean rust, and target spot) using the proposed network. The results showed that the MFEF-DCNet had an accuracy of 0.9470, an average precision of 0.9510, an average recall of 0.9480, and an F1-score of 0.9490 for soybean leaf disease identification. And MFEF-DCNet had certain performance advantages in terms of classification accuracy, convergence speed and other effects compared with VGG16, ResNet50, DenseNet201, EfficientNetB0, Xception and MobileNetV3_small models. In addition, the accuracy of the MFEF-DCNet model in recognizing soybean diseases in local data was 0.9024, which indicated that the MFEF-DCNet model had favorable application in practical applications. The proposed model and experience in this study could provide useful inspiration for automated disease identification in soybean and other crops.
Keywords: Soybean leaf diseases, Disease diagnosis, multiscale feature extraction fusion, denseconnectivity, deep convolutional neural networks
Received: 28 Jun 2025; Accepted: 10 Sep 2025.
Copyright: © 2025 Zhang, Bao, Guan, Wang, Wang, Cui, Niu, Wang, Ali, Zhang and Wang. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence:
Yakun Zhang, Henan University of Science and Technology, Luoyang, China
Yafei Wang, School of Agricultural Engineering, Jiangsu University, Zhenjiang 212013, China, Henan University of Science and Technology, Luoyang, China
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.