AUTHOR=Xu Laixiang , Duan Yiru , Cai Zhaopeng , Huang Wenwen , Zhai Fengyan , Zhao Junmin TITLE=DSA-net: a lightweight and efficient deep learning-based model for pea leaf disease identification JOURNAL=Frontiers in Plant Science VOLUME=Volume 16 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/plant-science/articles/10.3389/fpls.2025.1642453 DOI=10.3389/fpls.2025.1642453 ISSN=1664-462X ABSTRACT=IntroductionPea is a nutrient-dense, functionally diversified vegetable. However, its leaf diseases have a direct impact on yield and quality. Most approaches for identifying pea leaf diseases exhibit low feature extraction efficiency, significant environmental sensitivity, and limited large-scale applications, making it impossible to meet the expectations of modern agriculture for accuracy, real-time processing, and low cost.MethodsTherefore, we propose a deep learning model for pea leaf disease identification based on an improved MobileNet-V3_small, deformable convolution strategy, self-attention, and additive attention mechanisms (DSA-Net). First, a deformable convolution is added to MobileNet-V3-small to increase the modeling skills for geometric changes in disease features. Second, a self-attention mechanism is integrated to improve the ability to recognize global features of complex diseases. Finally, an additive attention strategy to enhance the feature channel and spatial position response relationship in edge-blurred lesion areas. The experimental pea leaf data set consists of 7915 samples divided into five categories. It includes one healthy leaf and four diseases: brown spot, leaf miner, powdery mildew, and root rot.ResultsThe experimental results indicate that the suggested DSA-Net has an average recognition accuracy of 99.12%. It has a parameter size of 1.48M.DiscussionThe proposed approach will help with future edge device deployments. The current proposed technique considerably enhances the diagnostic accuracy of pea leaf diseases and has significant promotion and application potential in agriculture.