Your new experience awaits. Try the new design now and help us make it even better

ORIGINAL RESEARCH article

Front. Plant Sci.

Sec. Technical Advances in Plant Science

Volume 16 - 2025 | doi: 10.3389/fpls.2025.1673567

Computer vision based steering path visualization of headland in a soybean field

Provisionally accepted
Yuyang  RenYuyang Ren1Bo  ZhangBo Zhang2*Yang  LiYang Li3Changhai  ChenChanghai Chen4Wenxiao  LiWenxiao Li2Yongcai  MaYongcai Ma2
  • 1college of information and electrical engineering, Heilongjiang Bayi Agricultural University, Daqing, China
  • 2Heilongjiang Bayi Agricultural University, Daqing, China
  • 3Beijing Simulation Center, Beijing, China
  • 4Heilongjiang Academy of Agricultural Sciences, Harbin, China

The final, formatted version of the article will be published soon.

To address the insufficient accuracy of autonomous steering in soybean headland areas, this study proposes a dynamic navigation line visualization method based on deep learning and feature detection fusion, enhancing path planning capability for autopilot systems during the soybean V3-V8 stage. Firstly, the improved lightweight YOLO-PFL model was used for efficient headland detection (Precision was 94.100%, Recall was 92.700%, mAP@0.5 was 95.600%), with number of parameters and float point operations of 1.974 M and 4.816 GFLOPs, meeting embedded deployment requirements for agricultural machines. A 3D positioning model was built using binocular stereo vision, distance error was controlled within 2.000%, 4.000%, and 6.000% for ranges of 0.000-3.000 m, 3.000-7.000 m, and 7.000-10.000 m; Secondly, got interference-resistant crop row centerlines (the average error of orientation angle was -0.473°, with the negative sign indicating a tendency for the measured orientation to deviate slightly to the left of the reference axis, suggesting a small systematic bias, the Mean Absolute Error (MAE) was 3.309°) by first enhancing contours with HSV colour space conversion and morphological operations, then fitting feature points extracted from ROIs and crop row intersection area using the least squares method. This process solved centerline offset issues caused by straws, weeds, changes in illumination, and the presentation of holes or sticking areas; Finally, 3D positioning and orientation parameters were fused to generate circular arc paths in the world coordinate system, which were dynamically projected across the coordinate system to visualise the image plane navigation lines. Experiments show the method generates real-time steering paths with acceptable errors, offering a navigation reference for automatic wheeled machines in soybean fields and technical support the advancement of intelligent precision agriculture equipment.

Keywords: YOLO-PFL model, 3D localisation, feature detection, Crop row centreline extraction, Navigation line visualisation

Received: 26 Jul 2025; Accepted: 17 Sep 2025.

Copyright: © 2025 Ren, Zhang, Li, Chen, Li and Ma. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Bo Zhang, zbxiha@126.com

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.