ORIGINAL RESEARCH article
Front. Robot. AI
Sec. Multi-Robot Systems
Volume 12 - 2025 | doi: 10.3389/frobt.2025.1648309
This article is part of the Research TopicRobotic Perception, Multi-Robot Exploration, Mapping, and Autonomous ComputingView all articles
GNV2-SLAM: Vision SLAM system for cowshed inspection robots
Provisionally accepted- 1Longmen Laboratory, Luoyan 471000, China
- 2College of Agricultural Equipment Engineering, Henan University of Science and Technology, Luoyang 471003, China
- 3Collaborative Innovation Center of Machinery Equipment Advanced Manufacturing of Henan Province, Luoyang 471003, China
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
Simultaneous Localization and Mapping (SLAM) has emerged as one of the foundational technologies enabling mobile robots to achieve autonomous navigation, garnering significant attention in recent years. To address the limitations inherent in traditional SLAM systems when operating within dynamic environments, this paper proposes GNV2-SLAM, an advanced system built upon ORB-SLAM2 that is specifically designed for the scenario of cowshed inspection. This innovative system incorporates a lightweight object detection network called GNV2 based on YOLOv8. Additionally, it employs GhostNetv2 as backbone network. The CBAM attention mechanism and SCDown downsampling module were introduced to reduce the model complexity while ensuring detection accuracy. Experimental results indicate that the GNV2 network achieves excellent model compression effects while * College of Agricultural Equipment Engineering Henan University of Science and Technology Correspondence: du_xinwu@sina.com maintaining high performance: mAP@0.5 increased by 1.04%, reaching a total of 95.19%; model parameters were decreased by 41.95%, computational cost reduced by 36.71%, and the model size shrunk by 40.44%. Moreover, the GNV2-SLAM system incorporates point and line feature extraction techniques that effectively mitigate issues of reduced feature point extraction caused by excessive dynamic targets or blurred images. Testing on the TUM dataset demonstrate that GNV2-SLAM significantly outperforms the traditional ORB-SLAM2 system in terms of positioning accuracy and robustness within dynamic environments. Specifically, there was a remarkable reduction of 96.13% in root mean square error (RMSE) for absolute trajectory error (ATE), alongside decreases of 88.36% and 86.19% for translation and rotation drift in relative pose error (RPE), respectively. In terms of tracking evaluation, GNV2-SLAM successfully completes the tracking processing of a single frame image within 30 ms, demonstrating expressive real-time performance and competitiveness. Following the deployment of this system on inspection robots and subsequent experimental trials conducted in the cowshed environment, the results indicate that when the robot operates at speeds of 0.4m/s and 0.6m/s, the pose trajectory output by GNV2-SLAM is more consistent with the robot's actual movement trajectory. This provides robust support for the automation of inspection tasks within livestock house.
Keywords: SLAM, YOLOv8, GNV2-SLAM, Cowshed inspection, Computer Vision
Received: 18 Jul 2025; Accepted: 28 Aug 2025.
Copyright: © 2025 Xinwu, Li, Xin, Yu, Xie and Zhang. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence:
Du Xinwu, Longmen Laboratory, Luoyan 471000, China
Tingting Li, College of Agricultural Equipment Engineering, Henan University of Science and Technology, Luoyang 471003, China
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.