Your new experience awaits. Try the new design now and help us make it even better

ORIGINAL RESEARCH article

Front. Robot. AI

Sec. Industrial Robotics and Automation

Volume 12 - 2025 | doi: 10.3389/frobt.2025.1585544

This article is part of the Research TopicInnovations in Industry 4.0: Advancing Mobility and Manipulation in RoboticsView all 5 articles

A Multi-Robot Collaborative Manipulation Framework for Dynamic and Obstacle-Dense Environments: Integration of Deep Learning for Real-Time Task Execution

Provisionally accepted
Afnan  Ahmed AdilAfnan Ahmed Adil1Saber  SakhriehSaber Sakhrieh1Jinane  MounsefJinane Mounsef1*Noel  MaaloufNoel Maalouf2
  • 1Rochester Institute of Technology Dubai, Dubai, United Arab Emirates
  • 2School of Engineering, Lebanese American University, Byblos, Lebanon

The final, formatted version of the article will be published soon.

This paper presents a multi-robot collaborative manipulation framework, implemented in the Gazebo simulation environment, designed to enable the execution of autonomous tasks by mobile manipulators in dynamic environments and dense obstacles. The system consists of multiple mobile robot platforms, each equipped with a robotic manipulator, a simulated RGB-D camera, and a 2D LiDAR sensor on the mobile base, facilitating task coordination, object detection, and advanced collision avoidance within a simulated warehouse setting. A leaderfollower architecture governs collaboration, allowing for the dynamic formation of teams to tackle tasks requiring combined effort, such as transporting heavy objects. Task allocation and control are achieved through a centralized control structure architecture in which the leader robot coordinates subordinate units based on high-level task assignments. The framework incorporates deep learning-based object detection (YOLOv2) to identify target objects using a simulated RGB-D camera mounted on the manipulator's end-effector. Path planning is achieved through a sampling-based algorithm that is integrated with the LiDAR data to facilitate precise obstacle avoidance and localization. It also provides real-time path rerouting for safe navigation when dynamically moving obstacles, such as humans or other entities, intersect planned paths. This functionality ensures uninterrupted task execution and enhances safety in human-robot shared spaces. High-level task scheduling and control transitions are managed using MATLAB and Stateflow logic, while ROS facilitates data communication between MATLAB, Simulink, and Gazebo. This multirobot architecture is adaptable, allowing configuration of team size for collaborative tasks based on load requirements and environmental complexity. By integrating computer vision and deep learning for visual processing, and YOLOv2 for object detection, the system efficiently identifies, picks, and transports objects to designated locations, demonstrating the scalability of multi-robot framework for future applications in logistics automation, collaborative manufacturing, and dynamic human-robot interaction scenarios.

Keywords: Multi-robot system, Cooperative object transportation, path planning, RRT *, YOLO, dynamic obstacles

Received: 28 Feb 2025; Accepted: 04 Jul 2025.

Copyright: © 2025 Adil, Sakhrieh, Mounsef and Maalouf. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Jinane Mounsef, Rochester Institute of Technology Dubai, Dubai, United Arab Emirates

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.