Advanced Sensing, Learning and Control for Effective Human-Robot Interaction

  • 1,200

    Total downloads

  • 9,781

    Total views and downloads

About this Research Topic

This Research Topic is still accepting articles.

Background

In recent years, the field of robotics has made remarkable strides in advancing the capabilities of robots to interact with humans in various contexts. The key to enhancing these interactions lies in the development of sophisticated sensing, learning, and control mechanisms that enable robots to effectively understand and respond to human actions and intentions. In this Research Topic, we aim to showcase recent research progress related to sensing, learning, and control in human-robot interaction. The development of this topic anticipates a future where robots are more integrated into our daily work and lives. It is essential that they are not only capable of performing tasks but also able to do so in an intuitive, safe, and conducive manner that promotes effective communication and cooperation with human partners.

This Research Topic aims to address the challenges of sensing, learning, and control problems in human-robot interaction. In the realm of sensing, we focus on recent advances in multiple sensing units, ranging from cameras and depth cameras to tactile sensors and force feedback mechanisms. These advancements enable robots to perceive the environment and human operators' actions with a level of detail and precision that was previously unattainable. Regarding robot learning, our emphasis will be on advanced learning methods and frameworks for robot actions and policies. We will also plan human-robot interaction strategies to enhance interaction within unstructured environments. Control systems serve as the bridge between sensing and learning, enabling robots to execute tasks with precision and safety. In this collection, our primary focus will be on the advanced control mechanisms in modern robots, which are capable of coordinating intricate movements and interactions in real-time.

The Research Topic will encompass a wide range of subjects, such as new sensor technology, intelligent perception, robot learning, safe human-robot interaction, advanced control, network control, and ethical considerations. The primary focus of this Research Topic is to delve into recent advancements in multidisciplinary technologies with practical applications and wide-reaching real-world implications. The scope is inclusive and extends to, but is not limited to, the following areas:
• Novel sensor and hardware design
• Intelligent multi-camera perception and fusion
• Computer vision and cognition
• Sensing and control systems for collaborative robots
• Teleoperation
• Advanced control techniques, including shared control and networked control
• Learning from human demonstrations
• Reinforcement learning for robot interactions
• Efficient and safe human-robot interactions
• Ethics in human-robot interactions
• Other related subjects.

Article types and fees

This Research Topic accepts the following article types, unless otherwise specified in the Research Topic description:

  • Brief Research Report
  • Data Report
  • Editorial
  • FAIR² Data
  • General Commentary
  • Hypothesis and Theory
  • Methods
  • Mini Review
  • Opinion

Articles that are accepted for publication by our external editors following rigorous peer review incur a publishing fee charged to Authors, institutions, or funders.

Keywords: Human-Robot Interaction, Robot Learning, Intelligent Perception, Advanced Control, Safety and Ethics

Important note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.

Topic editors

Manuscripts can be submitted to this Research Topic via the main journal or any other participating journal.

Impact

  • 9,781Topic views
  • 7,150Article views
  • 1,200Article downloads
View impact