Research Topic

Active Vision and Perception in Human-Robot Collaboration

About this Research Topic

Humans naturally interact and collaborate in unstructured social environments that produce an overwhelming amount of information and may yet hide behaviorally relevant variables. Finding the underlying design principles that allow humans to adaptively find and select relevant information is important for Robotics
but also other fields, such as Computational Neuroscience and Interaction Design.

Current solutions cover specific tasks, e.g. autonomous cars, and usually employ over-redundant, expensive, and computationally demanding sensory systems that attempt to cover the wide set of sensing conditions which the systems may have to deal with. Adaptive control of the sensors and of the perception process is a key solution found by nature to cope with such problems, as shown by the foveal anatomy of the eye and its high mobility.


Alongside this interest in “active” vision, collaborative robotics has recently progressed to human-robot interaction in real manufacturing. Measuring and modelling task-specific gaze behaviours seems to be essential for smooth humanrobot interaction. Indeed, anticipatory control for human-in-the-loop architectures, which can enable robots to proactively collaborate with humans, relies heavily on observing gaze and actions patterns of the human partner.

We are interested in manuscripts that present novel computational and robotic models, theories and experimental results as well as reviews relevant to understand how human actively control their perception during social interaction and in which condition they fail, and how these insights may enable natural interaction between humans and artificial systems in non-trivial conditions. Topics of interest include (but are not limited to):

• Active perception for intention and action prediction
• Activity and action recognition in the wild
• Active perception for social interaction
• Human-robot collaboration in unstructured environments
• Human-robot collaboration in presence of sensory limits
• Joint Human-Robot search and exploration
• Testing setup for social perception in real or virtual environments
• Setup for transferring active perception skills from humans to robots
• Machine learning methods for active social perception
• Benchmarking and quantitative evaluation with human subject experiments
• Gaze-based Factors for Intuitive Human-Robot Collaboration
• Active perception modelling for social interaction and collaboration
• Head-mounted eye tracking and gaze estimation during social interaction
• Estimation and guidance of partner situation awareness and attentional
state in human-robot collaboration
• Multimodal Social perception
• Adaptive social perception
• Egocentric vision in social interaction;
• Explicit and implicit sensorimotor communication;
• Social attention;
• Natural human-robot (machine) interaction;
• Collaborative exploration;
• Joint attention;
• Multimodal social attention;
• Attentive activity recognition;
• Belief and mental state attribution in robots


Keywords: Active Vision, Social Perception, Intention Prediction, Egocentric Vision, Natural Human-Robot Interaction


Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.

Humans naturally interact and collaborate in unstructured social environments that produce an overwhelming amount of information and may yet hide behaviorally relevant variables. Finding the underlying design principles that allow humans to adaptively find and select relevant information is important for Robotics
but also other fields, such as Computational Neuroscience and Interaction Design.

Current solutions cover specific tasks, e.g. autonomous cars, and usually employ over-redundant, expensive, and computationally demanding sensory systems that attempt to cover the wide set of sensing conditions which the systems may have to deal with. Adaptive control of the sensors and of the perception process is a key solution found by nature to cope with such problems, as shown by the foveal anatomy of the eye and its high mobility.


Alongside this interest in “active” vision, collaborative robotics has recently progressed to human-robot interaction in real manufacturing. Measuring and modelling task-specific gaze behaviours seems to be essential for smooth humanrobot interaction. Indeed, anticipatory control for human-in-the-loop architectures, which can enable robots to proactively collaborate with humans, relies heavily on observing gaze and actions patterns of the human partner.

We are interested in manuscripts that present novel computational and robotic models, theories and experimental results as well as reviews relevant to understand how human actively control their perception during social interaction and in which condition they fail, and how these insights may enable natural interaction between humans and artificial systems in non-trivial conditions. Topics of interest include (but are not limited to):

• Active perception for intention and action prediction
• Activity and action recognition in the wild
• Active perception for social interaction
• Human-robot collaboration in unstructured environments
• Human-robot collaboration in presence of sensory limits
• Joint Human-Robot search and exploration
• Testing setup for social perception in real or virtual environments
• Setup for transferring active perception skills from humans to robots
• Machine learning methods for active social perception
• Benchmarking and quantitative evaluation with human subject experiments
• Gaze-based Factors for Intuitive Human-Robot Collaboration
• Active perception modelling for social interaction and collaboration
• Head-mounted eye tracking and gaze estimation during social interaction
• Estimation and guidance of partner situation awareness and attentional
state in human-robot collaboration
• Multimodal Social perception
• Adaptive social perception
• Egocentric vision in social interaction;
• Explicit and implicit sensorimotor communication;
• Social attention;
• Natural human-robot (machine) interaction;
• Collaborative exploration;
• Joint attention;
• Multimodal social attention;
• Attentive activity recognition;
• Belief and mental state attribution in robots


Keywords: Active Vision, Social Perception, Intention Prediction, Egocentric Vision, Natural Human-Robot Interaction


Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.

About Frontiers Research Topics

With their unique mixes of varied contributions from Original Research to Review Articles, Research Topics unify the most influential researchers, the latest key findings and historical advances in a hot research area! Find out more on how to host your own Frontiers Research Topic or contribute to one as an author.

Topic Editors

Loading..

Submission Deadlines

31 July 2020 Abstract
31 December 2020 Manuscript

Participating Journals

Manuscripts can be submitted to this Research Topic via the following journals:

Loading..

Topic Editors

Loading..

Submission Deadlines

31 July 2020 Abstract
31 December 2020 Manuscript

Participating Journals

Manuscripts can be submitted to this Research Topic via the following journals:

Loading..
Loading..

total views article views article downloads topic views

}
 
Top countries
Top referring sites
Loading..