About this Research Topic
Presently, artificial intelligence is experiencing rapid development. Human-robot cooperation and symbiosis have become an integral part of human life, which is increasingly affecting our world. Research and novel discussions have arisen around how human-robot cooperation models will differ in the next generation compared to today. Apart from physical action commands (such as moving forward and backward, turning left and right), task-level intents will be the main control instructions sent to single or several robots now and in the future. To realize high-level communication and collaboration between human and robots, the use of multimodal physiological signals is an essential way to convey human instructions and intentions, since human-robot cooperation starts from the human and focuses on the human.
Human interaction channels including central nervous system, peripheral nerves and motor behavior have been studied to form internal mechanism models. While neural signals convey human intentions and instructions, it is essential to identify the features and information through neuroscientific research. Recent studies tend to focus on the fusion of multiple physiological signals to describe and represent more complex human behavior and intentions and to develop correlation of multimodal physiological signals, which helps to build human-robot interaction systems. From the research of communication patterns between humans and their surroundings, nervous system mechanisms of human bodies, internal models and cross model correlations, we are aiming to explore the biomedical principles of physiological information and new human-robot interaction paradigms and algorithms. We also aim to develop the human natural behavior in communication with the outside world and thus provide possibility for more complex human-robot collaboration and supervision.
We are looking for original research that relates to:
- the correlation of multimodal physiological signals
- models and simulations of human neural pathways (from central nervous system to peripheral nerve and motor
- implementation of multimodal signals in human-robot interactions;
- Brain-computer interfaces with regards to;
o Neural typing
o Speech recognition
o Eye tracking
o Hand gesture recognition
o Emotion identification for Human-Robot Interactions.
The Guest Editors would like to express their profound gratitude to Dr. Jinghan Wu for her secretarial work in the preparation of this research topic. We also want to express our gratitude to Drs. Shuang Yang and Ruihao Li for their valuable work in participating and initiating this article collection.
Keywords: Multimodal physiological signal, Cross-modal human-robot interaction, Hybrid brain-computer interface, Speech recognition, Emotion identification
Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.