Research Topic

Recognizing the State of Emotion, Cognition and Action from Physiological and Behavioural Signals

About this Research Topic

Nowadays people spend significant amount of time interacting with devices and computing systems. These devices and computing systems, however, are playing a passive role during interaction, unlike humans who can observe partners to know when and how to provide assistance. Seamless blending of humans and technology for intelligent interaction is becoming more important than ever. One key aspect is to let machine understand users’ state of emotion, cognition and action (herein termed user state for simplicity). Building this ability in machine is of critical use in a wide variety of human machine collaboration contexts spanning from safe driving to assistance for people with disability. For example, autonomous car can ‘observe’ user’s state of emotion (e.g., negative), cognition (e.g., overloaded), and action (e.g., glancing down) to determine when to give safety reminders or take control. The aim of this research is to empower machine to understand user state so that human and machine can collaborate in the best form to augment human ability.

Recognition of users’ state of emotion, cognition and action is a fundamentally multidisciplinary field. The key research questions need to be addressed include (i) what are the physiological and behavioural cues that are critically important in understanding user state? (ii) how to develop better representation for each individual state (e.g., distress) for addressing the complex psychological process where our psychological and behavioural signals are often affected by multiple states? (iii) how to develop computational methods applicable to real-life tasks to automatically recognize the interested state of cognition, emotion, action or all of them?

As we often express emotion while performing cognitive tasks and responding with actions, the multiple co-occurring states on users make it complicated to recognize them in real life and require substantial research. Studies examining the measurable differences and recognition performance in less controlled tasks and proposing effective and robust computing methods for user state recognition will help shed light on these problems. 

The main objective of this Research Topic is to bring together current advances in the field. Topics of interest include, but are not limited to:

● Theorical frameworks and/or experimental study or review for the relationship between user state and physiological/behavioural cues
● Methods for processing physiological and behavioural signals and recognizing state of emotion, cognition, and action indicators 
● Machine learning techniques focused specifically on user state recognition
● Methods for multimodal user-state recognition
● Robustness issues in recognizing user state in less controlled contexts and everyday life.
● Wearable technologies for user state analysis
● Emotion/cognition/action recognition-powered assistive technologies and applications
● Novel user-state recognition systems and their applications


Keywords: Computational Psychophysiology, Affective Computing, Human Activity Recognition, Human Computer Interaction, Human Factors, Assistive Technology, Signal Processing, Pattern Recognition, Machine Learning, Electrophysiology/Electrochemestry, Wearable Sensor


Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.

Nowadays people spend significant amount of time interacting with devices and computing systems. These devices and computing systems, however, are playing a passive role during interaction, unlike humans who can observe partners to know when and how to provide assistance. Seamless blending of humans and technology for intelligent interaction is becoming more important than ever. One key aspect is to let machine understand users’ state of emotion, cognition and action (herein termed user state for simplicity). Building this ability in machine is of critical use in a wide variety of human machine collaboration contexts spanning from safe driving to assistance for people with disability. For example, autonomous car can ‘observe’ user’s state of emotion (e.g., negative), cognition (e.g., overloaded), and action (e.g., glancing down) to determine when to give safety reminders or take control. The aim of this research is to empower machine to understand user state so that human and machine can collaborate in the best form to augment human ability.

Recognition of users’ state of emotion, cognition and action is a fundamentally multidisciplinary field. The key research questions need to be addressed include (i) what are the physiological and behavioural cues that are critically important in understanding user state? (ii) how to develop better representation for each individual state (e.g., distress) for addressing the complex psychological process where our psychological and behavioural signals are often affected by multiple states? (iii) how to develop computational methods applicable to real-life tasks to automatically recognize the interested state of cognition, emotion, action or all of them?

As we often express emotion while performing cognitive tasks and responding with actions, the multiple co-occurring states on users make it complicated to recognize them in real life and require substantial research. Studies examining the measurable differences and recognition performance in less controlled tasks and proposing effective and robust computing methods for user state recognition will help shed light on these problems. 

The main objective of this Research Topic is to bring together current advances in the field. Topics of interest include, but are not limited to:

● Theorical frameworks and/or experimental study or review for the relationship between user state and physiological/behavioural cues
● Methods for processing physiological and behavioural signals and recognizing state of emotion, cognition, and action indicators 
● Machine learning techniques focused specifically on user state recognition
● Methods for multimodal user-state recognition
● Robustness issues in recognizing user state in less controlled contexts and everyday life.
● Wearable technologies for user state analysis
● Emotion/cognition/action recognition-powered assistive technologies and applications
● Novel user-state recognition systems and their applications


Keywords: Computational Psychophysiology, Affective Computing, Human Activity Recognition, Human Computer Interaction, Human Factors, Assistive Technology, Signal Processing, Pattern Recognition, Machine Learning, Electrophysiology/Electrochemestry, Wearable Sensor


Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.

About Frontiers Research Topics

With their unique mixes of varied contributions from Original Research to Review Articles, Research Topics unify the most influential researchers, the latest key findings and historical advances in a hot research area! Find out more on how to host your own Frontiers Research Topic or contribute to one as an author.

Topic Editors

Loading..

Submission Deadlines

13 September 2021 Manuscript

Participating Journals

Manuscripts can be submitted to this Research Topic via the following journals:

Loading..

Topic Editors

Loading..

Submission Deadlines

13 September 2021 Manuscript

Participating Journals

Manuscripts can be submitted to this Research Topic via the following journals:

Loading..
Loading..

total views article views article downloads topic views

}
 
Top countries
Top referring sites
Loading..