With the rapid development of new technologies equipped with multiple sensors and AI, it becomes more realistic to have a synchronised integration of multimodal information to better adapt to the user needs. Behavioural modelling based on multimodal sensing has opened a new door to have a more natural ...
With the rapid development of new technologies equipped with multiple sensors and AI, it becomes more realistic to have a synchronised integration of multimodal information to better adapt to the user needs. Behavioural modelling based on multimodal sensing has opened a new door to have a more natural perception of the user and create socially intelligent systems that can perceive user’s physical, psychological and cognitive states. Socially intelligent systems can well cater for data-driven personalisation and one particular area that can greatly benefit from such technologies is wellbeing. Wellbeing refers to physical, mental and social wellness and is fundamental to the overall health of an individual and extends beyond the classical definition of health. Recent years have witnessed a surge of assistive technologies that exploit multimodal data to inform users on their wellness-relevant metrics, including, but not limited to heart rate, steps, screen time, and physical activity. However, the number of applications which take advantage of AI and multimodal data to diagnose, prevent and manage wellbeing-related issues, is yet very limited. Such solutions can be in the form of intangible or tangible AI, where the former does not have a physical form and can be communicated through text, sound or images and the latter has a physical form to interact with, like social robots. AI-enabled technology integrated with multimodal information such as voice, speech, facial expression, body posture, gesture, activity, physiology and such can be used to assess wellbeing and provide timely interventions, so can essentially be used to promote wellbeing at different levels (e.g. individual, or societal). Considering multimodal behavioural AI for wellbeing is yet at its early stages, it is important to explore the potential opportunities and its impact as well as the ethical considerations to create socially responsible technologies.
The key aim of this research topic collection is to investigate how multimodal processing of human behavioural data using artificial intelligence can be used to promote wellbeing at individual, interpersonal or societal level, as well as, how the society and current world challenges can influence or inform the approaches and techniques in multimodal behavioural AI for wellbeing.
Suggested topics include but not limited to the followings with emphasis on multimodality and their application for wellbeing:
• Multimodal behavior modeling
• Multimodal interaction processing
• Multimodal human-computer interaction
• Multimodal human-robot interaction
• Multimodal affective computing
• Multimodal behavior synthesis
• Multimodal computer aided diagnosis
• Multimodal mobile systems for behavioral sensing and diagnosis
• Multimodal data collection and annotations
• Collaborative systems
• Virtual connectivity
• Post-COVID technologies
• Social computing
• Ethical issues
• Applications in assistive living, healthcare, education, etc.
Artificial Intelligence, Wellbeing, Human-Computer Interaction, Behaviour modeling, Multimodal analysis
All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.