Skip to main content

About this Research Topic

Manuscript Submission Deadline 21 January 2024

This Research Topic is still accepting articles. For authors aiming to contribute, please submit your manuscript today

Today, extended reality technologies are becoming increasingly accessible and therefore a more plausible area of investigation when creating sonic artefacts within immersive environments. Music composition, and audio systems, benefit from such immersive 3D environments because their mixed spatial features open up new avenues for musical expression, compositional thinking and human interaction with computing systems. Such as challenging the boundaries between physical/virtual instruments, spatialiasing sound, investigating new methods for sonic interactivity, using machine learning/AI to create human-led interactions with mixed spatial media and using the extended reality space to create accessible formats for enjoying audiovisual content. With successful enquiry, new methods for music composition, artistic immersion, data auralisation/sonic insights (regarding data sonification) and accessing artistic content are possible within extended reality.

The goal of this Research Topic is to investigate how new compositional and interactive audio systems can be built and designed (including accessibility enquiries) within extended reality to inform new methods for artistic practice within this emerging media. This is a challenging area because of a lack of literature and artistic output when applying artistic practice to extended reality tools, due to the novel nature of this space. This area is also challenging because traditionally artistic communities do not possess the technical knowledge to pair artistic practice skills with ML/AI affordances. Recent advances in this field include processing human biometrics with AI towards advanced sonic-centred HCI and developing smart music tutors.

The scope of this Research Topic aims to address all applications of interactive audio within the umbrella of extended reality, including for the creation of artistic products, for example:

- Interactive music composition systems (artefacts and demonstrations)
- Gestural interfaces for sonic-centric human-computer interaction (development of hardware & applications)
- Audio-visual interaction systems for educational outcomes (e.g., data visualisation and accessibility)
- Internet-of-things (IoT) centred interaction within immersive environments
- Machine learning and AI within immersive environments

The Research Topic Coordinator for this Research Topic is Dr Chris Rhodes. Primarily, Chris is a Lecturer in Digital Media Production at University College London, UK. He also holds a Research Fellowship (EPSRC Doctoral Prize) at the University of Manchester, UK, investigating AI and music performance. Chris’ research interests concern the use of interactive and immersive technologies towards future creative music systems.

Keywords: Interactive Audio, Interactive Music Composition, Extended Reality, Virtual Reality, Augmented Reality, Mixed Reality, 3D Interactive Environments, Human-computer Interaction


Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.

Today, extended reality technologies are becoming increasingly accessible and therefore a more plausible area of investigation when creating sonic artefacts within immersive environments. Music composition, and audio systems, benefit from such immersive 3D environments because their mixed spatial features open up new avenues for musical expression, compositional thinking and human interaction with computing systems. Such as challenging the boundaries between physical/virtual instruments, spatialiasing sound, investigating new methods for sonic interactivity, using machine learning/AI to create human-led interactions with mixed spatial media and using the extended reality space to create accessible formats for enjoying audiovisual content. With successful enquiry, new methods for music composition, artistic immersion, data auralisation/sonic insights (regarding data sonification) and accessing artistic content are possible within extended reality.

The goal of this Research Topic is to investigate how new compositional and interactive audio systems can be built and designed (including accessibility enquiries) within extended reality to inform new methods for artistic practice within this emerging media. This is a challenging area because of a lack of literature and artistic output when applying artistic practice to extended reality tools, due to the novel nature of this space. This area is also challenging because traditionally artistic communities do not possess the technical knowledge to pair artistic practice skills with ML/AI affordances. Recent advances in this field include processing human biometrics with AI towards advanced sonic-centred HCI and developing smart music tutors.

The scope of this Research Topic aims to address all applications of interactive audio within the umbrella of extended reality, including for the creation of artistic products, for example:

- Interactive music composition systems (artefacts and demonstrations)
- Gestural interfaces for sonic-centric human-computer interaction (development of hardware & applications)
- Audio-visual interaction systems for educational outcomes (e.g., data visualisation and accessibility)
- Internet-of-things (IoT) centred interaction within immersive environments
- Machine learning and AI within immersive environments

The Research Topic Coordinator for this Research Topic is Dr Chris Rhodes. Primarily, Chris is a Lecturer in Digital Media Production at University College London, UK. He also holds a Research Fellowship (EPSRC Doctoral Prize) at the University of Manchester, UK, investigating AI and music performance. Chris’ research interests concern the use of interactive and immersive technologies towards future creative music systems.

Keywords: Interactive Audio, Interactive Music Composition, Extended Reality, Virtual Reality, Augmented Reality, Mixed Reality, 3D Interactive Environments, Human-computer Interaction


Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.

Topic Editors

Loading..

Topic Coordinators

Loading..

Articles

Sort by:

Loading..

Authors

Loading..

total views

total views article views downloads topic views

}
 
Top countries
Top referring sites
Loading..

About Frontiers Research Topics

With their unique mixes of varied contributions from Original Research to Review Articles, Research Topics unify the most influential researchers, the latest key findings and historical advances in a hot research area! Find out more on how to host your own Frontiers Research Topic or contribute to one as an author.