Skip to main content

SYSTEMATIC REVIEW article

Front. Virtual Real., 14 July 2021
Sec. Virtual Reality and Human Behaviour
This article is part of the Research Topic Presence and Beyond: Evaluating User Experience in AR/MR/VR View all 16 articles

A Systematic Review of Physiological Measurements, Factors, Methods, and Applications in Virtual Reality

  • Human-Computer Interaction (HCI) Group, Informatik, University of Würzburg, Würzburg, Germany

Measurements of physiological parameters provide an objective, often non-intrusive, and (at least semi-)automatic evaluation and utilization of user behavior. In addition, specific hardware devices of Virtual Reality (VR) often ship with built-in sensors, i.e. eye-tracking and movements sensors. Hence, the combination of physiological measurements and VR applications seems promising. Several approaches have investigated the applicability and benefits of this combination for various fields of applications. However, the range of possible application fields, coupled with potentially useful and beneficial physiological parameters, types of sensor, target variables and factors, and analysis approaches and techniques is manifold. This article provides a systematic overview and an extensive state-of-the-art review of the usage of physiological measurements in VR. We identified 1,119 works that make use of physiological measurements in VR. Within these, we identified 32 approaches that focus on the classification of characteristics of experience, common in VR applications. The first part of this review categorizes the 1,119 works by field of application, i.e. therapy, training, entertainment, and communication and interaction, as well as by the specific target factors and variables measured by the physiological parameters. An additional category summarizes general VR approaches applicable to all specific fields of application since they target typical VR qualities. In the second part of this review, we analyze the target factors and variables regarding the respective methods used for an automatic analysis and, potentially, classification. For example, we highlight which measurement setups have been proven to be sensitive enough to distinguish different levels of arousal, valence, anxiety, stress, or cognitive workload in the virtual realm. This work may prove useful for all researchers wanting to use physiological data in VR and who want to have a good overview of prior approaches taken, their benefits and potential drawbacks.

1 Introduction

Virtual Reality (VR) provides the potential to expose people to a large variety of situations. One advantage it has over the exposure to real situations is that the creator of the virtual environment can easily and reliably control the stimuli that are presented to an immersed person (Vince, 2004). Usually, the presented stimuli are not arbitrary but intentionally chosen to evoke a certain experience in the user, e.g. anxiety, relaxation, stress, or presence. Researchers require tools that help them to determine whether the virtual environment fulfills its purpose and how users respond to certain stimuli. Evaluation methods are an essential part of the development and research of VR.

Evaluation techniques can be divided into implicit and explicit methods (Moon and Lee, 2016; Marín-Morales et al., 2020). Explicit methods require the user to explicitly and actively express the own experience. Hence, they can also be called subjective methods. Examples include interviews, thinking-aloud and questionnaires. In the evaluation of VR, questionnaires are the most prominent explicit method. They are very versatile and designed for the quantification of various characteristics of experience. Some assess VR specific phenomena, e.g. presence (Slater et al., 1994; Witmer and Singer, 1998), simulator sickness (Kennedy et al., 1993), or the illusion of virtual body-ownership (Roth and Latoschik, 2020). Other questionnaires capture more generic characteristics of experience, but are still useful in many VR scenarios, e.g. workload (Hart and Staveland, 1988) or affective reactions (Watson et al., 1988; Bradley and Lang, 1994).

Traditionally, questionnaires and other explicit methods also bring with them some disadvantages. There is a variety of self-report biases that can manipulate the way people respond to questions. A common example is the social desirability bias. It refers to the idea that subjects tend to choose a response that they expect to meet social expectations instead of one that reflects their true experience (Corbetta, 2003; Grimm, 2010). Other common examples for response biases are the midpoint bias where people tend to choose neutral answers (Morii et al., 2017) or extreme responding where people tend to choose the extreme choices on a rating scale (Robins et al., 2009). In general, there is a variety of characteristics and circumstances that can negatively influence the human capacity to evaluate oneself. For a detailed description of erroneous self-assessment of humans, refer to Dunning et al. (2004). Another point that can limit the validity of questionnaires is that one never knows if the questions were understood by participants (Rowley, 2014). The complexity of the information and the language skills of the respondents can influence how questions are interpreted and thus how answers turn out (Redline et al., 2003; Richard and Toffoli, 2009). Another problem is that explicit methods often separate the evaluation from the underlying stimulus. Thus, they rely on a correct recapitulation of experience. People might not be able to remember how exactly they were feeling when they were interacting with a software (Cairns and Cox, 2008). This is especially relevant for VR, as leaving the virtual environment can lead to a change in the evaluation of the experience (Schwind et al., 2019). In addition, some mental processes are not even accessible to consciousness and are therefore not recorded by explicit methods (Barsade et al., 2009).

Implicit evaluation methods avoid a lot of those drawbacks. In contrast to the explicit measures, they do not require the user to actively participate in the evaluation. Rather, they analyze the user behavior based on the response to a certain stimulus or event. This can be done either by direct observation or by analysis of physiological data. These implicit methods can also be referred to as objective methods as they do not rely on the ability of subjects to assess their own condition. Implicit evaluation that is based on physiological data has the advantage that it can assess both, automatic and deliberate processes. With automatic processes, we refer to organic activations that are unconsciously controlled by the autonomous nervous system, e.g. bronchial dilation or the activation of the sweat secretion (Jänig, 2008; Laight, 2013). These activations cannot be observed from the outside. With measures like electrodermal activity, electrocardiography, or electroencephalography, however, we can assess them. This allows quantification of how current stimuli are processed by the nervous system (Jänig, 2008; Laight, 2013). Deliberate processes, on the other hand, do not depend on unconscious activations of the autonomous nervous system. Nevertheless, physiological measures can help to understand these processes. Electromyography, for example, measures the strength of contraction of skeletal muscles (De Luca, 2006). Thus, this signal can also depend on arbitrary control by the human being.

Physiological measurements offer decisive advantages. They can be taken during exposure, they do not depend on memory, they can capture sub-conscious states, data can be collected fairly unobtrusively, and they yield quantitative data that can be leveraged for machine-learning approaches. A depiction of the discussed structure of evaluation methods can be found in Figure 1. An overview of the physiological measures that are considered in this work can be found in Table 1. It also contains abbreviations for the measurements that are used from now on.

FIGURE 1
www.frontiersin.org

FIGURE 1. Categorization of evaluation methods. The overview should not be considered comprehensive, but mereley as an orientation.

TABLE 1
www.frontiersin.org

TABLE 1. Physiological measures that are commonly used in VR applications.

The availability of easy-to-use wearable sensors is spurring the use of physiological data. EEG headsets such as the EPOC+1 or the Muse 2,2 wrist and chest worn trackers from POLAR,3 fitbit,4 and Apple5; as well as the EMPATICA E46 all make it easier to collect physiological data. In addition, VR headsets already come with built-in sensors that can be used for behavior analysis. Data from gyroscopes and accelerometers, included in VR-headsets and controllers, provide direct information about movement patterns. Moreover, eye-tracking devices from tobii7 or Pupil Labs8 can be used to easily extend VR-headsets so they deliver even more data, e.g. pupil dilation or blinking rate.

Due to the aforementioned advantages in combination with the availability of easy-to-use sensors and low-cost head-mounted displays (HMD) (Castelvecchi, 2016), the number of research approaches that combine physiological data with VR has increased considerably in recent years. In their meta-review about emotion recognition in VR with physiological data, Marín-Morales et al. (2020) even report an exponential growth of this field. Researchers who want to use physiological measures for their own VR application, however, are faced with a very rapidly growing field that offers a wide range of possibilities. As previously implied, there are a variety of signals that can be collected with a variety of sensors. While it is clear that a presence questionnaire is used to assess presence, such a 1-to-1 linkage of measure and experience is not possible for physiological measures. Their usage in VR applications is therefore anything but trivial. A structured reappraisal of the field is necessary.

This systematic review consists of two parts that address the following issues:

In the first part of this article, we examine the different use cases of physiological measurements in VR. We collect a broad selection of works that use physiological measures to assess the state of the user in the virtual realm. Then we categorize the works into specific fields of application and explain the functionality of the physiological measurements within those fields. As a synthesis of this part, we describe a list of the main purposes of using physiological data in VR. This serves as a broad, state of the art overview of how physiological measures can be used in the field of VR.

However, knowing what this data can be used for is only half the battle. We still need to know how to work with this data, in order to gain knowledge about a user’s experience. Hence, in the second part of this paper, we will discuss concrete ways to collect and interpret physiological data in VR. Works that tell us a lot about how to get data and what can be deduced from it are classification approaches. To be precise, this includes approaches, that classify different levels of certain characteristics of experience. The works from this domain usually adopt the characteristics of experiences as their independent variable. Subjects in those studies were exposed to stimuli known to elicit a certain experience, such as anxiety. These studies then examined the extent to which the change in experience was reflected in physiological measurements. Thus, the works focus on the physiological measures themselves and their ability to quantify a particular experience. This review of classifiers, therefore, provides a clear overview of signals, sensors, tools and algorithms, that have been sensitive enough to distinguish different levels of the targeted experience in a VR setup. They show concrete procedures on how to extract the information that is hidden in the physiological data.

2 Methods

On the basis of the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) statement (Liberati et al., 2009), we searched and assessed literature to find papers that make use of physiological data in VR. We always searched for one specific signal in combination with VR, so the search terms consisted of two parts that are connected with an AND. Thus, the individual search terms can be summarized in one big query that can be described like this: (“Virtual Reality” OR “Virtual Environment” OR “VR” OR “HMD”) AND (Pupillometry OR “Pupil* Size” OR “Pupil* Diameter” OR “Pupil* Dilation” OR “Pupil” OR “Eye Tracking” OR “Eye-Tracking” OR “Eye-Tracker*” OR “Gaze Estimation” OR “Gaze Tracking” OR “Gaze-Tracking” OR “Eye Movement” OR “EDA” OR “Electrodermal Activity” OR “Skin Conductance” OR “Galvanic Skin Response” OR “GSR” OR “Skin Potential Response” OR “SPR” OR “Skin Conductance Response” OR “SCR” OR “EMG” OR “Electromyography” OR “Muscle Activity” OR “Respiration” OR “Breathing” OR “Heart Rate” OR “Pulse” OR “Skin Temperature” OR “Thermal Imaging” OR “Surface Temperature” OR “Blood Pressure” OR “Blood Volume Pressure” OR “EEG” OR “Electroencephalography”). The terms had to be included in the title, abstract, or keywords of an article. Queried databases were ACM Digital Library, Web of Science, PubMed, APA PsycInfo, PsynDex, and IEEE Xplore. The date of the search was October 15, 2020.

We gathered the results and inserted them into a database together with some extra papers that were known to be relevant for the topic. We removed duplicates and then started with the screening process. Papers were excluded if they were from completely different domains (VR, for example, can not only stand for “Virtual Reality”), if new sensors or algorithms were only introduced (but not actually used), if they only dealt with augmented reality, or if they were just presenting the idea of using physiological data and VR (but not actually did it). Furthermore, we excluded poster presentations, abstracts, reviews, and works that were not written in English. That means, left after this screening process were all works that present a use case in which the sought-after physiological measures were used together with a VR application. We usually screened papers on basis of title and abstract. About 10% required inspection of the full text to determine if they met the criteria. If the full text of those works was not available they were also excluded. We used the papers that were left after this process for the first part of this work. During the screening process, we began to note certain repetitive fields of application and compiled a list of categories and sub-categories of field of application. We then tagged the papers with these categories according to their field of application. We also noted the purpose for which the physiological data was used, also with the help of tags.

As already explained in the introduction (Section 1), in the further course of the review we focused on classification approaches. During the aforementioned tagging process, we identified papers that deal with some kind of classification. Those papers were then examined for their eligibility to be included in the second part of the review. The criterion for the inclusion of a paper here was that it is a work that presents a classification based on physiological data which was captured during exposure to immersive VR (CAVE-based or HMD-based). In order to be included, the work also had to distinguish different levels of current experience (e.g. high vs. low stress) and not different groups of people (e.g. children with and without ADHD). Excluded were classifiers that aim at the recognition of user input, an adaption of the system, or the recognition of the used technology. Also excluded were works that just look for correlations between signals and certain events, classification approaches that are based on desktop VR or non-physiological data.

3 Results

An overview of the specific phases of the search for literature and the results can be seen in Figure 2. In total, the literature research yielded 4,943 different works. After the first screening process, 1,408 works were left over. They all show examples of how physiological data can be used in immersive VR to assess the state of the user. Figure 3 shows the distribution of the papers over the years since 1995.

FIGURE 2
www.frontiersin.org

FIGURE 2. Process and Results of the Literature Review. Diagram is adapted from Liberati et al. (2009).

FIGURE 3
www.frontiersin.org

FIGURE 3. Division of the 1,408 papers, that remained after screening, according to published year.

From this point forward, we chose to continue with works from 2013 and later. Thus, we shifted the focus to current trends. The numbers show that most of the papers were published during the last years (see Figure 3). The year 2013 was the first year for which we found more than 50 papers. This left 1,119 of the 1,408 papers that were published in 2013 or later. During the screening process, described in Section 2, we identified five major fields of application to which most works can be assigned to. These domains are therapy and rehabilitation, training and education, entertainment, functional VR properties and general VR properties. In the first part of the discussion section, we use this domain division to give a broad overview of the usage of physiological measures in VR (see Section 4.1).

After screening and checking for eligibility, 32 works that deal with the classification of experience in VR were left for further qualitative analysis. Each of the 32 works use physiological measures as dependent variables. As independent variables they manipulate the intensity of a target characteristic of experience. Thus, the works show the extent to which the physiological measures were able to reflect the manipulation of the independent variable. In our results, the most commonly assessed characteristic of experience was arousal, used in nine works, followed by valence and anxiety, both used in six works. Five works classify stress, and four, cognitive workload. The following characteristics of experience were measured in only one work: Visual fatigue, moments of insight, cybersickness, and understanding. An overview of the 32 works can be found in Table 2. This overview shows which characteristics of experience the works assessed, which measures and sensors they used for their approach, and which classification algorithms were chosen for the interpretation of the data. Table 3 provides a separate overview of the sensors that were used in the 32 works. In Table 4 we list different tools that were used in various classification approaches to record, synchronize, and process the physiological data. In the second part of the discussion, we deal with the listed characteristics of experience individually and summarize the corresponding approaches with a focus on signals and sensors (see Section 4.2).

TABLE 2
www.frontiersin.org

TABLE 2. Overview of classification approaches based on physiological data collected in full-immersive VR. If not stated otherwise, the values presented in the Results column usually refer to the accuracy that was achieved in a cross validation or on an extra test-set. These values serve only as a rough guide to the success of the method and are not comparable 1-to-1.

TABLE 3
www.frontiersin.org

TABLE 3. Sensors used in the identified classification approaches.

TABLE 4
www.frontiersin.org

TABLE 4. Various tools that were used in the identified classification approaches for the recording, synchronization, and processing of physiological data.

4 Discussion

As already indicated in Section 3 the discussion of this work is divided into two parts.

4.1 Part 1: Fields of Application for Physiological Data in Virtual Reality

In the first part, we give a categorized overview of the copious use cases of physiological measures in VR. This overview is based on the 1,119 works and the fields of application that we identified during the screening process. This section is structured according to those fields. We highlight which works belong to which fields and how physiological measures are used. We summarize this overview by listing meta purposes for which physiological data are used in VR. This overview cannot cover all the works that are out there. What we describe are the types of work that have occurred most frequently. After all, this is an abstraction of the field. With only a few exceptions, all the examples we list here are HMD-based approaches.

4.1.1 Therapy and Rehabilitation

Therapy and rehabilitation applications are frequent fields of application for physiological measurements. Here, we talk about approaches that try to reduce or completely negate the effects or causes of diseases and accidents.

4.1.1.1 Exposure Therapie

A very common type of therapy that leverages physiological data in virtual reality is exposure therapy. Heart rate, skin conductivity, or the respiration rate are often used to quantify anxiety reactions to stimuli that can be related to a phobia. Common examples for this are public speaking situations (Kothgassner et al., 2016; Kahlon et al., 2019), standing on elevated places (Gonzalez et al., 2016; Ramdhani et al., 2019), confrontations with spiders (Hildebrandt et al., 2016; Mertens et al., 2019), being locked up in a confined space (Shiban et al., 2016b; Tsai et al., 2018), or reliving a war-scenario (Almeida et al., 2016; Maples-Keller et al., 2019).

Physiological measures can also be used to evaluate the progress of the therapy. Shiban et al. (2017) created a virtual exposure application for the treatment of aviophobia. Heart rate and skin conductance were measured as indicators for the fear elicited by a virtual airplane flight. The exposure session consisted of three flights, while a follow-up test session, one week later, contained two flights. By analyzing the psychophysiological data throughout the five flights, the researchers were able to show that patients continuously got used to the fear stimulus.

Another way in which the physiological data can be utilized is for an automatic adaption of the exposure therapy system. Bălan et al. (2020) used a deep learning approach for the creation of a fear-level classifier based on heart rate, GSR, and EEG data. This classifier was then used as part of a virtual acrophobia therapy in which the immersed person stands on the roof of a building. Based on a target anxiety level and the output of the fear classifier the system can steer the height of the building and thus the intensity of the exposure. A similar approach comes from Herumurti et al. (2019) in the form of an exercise system for people with public speaking anxiety. Here, the behavior of a virtual audience depends on the heart rate of the user, i.e. the audience pays attention, pays no attention, or mocks the speaker.

What is also often done in research with exposure therapy applications is the comparison of different stimuli, systems, or groups of people. Physiological signals often represent a reference value that enhances such approaches. Comparisons have been made between traditional and virtual exposure therapy (Levy et al., 2016), fear-inducing stimuli in VR and AR (Li et al., 2017; Yeh et al., 2018), or just between phobic and healthy subjects (Breuninger et al., 2017; Kishimoto and Ding, 2019; Freire et al., 2020; Malta et al., 2020).

4.1.1.2 Relaxation Applications

Many approaches work with the idea to use a virtual environment to let people escape from their current situation and immerse themselves in a more relaxing environment. Physiological stress indicators can help to assess the efficacy of these environments. Common examples for this include scenes with a forest (Yu et al., 2018; Browning et al., 2019; Wang X. et al., 2019; De Asis et al., 2020), a beach (Ahmaniemi et al., 2017; Anderson et al., 2017), mountains (Ahmaniemi et al., 2017; Zhu et al., 2019) or an underwater scenario (Soyka et al., 2016; Liszio et al., 2018; Fernandez et al., 2019). Other works go one step further and manipulate the virtual environment, based on the physiological status of the immersed person. So-called biofeedback applications are very common in the realm of relaxation applications and aim to make the users aware of their inner processes. The way this feedback looks can be very different. Blum et al. (2019) chose a virtual beach scene at sunset with palms, lamps, and a campfire. Their system calculates a real-time feedback parameter based on the heart rate variability as an indicator for relaxation. This parameter determines the cloud coverage in the sky and if the campfire and lamps are lit or not. Fominykh et al. (2018) present a similar virtual beach where the sea waves become higher and the clouds become darker when the heart rate of the user rises. Patibanda et al. (2017) present the serious game Life Tree which aims to teach a stress reducing breathing technique. The game revolves around a tree, that is bare at the start. By exhaling, the player can blow leaves towards the tree. The color of the leaves become green if the player breathes rhythmically and brown if not. Also, the color of the tree itself changes as the player practices correct breathing. Parenthoen et al. (2015) realized biofeedback with the help of EEG data by animating ocean waves according to surface cerebral electromagnetic waves of the immersed person. Most of these works aim at transferring the users from their stressful everyday life into a meditative state. Refer to Döllinger et al. (2021) for a systematic review of such works.

Relaxation applications can not only be used to escape the stress of everyday life but also as a distraction from painful medical procedures and conditions. This was applied in different contexts, e.g. during intravenous cannulation of cancer patients (Wong et al., 2020), preparation for knee surgery (Robertson et al., 2017), stay on an intensive care unit (Ong et al., 2020), or a dental extraction procedure (Koticha et al., 2019). Physiological stress indicators are commonly used to compare the effects of the virtual distraction to control groups (Ding et al., 2019; Hoxhallari et al., 2019; Rao et al., 2019).

4.1.1.3 Physical Therapy

VR stroke therapies often aim for the rehabilitation of impaired extremities. Here, virtual environments are commonly used to enhance motivation with gamification (Ma et al., 2018; Solanki and Lahiri, 2020) or to offer additional feedback, e.g. with a virtual mirror (Patel et al., 2015; Patel et al., 2017). In the domain of motor-rehabilitation, EMG-data can be of particular importance. It can be used to demonstrate the basic effectiveness of the system by showing that users of the application really activate targeted muscles (Park et al., 2016; Drolet et al., 2020). This is of special interest when impairments do not allow visible movement of the target-limb (Patel et al., 2015). Another strategy to combine VR stroke therapy and EMG signals is a feedback approach. Here, the strength of the muscle activation is made available to the user visually or audibly which can result in positive therapy effects, as the user becomes aware of internal processes (Dash et al., 2019; Vourvopoulos et al., 2019). The use of psychophysiological data in stroke-therapy is not necessarily restricted to EMG. Calabro et al. (2017) created a virtual gait training for lower limb paralysis and compared it to a non-VR version of the therapy. With the help of EEG measurements, they showed that the VR version was especially useful for activating brain areas that are responsible for motor learning.

Also Parkinson’s disease requires motor-rehabilitation. Researchers have used physiological data to assess the anxiety experience of Parkinson patients with impaired gait under different elevations or on a virtual plank (Ehgoetz Martens et al., 2015; Ehgoetz Martens et al., 2016; Kaur et al., 2019). This data has helped researchers and therapists to understand the experience of the patients and to adapt the therapy accordingly.

4.1.1.4 Addiction Therapy

Another field of application where physiological measurements prove useful is in the therapy of drug addictions. Gamito et al. (2014) showed that virtual cues have the potential to elicit a craving for nicotine in smokers. With the help of eye-tracking, they demonstrated that smokers exhibit a significantly higher number of eye fixations on cigarettes and tobacco packages. In a similar studies, Thompson-Lake et al. (2015) and García-Rodríguez et al. (2013) showed that virtual, smoking-related cues can cause an increase in the heart rate of addicts. Yong-Guang et al. (2018) and Ding X. et al. (2020) did the same for methamphetamine users. They found evidence that meth-users show significant differences in EEG, GSR, and heart rate variability measurements when being exposed to drug-related stimuli in a virtual environment. Based on this, Wang Y.-G. et al. (2019) created a VR counter-conditioning procedure for methamphetamine users. With this virtual therapy, they were able to suppress cue-induced reactions in patients with meth-dependence. The use of physiological data to study the effects of addiction-related stimuli that are presented in VR has also been applied for gambling (Bruder and Peters, 2020; Detez et al., 2019).

A summary of the works discussed in this section can be found in Table 5.

TABLE 5
www.frontiersin.org

TABLE 5. Overview of the works from the field of therapy and rehabilitation that were discussed in Section 4.1.1. The Measures column refers to the physiological measures used in the work. The entries of the Independent Variables column often do not cover everything that was considered in the work. Entries in the Purposes column refer to the categories listed in Section 4.1.6.

4.1.2 Training and Education

A considerable amount of VR applications help people to learn new skills, enhance existing ones, or facilitate knowledge in a certain area. A major reason why physiological data comes in handy in training and teaching applications is its potential to indicate cognitive workload and the stress state of a human subject.

4.1.2.1 Simulator Training

Training simulators from various domains include an estimation of mental workload based on physiological data, e.g. surgery training (Gao et al., 2019), virtual driving (Bozkir et al., 2019), or flight simulation (Zhang S. et al., 2017). One way to use knowledge about cognitive load is by adapting task difficulty. Dey et al. (2019a) created a VR training task that requires the user to select a target object, defined by a combination of shape and color. The system uses the EEG alpha band signal to determine how demanding the task is. Based on this information the system can steer the difficulty of the task by altering the number and properties of distractors. In this way, it can be ensured that the task is neither too easy nor too difficult. In the application of Faller et al. (2019) the user has to navigate a plane through rings. The difficulty, i.e. the size of the rings, can be adjusted based on EEG data. With this approach, they were able to keep trainees on an arousal level that is ideal for learning.

In certain fields, physiological measures can even be used to determine the difference between experts and novices. Clifford et al. (2018) worked with a VR application for the training of aerial firefighting. It is a multi-user system that requires communication from the trainees. To cause additional stress the system includes a scenario where the communication is distorted. They evaluated the system with novice and experienced firefighters. By analyzing the heart rate variability of subjects, they were able to show that the communication disorders were effective in eliciting stress throughout the subjects. More interestingly, however, experts showed an increased ability to maintain their heart rate variability, compared to the inexperienced firefighters. This indicates that they were better able to cope with the stress (Clifford et al., 2020). Currie et al. (2019) worked with a similar approach. Their virtual training environment is focused on a high-fidelity surgical procedure. Eye-tracking was used to gain information about the attention patterns of users. A study with novice and expert operators showed that the expert group had significantly greater dwell time and fixations on support displays (screens with X-ray or vital signs). Melnyk et al. (2021) showed how this knowledge can be used to support learning, as they augmented surgical training by using expert gaze patterns to guide the trainees.

In simpler cases, stress and workload indicators are used to substantiate the basic effectiveness of virtual stimuli in training. Physiological indicators can be used to show that implemented scenarios are really able to elicit desired stress responses (Loreto et al., 2018; Prachyabrued et al., 2019; Spangler et al., 2020).

4.1.2.2 Virtual Classrooms

Physiological measures can also benefit classical teacher-student scenarios. Rahman et al. (2020) present a virtual education environment in which the teacher is provided with a visual representation of the gaze behavior of students. This allows the teachers to identify distracted or confused students, which can benefit the transfer of knowledge. Yoshimura et al. (2019) developed a strategy to deal with inattentive listeners. They constructed an educational virtual environment in which eye-tracking is used to identify distracted students. The system can then present visual cues, e.g. arrows or lines, that direct the attention of the pupils towards critical objects that are currently discussed. In the educational environment of (Khokhar et al., 2019) the knowledge about inattentive students is provided to a pedagogical agent. Sakamoto et al. (2020) tested pupil metrics for their eligibility to gain information about the comprehension of people. They recorded gaze behavior during a learning task in VR and compared this to the subjective comprehension ratings of subjects. In a similar example from Orlosky et al. (2019), they used eye movement and pupil size data to build a support vector machine that predicts if a user understood a given term or not. Even the experience of flow can be assessed with the help of physiological indicators (Bian et al., 2016). Information about attention and comprehension of students can be used to optimize teaching scenarios. It is an illustrative example of how physiological data can augment virtual learning spaces and create possibilities that would be unthinkable in real-life ones.

4.1.2.3 Physical Training

Our discussion of training applications has thus far revolved around mental training. However, there are also applications for physical training in VR. Again, physiological data can be used to emphasize the basic effectiveness of the application. Changes in heart rate or oxygen consumption can show that virtual training elicits physical exertion and give insights into the extent of it (Mishra and Folmer, 2018; Xie et al., 2018; Debska et al., 2019; Kivelä et al., 2019). This can also provide a reference value for the comparison of real-life and virtual exercising. Works like Zeng et al. (2017) and McDonough et al. (2020) compared a VR-based bike exercise with a traditional one. Their assessment of exertion with the help of BP measurements showed no significant difference between the virtual and analog exercises. Measurements of the subjectively perceived exertion, however, showed that participants of a VR-based training felt significantly less physiological fatigue.

As in other fields of application, physiological data can be used to adapt the virtual environment. Campbell and Fraser (2019) present an application where the trainee rides a stationary bike while wearing an HMD. In the virtual environment, the user is represented by a cyclist avatar. The goal of the training is to cover as much distance as possible in the virtual world, however, the speed at which the avatar moves is determined by the heart rate of the user. This way, the difficulty cannot be reduced by simply reducing the resistance of the bike and unfit users have the opportunity to cover more distance. In the exercise environment of Kirsch et al. (2019), the music-tempo is adapted to the user’s heart rate which was perceived as motivating by the trainees. Other works just take the physiological data and display it to the users so they can keep track of their real-time physical exertion (Yoo et al., 2018; Kojić et al., 2019; Greinacher et al., 2020).

A summary of the works discussed in this section can be found in Table 6.

TABLE 6
www.frontiersin.org

TABLE 6. Overview of the works from the field of training and education that were discussed in Section 4.1.2. The Measures column refers to the physiological measures used in the work. The entries of the Independent Variables column often do not cover everything that was considered in the work. Entries in the Purposes column refer to the categories listed in Section 4.1.2.

4.1.3 Entertainment

Another field of application comprises VR systems that are primarily built for entertainment purposes, i.e. games and videos. Often, researchers use physiological data to get information about the arousal video or a game elicits (Shumailov and Gunes, 2017; Ding et al., 2018; Mavridou et al., 2018b; Ishaque et al., 2020). Physiological measures can also be used as explicit game features. For example, progress may be denied if a player is unable to adjust their heart rate to a certain level (Houzangbe et al., 2019; Mosquera et al., 2019). Additionally, the field of view in a horror game can be adjusted depending on the heartbeat (Houzangbe et al., 2018). Kocur et al. (2020) present a novel way to help novice users in a shooter game by introducing a gaze-based aiming assistant. If the user does not hit a target with his shot, the assistant can guess what the actual target was, based on the gaze. When the shot is close enough to the intended target it hits nevertheless. Moreover, eye-tracking can be used to optimize VR video streaming. Yang et al. (2019) used gaze-tracking to analyze the user’s attention and leverage this knowledge to reduce the bandwidth of video streaming by reducing the quality of those parts of a scene that are not focused.

A summary of the works discussed in this section can be found in Table 7.

TABLE 7
www.frontiersin.org

TABLE 7. Overview of the works from the field of entertainment that were discussed in Section 4.1.3. The Measures column refers to the physiological measures used in the work. The entries of the Independent Variables column often do not cover everything that was considered in the work. Entries in the Purposes column refer to the categories listed in Section 4.1.6.

4.1.4 Functional Virtual Reality Properties

Within this section, we describe applications that make use of common techniques of VR. These are applications that include embodiment, agent interaction, or multiuser VR. We are talking about functional properties that may or may not be part of the system. These properties can also be part of the fields of application that we discussed before, e.g. therapy and training. Nevertheless, we have identified them as separate fields because physiological measurements have their own functions in applications that use embodiment, agent interaction, or multiuser VR. Researchers who want to use those techniques in their own applications can find separate information about the role of the physiological measurements here.

4.1.4.1 Applications With Embodiment

A range of VR applications use avatars as a representation of the user in the virtual realm (Lugrin et al., 2018; Lugrin et al., 2019b; Wolf et al., 2020). VR has the potential to elicit the illusion of owning a digital body which can be referred to as the Illusion of Virtual Body Ownership (Lugrin et al., 2015; Roth et al., 2017). This concept is an extension of the rubber-hand illusion (Botvinick and Cohen, 1998), which has the consequence that the feeling of ownership is often based on the synchrony of multi sensory information, e.g. visuo-tactile or visuo-motor (Tsakiris et al., 2006; Slater et al., 2008). Physiological data can provide information about whether and to what extent the virtual body is perceived as the own. One way to provide objective evidence for the illusion of body ownership is to threaten the artificial body-part while measuring the skin response to get information about whether the person shows an anxiety reaction (Armel and Ramachandran, 2003; Ehrsson et al., 2007). One of the most common paradigms still used in more recent literature is to threaten the virtual body (part) with a knife stab (González-Franco et al., 2014; Ma and Hommel, 2015; Preuss and Ehrsson, 2019). Alchalabi et al. (2019) present an approach that uses EEG data to estimate embodiment. They worked with a conflict between visual feedback and motor control. That means subjects had to perform a moving task on a treadmill that was replicated by their virtual representation. However, the avatar stopped walking prematurely while the subject was still moving in real life. This modification in feedback was reflected in EEG data and results showed a strong correlation between the subjective level of embodiment and brain activation over the motor- and pre-motor cortex. Relations between EEG patterns and the illusion of body ownership were also shown in virtual variations of the rubber-hand illusion (González-Franco et al., 2014; Skola and Liarokapis, 2016). Furthermore, there is also evidence that the feeling of ownership and agency over a virtual body or limb can be reflected in skin temperature regulation (Macauda et al., 2015; Tieri et al., 2017).

Other works connect embodiment and physiological measures by investigating how the behavior or properties of an avatar can change physiological responses. In their study, Czub and Kowal (2019) introduced a visuo-respiratory conflict, i.e. the avatar that represented the subjects showed a different respiration rate than its owner. They found out that the immersed subjects actually adapted their respiration rate to their virtual representation. The frequency of breathing increased when the breathing animation of the avatar was played faster and vice versa. Kokkinara et al. (2016) showed that activity of the virtual body, i.e. climbing a hill, can increase the heart rate of subjects, even if they are sitting on a chair in real life.

Another link between physiological measures and body-ownership can be made when those measures are used as input for the behavior of the avatar. Betka et al. (2020) executed a study in which they measured the respiration rate of the subjects and mapped it onto the avatar that was used as their virtual representation. Results showed that congruency of breathing behavior is an important factor for the sense of agency and the sense of ownership over the virtual body.

4.1.4.2 Applications With Agent Interaction

In the last section, we focused on applications that use avatars to embody users in the virtual environment. Now we move from the virtual representation of the user to the virtual representation of an artificial intelligence, so-called agents (Luck and Aylett, 2000).

Physiological data is often used to analyze and understand the interaction between a human user and agents. The study of Gupta et al. (2019), that was revised in Gupta et al. (2020), aimed to learn about the trust between humans and agents. The primary task of this study comprised a shape selection where subjects had to find a target object that was defined by shape and color. An agent was implemented that gave hints about the direction in which the object could be found. There were two versions of the agent, whereat one version always gave an accurate hint and the other one did not. With the help of a secondary task, an additional workload was induced. EEG, GSR, and heart rate variability were captured throughout the experiment, as an objective indicator for the cognitive workload of the subjects. In the EEG data, Gupta et al. (2020) found a significant main effect for the accuracy of the agent’s hints. That means subjects who received correct hints showed less cognitive load. The authors interpret this as a sign of trust towards the agent as the subjects did not seem to put any additional effort into the shape selection task as soon as they got the correct hints from the virtual assistant. In another example, Krogmeier et al. (2019) investigated the effects of bumping into a virtual character. In their study, they manipulated the haptic feedback during the collision. They explored how this encounter and the introduction of haptic feedback changed the physiological arousal of the subjects gauged with EDA. In a related study, Swidrak and Pochwatko (2019) showed a heart-rate deceleration of people who are touched by a virtual human. Another facet of human-agent interaction is the role of different facial expressions and how they affect physiological responses (Mueller et al., 2017; Ravaja et al., 2018; Kaminskas and Sciglinskas, 2019).

Other works investigate different kinds of agents and use physiological data as a reference for their comparison. Volante et al. (2016) investigated different styles of virtual humans, i.e. visually realistic vs. cartoon-like vs. sketch-like. The agents were depicted as patients in a hospital which showed progressive deterioration of their medical condition. With the help of EDA data, they were able to quantify emotional responses towards those avatars and analyze how these responses were affected by the visual appearance. Other works compared gaze behavior during contact with real people and agents (Syrjamaki et al., 2020) or the responses to virtual crowds showing different emotions (Volonte et al., 2020).

Another category can be seen in studies that leverage agents to simulate certain scenarios and use physiological data to test the efficacy of these scenarios to elicit desired emotional responses. At this point, there is a relatively large overlap with the previously discussed exposure therapies (Section 4.1.1.1). Applications aimed at the treatment of social anxiety often include the exposition to a virtual audience that aims to generate a certain atmosphere (Herumurti et al., 2019; Lugrin et al., 2019a; Streck et al., 2019). Kothgassner et al. (2016) asked participants of their study to speak in front of a real and a virtual audience. Heart rate, heart rate variability, and saliva cortisol secretion were assessed. For both groups, these stress indicators increased similarly, which demonstrates the fundamental usefulness of such therapy systems, as the physiological response to a virtual audience was comparable to a real one. Other studies investigated stress reactions depending on the size (Mostajeran et al., 2020) or displayed emotions (Barreda-Angeles et al., 2020) of the audience. The potential of virtual audiences to elicit stress is not only applicable to people with social anxiety. Research approaches that investigate human behavior and experience under stress can use a speech task in front of a virtual audience as a stressor. This can be referred to as the Trier Social Stress Test, which was often transferred to the virtual realm (Delahaye et al., 2015; Shiban et al., 2016a; Kothgassner et al., 2019; Zimmer et al., 2019; Kerous et al., 2020). Social training applications that work with virtual audiences are also available specifically for people with autism. Again, physiological measurements help to understand the condition of the user and thus to adjust the training (Kuriakose et al., 2013; Bekele et al., 2016; Simões et al., 2018). Also physical training applications can use physiological data to determine the effect of agents. Murray et al. (2016) worked with a virtual aerobic exercise, i.e. rowing on an ergometer. One cohort of their study had a virtual companion that performed the exercise alongside the subject. In a related study, Haller et al. (2019) investigated the effect of a clapping virtual audience on the performance in a high-intensity interval training. In both examples the effect of the agents was evaluated with a comparison of the heart rate. It indicates changes in the physical effort and can thus show whether the presence of agents changes training behavior.

4.1.4.3 Applications With Multiuser Virtual Reality

In multiuser VR applications, two or more users can be present and interact with each other at the same time (Schroeder, 2010). This concept offers the possibility of exchanging physiological data among those users. Dey et al. (2018) designed three different collaborative virtual environments comprising puzzles that must be solved together. They evaluated those environments in a user-study, whereat one group got auditory and haptic feedback about the heart rate of the partner. Results indicated that participants who received the feedback felt the presence of the collaborator more. There is even evidence that the heart rate feedback received from a partner can cause an adaption in the own heart rate (Dey et al., 2019b). In a similar approach, Salminen et al. (2019) used an application that shares EEG and respiration information among subjects in a virtual meditation exercise. The feedback was depicted as a glowing aura that pulsates according to the respiration rate and is visualized with different colors, depending on brain activity. Users who had this kind of feedback perceived more empathy towards the other user. Desnoyers-Stewart et al. (2019) built an application that deliberately aims to achieve such synchronization of physiological signals in order to establish a connection between users. Another way in which multiuser VR applications can benefit from physiological measures is in terms of communication. Lou et al. (2020) present a hardware solution that uses EMG sensors to track facial muscle activity. These activities are then translated to a set of facial expressions that can be displayed by an avatar. This offers the possibility of adding nonverbal cues to interpersonal communication in VR.

A summary of the works discussed in this section can be found in Table 8.

TABLE 8
www.frontiersin.org

TABLE 8. Overview of the works from the field of functional VR properties that were discussed in Section 4.1.4. The Measures column refers to the physiological measures used in the work. The entries of the Independent Variables column often do not cover everything that was considered in the work. Entries in the Purposes column refer to the categories listed in Section 4.1.6.

4.1.5 General Virtual Reality Properties

Our last field of application focuses on properties that are relevant for every VR application as they are inherent to the medium itself. These are cybersickness and presence. Here we are talking about non-functional properties of a VR system, as they can occur to varying degrees. These varying degrees of cybersickness and presence are either actively manipulated or passively observed. In both cases, consideration of physiological measurements can provide interesting insights.

4.1.5.1 Presence

Presence describes the experience of a user to be situated in the virtual instead of the real environment Witmer and Singer (1998). Hence, knowledge about the extent to which a virtual environment can elicit the feeling of presence in a user is relevant in most VR applications. Beyond the classic presence questionnaires from Slater et al. (1994) or Witmer and Singer (1998), there are also approaches that aim to determine presence based on physiological data.

Athif et al. (2020) present a comprehensive study that relates presence factors to physiological signals. They worked with a VR forest scenario in which the player needs to collect mushrooms that spawn randomly. This scenario was implemented in six different gradations based on the four factors of presence, described by Witmer and Singer (1998). These are distraction, control, sensory, and realism. That means the base version fulfilled the requirements for all these factors. Four versions suppressed one factor each and one version suppressed all the factors simultaneously. In the study, participants were presented with each of the scenarios while their physiological reactions were measured. Data showed that EEG features indicated changes in presence particularly well, while ECG and EDA features did not. Signals from temporal and parietal regions of the brain showed correlations with the suppression of the specific presence factors. In a similar investigation Dey et al. (2020) implemented two versions of a cart-ride through a virtual jungle. Their high presence version was realized through higher visual fidelity, more control, and object-specific sound. In this setup, they were able to show a significant increase in the heart rate of people presented to the high presence version, whereat EDA showed no systematic changes. The study of Deniaud et al. (2015) showed correlations between presence questionnaire scores, skin conductance, and heart rate variability. Other studies again, found the heart rate or EDA data to be weak indicators for presence (Felnhofer et al., 2014; Felnhofer et al., 2015). Szczurowski and Smith (2017) suggest to gauge presence through a comparison of virtual and real stimuli. Accordingly, a high presence is characterized in such a way that the exposure to the virtual stimulus elicits similar physiological responses as the exposure to the real stimulus. As such, one could take any physiological measure to gauge presence, as long as one has a comparative value from a real life stimulus.

The exact relationship between specific physiological measures and the experience of presence still seems ambiguous. This may also be due to the fact that the concept of presence is understood and defined differently. Only recently, Latoschik and Wienrich (2021) introduced a new theoretical model for VR experiences which also shows a new perspective on presence. Just as the understanding of presence evolves, so does the measurement of it.

4.1.5.2 Cybersickness

Cybersickness can be described as a set of adverse symptoms that are induced by the visual stimuli of virtual and augmented reality applications (Stauffert et al., 2020). Common symptoms include headache, dizziness, nausea, disorientation, or fatigue (Kennedy et al., 1993; LaViola, 2000). There are multiple theories on what might be the causes of cybersickness, whereas the most common revolve around sensory mismatches and postural instability (Rebenitsch and Owen, 2016).

Besides questionnaires and tests for postural instability, the assessment of the physiological state of a VR user is one of the common ways to measure cybersickness (Rebenitsch and Owen, 2016). In recent years researchers used several approaches to assess physiological measures and find out how much they correlate with cybersickness. Gavgani et al. (2017) used a virtual roller-coaster ride that subjects were asked to ride on three consecutive days. This roller-coaster ride was quite effective at inducing cybersickness as only one of fourteen subjects completed all rides while the others terminated theirs due to nausea. However, it took the participants significantly more time to abort the ride on the third day, compared to the first, which speaks for a habituation. During the 15-min rides, heart rate, respiration rate, and skin conductance were monitored and participants had to give a subjective assessment of their felt motion sickness. Results demonstrated that the nausea level of subjects continuously increased over the course of the ride. The measurement of the forehead skin conductance was the best physiological correlate to the gradually increasing nausea. A virtual roller-coaster ride was also leveraged in the study of Cebeci et al. (2019). Here, pupil dilation, heart rate, blink count, and saccades were analyzed. In this study, the average heart rate and the saccade mean speed were the highest when cybersickness symptoms occurred. Moreover, they found a correlation between the blink count, nausea and oculomotor discomfort (Kennedy et al., 1993). Approaches that use physiological data to assess cybersickness mainly use this data for the sake of comparison. This can serve to gain knowledge about the connection of unpleasant VR experiences and latency jitter (Stauffert et al., 2018), navigation techniques (Líndal et al., 2018), or the display type (Guna et al., 2019; Gersak et al., 2020; Guna et al., 2020). Plouzeau et al. (2018) used cybersickness indicators in an adaption mechanism for their VR application. They introduced a navigation method that allows the user to move and rotate in the virtual environment with the help of two joysticks. The acceleration of the navigation is adapted according to an objective indicator for simulator sickness, i.e. EDA. When the EDA increases the acceleration decreases proportionally and vice versa.

A summary of the works discussed in this section can be found in Table 9.

TABLE 9
www.frontiersin.org

TABLE 9. Overview of the works from the field of general VR properties that were discussed in Section 4.1.5. The Measures column refers to the physiological  measures used in the work. The entries of the Independent Variables column often do not cover everything that was considered in the work. Entries in the Purposes column refer to the categories listed in Section 4.1.6.

4.1.6 High-Level Purposes

Throughout this section, we gave an overview of the usage of physiological measures in VR to assess the state of the user. We listed fields of application and concretely explained how physiological measures are used in them. Across the fields of application, physiological measures are used for recurring purposes. To summarize this overview we turn to the meta-level to highlight these recurring themes for the usage of physiological data in VR. The categories are not mutually exclusive and are not always clearly separable.

• Stimuli Comparison: Physiological measures can be used to determine how the response to a virtual stimulus compares to the response to another (virtual) stimulus. In these cases the independent variable is the stimulus and the dependent variable is the physiological measure. Examples include works that compare responses to real life situations and their virtual counterparts (Chang et al., 2019; Syrjamaki et al., 2020). Others compare how different kinds of virtual audiences impact stress responses (Barreda-Angeles et al., 2020; Mostajeran et al., 2020).

• Group Comparison: Physiological measures can be used to determine how the response to the same stimulus compares between groups of people. In these studies the independent variable is the user group and the dependent variable is the physiological measure. Examples include works that compare phobic with non-phobic subjects (Breuninger et al., 2017; Kishimoto and Ding, 2019) or subjects with and without autism (Simões et al., 2018).

• Process analysis: Physiological measures can be used to determine how the response changes over the course of a virtual simulation. In these cases the independent variable is the time of measurement and the dependent variable is the physiological measure. Thus, the effect of the appearance of a certain stimulus can be determined, e.g. a knife attack (González-Franco et al., 2014) or a noise burst (Mueller et al., 2017).

• Progress: Physiological measures can be used to determine a change in response to the same stimulus throughout multiple expositions. In these studies the independent variable is the number of expositions or sessions and the dependent variable is the physiological measure. This is often done to quantify the progress of a therapy or training (Lee et al., 2015; Shiban et al., 2017) but can, for example, also be used to determine a habituation to cybersickness inducing stimuli (Gavgani et al., 2017).

• Correlation: Physiological measures can be used to establish a relationship between the measure and a second variable. Usually, both measures are dependent variables of the research setup. Typical examples assess the relationship between physiological and subjective measurements, e.g. of embodiment (Alchalabi et al., 2019) or cybersickness (John, 2019).

• Classification: Physiological measures can be used to differentiate users based on the response to a virtual stimulus. The goal of these approaches is to determine if the information in the physiological data is sufficient to reflect the changes in the independent variable. Examples can be the classification of specific groups of people, e.g. healthy and addicted people (Ding X. et al., 2020) or people under low and high stress (Ishaque et al., 2020).

• Feedback: Physiological measures can be presented to the user or a second person to make latent and unconscious processes visible. This is particularly common in relaxation applications where the stress level can be visualized for the user (Patibanda et al., 2017; Blum et al., 2019) but it can also be used to inform the supervisor of a therapy or training session about the user’s condition (Bayan et al., 2018; Streck et al., 2019). This purpose differs from the previous ones in that the physiological measurement is no longer intended to indicate the manipulation of an independent variable.

• Adaption: Physiological measurements can be used to adapt the system status to the state of the user. A typical example is the adaption of training and therapy systems based on effort and stress indicators (Campbell and Fraser, 2019; Bălan et al., 2020). This is similar to the feedback purpose in that the measurements here are used to make changes to the system and not to allow comparisons. While feedback approaches are really just focused on visualizing the physiological data, here it is more about changing the behavior of the application.

4.2 Part 2: Characteristics of Experience and Their Measurement in Virtual Reality

In the second part of the discussion, we focus on the results of the search for classification approaches depicted in Table 2. Here, we discuss approaches that expose participants to a particular VR stimulus that is known to trigger a particular characteristic of experience. The focus of the studies is on how well this manipulation is reflected by the physiological measurements. We use those classification approaches to show which measures, sensors, and algorithms have been used to gauge the targeted characteristic of experience. A universal solution to measure and interpret those specific experiences does not exist, as this is usually context dependent. So what this work cannot do is to give strict guidelines for which measures should be used for which case. The field is too diverse and the focus of the work too broad.

A comparison of the accuracy of the specific approaches, should be treated with caution as they are partly obtained under different circumstances. Results show more of a rough guide to how well the classification works and should not be compared 1-to-1. All the classifiers reported here are, in principle, successful in distinguishing different levels of an experience. This means all examples show combinations of signals, sensors, and algorithms that can work for the assessment of experience in VR.

Our review of classification approaches showed that in immersive VR there are some main characteristics of experience that are predominantly assessed with the help of physiological data. These experiences are arousal, valence, stress, anxiety, and cognitive workload. Those constructs are similar and interrelated. Stress and anxiety can be seen as a form of hyperarousal and cognitive workload itself can be seen as a stress factor (Gaillard, 1993; Blanco et al., 2019). Nevertheless, most of the works focus on one of the characteristics of experience and they have different approaches to elicit and assess them. The discussion is separated according to these characteristics of experience. The reader should still keep in mind that the constructs are related.

4.2.1 Arousal and Valence

Studies from this domain usually base their work on the Circumplex Model of Affects (Russell and Mehrabian, 1977; Posner et al., 2005). This model arranges human emotions in a two-dimensional coordinate system. One axis of this coordinate system represents arousal, i.e. the activation of the neural system, and one axis represents valence, i.e. how positive or negative an emotion is perceived. Hence, classifiers from this category usually distinguished high and low levels of arousal or positive and negative valence. Arousal inducing scenes often comprise a virtual roller-coaster ride (Hofmann et al., 2018; Teo and Chia, 2018; Bilgin et al., 2019) or dynamic mini-games (Shumailov and Gunes, 2017; Ding Y. et al., 2020). Emotional scenes are often used to manipulate the valence of people (Shumailov and Gunes, 2017; Mavridou et al., 2018b; Zheng et al., 2020). Such scenes can be taken from a database (Samson et al., 2016) or be tested in a pre-study to see what emotions they trigger (Zhang W. et al., 2017).

The most commonly used physiological measure for the classification of arousal and valence is EEG. The trend here seems to be towards the more comfortable wearable EEG sensors, e.g. a EEG headset (Teo and Chia, 2018; Bilgin et al., 2019; Ding Y. et al., 2020; Suhaimi et al., 2020) or textile electrodes inside the HMD (Xu et al., 2019). Some works use cardiovascular data next to the EEG information (Marín-Morales et al., 2018; Mavridou et al., 2018b). Deviating from the EEG approach, Zheng et al. (2020) leveraged pupillometry and Shumailov and Gunes (2017) forearm EMG to classify arousal and valence. Both examples also worked with comfortable and easy-to-setup sensors.

The deep neural network for the two-level classification of emotional arousal of Teo and Chia (2018) achieved an accuracy of 96.32% in a 10-fold cross validation. This result was achieved, just with the data from the Muse 4-channel EEG headband. For a binary valence classification Shumailov and Gunes (2017) reported an F1 value of 0.85. This value was achieved with the help of a support vector machine and EMG armband data, captured while playing VR games. The highest value for classifying arousal and valence at the same time (four classes) comes from Suhaimi et al. (2020). Their random-forest classifier achieved an accuracy of 82.49% in a 10-fold cross validation, distinguishing four different emotions that are embedded in the valence-arousal model.

When a researcher wants to assess arousal in a virtual environment EEG signals appear to be the go-to indicators. In addition, cardiovascular data also appears to be useful for this purpose. Six out of the eight presented arousal classifiers successfully used one or both of the signals to distinguish high and low arousal in the virtual realm. The systematic review of Marín-Morales et al. (2020) about the recognition of emotions in VR generally confirms this impression. They list sixteen works that assessed arousal in VR, whereat fifteen of them used EEG or heart rate variability signals. However, the review of Marín-Morales et al. (2020) also shows that nine of the sixteen works used EDA data to estimate arousal, a signal that did not appear among recent classification algorithms. One reason for this could be that, among the works listed by Marín-Morales et al. (2020), the older ones tended to leverage EDA for arousal assessment, and this work here just considers literature from the last few years. Nevertheless, this does not mean that EDA measurements are not important for estimating arousal anymore. Only recently, Granato et al. (2020) found the skin conductance level to be one of the most informative features when it comes to the assessment of arousal. Also worth mentioning is the work of Shumailov and Gunes (2017) which showed that also forearm EMG is suitable for the classification of arousal levels. They showed this in a setup where subjects moved a lot as they were playing VR games, while other approaches usually gather their data in a setup where subjects must remain still. Due to movement artifacts, it is questionable to what extent the other classifiers are transferable to setups that include a lot of motion. As for the sensors, various works showed that EEG data collected with easy-to-use headsets is sufficient to distinguish arousal levels in VR (Teo and Chia, 2018; Bilgin et al., 2019; Ding Y. et al., 2020).

The differentiation of positive and negative valence appears to be similar to arousal. Our results showed that most frequently EEG data was used for its assessment. Also notable is the attempt to classify arousal based on facial expressions. Even if an HMD is worn, this is possible through facial EMG (Mavridou et al., 2018a).

4.2.2 Stress

Studies that work on the classification of stress often used some kind of dynamic or unpredictable virtual environment to elicit the desired responses, e.g. a roller-coaster ride (Ishaque et al., 2020) or a guard, patrolling in a dark room (Ham et al., 2017). Stress is usually regulated with an additional assignment, e.g. an arithmetic task (Cho et al., 2017) or a Stroop task (Ishaque et al., 2020).

Looking at the signals with which stress was attempted to be classified, it is noticeable that each approach measures the cardiovascular activity. Either with optical sensors on the finger (Cho et al., 2017; Ham et al., 2017) or with electrical sensors (Tartarisco et al., 2015; Robitaille and McGuffin, 2019; Ishaque et al., 2020). Additional measures that were used by these studies are EDA (Cho et al., 2017; Ishaque et al., 2020), skin temperature (Cho et al., 2017), respiration Ishaque et al. (2020), or motion activity (Tartarisco et al., 2015; Robitaille and McGuffin, 2019).

The kernel-based extreme learning machine of Cho et al. (2017) distinguishes five stress levels and it achieved an accuracy of over 95% in a leave-one-out cross validation. Their classifier was trained with PPG, EDA, and skin temperature signals that were gathered relatively simple with four finger electrodes. In an even simpler setup, with only one finger-worn PPG sensor and a Linear Discriminant Analysis, Ham et al. (2017) achieved an accuracy of approximately 80% for three different classes. Tartarisco et al. (2015) took an approach with a wearable chest band. They collected ECG, respiration, and motion data and trained a neuro-fuzzy neural network that achieved an accuracy of 83% for four different classes.

Traditionally, heart rate variability is regarded as one of the most important indicators of stress (Melillo et al., 2011; Kim et al., 2018). This coincides with our results as the most commonly used signals for stress classification were PPG and ECG. This impression is also confirmed when considering non-classification approaches in VR. For example, if one looks at the VR adaptations of the Trier Social Stress Test mentioned in Section 4.1.4, one finds that in all the listed examples the heart activity is measured. Two other signals frequently used in research to indicate changes in stress level is EDA (Kurniawan et al., 2013; Anusha et al., 2017; Bhoja et al., 2020) and skin temperature (Vinkers et al., 2013; Herborn et al., 2015). Both signals and heart rate variability were compared by Cho et al. (2017) in VR. Results indicated that PPG and EDA provided more information about the stress level of the immersed people than the skin temperature, whereas PPG features were best suited for the distinction of stress. The combination of EDA and cardiovascular data seems to be a good compromise for measuring stress in VR.

Our results also suggest that the better classification results were achieved with the help of more obtrusive sensors like multiple electrodes on the fingers or the body (Cho et al., 2017; Ishaque et al., 2020). This is somewhat problematic as a lot of VR scenarios require quite some movement interaction. Individual electrodes distributed over the body could be bothersome. Approaches with more comfortable chest bands showed somewhat worse accuracy values, yet were able to effectively classify different levels of stress (Tartarisco et al., 2015; Robitaille and McGuffin, 2019). Future research could aim on improving the quality of stress indicators in VR based on unobtrusive sensors. In addition to chest bands, wrist-worn devices could be used given that they can deliver the important cardiovascular and EDA signals. Indeed, the focus in these scenarios is only on creating stress. Relaxation environments that do the opposite could also provide data to train future classifiers.

4.2.3 Cognitive Workload

As the name suggests, VR studies that work on the estimation of cognitive workload often used mentally demanding tasks that allow for a manipulation with different levels of difficulty. This can be abstract assignments like the n-back task (Tremmel et al., 2019; Tremmel, 2020) or a cube puzzle (Collins et al., 2019), but also more concrete scenarios like a flight simulator with different difficulties (Kakkos et al., 2019). It is in these scenarios that cognitive workload differs from the other characteristics of experiences listed here. While the other experiences can be placed somewhere in the Circumplex Model of Affects and therefore have an emotional character, the focus here is on a mental effort that must be performed by the subjects. In contrast to the stress simulations, here it is purely a matter of the cognitive demands of the tasks and not on environmental factors that are supposed to create additional stress.

Most frequently cognitive workload classifiers worked with an EEG signal (Kakkos et al., 2019; Siravenha et al., 2019; Tremmel et al., 2019; Tremmel, 2020). An exception to this is the work of (Collins et al., 2019) who approached the classification of workload in VR with PPG and EDA signals.

It is also (Collins et al., 2019) who reached the highest accuracy among the cognitive workload classifiers that are listed here. Based on information about the cardiovascular activity, collected with a wristband, they created a random forest classifier that predicts three different levels of cognitive workload with an accuracy of 91.75%. Among the EEG based approaches, Kakkos et al. (2019) report the highest accuracy. With data from 64-scalp electrodes, they trained a linear discriminant analysis classifier that reached an accuracy of 89% for a prediction of three different workload levels.

Older studies established heart rate features as the most reliable predictors of cognitive workload (Hancock et al., 1985; Vogt et al., 2006). More recent works argue for EEG data as the most promising signal for classifying workload (Christensen et al., 2012; Hogervorst et al., 2014). This trend is also visible in our results, as almost all of the classifiers for workload used EEG. However, Collins et al. (2019) showed that a cognitive workload classification in VR can also work with PPG signals. So both, heart rate and EEG features seem to be usable for workload classification in VR. A recent review on the usage of physiological data to assess cognitive workload also shows that cardiovascular and EEG data are two main measures for this purpose (Charles and Nixon, 2019). Charles and Nixon (2019) report that the second most used signal is the assessment of cognitive workload are ocular measures, i.e. blink rate and pupil size. Those measures did not appear at all in the classifiers for workload that we found. Closing this gap could be a task for future research, especially because of the availability of sensors that allow capturing pupillometry data inside an HMD.

Regarding the sensors, we found that all the EEG devices that were used for a workload classification were quite cumbersome (caps with multiple wet electrodes). The classification with more comfortable devices like the Emotiv Epoc or the Muse headset is still pending. When using pulse sensors, Collins et al. (2019) already showed that the data from a convenient wristband can be sufficient to distinguish workload levels, however, more examples are needed to confirm this impression.

4.2.4 Anxiety

The classification of anxiety is closely related to the virtual exposure therapies presented in Section 4.1.1. This becomes particularly clear when one considers the scenarios in which the data for the classifiers were gathered. The scenario is either the exposure to different altitudes (Hu et al., 2018; Wang et al., 2018; Bălan et al., 2020) or a speech in front of a crowd (Salkevicius et al., 2019).

The studies of Hu et al. (2018) and Wang et al. (2018) work with more cumbersome sensors, i.e. over 30 scalp electrodes, for capturing EEG data. The convolutional neural network of Hu et al. (2018) reached an accuracy of 88.77% in a 10-fold cross validation when classifying four different levels of acrophobia. The support vector machine of Wang et al. (2018) reached an accuracy of 96.20% in a 5-fold cross validation, yet only distinguished three levels of fear.

Salkevicius et al. (2019) present a VR anxiety classification based on a wearable sensor. With the help of the Empatica E4 wristband sensor, they collected PPG, EDA, and skin temperature data. They created a fusion-based support vector machine that classifies four different levels of anxiety. In a 10× 10-fold cross validation it reached an accuracy of 86.10%, which is comparable to what Hu et al. (2018) achieved with a more elaborate 30 electrodes setup.

Anxiety is usually characterized by sympathetic activation. Therefore, in the past many studies have found correlations between anxiety levels and numerous features of cardiovascular activity and EDA measurements (Kreibig, 2010). In VR applications, too, most researchers use heart rate variability and EDA data to make anxiety measurable (Marín-Morales et al., 2020). In our results, however, this combination only appeared in the study of Salkevicius et al. (2019). From this work, it can be concluded that heart rate variability, EDA, and skin temperature data are in general suitable for distinguishing different anxiety levels in VR. Moreover, it showed that the fusion of these three signals can considerably increase the quality of the prediction, which is particularly useful when using a wristband that can conveniently deliver this data like the Empatica E4.

Our results indicate the suitability of EEG data as a sensitive measure for anxiety in VR. Each of the fear-related approaches, except that of Salkevicius et al. (2019), used knowledge of the brain activity for classification. Additionally, the combination with cardiovascular measures seems to work fine (Balan et al., 2019; Bălan et al., 2020). As with cognitive workload, the EEG data here has been mainly captured with comprehensive electrode setups. Future work could seek for classification with the more comfortable headsets. Future anxiety classification approaches could also include EMG signals from the orbicularis oculi muscle. This measure can serve to identify startle responses (Maples-Keller et al., 2019; Mertens et al., 2019).

4.2.5 Other Classifiers

We also found classification approaches for more seldom assessed characteristics of experience. Orlosky et al. (2019) built a classifier that predicts if a learner in a virtual environment understood a given term or not. Based on data of eye movement and pupil size, they report a classification accuracy of 75%. Understanding is also a focus in the study of Collins et al. (2019). They use EDA information to recognize a moment of insight (Aha! moment). Also, the severity of cybersickness can be classified with physiological data. Jeong et al. (2019) used an Emotiv Epoc+ EEG headset to capture data for implementing a neural network. This network was able to detect if someone feels sick or not with an accuracy of 98.82%. Just like cybersickness, the visual fatigue caused by an HMD is quite specific to VR. Wang Y. et al. (2019) built a classifier that could distinguish two levels of visual fatigue with an accuracy of 90.79%.

4.3 Limitations

Although this review provides a fairly comprehensive overview of the usage of physiological signals in VR, it is not without limitations. Of course, there are a variety of application areas for physiological data in VR that we have not addressed. Indeed, we have only reported a fraction of the papers that were left after the screening process. The scope of this review limits us to only a superficial discussion of the specific field. To generate a deeper understanding one would have to dedicate a separate review to many of the topics. Moreover, we only focused on works that used physiological measures to gauge the state of the user. However, the measurements can also be used to make active system commands, for example with the help of a brain-computer interface. Additionally, our discussion of classification approaches could only cover certain areas. We discussed the characteristics of experience and the signals with which they were assessed. We have not discussed the features of the specific signals.

5 Conclusion

The use of physiological measures in VR is very wide and versatile. In the first part of this review, we provided a structured overview of the field. We showed how physiological signals are used in therapy, training and entertainment applications as well as the usage with functional and general VR properties. We also highlighted how the knowledge obtained through physiological data is used. This ranges from the comparison of different stimuli over the adaptation of the virtual environment to statistical methods such as correlation. In the second part, we focused on classification approaches that can show which characteristics of experience can be assessed with which measures and sensors. Approaches for the classification of arousal, valence, anxiety, stress, and cognitive workload were most prominent. EEG and cardiovascular data were most commonly used for the assessment of those dimensions. In many areas, simple and easy-to-use sensors were sufficient to distinguish different levels of an experience.

Data Availability Statement

The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author.

Author Contributions

AH conducted the search, processing, and ordering of the literature and took the lead in writing. ML has contributed to the categorization of the literature and the overall structuring of this review. He is also the supervisor of this project.

Funding

This research has been funded by the German Federal Ministry of Education and Research in the project VIA-VR (project number 16SV8444).

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

Special thanks to Sebastian Oberdörfer and Samantha Monty for proofreading and feedback and to Murat Yalcin for assisting in machine learning questions.

Footnotes

1https://www.emotiv.com/epoc/

2https://choosemuse.com/de/muse-2/

3https://www.polar.com/

4https://www.fitbit.com/global/de/home

5https://www.apple.com/watch/

6https://www.empatica.com/research/e4/

7https://www.tobii.com

8https://pupil-labs.com

References

Ahmaniemi, T., Lindholm, H., Muller, K., and Taipalus, T. (2017). “Virtual Reality Experience as a Stress Recovery Solution in Workplace,” in 2017 IEEE Life Sciences Conference (LSC), 206–209.

CrossRef Full Text | Google Scholar

Al-Khalidi, F. Q., Saatchi, R., Burke, D., Elphick, H., and Tan, S. (2011). Respiration Rate Monitoring Methods: A Review. Pediatr. Pulmonol. 46, 523–529. doi:10.1002/ppul.21416

PubMed Abstract | CrossRef Full Text | Google Scholar

Alchalabi, B., Faubert, J., and Labbe, D. R. (2019). “EEG Can Be Used to Measure Embodiment when Controlling a Walking Self-Avatar,” in 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 776–783.

Google Scholar

Alian, A. A., and Shelley, K. H. (2014). Photoplethysmography. Best Pract. Res. Clin. Anaesthesiol. 28, 395–406. doi:10.1016/j.bpa.2014.08.006

PubMed Abstract | CrossRef Full Text | Google Scholar

Almeida, J., Spofford, C., van ’t Wout, M., Unger, W., Philip, N., Carpenter, L., et al. (2016). Heart Rate Variability Responses to a Standardized Virtual Reality Exposure in Veterans With PTSD. Neuropsychopharmacology 41, S460–S461. doi:10.1038/npp.2016.242

CrossRef Full Text | Google Scholar

Anderson, A. P., Mayer, M. D., Fellows, A. M., Cowan, D. R., Hegel, M. T., and Buckey, J. C. (2017). Relaxation with Immersive Natural Scenes Presented Using Virtual Reality. Aerosp. Med. Hum. Perform. 88, 520–526. doi:10.3357/AMHP.4747.2017

PubMed Abstract | CrossRef Full Text | Google Scholar

Anusha, A., Joy, J., Preejith, S., Joseph, J., and Sivaprakasam, M. (2017). “Differential Effects of Physical and Psychological Stressors on Electrodermal Activity,” in 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) (IEEE), 4549–4552.

Google Scholar

Armel, K. C., and Ramachandran, V. S. (2003). Projecting Sensations to External Objects: Evidence from Skin Conductance Response. Proc. R. Soc. Lond. B 270, 1499–1506. doi:10.1098/rspb.2003.2364

CrossRef Full Text | Google Scholar

Athif, M., Rathnayake, B. L. K., Nagahapitiya, S. M. D. B. S., Samarasinghe, S. A. D. A. K., Samaratunga, P. S., Peiris, R. L., et al. (2020). “Using Biosignals for Objective Measurement of Presence in Virtual Reality Environments,” in 2020 42nd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC), 3035–3039.

PubMed Abstract | CrossRef Full Text | Google Scholar

Balan, O., Moise, G., Moldoveanu, A., Moldoveanu, F., and Leordeanu, M. (2019). “Automatic Adaptation of Exposure Intensity in vr Acrophobia Therapy, Based on Deep Neural Networks,” in 27th European Conference on Information Systems - Information Systems for a Sharing Society, ECIS 2019, Stockholm and Uppsala, Sweden, June 8–14, 2019.

Google Scholar

Barreda-Ángeles, M., Aleix-Guillaume, S., and Pereda-Baños, A. (2020). Users' Psychophysiological, Vocal, and Self-Reported Responses to the Apparent Attitude of a Virtual Audience in Stereoscopic 360°-Video. Virtual Real. 24, 289–302. doi:10.1007/s10055-019-00400-1

CrossRef Full Text | Google Scholar

Barsade, S. G., Ramarajan, L., and Westen, D. (2009). Implicit Affect in Organizations. Res. Organ. Behav. 29, 135–162. doi:10.1016/j.riob.2009.06.008

CrossRef Full Text | Google Scholar

Bayan, S., Assaf, K., Yassin, M., Cherry, A., Raad, M., and Hamawy, L. (2018). “A Virtual Reality Assisted Rehabilitation System for Physical Therapy,” in 2018 International Conference on Computer and Applications (ICCA), 32–40.

CrossRef Full Text | Google Scholar

Bekele, E., Wade, J., Bian, D., Fan, J., Swanson, A., Warren, Z., et al. (2016). “Multimodal Adaptive Social Interaction in Virtual Environment (masi-vr) for Children with Autism Spectrum Disorders (Asd),” in 2016 IEEE Virtual Reality (VR) (IEEE), 121–130.

Google Scholar

Benedek, M., and Kaernbach, C. (2010). A Continuous Measure of Phasic Electrodermal Activity. J. Neurosci. Methods 190, 80–91. doi:10.1016/j.jneumeth.2010.04.028

CrossRef Full Text | Google Scholar

Betka, S., Canzoneri, E., Adler, D., Herbelin, B., Bello-Ruiz, J., Kannape, O. A., et al. (2020). Mechanisms of the Breathing Contribution to Bodily Self-Consciousness in Healthy Humans: Lessons from Machine-Assisted Breathing?. Psychophysiology 57, e13564. doi:10.1111/psyp.13564

PubMed Abstract | CrossRef Full Text | Google Scholar

Bhoja, R., Guttman, O. T., Fox, A. A., Melikman, E., Kosemund, M., and Gingrich, K. J. (2020). Psychophysiological Stress Indicators of Heart Rate Variability and Electrodermal Activity With Application in Healthcare Simulation Research. Sim. Healthc. 15, 39–45. doi:10.1097/sih.0000000000000402

PubMed Abstract | CrossRef Full Text | Google Scholar

Bian, Y., Yang, C., Gao, F., Li, H., Zhou, S., Li, H., et al. (2016). A Framework for Physiological Indicators of Flow in VR Games: Construction and Preliminary Evaluation. Pers. Ubiquitous Comput. 20, 821–832. doi:10.1007/s00779-016-0953-5

CrossRef Full Text | Google Scholar

Bigdely-Shamlo, N., Mullen, T., Kothe, C., Su, K.-M., and Robbins, K. A. (2015). The Prep Pipeline: Standardized Preprocessing for Large-Scale Eeg Analysis. Front. Neuroinform. 9, 16. doi:10.3389/fninf.2015.00016

PubMed Abstract | CrossRef Full Text | Google Scholar

Bilgin, P., Agres, K., Robinson, N., Wai, A. A. P., and Guan, C. (2019). “A Comparative Study of Mental States in 2D and 3D Virtual Environments Using EEG,” in 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), 2833–2838.

CrossRef Full Text | Google Scholar

Blanco, J. A., Vanleer, A. C., Calibo, T. K., and Firebaugh, S. L. (2019). Single-Trial Cognitive Stress Classification Using Portable Wireless Electroencephalography. Sensors 19, 499. doi:10.3390/s19030499

CrossRef Full Text | Google Scholar

Blum, J., Rockstroh, C., and Goeritz, A. S. (2019). Heart Rate Variability Biofeedback Based on Slow-Paced Breathing With Immersive Virtual Reality Nature Scenery. Front. Psychol. 10, 2172. doi:10.3389/fpsyg.2019.02172

PubMed Abstract | CrossRef Full Text | Google Scholar

Botvinick, M., and Cohen, J. (1998). Rubber Hands ‘Feel’ Touch that Eyes See. Nature 391, 756. doi:10.1038/35784

PubMed Abstract | CrossRef Full Text | Google Scholar

Bozkir, E., Geisler, D., and Kasneci, E. (2019). “Person Independent, Privacy Preserving, and Real Time Assessment of Cognitive Load Using Eye Tracking in a Virtual Reality Setup,” in 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 1834–1837.

CrossRef Full Text | Google Scholar

Bradley, M. M., and Lang, P. J. (1994). Measuring Emotion: the Self-Assessment Manikin and the Semantic Differential. J. Behav. Ther. Exp. Psychiatry 25, 49–59. doi:10.1016/0005-7916(94)90063-9

PubMed Abstract | CrossRef Full Text | Google Scholar

Braithwaite, J. J., Watson, D. G., Jones, R., and Rowe, M. (2013). A Guide for Analysing Electrodermal Activity (eda) & Skin Conductance Responses (scrs) for Psychological Experiments. Psychophysiology 49, 1017–1034.

Google Scholar

Breuninger, C., Slama, D. M., Kraemer, M., Schmitz, J., and Tuschen-Caffier, B. (2017). Psychophysiological Reactivity, Interoception and Emotion Regulation in Patients With Agoraphobia during Virtual Reality Anxiety Induction. Cogn. Ther. Res. 41, 193–205. doi:10.1007/s10608-016-9814-9

CrossRef Full Text | Google Scholar

Browning, M. H. E. M., Mimnaugh, K. J., van Riper, C. J., Laurent, H. K., and LaValle, S. M. (2019). Can Simulated Nature Support Mental Health? Comparing Short, Single-Doses of 360-Degree Nature Videos in Virtual Reality With the Outdoors. Front. Psychol. 10, 2667. doi:10.3389/fpsyg.2019.02667

PubMed Abstract | CrossRef Full Text | Google Scholar

Bruder, L., and Peters, J. (2020). A Virtual Reality Setup for the Combined Assessment of Subjective, Behavioral and Psychophysiological Cue-Reactivity in Gambling Disorder. bioRxiv. doi:10.1101/2020.08.07.237826

CrossRef Full Text | Google Scholar

Bălan, O., Moise, G., Moldoveanu, A., Leordeanu, M., and Moldoveanu, F. (2020). An Investigation of Various Machine and Deep Learning Techniques Applied in Automatic Fear Level Detection and Acrophobia Virtual Therapy. Sensors 20, 496. doi:10.3390/s20020496

CrossRef Full Text | Google Scholar

Cairns, P. E., and Cox, A. L. (2008). Research Methods for Human-Computer Interaction. Cambridge, UK: Cambridge University Press.

Calabro, R. S., Naro, A., Russo, M., Leo, A., De Luca, R., Balletta, T., et al. (2017). The Role of Virtual Reality in Improving Motor Performance as Revealed by EEG: A Randomized Clinical Trial. J Neuroeng. Rehabil. 14, 53. doi:10.1186/s12984-017-0268-4

CrossRef Full Text | Google Scholar

Campbell, J., and Fraser, M. (2019). “Switching it up: Designing Adaptive Interfaces for Virtual Reality Exergames,” in Proceedings of the 31st European Conference on Cognitive Ergonomics (New York, NY: Association for Computing Machinery), 177–184.

CrossRef Full Text | Google Scholar

Carreiras, C., Alves, A. P., Lourenço, A., Canento, F., Silva, H., Fred, A., et al. (2015). BioSPPy: Biosignal Processing in Python. Dataset.

Google Scholar

Castelvecchi, D. (2016). Low-Cost Headsets Boost Virtual Reality’s Lab Appeal. Nature 533, 153–154. doi:10.1038/533153a

PubMed Abstract | CrossRef Full Text | Google Scholar

Cebeci, B., Celikcan, U., and Capin, T. K. (2019). A Comprehensive Study of the Affective and Physiological Responses Induced by Dynamic Virtual Reality Environments. Comput. Animat. Virtual Worlds 30, e1893. doi:10.1002/cav.1893

CrossRef Full Text | Google Scholar

Chang, T. P., Beshay, Y., Hollinger, T., and Sherman, J. M. (2019). Comparisons of Stress Physiology of Providers in Real-Life Resuscitations and Virtual Reality-Simulated Resuscitations. Simul. Healthc. 14, 104–112. doi:10.1097/SIH.0000000000000356

PubMed Abstract | CrossRef Full Text | Google Scholar

Charles, R. L., and Nixon, J. (2019). Measuring Mental Workload Using Physiological Measures: A Systematic Review. Appl. Ergon. 74, 221–232. doi:10.1016/j.apergo.2018.08.028

PubMed Abstract | CrossRef Full Text | Google Scholar

Cho, D., Ham, J., Oh, J., Park, J., Kim, S., Lee, N.-K., et al. (2017). Detection of Stress Levels from Biosignals Measured in Virtual Reality Environments Using a Kernel-Based Extreme Learning Machine. Sensors 17, 2435. doi:10.3390/s17102435

PubMed Abstract | CrossRef Full Text | Google Scholar

Christensen, J. C., Estepp, J. R., Wilson, G. F., and Russell, C. A. (2012). The Effects of Day-to-Day Variability of Physiological Data on Operator Functional State Classification. NeuroImage 59, 57–63. doi:10.1016/j.neuroimage.2011.07.091

PubMed Abstract | CrossRef Full Text | Google Scholar

Clifford, R. M. S., Engelbrecht, H., Jung, S., Oliver, H., Billinghurst, M., Lindeman, R. W., et al. (2020). Aerial Firefighter Radio Communication Performance in a Virtual Training System: Radio Communication Disruptions Simulated in VR for Air Attack Supervision. Vis. Comp.. doi:10.1007/s00371-020-01816-6

CrossRef Full Text | Google Scholar

Clifford, R. M. S., Hoermann, S., Marcadet, N., Oliver, H., Billinghurst, M., and Lindeman, R. W. (2018). “Evaluating the Effects of Realistic Communication Disruptions in VR Training for Aerial Firefighting,” in 2018 10th International Conference on Virtual Worlds and Games for Serious Applications (VS-Games), 1–8.

CrossRef Full Text | Google Scholar

Collins, J., Regenbrecht, H., Langlotz, T., Can, Y. S., Ersoy, C., and Butson, R. (2019). “Measuring Cognitive Load and Insight: A Methodology Exemplified in a Virtual Reality Learning Context,” in 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Beijing, China, 351–362.

CrossRef Full Text | Google Scholar

Corbetta, P. (2003). Social Research: Theory, Methods and Techniques. Sage.

Currie, J., Bond, R. R., McCullagh, P., Black, P., Finlay, D. D., Gallagher, S., et al. (2019). Wearable Technology-Based Metrics for Predicting Operator Performance during Cardiac Catheterisation. Int. J. Comput. Assist. Radiol. Surg. 14, 645–657. doi:10.1007/s11548-019-01918-0

CrossRef Full Text | Google Scholar

Czub, M., and Kowal, M. (2019). Respiration Entrainment in Virtual Reality by Using a Breathing Avatar. Cyberpsychol. Behav. Soc. Netw. 22, 494–499. doi:10.1089/cyber.2018.0700

PubMed Abstract | CrossRef Full Text | Google Scholar

da Costa, R. T., de Carvalho, M. R., Ribeiro, P., and Nardi, A. E. (2018). Virtual Reality Exposure Therapy for Fear of Driving: Analysis of Clinical Characteristics, Physiological Response, and Sense of Presence. Braz. J. Psychiatry 40, 192–199. doi:10.1590/1516-4446-2017-2270

CrossRef Full Text | Google Scholar

Dash, A., Yadav, A., and Lahiri, U. (2019). “Physiology-Sensitive Virtual Reality Based Strength Training Platform for Post-Stroke Grip Task,” in 2019 IEEE EMBS International Conference on Biomedical Health Informatics (BHI), 1–4.

Google Scholar

De Asis, K. M. R., Guillem, E. J. P., Reyes, F. A. M., and Samonte, M. J. C. (2020). Serenity: A Stress-Relieving Virtual Reality Application Based on Philippine Environmental Variables. In Proceedings of the 2020 The 6th International Conference on Frontiers of Educational Technologies (New York, NY: Association for Computing Machinery), 155–159.

Google Scholar

De Luca, C. (2006). Electromyography. Encyclopedia of Medical Devices and Instrumentation, Hoboken, USA: Wiley Online Library.

Debska, M., Polechonski, J., Mynarski, A., and Polechonski, P. (2019). Enjoyment and Intensity of Physical Activity in Immersive Virtual Reality Performed on Innovative Training Devices in Compliance with Recommendations for Health. Int. J. Environ. Res. Public Health 16, 3673. doi:10.3390/ijerph16193673

CrossRef Full Text | Google Scholar

Delahaye, M., Lemoine, P., Cartwright, S., Deuring, G., Beck, J., Pflueger, M., et al. (2015). Learning Aptitude, Spatial Orientation and Cognitive Flexibility Tested in a Virtual Labyrinth after Virtual Stress Induction. BMC Psychol. 3, 22. doi:10.1186/s40359-015-0080-5

PubMed Abstract | CrossRef Full Text | Google Scholar

Delvigne, V., Ris, L., Dutoit, T., Wannous, H., and Vandeborre, J.-P. (2020). “Vera: Virtual Environments Recording Attention,” in 2020 IEEE 8th International Conference on Serious Games and Applications for Health (SeGAH) (IEEE), 1–7.

CrossRef Full Text | Google Scholar

Deniaud, C., Honnet, V., Jeanne, B., and Mestre, D. (2015). “An Investigation into Physiological Responses in Driving Simulators: An Objective Measurement of Presence,” in 2015 Science and Information Conference (SAI), 739–748.

CrossRef Full Text | Google Scholar

Desnoyers-Stewart, J., Stepanova, E. R., Pasquier, P., and Riecke, B. E. (2019). “JeL: Connecting through Breath in Virtual Reality,” in Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems CHIEA’19, Glasgow, United Kingdom (New York, NY: Association for Computing Machinery), 1–6.

Google Scholar

Detez, L., Greenwood, L.-M., Rave, R. S., Wilson, E., Chandler, T., Ries, T., et al. (2019). A Psychophysiological and Behavioural Study of Slot Machine Near-Misses Using Immersive Virtual Reality. J. Gambl. Stud. 35, 929–944. doi:10.1007/s10899-018-09822-z

PubMed Abstract | CrossRef Full Text | Google Scholar

Dey, A., Chatburn, A., and Billinghurst, M. (2019a). “Exploration of an EEG-Based Cognitively Adaptive Training System in Virtual Reality,” in 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 220–226.

Google Scholar

Dey, A., Chen, H., Hayati, A., Billinghurst, M., and Lindeman, R. W. (2019b). “Sharing Manipulated Heart Rate Feedback in Collaborative Virtual Environments,” in 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 248–257.

CrossRef Full Text | Google Scholar

Dey, A., Chen, H., Zhuang, C., Billinghurst, M., and Lindeman, R. W. (2018). “Effects of Sharing Real-Time Multi-Sensory Heart Rate Feedback in Different Immersive Collaborative Virtual Environments,” in 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 165–173.

Google Scholar

Dey, A., Phoon, J., Saha, S., Dobbins, C., and Billinghurst, M. (2020). “Neurophysiological Effects of Presence in Calm Virtual Environments,” in 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), 744–745.

CrossRef Full Text | Google Scholar

Ding, J., He, Y., Chen, L., Zhu, B., Cai, Q., Chen, K., et al. (2019). Virtual Reality Distraction Decreases Pain during Daily Dressing Changes Following Haemorrhoid Surgery. J. Int. Med. Res. 47, 4380–4388. doi:10.1177/0300060519857862

CrossRef Full Text | Google Scholar

Ding, N., Zhou, W., and Fung, A. Y. H. (2018). Emotional Effect of Cinematic VR Compared with Traditional 2D Film. Telemat. Inform. 35, 1572–1579. doi:10.1016/j.tele.2018.04.003

CrossRef Full Text | Google Scholar

Ding, X., Li, Y., Li, D., Li, L., and Liu, X. (2020). Using Machine-Learning Approach to Distinguish Patients with Methamphetamine Dependence from Healthy Subjects in a Virtual Reality Environment. Brain Behav. 10, e01814. doi:10.1002/brb3.1814

PubMed Abstract | CrossRef Full Text | Google Scholar

Ding, Y., Robinson, N., Zeng, Q., Chen, D., Wai, A. A. P., Lee, T.-S., et al. (2020). “TSception:A Deep Learning Framework for Emotion Detection Using EEG,” in 2020 International Joint Conference on Neural Networks (IJCNN), 1–7.

CrossRef Full Text | Google Scholar

Döllinger, N., Wienrich, C., and Latoschik, M. E. (2021). Challenges and Opportunities of Immersive Technologies for Mindfulness Meditation: A Systematic Review. Front. Virt. Real. 2, 29. doi:10.3389/frvir.2021.644683

CrossRef Full Text | Google Scholar

Drolet, M., Yumbla, E. Q., Hobbs, B., and Artemiadis, P. (2020). “On the Effects of Visual Anticipation of Floor Compliance Changes on Human Gait: Towards Model-Based Robot-Assisted Rehabilitation,” in 2020 IEEE International Conference on Robotics and Automation (ICRA), 9072–9078.

Google Scholar

Dunning, D., Heath, C., and Suls, J. M. (2004). Flawed Self-Assessment: Implications for Health, Education, and the Workplace. Psychol. Sci. Public Interest 5, 69–106. doi:10.1111/j.1529-1006.2004.00018.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Ehgoetz Martens, K. A., Ellard, C. G., and Almeida, Q. J. (2015). Anxiety-Provoked Gait Changes are Selectively Dopa-Responsive in Parkinson’s Disease. Eur. J. Neurosci. 42, 2028–2035. doi:10.1111/ejn.12928

CrossRef Full Text | Google Scholar

Ehgoetz Martens, K. A., Ellard, C. G., and Almeida, Q. J. (2016). Evaluating the Link Between Dopaminergic Treatment, Gait Impairment, and Anxiety in Parkinson’s Disease. Move. Disord. Clin. Pract. 3, 389–394. doi:10.1002/mdc3.12298

CrossRef Full Text | Google Scholar

Ehrsson, H. H., Wiech, K., Weiskopf, N., Dolan, R. J., and Passingham, R. E. (2007). Threatening a Rubber Hand that You Feel is Yours Elicits a Cortical Anxiety Response. Proc. Natl. Acad. Sci. 104, 9828–9833. doi:10.1073/pnas.0610011104

PubMed Abstract | CrossRef Full Text | Google Scholar

Faller, J., Cummings, J., Saproo, S., and Sajda, P. (2019). Regulation of Arousal via Online Neurofeedback Improves Human Performance in a Demanding Sensory-Motor Task. Proc. Natl. Acad. Sci. U.S.A. 116, 6482–6490. doi:10.1073/pnas.1817207116

PubMed Abstract | CrossRef Full Text | Google Scholar

Felnhofer, A., Kothgassner, O. D., Hetterle, T., Beutl, L., Hlavacs, H., and Kryspin-Exner, I. (2014). Afraid to be There? Evaluating the Relation between Presence, Self-Reported Anxiety, and Heart Rate in a Virtual Public Speaking Task. Cyberpsychol. Behav. Soc. Netw. 17, 310–316. doi:10.1089/cyber.2013.0472

PubMed Abstract | CrossRef Full Text | Google Scholar

Felnhofer, A., Kothgassner, O. D., Schmidt, M., Heinzle, A.-K., Beutl, L., Hlavacs, H., et al. (2015). Is Virtual Reality Emotionally Arousing? Investigating Five Emotion Inducing Virtual Park Scenarios. Int. J. Hum. Comput. Stud. 82, 48–56. doi:10.1016/j.ijhcs.2015.05.004

CrossRef Full Text | Google Scholar

Fernandez, J. A., Fusté, A., Richer, R., and Maes, P. (2019). “Deep Reality: An Underwater VR Experience to Promote Relaxation by Unconscious HR, EDA, and Brain Activity Biofeedback,” in ACM SIGGRAPH 2019 Virtual, Augmented, and Mixed Reality (New York, NY: Association for Computing Machinery).

Google Scholar

Fominykh, M., Prasolova-Førland, E., Stiles, T. C., Krogh, A. B., and Linde, M. (2018). Conceptual Framework for Therapeutic Training With Biofeedback in Virtual Reality: First Evaluation of a Relaxation Simulator. J. Interact. Learn. Res. 29, 51–75.

Google Scholar

Freire, R. C., Ferreira-Garcia, R., Cabo, M. C., Martins, R. M., and Nardi, A. E. (2020). Panic Attack Provocation in Panic Disorder Patients with a Computer Simulation. J. Affect. Disord. 264, 498–505. doi:10.1016/j.jad.2019.11.081

CrossRef Full Text | Google Scholar

Gaillard, A. (1993). Comparing the Concepts of Mental Load and Stress. Ergonomics 36, 991–1005. doi:10.1080/00140139308967972

PubMed Abstract | CrossRef Full Text | Google Scholar

Gamito, P., Oliveira, J., Baptista, A., Morais, D., Lopes, P., Rosa, P., et al. (2014). Eliciting Nicotine Craving with Virtual Smoking Cues. Cyberpsychol. Behav. Soc. Netw. 17, 556–561. doi:10.1089/cyber.2013.0329

PubMed Abstract | CrossRef Full Text | Google Scholar

Gao, J., Liu, S., Feng, Q., Zhang, X., Jiang, M., Wang, L., et al. (2019). Subjective and Objective Quantification of the Effect of Distraction on Physician’s Workload and Performance During Simulated Laparoscopic Surgery. Med. Sci. Monit. 25, 3127–3132. doi:10.12659/MSM.914635

PubMed Abstract | CrossRef Full Text | Google Scholar

García-Rodríguez, O., Weidberg, S., Gutiérrez-Maldonado, J., and Secades-Villa, R. (2013). Smoking a Virtual Cigarette Increases Craving Among Smokers. Addict. Behav. 38, 2551–2554. doi:10.1016/j.addbeh.2013.05.007

PubMed Abstract | CrossRef Full Text | Google Scholar

Gavgani, A. M., Nesbitt, K. V., Blackmore, K. L., and Nalivaiko, E. (2017). Profiling Subjective Symptoms and Autonomic Changes Associated with Cybersickness. Auton. Neurosci. 203, 41–50. doi:10.1016/j.autneu.2016.12.004

PubMed Abstract | CrossRef Full Text | Google Scholar

Gersak, G., Lu, H., and Guna, J. (2020). Effect of VR Technology Matureness on VR Sickness. Multimed. Tools. Appl. 79, 14491–14507. doi:10.1007/s11042-018-6969-2

CrossRef Full Text | Google Scholar

Gonzalez, D. S., Moro, A. D., Quintero, C., and Sarmiento, W. J. (2016). “Fear Levels in Virtual Environments, an Approach to Detection and Experimental User Stimuli Sensation,” in 2016 XXI Symposium on Signal Processing, Images and Artificial Vision (STSIVA), 1–6.

CrossRef Full Text | Google Scholar

González-Franco, M., Peck, T. C., Rodríguez-Fornells, A., and Slater, M. (2014). A Threat to a Virtual Hand Elicits Motor Cortex Activation. Exp. Brain Res. 232, 875–887. doi:10.1007/s00221-013-3800-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Gramfort, A., Luessi, M., Larson, E., Engemann, D. A., Strohmeier, D., Brodbeck, C., et al. (2013). Meg and Eeg Data Analysis with mne-Python. Front. Neurosci. 7, 267. doi:10.3389/fnins.2013.00267

PubMed Abstract | CrossRef Full Text | Google Scholar

Granato, M., Gadia, D., Maggiorini, D., and Ripamonti, L. A. (2020). An Empirical Study of Players’ Emotions in VR Racing Games Based on a Dataset of Physiological Data. Multimed. Tools Appl. 79, 33657–33686. doi:10.1007/s11042-019-08585-y

CrossRef Full Text | Google Scholar

Greinacher, R., Kojić, T., Meier, L., Parameshappa, R. G., Möller, S., and Voigt-Antons, J. (2020). “Impact of Tactile and Visual Feedback on Breathing Rhythm and User Experience in VR Exergaming,” in 2020 Twelfth International Conference on Quality of Multimedia Experience (QoMEX), 1–6.

CrossRef Full Text | Google Scholar

Grimm, P. (2010). Social Desirability Bias. Hoboken, United States: Wiley.

Grübel, J., Weibel, R., Jiang, M. H., Hölscher, C., Hackman, D. A., and Schinazi, V. R. (2016). “Eve: A Framework for Experiments in Virtual Environments,” in Spatial Cognition X. Bremen, Germany: Springer, 159–176.

Google Scholar

Guna, J., Gersak, G., Humar, I., Krebl, M., Orel, M., Lu, H., et al. (2020). Virtual Reality Sickness and Challenges behind Different Technology and Content Settings. Mob. Netw. Appl. 25, 1436–1445. doi:10.1007/s11036-019-01373-w

CrossRef Full Text | Google Scholar

Guna, J., Gersak, G., Humar, I., Song, J., Drnovsek, J., and Pogacnik, M. (2019). Influence of Video Content Type on Users’ Virtual Reality Sickness Perception and Physiological Response. Future Gener. Comput. Syst. 91, 263–276. doi:10.1016/j.future.2018.08.049

CrossRef Full Text | Google Scholar

Gupta, K., Hajika, R., Pai, Y. S., Duenser, A., Lochner, M., and Billinghurst, M. (2019). “AI We Trust: Investigating the Relationship between Biosignals, Trust and Cognitive Load in VR,” in 25th ACM Symposium on Virtual Reality Software and TechnologyVRST ’19, Parramatta, Australia (New York, NY: Association for Computing Machinery).

Google Scholar

Gupta, K., Hajika, R., Pai, Y. S., Duenser, A., Lochner, M., and Billinghurst, M. (2020). “Measuring Human Trust in a Virtual Assistant Using Physiological Sensing in Virtual Reality,” in 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 756–765.

CrossRef Full Text | Google Scholar

Haller, J. C., Jang, Y. H., Haller, J., Shaw, L., and Wünsche, B. C. (2019). “HIIT the Road: Using Virtual Spectator Feedback in HIIT-Based Exergaming,” in Proceedings of the Australasian Computer Science Week Multiconference, Sydney, Australia (New York, NY: Association for Computing Machinery).

Google Scholar

Ham, J., Cho, D., Oh, J., and Lee, B. (2017). “Discrimination of Multiple Stress Levels in Virtual Reality Environments Using Heart Rate variabilityIEEE Engineering in Medicine and Biology Society,” in 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 3989–3992.

Google Scholar

Hancock, P. A., Meshkati, N., and Robertson, M. (1985). Physiological Reflections of Mental Workload. Aviat. Space Environ. Med. 56, 1110–1114.

PubMed Abstract | Google Scholar

Hart, S. G., and Staveland, L. E. (1988). Development of Nasa-Tlx (Task Load Index): Results of Empirical and Theoretical Research. Adv. Psychol. 52, 139–183. doi:10.1016/s0166-4115(08)62386-9

CrossRef Full Text | Google Scholar

Herborn, K. A., Graves, J. L., Jerem, P., Evans, N. P., Nager, R., McCafferty, D. J., et al. (2015). Skin Temperature Reveals the Intensity of Acute Stress. Physiol. Behav. 152, 225–230. doi:10.1016/j.physbeh.2015.09.032

PubMed Abstract | CrossRef Full Text | Google Scholar

Herumurti, D., Yuniarti, A., Rimawan, P., and Yunanto, A. A. (2019). “Overcoming Glossophobia Based on Virtual Reality and Heart Rate Sensors,” in 2019 IEEE International Conference on Industry 4.0, Artificial Intelligence, and Communications Technology (IAICT), 139–144.

CrossRef Full Text | Google Scholar

Hildebrandt, L. K., McCall, C., Engen, H. G., and Singer, T. (2016). Cognitive Flexibility, Heart Rate Variability, and Resilience Predict fine-grained Regulation of Arousal during Prolonged Threat. Psychophysiology 53, 880–890. doi:10.1111/psyp.12632

PubMed Abstract | CrossRef Full Text | Google Scholar

Hofmann, S. M., Klotzsche, F., Mariola, A., Nikulin, V. V., Villringer, A., and Gaebler, M. (2018). “Decoding Subjective Emotional Arousal during a Naturalistic VR Experience from EEG Using LSTMs,” in 2018 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), 128–131.

CrossRef Full Text | Google Scholar

Hogervorst, M. A., Brouwer, A.-M., and Van Erp, J. B. (2014). Combining and Comparing Eeg, Peripheral Physiology and Eye-Related Measures for the Assessment of Mental Workload. Front. Neurosci. 8, 322. doi:10.3389/fnins.2014.00322

PubMed Abstract | CrossRef Full Text | Google Scholar

Houzangbe, S., Christmann, O., Gorisse, G., and Richir, S. (2019). “Effects of Voluntary Heart Rate Control on User Engagement in Virtual Reality,” in 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 982–983.

CrossRef Full Text | Google Scholar

Houzangbe, S., Christmann, O., Gorisse, G., and Richir, S. (2018). “Fear as a Biofeedback Game Mechanic in Virtual Reality: Effects on Engagement and Perceived Usability,” in Proceedings of the 13th International Conference on the Foundations of Digital Games FDG ’18, Malmö, Sweden (New York, NY: Association for Computing Machinery).

Google Scholar

Hoxhallari, E., Behr, I. J., Bradshaw, J. S., Morkos, M. S., Haan, P. S., Schaefer, M. C., et al. (2019). Virtual Reality Improves the Patient Experience during Wide-Awake Local Anesthesia No Tourniquet Hand Surgery: A Single-Blind, Randomized, Prospective Study. Plast. Reconstr. Surg. 144, 408–414. doi:10.1097/PRS.0000000000005831

PubMed Abstract | CrossRef Full Text | Google Scholar

Hu, F., Wang, H., Chen, J., and Gong, J. (2018). “Research on the Characteristics of Acrophobia in Virtual Altitude Environment,” in 2018 IEEE International Conference on Intelligence and Safety for Robotics (ISR), 238–243.

CrossRef Full Text | Google Scholar

Ishaque, S., Rueda, A., Nguyen, B., Khan, N., and Krishnan, S. (2020). “Physiological Signal Analysis and Classification of Stress from Virtual Reality Video Game,” in 2020 42nd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC), 867–870.

PubMed Abstract | CrossRef Full Text | Google Scholar

Jänig, W. (2008). Integrative Action of the Autonomic Nervous System: Neurobiology of Homeostasis. Cambridge, UK: Cambridge University Press.

Jeong, D., Yoo, S., and Yun, J. (2019). “Cybersickness Analysis with EEG Using Deep Learning Algorithms,” in 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 827–835.

CrossRef Full Text | Google Scholar

John, B. (2019). “Pupil Diameter as a Measure of Emotion and Sickness in VR,” in Proceedings of the 11th ACM Symposium on Eye Tracking Research & ApplicationsETRA ’19, Denver, CO (New York, NY: Association for Computing Machinery).

CrossRef Full Text | Google Scholar

Kahlon, S., Lindner, P., and Nordgreen, T. (2019). Virtual Reality Exposure Therapy for Adolescents with Fear of Public Speaking: a Non-randomized Feasibility and Pilot Study. Child. Adolesc. Psychiatry Ment. Health 13, 47. doi:10.1186/s13034-019-0307-y

PubMed Abstract | CrossRef Full Text | Google Scholar

Kakkos, I., Dimitrakopoulos, G. N., Gao, L., Zhang, Y., Qi, P., Matsopoulos, G. K., et al. (2019). Mental Workload Drives Different Reorganizations of Functional Cortical Connectivity between 2D and 3D Simulated Flight Experiments. IEEE Trans. Neural Syst. Rehabil. Eng. 27, 1704–1713. doi:10.1109/TNSRE.2019.2930082

PubMed Abstract | CrossRef Full Text | Google Scholar

Kaminskas, V., and Sciglinskas, E. (2019). A Comparison of the Control Schemes of Human Response to a Dynamic Virtual 3D Face. Inf. Technol. Control 48, 250–267. doi:10.5755/j01.itc.48.2.21667

CrossRef Full Text | Google Scholar

Kataoka, H., Kano, H., Yoshida, H., Saijo, A., Yasuda, M., and Osumi, M. (1998). “Development of a Skin Temperature Measuring System for Non-contact Stress Evaluation,” in Proceedings of the 20th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (IEEE), 940–943.

Google Scholar

Kaur, R., Sun, R., Ziegelman, L., Sowers, R., and Hernandez, M. E. (2019). “Using Virtual Reality to Examine the Neural and Physiological Anxiety-Related Responses to Balance-Demanding Target-Reaching Leaning Tasks,” in 2019 IEEE-RAS 19th International Conference on Humanoid Robots (Humanoids), 1–7.

Google Scholar

Kennedy, R. S., Lane, N. E., Berbaum, K. S., and Lilienthal, M. G. (1993). Simulator Sickness Questionnaire: An Enhanced Method for Quantifying Simulator Sickness. Int. J. Aviat. Psychol. 3, 203–220. doi:10.1207/s15327108ijap0303_3

CrossRef Full Text | Google Scholar

Kerous, B., Bartecek, R., Roman, R., Sojka, P., Becev, O., and Liarokapis, F. (2020). Examination of Electrodermal and Cardio-Vascular Reactivity in Virtual Reality through a Combined Stress Induction Protocol. J. Ambient. Intell. Humaniz. Comput. 11, 6033–6042. doi:10.1007/s12652-020-01858-7

CrossRef Full Text | Google Scholar

Khokhar, A., Yoshimura, A., and Borst, C. W. (2019). “Pedagogical Agent Responsive to Eye Tracking in Educational VR,” in 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 1018–1019.

CrossRef Full Text | Google Scholar

Kim, H.-G., Cheon, E.-J., Bai, D.-S., Lee, Y. H., and Koo, B.-H. (2018). Stress and Heart Rate Variability: A Meta-Analysis and Review of the Literature. Psychiatry Invest. 15, 235. doi:10.30773/pi.2017.08.17

PubMed Abstract | CrossRef Full Text | Google Scholar

Kirsch, K., Schatzschneider, C., Garber, C., Rosenberger, A., Kirsten, K., Ariza, O., et al. (2019). “KiVR Sports: Influencing the Users Physical Activity in VR by Using Audiovisual Stimuli in Exergames,” in Proceedings of Mensch Und Computer 2019, MuC’19, Hamburg, Germany (New York, NY: Association for Computing Machinery), 777–781.

Google Scholar

Kishimoto, T., and Ding, X. (2019). The Influences of Virtual Social Feedback on Social Anxiety Disorders. Behav. Cogn. Psychother. 47, 726–735. doi:10.1017/S1352465819000377

PubMed Abstract | CrossRef Full Text | Google Scholar

Kivelä, O., Alavesa, P., Visuri, A., and Ojala, T. (2019). “Study on the Motivational and Physical Effects of Two VR Exergames,” in 2019 11th International Conference on Virtual Worlds and Games for Serious Applications (VS-Games), 1–2.

Google Scholar

Kocur, M., Dechant, M. J., Lankes, M., Wolff, C., and Mandryk, R. (2020). “Eye Caramba: Gaze-Based Assistance for Virtual Reality Aiming and Throwing Tasks in Games,” in ACM Symposium on Eye Tracking Research and Applications, ETRA ’20, Stuttgart, Germany (New York, NY: Association for Computing Machinery).

Google Scholar

Kojić, T., Nugyen, L. T., and Voigt-Antons, J. (2019). “Impact of Constant Visual Biofeedback on User Experience in Virtual Reality Exergames,” in 2019 IEEE International Symposium on Multimedia (ISM), 307–3073.

Google Scholar

Kokkinara, E., Kilteni, K., Blom, K. J., and Slater, M. (2016). First Person Perspective of Seated Participants Over a Walking Virtual Body Leads to Illusory Agency Over the Walking. Sci. Rep. 6, 1–11. doi:10.1038/srep28879

PubMed Abstract | CrossRef Full Text | Google Scholar

Kothgassner, O. D., Felnhofer, A., Hlavacs, H., Beutl, L., Palme, R., Kryspin-Exner, I., et al. (2016). Salivary Cortisol and Cardiovascular Reactivity to a Public Speaking Task in a Virtual and Real-Life Environment. Comput. Hum. Behav. 62, 124–135. doi:10.1016/j.chb.2016.03.081

CrossRef Full Text | Google Scholar

Kothgassner, O. D., Goreis, A., Kafka, J. X., Kaufmann, M., Atteneder, K., Beutl, L., et al. (2019). Virtual Social Support Buffers Stress Response: An Experimental Comparison of Real-Life and Virtual Support Prior to a Social Stressor. J. Behav. Ther. Exp. Psychiatry 63, 57–65. doi:10.1016/j.jbtep.2018.11.003

CrossRef Full Text | Google Scholar

Koticha, P., Katge, F., Shetty, S., and Patil, D. P. (2019). Effectiveness of Virtual Reality Eyeglasses as a Distraction Aid to Reduce Anxiety Among 6-10-Year-Old Children Undergoing Dental Extraction Procedure. Int. J. Clin. Pediatr. Dent. 12, 297–302. doi:10.5005/jp-journals-10005-1640

PubMed Abstract | CrossRef Full Text | Google Scholar

Kreibig, S. D. (2010). Autonomic Nervous System Activity in Emotion: A Review. Biol. Psychol. 84, 394–421. doi:10.1016/j.biopsycho.2010.03.010

PubMed Abstract | CrossRef Full Text | Google Scholar

Krogmeier, C., Mousas, C., and Whittinghill, D. (2019). Human-Virtual Character Interaction: Toward Understanding the Influence of Haptic Feedback. Comput. Animat. Virtual Worlds 30, 1883. doi:10.1002/cav.1883

CrossRef Full Text | Google Scholar

Kuriakose, S., Kunche, S., Narendranath, B., Jain, P., Sonker, S., and Lahiri, U. (2013). “A Step towards Virtual Reality Based Social Communication for Children with Autism,” in 2013 International Conference on Control, Automation, Robotics and Embedded Systems (CARE) (IEEE), 1–6.

CrossRef Full Text | Google Scholar

Kurniawan, H., Maslov, A. V., and Pechenizkiy, M. (2013). “Stress Detection from Speech and Galvanic Skin Response Signals,” in Proceedings of the 26th IEEE International Symposium on Computer-Based Medical Systems (IEEE), 209–214.

CrossRef Full Text | Google Scholar

Lai, C. Q., Ibrahim, H., Abdullah, M. Z., Abdullah, J. M., Suandi, S. A., and Azman, A. (2018). “Literature Survey on Applications of Electroencephalography (EEG),” in AIP Conference Proceedings (AIP Publishing LLC), 020070.

CrossRef Full Text | Google Scholar

Laight, D. (2013). Overview of Peripheral Nervous System Pharmacology. Nurse Prescribing 11, 448–454. doi:10.12968/npre.2013.11.9.448

CrossRef Full Text | Google Scholar

Latoschik, M. E., and Wienrich, C. (2021). Coherence and Plausibility, Not Presence?! Pivotal Conditions for Xr Experiences and Effects, a Novel Model. arXiv.

Google Scholar

LaViola, J. J. (2000). A Discussion of Cybersickness in Virtual Environments. ACM Sigchi Bull. 32, 47–56. doi:10.1145/333329.333344

CrossRef Full Text | Google Scholar

Lee, S.-H., Kim, Y.-M., and Lee, B.-H. (2015). Effects of Virtual Reality-Based Bilateral Upper-Extremity Training on Brain Activity in Post-Stroke Patients. J. Phys. Ther. Sci. 27, 2285–2287. doi:10.1589/jpts.27.2285

CrossRef Full Text | Google Scholar

Levy, F., Leboucher, P., Rautureau, G., and Jouvent, R. (2016). E-virtual Reality Exposure Therapy in Acrophobia: A Pilot Study. J. Telemed. Telecare 22, 215–220. doi:10.1177/1357633X15598243

CrossRef Full Text | Google Scholar

Li, Y., Chiu, P., Yeh, S., and Zhou, C. (2017). “Effects of Virtual Reality and Augmented Reality on Induced Anxiety,” in 2017 5th International Conference on Enterprise Systems (ES), 132–138.

CrossRef Full Text | Google Scholar

Liberati, A., Altman, D. G., Tetzlaff, J., Mulrow, C., Gøtzsche, P. C., Ioannidis, J. P., et al. (2009). The Prisma Statement for Reporting Systematic Reviews and Meta-Analyses of Studies that Evaluate Health Care Interventions: Explanation and Elaboration. J. Clin. Epidemiol. 62, e1–e34. doi:10.1016/j.jclinepi.2009.06.006

CrossRef Full Text | Google Scholar

Líndal, P. J., Jóhannsdóttir, K. R., Kristjánsson, U., Lensing, N., Stühmeier, A., Wohlan, A., et al. (2018). “Comparison of Teleportation and Fixed Track Driving in VR,” in 2018 10th International Conference on Virtual Worlds and Games for Serious Applications (VS-Games), 1–7.

Google Scholar

Liszio, S., Graf, L., and Masuch, M. (2018). The Relaxing Effect of Virtual Nature: Immersive Technology Provides Relief in Acute Stress Situations. Annu. Rev. CyberTherapy Telemed. 16, 87–93.

Google Scholar

Liu, Y., Lin, Y., Wu, S., Chuang, C., Prasad, M., and Lin, C. (2014). “EEG-Based Driving Fatigue Prediction System Using Functional-Link-Based Fuzzy Neural Network,” in 2014 International Joint Conference on Neural Networks (IJCNN), 4109–4113.

Google Scholar

Loreto, C. D., Chardonnet, J., Ryard, J., and Rousseau, A. (2018). “WoaH: A Virtual Reality Work-At-Height Simulator,” in 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 281–288.

Google Scholar

Lotte, F., Bougrain, L., Cichocki, A., Clerc, M., Congedo, M., Rakotomamonjy, A., et al. (2018). A Review of Classification Algorithms for eeg-Based Brain–Computer Interfaces: A 10 Year Update. J. Neural Eng. 15, 031005. doi:10.1088/1741-2552/aab2f2

CrossRef Full Text | Google Scholar

Lou, J., Wang, Y., Nduka, C., Hamedi, M., Mavridou, I., Wang, F.-Y., et al. (2020). Realistic Facial Expression Reconstruction for VR HMD Users. IEEE Trans. Multimedia 22, 730–743. doi:10.1109/TMM.2019.2933338

CrossRef Full Text | Google Scholar

Luck, M., and Aylett, R. (2000). Applying Artificial Intelligence to Virtual Reality: Intelligent Virtual Environments. Appl. Artif. Intell. 14, 3–32. doi:10.1080/088395100117142

CrossRef Full Text | Google Scholar

Lugrin, J.-L., Ertl, M., Krop, P., Klüpfel, R., Stierstorfer, S., Weisz, B., et al. (2018). “Any “Body” There? - Avatar Visibility Effects in a Virtual Reality Game,” in 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 17–24.

Google Scholar

Lugrin, J.-L., Latoschik, M. E., Glémarec, Y., Bosser, A.-G., Chollet, M., and Lugrin, B. (2019a). “Towards Narrative-Driven Atmosphere for Virtual Classroom,” in Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, 1–6.

Google Scholar

Lugrin, J.-L., Latt, J., and Latoschik, M. E. (2015). “Avatar Anthropomorphism and Illusion of Body Ownership in vr,” in 2015 IEEE Virtual Reality (VR) (IEEE), 229–230.

CrossRef Full Text | Google Scholar

Lugrin, J.-L., Unruh, F., Landeck, M., Lamour, Y., Latoschik, M. E., Vogeley, K., et al. (2019b). “Experiencing Waiting Time in Virtual Reality,” in Proceedings of the 25th ACM Conference on Virtual Reality Software and Technology.

CrossRef Full Text | Google Scholar

Ma, K., and Hommel, B. (2015). The Role of agency for Perceived Ownership in the Virtual Hand Illusion. Conscious. Cogn. 36, 277–288. doi:10.1016/j.concog.2015.07.008

PubMed Abstract | CrossRef Full Text | Google Scholar

Ma, L., Zhao, X., Li, Z., Zhao, M., and Xu, Z. (2018). “A sEMG-Based Hand Function Rehabilitation System for Stroke Patients,” in 2018 3rd International Conference on Advanced Robotics and Mechatronics (ICARM), 497–502.

Google Scholar

Macauda, G., Bertolini, G., Palla, A., Straumann, D., Brugger, P., and Lenggenhager, B. (2015). Binding Body and Self in Visuo-Vestibular Conflicts. Eur. J. Neurosci. 41, 810–817. doi:10.1111/ejn.12809

CrossRef Full Text | Google Scholar

Malta, L. S., Giosan, C., Szkodny, L. E., Altemus, M. M., Rizzo, A. A., Silbersweig, D. A., et al. (2020). Development of a Virtual Reality Laboratory Stressor. Virtual Reality. doi:10.1007/s10055-020-00455-5

CrossRef Full Text | Google Scholar

Maples-Keller, J. L., Rauch, S. A. M., Jovanovic, T., Yasinski, C. W., Goodnight, J. M., Sherrill, A., et al. (2019). Changes in Trauma-Potentiated Startle, Skin Conductance, and Heart Rate within Prolonged Exposure Therapy for PTSD in High and Low Treatment Responders. J. Anxiety Disord. 68, 102147. doi:10.1016/j.janxdis.2019.102147

CrossRef Full Text | Google Scholar

Marín-Morales, J., Higuera-Trujillo, J. L., Greco, A., Guixeres, J., Llinares, C., Scilingo, E. P., et al. (2018). Affective Computing in Virtual Reality: Emotion Recognition from Brain and Heartbeat Dynamics Using Wearable Sensors. Sci. Rep. 8, 13657. doi:10.1038/s41598-018-32063-4

PubMed Abstract | CrossRef Full Text | Google Scholar

Marín-Morales, J., Llinares, C., Guixeres, J., and Alcañiz, M. (2020). Emotion Recognition in Immersive Virtual Reality: From Statistics to Affective Computing. Sensors 20, 5163. doi:10.3390/s20185163

CrossRef Full Text | Google Scholar

Masaoka, Y., and Homma, I. (1997). Anxiety and Respiratory Patterns: Their Relationship during Mental Stress and Physical Load. Int. J. Psychophysiol. 27, 153–159. doi:10.1016/s0167-8760(97)00052-4

CrossRef Full Text | Google Scholar

Mavridou, I., Seiss, E., Hamedi, M., Balaguer-Ballester, E., and Nduka, C. (2018a). “Towards Valence Detection from emg for Virtual Reality Applications,” in 12th International Conference on Disability, Virtual Reality and Associated Technologies (ICDVRAT 2018), Nottingham, United Kingdom, September 4–6, 2018.

Google Scholar

Mavridou, I., Seiss, E., Kostoulas, T., Nduka, C., and Balaguer-Ballester, E. (2018b). “Towards an Effective Arousal Detection System for Virtual Reality,” in Proceedings of the Workshop on Human-Habitat for Health (H3): Human-Habitat Multimodal Interaction for Promoting Health and Well-Being in the Internet of Things Era, H3 ’18, Boulder, CO (New York, NY: Association for Computing Machinery).

CrossRef Full Text | Google Scholar

McDonough, D. J., Pope, Z. C., Zeng, N., Liu, W., and Gao, Z. (2020). Comparison of College Students’ Blood Pressure, Perceived Exertion, and Psychosocial Outcomes During Virtual Reality, Exergaming, and Traditional Exercise: An Exploratory Study. Games Health J. 9, 290–296. doi:10.1089/g4h.2019.0196

PubMed Abstract | CrossRef Full Text | Google Scholar

Melillo, P., Bracale, M., and Pecchia, L. (2011). Nonlinear Heart Rate Variability Features for Real-Life Stress Detection. Case Study: Students Under Stress Due to university Examination. Biomed. Eng. Online 10, 96. doi:10.1186/1475-925x-10-96

PubMed Abstract | CrossRef Full Text | Google Scholar

Melnyk, R., Campbell, T., Holler, T., Cameron, K., Saba, P., Witthaus, M., et al. (2021). See Like an Expert: Gaze-Augmented Training Enhances Skill Acquisition in a Virtual Reality Robotic Suturing Task. J. Endourol. 35, 376–382. doi:10.1089/end.2020.0445

CrossRef Full Text | Google Scholar

Mertens, G., Wagensveld, P., and Engelhard, I. M. (2019). Cue Conditioning Using a Virtual Spider Discriminates Between High and Low Spider Fearful Individuals. Comput. Hum. Behav. 91, 192–200. doi:10.1016/j.chb.2018.10.006

CrossRef Full Text | Google Scholar

Mishra, N., and Folmer, E. (2018). “Measuring Physical Exertion in Virtual Reality Exercise Games,” in Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology, Tokyo, Japan (New York, NY: Association for Computing Machinery).

CrossRef Full Text | Google Scholar

Moon, S.-E., and Lee, J.-S. (2016). Implicit Analysis of Perceptual Multimedia Experience Based on Physiological Response: A Review. IEEE Trans. Multimed. 19, 340–353. doi:10.1109/TMM.2016.2614880

CrossRef Full Text | Google Scholar

Morii, M., Sakagami, T., Masuda, S., Okubo, S., and Tamari, Y. (2017). How Does Response Bias Emerge in Lengthy Sequential Preference Judgments?. Behaviormetrika 44, 575–591. doi:10.1007/s41237-017-0036-6

CrossRef Full Text | Google Scholar

Mosquera, C., Galvan, B., Liu, E., De Vito, R., Ting, P., Costello, E. L., et al. (2019). “ANX Dread: A Virtual Reality Experience to Explore Anxiety during Task Completion,” in Proceedings of the 14th International Conference on the Foundations of Digital Games, FDG ’19, San Luis Obispo, CA (New York, NY: Association for Computing Machinery).

Google Scholar

Mostajeran, F., Balci, M. B., Steinicke, F., Kühn, S., and Gallinat, J. (2020). “The Effects of Virtual Audience Size on Social Anxiety during Public Speaking,” in 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 303–312.

CrossRef Full Text | Google Scholar

Mueller, E., Stolz, C., and Endres, D. (2017). Threat-Conditioned Contexts Modulate the Late Positive Potential to Neutral vs. Angry Avatars - A Mobile-EEG/Virtual Reality Study. Psychophysiology 54, S2. doi:10.1111/psyp.12921

PubMed Abstract | CrossRef Full Text | Google Scholar

Muñoz, J. E., Paulino, T., Vasanth, H., and Baras, K. (2016). “Physiovr: A Novel mobile Virtual Reality Framework for Physiological Computing,” in 2016 IEEE 18th international conference on e-Health Networking, Applications and Services (Healthcom) (IEEE), 1–6.

Google Scholar

Murray, E. G., Neumann, D. L., Moffitt, R. L., and Thomas, P. R. (2016). The Effects of the Presence of Others During a Rowing Exercise in a Virtual Reality Environment. Psychol. Sport Exerc. 22, 328–336. doi:10.1016/j.psychsport.2015.09.007

CrossRef Full Text | Google Scholar

Ong, T. L., Ruppert, M. M., Akbar, M., Rashidi, P., Ozrazgat-Baslanti, T., Bihorac, A., et al. (2020). Improving the Intensive Care Patient Experience With Virtual Reality-A Feasibility Study. Crit. Care Explor. 2, e0122. doi:10.1097/CCE.0000000000000122

PubMed Abstract | CrossRef Full Text | Google Scholar

Orlosky, J., Huynh, B., and Hollerer, T. (2019). “Using Eye Tracked Virtual Reality to Classify Understanding of Vocabulary in Recall Tasks,” in 2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), 66–667.

CrossRef Full Text | Google Scholar

Parenthoen, M., Murie, F., and Thery, F. (2015). “The Sea Is Your Mirror,” in Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games, MIG ’15, Paris, France (New York, NY: Association for Computing Machinery), 159–165.

CrossRef Full Text | Google Scholar

Park, S. K., Yang, D. J., Uhm, Y. H., Heo, J. W., and Kim, J. H. (2016). The Effect of Virtual Reality-Based Eccentric Training on Lower Extremity Muscle Activation and Balance in Stroke Patients. J. Phys. Ther. Sci. 28, 2055–2058. doi:10.1589/jpts.28.2055

CrossRef Full Text | Google Scholar

Patel, J., Fluet, G., Merians, A., Qiu, Q., Yarossi, M., Adamovich, S., et al. (2015). “Virtual Reality-Augmented Rehabilitation in the Acute Phase post-stroke for Individuals with Flaccid Upper Extremities: A Feasibility Study,” in 2015 International Conference on Virtual Rehabilitation (ICVR), 215–223.

Google Scholar

Patel, J., Qiu, Q., Yarossi, M., Merians, A., Massood, S., Tunik, E., et al. (2017). Exploring the Impact of Visual and Movement Based Priming on a Motor Intervention in the Acute Phase post-stroke in Persons with Severe Hemiparesis of the Upper Extremity. Disabil. Rehabil. 39, 1515–1523. doi:10.1080/09638288.2016.1226419

PubMed Abstract | CrossRef Full Text | Google Scholar

Patibanda, R., Mueller, F. F., Leskovsek, M., and Duckworth, J. (2017). “Life Tree: Understanding the Design of Breathing Exercise Games,” in Proceedings of the Annual Symposium on Computer-Human Interaction in Play, Amsterdam, Netherlands (New York, NY: Association for Computing Machinery), 19–31.

Google Scholar

Plouzeau, J., Chardonnet, J., and Merienne, F. (2018). “Using Cybersickness Indicators to Adapt Navigation in Virtual Reality: A Pre-Study,” in 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 661–662. doi:10.1109/VR.2018.8446192

CrossRef Full Text | Google Scholar

Posner, J., Russell, J. A., and Peterson, B. S. (2005). The Circumplex Model of Affect: An Integrative Approach to Affective Neuroscience, Cognitive Development, and Psychopathology. Dev. Psychopathol. 17, 715. doi:10.1017/S0954579405050340

PubMed Abstract | CrossRef Full Text | Google Scholar

Prachyabrued, M., Wattanadhirach, D., Dudrow, R. B., Krairojananan, N., and Fuengfoo, P. (2019). “Toward Virtual Stress Inoculation Training of Prehospital Healthcare Personnel: A Stress-Inducing Environment Design and Investigation of an Emotional Connection Factor,” in 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 671–679.

Google Scholar

Preuss, N., and Ehrsson, H. H. (2019). Full-Body Ownership Illusion Elicited by Visuo-Vestibular Integration. J. Exp. Psychol. Hum. Percept. Perform. 45, 209–223. doi:10.1037/xhp0000597

CrossRef Full Text | Google Scholar

Pullman, S. L., Goodin, D. S., Marquinez, A. I., Tabbal, S., and Rubin, M. (2000). Clinical Utility of Surface Emg: Report of the Therapeutics and Technology Assessment Subcommittee of the American Academy of Neurology. Neurology 55, 171–177. doi:10.1212/wnl.55.2.171

PubMed Abstract | CrossRef Full Text | Google Scholar

Quintero, L., Papapetrou, P., and Muñoz, J. E. (2019). “Open-source Physiological Computing Framework Using Heart Rate Variability in mobile Virtual Reality Applications,” in 2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR) (IEEE), 126–1267.

Google Scholar

Rahman, Y., Asish, S. M., Fisher, N. P., Bruce, E. C., Kulshreshth, A. K., and Borst, C. W. (2020). “Exploring Eye Gaze Visualization Techniques for Identifying Distracted Students in Educational VR,” in 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 868–877.

CrossRef Full Text | Google Scholar

Ramdhani, N., Akpewila, F., Faizah, M., and Resibisma, B. (2019). “It’s So Real: Psychophysiological Reaction Towards Virtual Reality Exposure,” in 2019 5th International Conference on Science and Technology (ICST), 1–5.

Google Scholar

Rangelova, S., Flutura, S., Huber, T., Motus, D., and André, E. (2019). “Exploration of Physiological Signals Using Different Locomotion Techniques in a vr Adventure Game,” in International Conference on Human-Computer Interaction (Springer), 601–616.

CrossRef Full Text | Google Scholar

Rao, D. G., Havale, R., Nagaraj, M., Karobari, N. M., Latha, A. M., Tharay, N., et al. (2019). Assessment of Efficacy of Virtual Reality Distraction in Reducing Pain Perception and Anxiety in Children Aged 6-10 Years: A Behavioral Interventional Study. Int. J. Clin. Pediatr. Dent. 12, 510–513. doi:10.5005/jp-journals-10005-1694

PubMed Abstract | CrossRef Full Text | Google Scholar

Ravaja, N., Bente, G., Kätsyri, J., Salminen, M., and Takala, T. (2018). Virtual Character Facial Expressions Influence Human Brain and Facial EMG Activity in a Decision-Making Game. IEEE Trans. Affect. Comput. 9, 285–298. doi:10.1109/TAFFC.2016.2601101

CrossRef Full Text | Google Scholar

Rebenitsch, L., and Owen, C. (2016). Review on Cybersickness in Applications and Visual Displays. Virtual Real. 20, 101–125. doi:10.1007/s10055-016-0285-9

CrossRef Full Text | Google Scholar

Redline, C. D., Dillman, D. A., Carley-Baxter, L., and Creecy, R. (2003). “Factors that Influence reading and Comprehension in Self-Administered Questionnaires,” in Workshop on Item-Nonresponse and Data Quality, Basel Switzerland.

Google Scholar

Richard, M.-O., and Toffoli, R. (2009). Language Influence in Responses to Questionnaires by Bilingual Respondents: A Test of the Whorfian Hypothesis. J. Bus. Res. 62, 987–994. doi:10.1016/j.jbusres.2008.10.016

CrossRef Full Text | Google Scholar

Robertson, A., Khan, R., Fick, D., Robertson, W. B., Gunaratne, D. R., Yapa, S., et al. (2017). “The Effect of Virtual Reality in Reducing Preoperative Anxiety in Patients Prior to Arthroscopic Knee Surgery: A Randomised Controlled Trial,” in 2017 IEEE 5th International Conference on Serious Games and Applications for Health (SeGAH), 1–7.

CrossRef Full Text | Google Scholar

Robins, R. W., Fraley, R. C., and Krueger, R. F. (2009). Handbook of Research Methods in Personality Psychology. New York, USA: Guilford Press.

Robitaille, P., and McGuffin, M. J. (2019). “Increased Affect-Arousal in VR Can Be Detected from Faster Body Motion with Increased Heart Rate,” in Proceedings of the ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games, I3D ’19, Montreal, Canada (New York, NY: Association for Computing Machinery).

Google Scholar

Roth, D., and Latoschik, M. E. (2020). Construction of the Virtual Embodiment Questionnaire (Veq). IEEE Ann. Hist. Comput. 26, 3546–3556. doi:10.1109/tvcg.2020.3023603

PubMed Abstract | CrossRef Full Text | Google Scholar

Roth, D., Lugrin, J.-L., Latoschik, M. E., and Huber, S. (2017). “Alpha Ivbo - Construction of a Scale to Measure the Illusion of Virtual Body Ownership,” in Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, 2875–2883.

Google Scholar

Rowley, J. (2014). Designing and Using Research Questionnaires. Manage. Res. Rev. 37, 308–330.

CrossRef Full Text | Google Scholar

Russell, J. A., and Mehrabian, A. (1977). Evidence for a Three-Factor Theory of Emotions. J. Res. Personal. 11, 273–294. doi:10.1016/0092-6566(77)90037-x

CrossRef Full Text | Google Scholar

Saha, D. P., Martin, L. T., and Knapp, R. B. (2018). “Towards Defining a Quality-Metric for Affective Feedback in an Intelligent Environment,” in 2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops) (IEEE), 609–614.

Google Scholar

Sakamoto, K., Shirai, S., Orlosky, J., Nagataki, H., Takemura, N., Alizadeh, M., et al. (2020). “Exploring Pupillometry as a Method to Evaluate Reading Comprehension in VR-Based Educational Comics,” in 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), 422–426.

Google Scholar

Salkevicius, J., Damasevicius, R., Maskeliunas, R., and Laukiene, I. (2019). Anxiety Level Recognition for Virtual Reality Therapy System Using Physiological Signals. Electronics 8. doi:10.3390/electronics8091039

CrossRef Full Text | Google Scholar

Salminen, M., Járvelá, S., Ruonala, A., Harjunen, V., Jacucci, G., Hamari, J., et al. (2019). Evoking Physiological Synchrony and Empathy Using Social VR with Biofeedback. IEEE Trans. Affect. Comput.. doi:10.1109/TAFFC.2019.2958657

CrossRef Full Text | Google Scholar

Samson, A. C., Kreibig, S. D., Soderstrom, B., Wade, A. A., and Gross, J. J. (2016). Eliciting Positive, Negative and Mixed Emotional States: A Film Library for Affective Scientists. Cogn. Emot. 30, 827–856. doi:10.1080/02699931.2015.1031089

PubMed Abstract | CrossRef Full Text | Google Scholar

Schroeder, R. (2010). Being There Together: Social Interaction in Shared Virtual Environments. Oxford, UK: Oxford University Press.

Schwind, V., Knierim, P., Haas, N., and Henze, N. (2019). “Using Presence Questionnaires in Virtual Reality,” in Proceedings of the 2019 CHI conference on human factors in computing systems, 1–12.

CrossRef Full Text | Google Scholar

Shiban, Y., Diemer, J., Brandl, S., Zack, R., Mühlberger, A., and Wüst, S. (2016a). Trier Social Stress Test In Vivo and in Virtual Reality: Dissociation of Response Domains. Int. J. Psychophysiol. 110, 47–55. doi:10.1016/j.ijpsycho.2016.10.008

CrossRef Full Text | Google Scholar

Shiban, Y., Diemer, J., Müller, J., Brütting-Schick, J., Pauli, P., and Mühlberger, A. (2017). Diaphragmatic Breathing During Virtual Reality Exposure Therapy for Aviophobia: Functional Coping Strategy or Avoidance Behavior? A Pilot Study. BMC Psychiatry 17, 29. doi:10.1186/s12888-016-1181-2

PubMed Abstract | CrossRef Full Text | Google Scholar

Shiban, Y., Peperkorn, H., Alpers, G. W., Pauli, P., and Mühlberger, A. (2016b). Influence of Perceptual Cues and Conceptual Information on the Activation and Reduction of Claustrophobic Fear. J. Behav. Ther. Exp. Psychiatry 51, 19–26. doi:10.1016/j.jbtep.2015.11.002

CrossRef Full Text | Google Scholar

Shumailov, I., and Gunes, H. (2017). “Computational Analysis of Valence and Arousal in Virtual Reality Gaming Using Lower Arm Electromyograms,” in 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), 164–169.

CrossRef Full Text | Google Scholar

Simões, M., Bernardes, M., Barros, F., and Castelo-Branco, M. (2018). Virtual Travel Training for Autism Spectrum Disorder: Proof-Of-Concept Interventional Study. JMIR Serious Games 6, e5. doi:10.2196/games.8428

PubMed Abstract | CrossRef Full Text | Google Scholar

Singh, H., and Singh, J. (2012). Human Eye Tracking and Related Issues: A Review. Int. J. Sci. Res. Pub. 2, 1–9. doi:10.5296/ijhrs.v2i3.2217

CrossRef Full Text | Google Scholar

Siravenha, A. C., Reis, M. N., Cordeiro, I., Tourinho, R. A., Gomes, B. D., and Carvalho, S. R. (2019). “Residual MLP Network for Mental Fatigue Classification in Mining Workers from Brain Data,” in 2019 8th Brazilian Conference on Intelligent Systems (BRACIS) (IEEE), 407–412.

CrossRef Full Text | Google Scholar

Sirois, S., and Brisson, J. (2014). Pupillometry. Wiley Interdiscip. Rev. Cogn. Sci. 5, 679–692. doi:10.1002/wcs.1323

PubMed Abstract | CrossRef Full Text | Google Scholar

Skola, F., and Liarokapis, F. (2016). Examining the Effect of Body Ownership in Immersive Virtual and Augmented Reality Environments. Vis. Comput. 32, 761–770. doi:10.1007/s00371-016-1246-8

CrossRef Full Text | Google Scholar

Slater, M., Pérez Marcos, D., Ehrsson, H., and Sanchez-Vives, M. V. (2008). Towards a Digital Body: The Virtual Arm Illusion. Front. Hum. Neurosci. 2, 6. doi:10.3389/neuro.09.006.2008

PubMed Abstract | CrossRef Full Text | Google Scholar

Slater, M., Usoh, M., and Steed, A. (1994). Depth of Presence in Virtual Environments. Presence: Teleoperators Virtual Environ. 3, 130–144. doi:10.1162/pres.1994.3.2.130

CrossRef Full Text | Google Scholar

Solanki, D., and Lahiri, U. (2020). Adaptive Treadmill-Assisted Virtual Reality-Based Gait Rehabilitation for Post-Stroke Physical Reconditioning & Feasibility Study in Low-Resource Settings. IEEE Access 8, 88830–88843. doi:10.1109/ACCESS.2020.2994081

CrossRef Full Text | Google Scholar

Soyka, F., Leyrer, M., Smallwood, J., Ferguson, C., Riecke, B. E., and Mohler, B. J. (2016). “Enhancing Stress Management Techniques Using Virtual Reality,” in Proceedings of the ACM Symposium on Applied Perception, SAP ’16, Anaheim, CA (New York, NY: Association for Computing Machinery), 85–88.

CrossRef Full Text | Google Scholar

Spangler, D., Alam, S., Rahman, S., Crone, J., Robucci, R., Banerjee, N., et al. (2020). Multilevel Longitudinal Analysis of Shooting Performance as a Function of Stress and Cardiovascular Responses. IEEE Trans. Affect. Comput.. doi:10.1109/TAFFC.2020.2995769

CrossRef Full Text | Google Scholar

Stauffert, J.-P., Niebling, F., and Latoschik, M. E. (2018). “Effects of Latency Jitter on Simulator Sickness in a Search Task,” in 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (IEEE), 121–127.

CrossRef Full Text | Google Scholar

Stauffert, J.-P., Niebling, F., and Latoschik, M. E. (2020). Latency and Cybersickness: Impact, Causes and Measures: A Review. Front. Virtual Real. 1, 31. doi:10.3389/frvir.2020.582204

CrossRef Full Text | Google Scholar

Streck, A., Stepnicka, P., Klaubert, J., and Wolbers, T. (2019). “Neomento SAD - VR Treatment for Social Anxiety,” in 2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), 245–2451.

Google Scholar

Suhaimi, N. S. B., Mountstephens, J., and Teo, J. (2020). “Emotional State Classification with Distributed Random Forest, Gradient Boosting Machine and Naïve Bayes in Virtual Reality Using Wearable Electroencephalography and Inertial Sensing,” in 2020 IEEE 10th Symposium on Computer Applications Industrial Electronics (ISCAIE), 12–17.

Google Scholar

Swidrak, J., and Pochwatko, G. (2019). “Being Touched by a Virtual Human.: Relationships between Heart Rate, Gender, Social Status, and Compliance,” in Proceedings of the 19th ACM International Conference on Intelligent Virtual Agents IVA ’19, Paris, France (New York, NY: Association for Computing Machinery), 49–55.

Google Scholar

Syrjamaki, A. H., Isokoski, P., Surakka, V., Pasanen, T. P., and Hietanen, J. K. (2020). Eye Contact in Virtual Reality - A Psychophysiological Study. Comput. Hum. Behav. 112, 106454. doi:10.1016/j.chb.2020.106454

CrossRef Full Text | Google Scholar

Szczurowski, K., and Smith, M. (2017). “Measuring Presence: Hypothetical Quantitative Framework,” in 2017 23rd International Conference on Virtual System Multimedia (VSMM), 1–8.

CrossRef Full Text | Google Scholar

Tartarisco, G., Carbonaro, N., Tonacci, A., Bernava, G. M., Arnao, A., Crifaci, G., et al. (2015). Neuro-Fuzzy Physiological Computing to Assess Stress Levels in Virtual Reality Therapy. Interact. Comput. 27, 521–533. doi:10.1093/iwc/iwv010

CrossRef Full Text | Google Scholar

Tarvainen, M. P., Niskanen, J.-P., Lipponen, J. A., Ranta-Aho, P. O., and Karjalainen, P. A. (2014). Kubios HRV–Heart Rate Variability Analysis Software. Comp. Methods Programs Biomed. 113, 210–220. doi:10.1016/j.cmpb.2013.07.024

CrossRef Full Text | Google Scholar

Taylor, S., Jaques, N., Chen, W., Fedor, S., Sano, A., and Picard, R. (2015). “Automatic Identification of Artifacts in Electrodermal Activity Data,” in 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) (IEEE), 1934–1937.

CrossRef Full Text | Google Scholar

Teo, J., and Chia, J. T. (2018). “Deep Neural Classifiers for Eeg-Based Emotion Recognition in Immersive Environments,” in 2018 International Conference on Smart Computing and Electronic Enterprise (ICSCEE), 1–6.

Google Scholar

Thompson-Lake, D. G. Y., Cooper, K. N., Mahoney, J. J., Bordnick, P. S., Salas, R., Kosten, T. R., et al. (2015). Withdrawal Symptoms and Nicotine Dependence Severity Predict Virtual Reality Craving in Cigarette-Deprived Smokers. Nicotine Tob. Res. 17, 796–802. doi:10.1093/ntr/ntu245

PubMed Abstract | CrossRef Full Text | Google Scholar

Tieri, G., Gioia, A., Scandola, M., Pavone, E. F., and Aglioti, S. M. (2017). Visual Appearance of a Virtual Upper Limb Modulates the Temperature of the Real Hand: a thermal Imaging Study in Immersive Virtual Reality. Eur. J. Neurosci. 45, 1141–1151. doi:10.1111/ejn.13545

CrossRef Full Text | Google Scholar

Topalovic, U., Aghajan, Z. M., Villaroman, D., Hiller, S., Christov-Moore, L., Wishard, T. J., et al. (2020). Wireless Programmable Recording and Stimulation of Deep Brain Activity in Freely Moving Humans. Neuron 108, 322–334. doi:10.1016/j.neuron.2020.08.021

PubMed Abstract | CrossRef Full Text | Google Scholar

Tremmel, C. (2020). Estimating Cognitive Workload in an Interactive Virtual Reality Environment Using Electrophysiological and Kinematic Activity. PhD dissertation. (Norfolk, USA: Dominion University).

Google Scholar

Tremmel, C., Herff, C., Sato, T., Rechowicz, K., Yamani, Y., and Krusienski, D. J. (2019). Estimating Cognitive Workload in an Interactive Virtual Reality Environment Using EEG. Front. Hum. Neurosci. 13. doi:10.3389/fnhum.2019.00401

PubMed Abstract | CrossRef Full Text | Google Scholar

Tremmel, C., and Krusienski, D. J. (2019). “EEG Spectral Conditioning for Cognitive-State Classification in Interactive Virtual Reality,” in 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), 2806–2811.

Google Scholar

Tsai, C.-F., Yeh, S.-C., Huang, Y., Wu, Z., Cui, J., and Zheng, L. (2018). The Effect of Augmented Reality and Virtual Reality on Inducing Anxiety for Exposure Therapy: A Comparison Using Heart Rate Variability. J. Healthc. Eng. 2018, 6357351. doi:10.1155/2018/6357351

PubMed Abstract | CrossRef Full Text | Google Scholar

Tsakiris, M., Prabhu, G., and Haggard, P. (2006). Having a Body versus Moving Your Body: How agency Structures Body-Ownership. Conscious. Cogn. 15, 423–432. doi:10.1016/j.concog.2005.09.004

PubMed Abstract | CrossRef Full Text | Google Scholar

Van Gent, P., Farah, H., Nes, N., and van Arem, B. (2018). “Heart Rate Analysis for Human Factors: Development and Validation of an Open Source Toolkit for Noisy Naturalistic Heart Rate Data,” in Proceedings of the 6th HUMANIST Conference, 173–178.

Google Scholar

Vanderlei, L., Pastre, C., Hoshi, R., Carvalho, T., and Godoy, M. (2009). Basic Notions of Heart Rate Variability and its Clinical Applicability. Braz. J. Cardiovasc. Surg. 24, 205–217. doi:10.1590/s0102-76382009000200018

CrossRef Full Text | Google Scholar

Vince, J. (2004). Introduction to Virtual Reality. London, UK: Springer Science & Business Media.

Vinkers, C. H., Penning, R., Hellhammer, J., Verster, J. C., Klaessens, J. H., Olivier, B., et al. (2013). The Effect of Stress on Core and Peripheral Body Temperature in Humans. Stress 16, 520–530. doi:10.3109/10253890.2013.807243

PubMed Abstract | CrossRef Full Text | Google Scholar

Vogt, J., Hagemann, T., and Kastner, M. (2006). The Impact of Workload on Heart Rate and Blood Pressure in En-Route and tower Air Traffic Control. J. Psychophysiol. 20, 297–314. doi:10.1027/0269-8803.20.4.297

CrossRef Full Text | Google Scholar

Volante, M., Babu, S. V., Chaturvedi, H., Newsome, N., Ebrahimi, E., Roy, T., et al. (2016). Effects of Virtual Human Appearance Fidelity on Emotion Contagion in Affective Inter-personal Simulations. IEEE Trans. Visual. Comput. Graph. 22, 1326–1335. doi:10.1109/TVCG.2016.2518158

PubMed Abstract | CrossRef Full Text | Google Scholar

Volonte, M., Hsu, Y., Liu, K., Mazer, J. P., Wong, S., and Babu, S. V. (2020). “Effects of Interacting with a Crowd of Emotional Virtual Humans on Users’ Affective and Non-Verbal Behaviors,” in 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 293–302.

Google Scholar

Vourvopoulos, A., Pardo, O. M., Lefebvre, S., Neureither, M., Saldana, D., Jahng, E., et al. (2019). Effects of a Brain-Computer Interface With Virtual Reality (VR) Neurofeedback: A Pilot Study in Chronic Stroke Patients. Front. Hum. Neurosci. 13. doi:10.3389/fnhum.2019.00210

PubMed Abstract | CrossRef Full Text | Google Scholar

Wagner, J., Lingenfelser, F., Baur, T., Damian, I., Kistler, F., and André, E. (2013). “The Social Signal Interpretation (Ssi) Framework: Multimodal Signal Processing and Recognition in Real-Time,” in Proceedings of the 21st ACM International Conference on Multimedia, 831–834.

Google Scholar

Wang, Q., Wang, H., and Hu, F. (2018). “Combining EEG and VR Technology to Assess Fear of Heights,” in 2018 9th International Conference on Information Technology in Medicine and Education (ITME), 110–114.

CrossRef Full Text | Google Scholar

Wang, X., Shi, Y., Zhang, B., and Chiang, Y. (2019). The Influence of Forest Resting Environments on Stress Using Virtual Reality. Int. J. Environ. Res. Public Health 16, 3263. doi:10.3390/ijerph16183263

PubMed Abstract | CrossRef Full Text | Google Scholar

Wang, Y.-G., Liu, M.-H., and Shen, Z.-H. (2019). A Virtual Reality Counterconditioning Procedure to Reduce Methamphetamine Cue-Induced Craving. J. Psychiatr. Res. 116, 88–94. doi:10.1016/j.jpsychires.2019.06.007

CrossRef Full Text | Google Scholar

Wang, Y., Zhai, G., Chen, S., Min, X., Gao, Z., and Song, X. (2019). Assessment of Eye Fatigue Caused by Head-Mounted Displays Using Eye-Tracking. Biomed. Eng. Online 18, 111. doi:10.1186/s12938-019-0731-5

PubMed Abstract | CrossRef Full Text | Google Scholar

Watson, D., Clark, L. A., and Tellegen, A. (1988). Development and Validation of Brief Measures of Positive and Negative Affect: The Panas Scales. J. Personal. Soc. Psychol. 54, 1063. doi:10.1037/0022-3514.54.6.1063

CrossRef Full Text | Google Scholar

Weibel, R. P., Grübel, J., Zhao, H., Thrash, T., Meloni, D., Hölscher, C., et al. (2018). Virtual Reality Experiments with Physiological Measures. J. Vis. Exp. (138), 58318. doi:10.3791/58318

PubMed Abstract | CrossRef Full Text | Google Scholar

Witmer, B. G., and Singer, M. J. (1998). Measuring Presence in Virtual Environments: A Presence Questionnaire. Presence 7, 225–240. doi:10.1162/105474698565686

CrossRef Full Text | Google Scholar

Wolf, E., Döllinger, N., Mal, D., Wienrich, C., Botsch, M., and Latoschik, M. E. (2020). “Body Weight Perception of Females Using Photorealistic Avatars in Virtual and Augmented Reality,” in 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

CrossRef Full Text | Google Scholar

Wong, C. L., Li, C. K., Chan, C. W. H., Choi, K. C., Chen, J., Yeung, M. T., et al. (2020). Virtual Reality Intervention Targeting Pain and Anxiety Among Pediatric Cancer Patients Undergoing Peripheral Intravenous Cannulation: A Randomized Controlled Trial. Cancer Nurs.. [Epub ahead of print]. doi:10.1097/NCC.0000000000000844

CrossRef Full Text | Google Scholar

Xie, B., Zhang, Y., Huang, H., Ogawa, E., You, T., and Yu, L.-F. (2018). Exercise Intensity-Driven Level Design. IEEE Trans. Vis. Comp. Graph. 24, 1661–1670. doi:10.1109/TVCG.2018.2793618

PubMed Abstract | CrossRef Full Text | Google Scholar

Xu, T., Yin, R., Shu, L., and Xu, X. (2019). “Emotion Recognition Using Frontal EEG in VR Affective Scenes,” in 2019 IEEE MTT-S International Microwave Biomedical Conference (IMBioC), 1–4.

CrossRef Full Text | Google Scholar

Yang, S., He, Y., and Zheng, X. (2019). “FoVR: Attention-Based VR Streaming through Bandwidth-Limited Wireless Networks,” in 2019 16th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON), 1–9.

Google Scholar

Yeh, S., Li, Y., Zhou, C., Chiu, P., and Chen, J. (2018). Effects of Virtual Reality and Augmented Reality on Induced Anxiety. IEEE Trans. Neural Syst. Rehabil. Eng. 26, 1345–1352. doi:10.1109/TNSRE.2018.2844083

PubMed Abstract | CrossRef Full Text | Google Scholar

Yong-Guang, W., Zhi-Hua, S., and Xuan-Chen, W. (2018). Detection of Patients with Methamphetamine Dependence with Cue-Elicited Heart Rate Variability in a Virtual Social Environment. Psychiatry Res. 270, 382–388. doi:10.1016/j.psychres.2018.10.009

PubMed Abstract | CrossRef Full Text | Google Scholar

Yoo, S., Parker, C., and Kay, J. (2018). “Adapting Data from Physical Activity Sensors for Visualising Exertion in Virtual Reality Games,” in Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers UbiComp ’18, Singapore, Singapore (New York, NY: Association for Computing Machinery), 307–310.

CrossRef Full Text | Google Scholar

Yoshimura, A., Khokhar, A., and Borst, C. W. (2019). “Eye-Gaze-Triggered Visual Cues to Restore Attention in Educational VR,” in 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 1255–1256.

Google Scholar

Yu, C.-P., Lee, H.-Y., and Luo, X.-Y. (2018). The Effect of Virtual Reality forest and Urban Environments on Physiological and Psychological Responses. Urban For. Urban Green. 35, 106–114. doi:10.1016/j.ufug.2018.08.013

CrossRef Full Text | Google Scholar

Zeng, N., Pope, Z., and Gao, Z. (2017). Acute Effect of Virtual Reality Exercise Bike Games on College Students’ Physiological and Psychological Outcomes. Cyberpsychology Behav. Soc. Netw. 20, 453–457. doi:10.1089/cyber.2017.0042

PubMed Abstract | CrossRef Full Text | Google Scholar

Zhang, S., Zhang, Y., Sun, Y., Thakor, N., and Bezerianos, A. (2017). “Graph Theoretical Analysis of EEG Functional Network during Multi-Workload Flight Simulation experiment in Virtual Reality Environment,” in 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 3957–3960.

Google Scholar

Zhang, W., Shu, L., Xu, X., and Liao, D. (2017). “Affective Virtual Reality System (Avrs): Design and Ratings of Affective Vr Scenes,” in 2017 International Conference on Virtual Reality and Visualization (ICVRV) (IEEE), 311–314.

CrossRef Full Text | Google Scholar

Zheng, L. J., Mountstephens, J., and Teo, J. (2020). Four-Class Emotion Classification in Virtual Reality Using Pupillometry. J. Big Data 7, 1–9. doi:10.1186/s40537-020-00322-9

CrossRef Full Text | Google Scholar

Zhu, L., Tian, X., Xu, X., and Shu, L. (2019). “Design and Evaluation of the Mental Relaxation VR Scenes Using Forehead EEG Features,” in 2019 IEEE MTT-S International Microwave Biomedical Conference (IMBioC), 1–4.

CrossRef Full Text | Google Scholar

Zimmer, P., Buttlar, B., Halbeisen, G., Walther, E., and Domes, G. (2019). Virtually Stressed? A Refined Virtual Reality Adaptation of the Trier Social Stress Test (TSST) Induces Robust Endocrine Responses. Psychoneuroendocrinology 101, 186–192. doi:10.1016/j.psyneuen.2018.11.010

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: virtual reality, use cases, sesnsors, tools, biosignals, psychophyisology, HMD (Head-Mounted Display), systematic review

Citation: Halbig A and Latoschik ME (2021) A Systematic Review of Physiological Measurements, Factors, Methods, and Applications in Virtual Reality. Front. Virtual Real. 2:694567. doi: 10.3389/frvir.2021.694567

Received: 13 April 2021; Accepted: 17 June 2021;
Published: 14 July 2021.

Edited by:

Missie Smith, Independent Researcher, United States

Reviewed by:

Frank Guan, Singapore Institute of Technology, Singapore
Mariano Alcañiz, Universitat Politècnica de València, Spain

Copyright © 2021 Halbig and Latoschik. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Andreas Halbig, andreas.halbig@uni-wuerzburg.de

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.