<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article article-type="review-article" dtd-version="2.3" xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Comput. Sci.</journal-id>
<journal-title>Frontiers in Computer Science</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Comput. Sci.</abbrev-journal-title>
<issn pub-type="epub">2624-9898</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">557608</article-id>
<article-id pub-id-type="doi">10.3389/fcomp.2021.557608</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Computer Science</subject>
<subj-group>
<subject>Review</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Applications of Biological and Physiological Signals in Commercial Video Gaming and Game Research: A&#x20;Review</article-title>
<alt-title alt-title-type="left-running-head">Hughes and Jorda</alt-title>
<alt-title alt-title-type="right-running-head">Physiological Signals in Video Gaming</alt-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Hughes</surname>
<given-names>Alayna</given-names>
</name>
<xref ref-type="corresp" rid="c001">&#x2a;</xref>
<uri xlink:href="https://loop.frontiersin.org/people/868563/overview"/>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Jorda</surname>
<given-names>Sergi</given-names>
</name>
<xref ref-type="corresp" rid="c001">&#x2a;</xref>
<uri xlink:href="https://loop.frontiersin.org/people/794978/overview"/>
</contrib>
</contrib-group>
<aff>Multimodal Interaction Group, Music Technology Group, ETIC, Universitat Pompeu Fabra, <addr-line>Barcelona</addr-line>, <country>Spain</country>
</aff>
<author-notes>
<fn fn-type="edited-by">
<p>
<bold>Edited by:</bold> <ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/586182/overview">Hugo Pl&#xe1;cido da Silva</ext-link>, Institute of Telecommunications (IT), Portugal</p>
</fn>
<fn fn-type="edited-by">
<p>
<bold>Reviewed by:</bold> <ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/214441/overview">Benjamin Cowley</ext-link>, University of Helsinki, Finland</p>
<p>
<ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/1165895/overview">Ivan Miguel Pires</ext-link>, Universidade da Beira Interior, Portugal</p>
</fn>
<corresp id="c001">&#x2a;Correspondence: Alayna Hughes, <email>alayna.hughes@upf.edu</email>; Sergi Jorda, <email>sergi.jorda@upf.edu</email>
</corresp>
<fn fn-type="other">
<p>This article was submitted to Human&#x2013;Media Interaction, a section of the journal Frontiers in Computer Science</p>
</fn>
</author-notes>
<pub-date pub-type="epub">
<day>11</day>
<month>05</month>
<year>2021</year>
</pub-date>
<pub-date pub-type="collection">
<year>2021</year>
</pub-date>
<volume>3</volume>
<elocation-id>557608</elocation-id>
<history>
<date date-type="received">
<day>30</day>
<month>04</month>
<year>2020</year>
</date>
<date date-type="accepted">
<day>16</day>
<month>04</month>
<year>2021</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#xa9; 2021 Hughes and Jorda.</copyright-statement>
<copyright-year>2021</copyright-year>
<copyright-holder>Hughes and Jorda</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/">
<p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these&#x20;terms.</p>
</license>
</permissions>
<abstract>
<p>Video gaming is now available as a fully immersive experience that creates responsive inputs and outputs concerning the user, and some experimental developers have integrated the use of the voice, brain, or muscles as input controls. The use of physiological signal equipment can provide valuable information regarding the emotion of a player or patient during gameplay. In this article, we discuss five of the most common biosignals that are used in gaming research, and their function and devices that may be used for measurement. We break down those individual signals and present examples of research studies that implement them. We also discuss the usage of biological signals within commercial gaming and conclude with some possible future directions for the use of biological signals in gaming and game research.</p>
</abstract>
<kwd-group>
<kwd>EMG</kwd>
<kwd>video games</kwd>
<kwd>EEG</kwd>
<kwd>biofeedback</kwd>
<kwd>eye tracking</kwd>
<kwd>ECG</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec id="s1">
<title>Introduction</title>
<p>Game companies and researchers have been exploring the benefits of using physiological sensing devices within their video games and their research. They have experimented with brain-controlled gaming for stress control and used biosignals to retrieve emotional and interaction information from users during gameplay. Researchers have also integrated biosensors into games, to aid in physical rehabilitation as well as in the treatment of some attention and neurological disorders. In many of these studies, games have been implemented along with biosensors, to study the biological data of the participants during play as well as to make these games adaptive using the data from their biosignals.</p>
<p>A biosignal is any signal from living beings that can be continually measured and monitored. Although the term is often used to refer to bioelectrical signals, it may indeed refer to both electrical and nonelectrical signals. Biosignals can come from different parts of the body, such as the skin, brain, heart, muscles, or eyes. Among the bioelectrical signals commonly used in research, as well as being integrated into some commercial devices, are electroencephalography (EEG), electrocardiography (ECG), electromyography (EMG), electrooculography (EOG), and electrodermal activity (EDA).</p>
<p>Within a game, these biosignals can be used for different applications. They may be used as inputs to control certain aspects or objects within a game. Biosignals that can act as controllers or data input typically include (but are not limited to): eye movements and gaze, muscle activity, skin conductance level, heart rate variation, and brain activity. In this article, we will present examples of the use of biosignals and how they have been used within gaming research to aid persons with disabilities. These are also used often for therapeutic and rehabilitation purposes, namely EMG, which has been used to train persons who are learning to use prostheses.</p>
<p>The rest of this article is structured as follows. In <italic>Overview of Biosignals</italic> we introduce each of the most commonly used biosignals and physiological signals within medicinal and gaming research, and in <italic>Applications of Biosignals in Gaming</italic>, we describe their applications within commercial and serious games. In the final section, we discuss the possibilities for future implementations.</p>
</sec>
<sec id="s2">
<title>Overview of Biosignals</title>
<p>In this section, we introduce the five electrical biosignals that will be discussed in this article, namely electromyography (EMG), electroencephalography (EEG), electrodermal activity (EDA), eye tracking, and electrooculography (EOG), and electrocardiography (ECG). Widely used in research, they can all be accessed today in noninvasive means and through consumer devices, which makes them easily retrievable, also for non-researchers. New scientific and consumer hardware, along with their functionality and purpose, will also be introduced.</p>
<sec id="s2-1">
<title>Electromyography</title>
<p>Electromyography (EMG) is a technique used to measure and record the electrical activity generated by the muscles (<xref ref-type="bibr" rid="B9">Athavale and Krishnan, 2017</xref>). EMG has been widely used for rehabilitation and for control of prosthetic limbs, and also as input control for gaming, primarily by developers and researchers. A popular commercial EMG device, the MYO armband (<xref ref-type="fig" rid="F1">Figure&#x20;1A</xref>), has been used for research purposes and has also been a hacker&#x2019;s favorite in finding novel uses for EMG in nonmedical fields.</p>
<fig id="F1" position="float">
<label>FIGURE 1</label>
<caption>
<p>
<bold>(A)</bold> Thalmic Labs MYO armband. Thalmic Labs 2013&#x2013;2018. <bold>(B)</bold> Imec EEG unit. The unit uses dry rubber electrodes for conductance. The unit can easily be integrated to be worn with a head-mounted device for monitoring brain signals while experiencing Virtual Reality. <bold>(C)</bold> &#x201c;Gamer using a Tobii Eye Tracker mounted on a monitor&#x201d; Tobii AB. <bold>(D)</bold> &#x201c;Bitalino Board Kit. Input for multiple biosignals including EDA and ECG.&#x201d; PLUX Wireless Biosignals S.A.</p>
</caption>
<graphic xlink:href="fcomp-03-557608-g001.tif"/>
</fig>
</sec>
<sec id="s2-2">
<title>Electroencephalography</title>
<p>Electroencephalography (EEG) is the measuring of electrical signals in the brain, through the use of electrodes that are placed on certain points on the skull, to assess the activity of specific regions of the brain. Brain&#x2013;computer interfaces (BCIs) are EEG devices that communicate between the neural pathways and a computer. These have been developed to decode brain signals and translate them into a computer program (<xref ref-type="bibr" rid="B1">Abdulkader et&#x20;al., 2015</xref>). Researchers are able to perform tests using EEG, to understand how the brain responds to certain stimuli, such as music, or how the brain behaves in a person experiencing sleep disturbances or epilepsy. These devices have made the process of retrieving the brain&#x2019;s electrical signals into a less complex process so that even non-researchers can now use them to interact with their brainwaves.</p>
<p>There are several consumer-level EEG headsets on the market such as the Muse headband,<xref ref-type="fn" rid="FN1">
<sup>1</sup>
</xref> the Emotiv Insight,<xref ref-type="fn" rid="FN2">
<sup>2</sup>
</xref> the EPOC,<xref ref-type="fn" rid="FN3">
<sup>3</sup>
</xref> the Imec EEG<xref ref-type="fn" rid="FN4">
<sup>4</sup>
</xref> headset (<xref ref-type="fig" rid="F1">Figure&#x20;1B</xref>), or the Galea by Open BCI<xref ref-type="fn" rid="FN5">
<sup>5</sup>
</xref> (to be released in 2022). These headsets tend to come with proprietary software, which allows the user to calibrate the device easily, provides artifact filtering, and data readouts for separate brainwaves, brain regions, and emotions. In counterpart, due to this proprietary nature, headsets such as Emotiv and Muse cannot be used with external software, without much revision and hacking into the company software. Devices such as the IMEC headset can be mounted within a VR headset, and use flexible, polymer dry electrodes. Although researchers prefer to use headsets that use wet electrodes (electrodes that are placed with conductive paste for better signal transmission), an analysis of the electrodes used in the IMEC headset showed no viable differences in signal when tested against conventional wet electrodes (<xref ref-type="bibr" rid="B19">Chen et&#x20;al., 2014</xref>).</p>
<p>BCIs use two main methods for measuring responses in brain activity. Using the method of steady-state-evoked visual potentials (SSEVP), a visual stimulus occurs at a different constant frequency (<xref ref-type="bibr" rid="B4">Allison, et&#x20;al., 2008</xref>). A stimulus that is presented at a frequency of 6&#x2013;8&#x20;Hz is more likely to elicit an SSEVP (<xref ref-type="bibr" rid="B40">M&#xfc;ller and Hillyard, 2000</xref>). Through the use of a BCI that uses SSEVP, the EEG activity is measured at a certain frequency to measure the intent of the user. Examples of steady-state methods can be seen in the studies mentioned in this article.</p>
<p>An event-related response is when small electrical changes result from a stimulus change (<xref ref-type="bibr" rid="B11">Blackwood and Muir, 1990</xref>). The stimulus could be cognitive, sensory, or motor (<xref ref-type="bibr" rid="B1">Abdulkader et&#x20;al., 2015</xref>). Examples of event-related response can be seen in examples in which the researchers monitor biological changes from specific changes in the stimuli, such as a change in the tension or&#x20;music.</p>
</sec>
<sec id="s2-3">
<title>Eye Tracking and Electrooculography</title>
<p>Electrooculography (EOG) and eye tracking are two methods for measuring the point of gaze or the motion of an eye relative to the head. EOG tracks ocular activity through the use of sensors that are placed around the eyes, in a similar manner as EMG electrodes. These sensors record the corneo-retinal potential difference between the positive and negative sides of the eyeball (<xref ref-type="bibr" rid="B15">Bulling et&#x20;al., 2009</xref>). Alternatively, eye tracking uses a camera and image analysis for measuring the point of gaze or the motion of the eye. Eye tracking has become integrated into consumer devices such as the VIVE Pro Eye headset<xref ref-type="fn" rid="FN6">
<sup>6</sup>
</xref> and Alienware laptops, and companies such as Vive and Tobii<xref ref-type="fn" rid="FN7">
<sup>7</sup>
</xref> (<xref ref-type="fig" rid="F1">Figure&#x20;1C</xref>) market eye tracking to companies as a tool for gathering analytical information to use for their businesses. Eye tracking can record where a potential customer may look or keep her attention during an advertisement or shopping trip, or how immersed a person may be during a task such as playing a game. Researchers, on their side, have implemented eye tracking to study many problems, including people with gambling (<xref ref-type="bibr" rid="B41">Murch et&#x20;al., 2019</xref>) and gaming disorders.</p>
</sec>
<sec id="s2-4">
<title>Electrodermal Activity</title>
<p>Electrodermal activity (EDA) is a measure of the electrical activity or resistance level of the skin. It is a physiological response that is activated by the sympathetic system (<xref ref-type="bibr" rid="B23">Dawson et&#x20;al., 2017</xref>). The skin&#x2019;s electrical resistance typically changes during moments of arousal or physical activity, such as exercise, nervousness, or fear response. While we notice when our heart is beating quickly, or we may even perceive at moments of panic or nervousness an elevated amount of sweat in the palms, small changes in our sweat or skin conductance are not noticeable to us. Minute changes in skin conductance require the use of EDA sensors. One advantage of EDA is that it is less sensitive to noise in comparison with EMG and EEG, which receive a lot of noise artifacts from muscle and body movements (<xref ref-type="bibr" rid="B34">Kivikangas et&#x20;al., 2011</xref>).</p>
</sec>
<sec id="s2-5">
<title>Electrocardiography</title>
<p>ECG is the measure of the electrical activity generated by the heart. Similar to EMG or EOG, the heart rate is measured using three electrodes that are stuck onto the skin of the chest over the heart. Heart rate changes during exercise and physical activity, during times of emotional duress, and bodily fatigue (<xref ref-type="bibr" rid="B18">Ch&#x119;&#x107; et&#x20;al., 2015</xref>). ECG can be used to measure stress and ECG changes in a player during gameplay (<xref ref-type="bibr" rid="B65">Porter et&#x20;al., 2019</xref>). When measuring emotions and behavioral responses, ECG is often coupled with another sensor, usually EDA or&#x20;EEG.</p>
</sec>
</sec>
<sec id="s3">
<title>Applications of Biosignals in Gaming</title>
<p>In this section, we discuss the applications of each of the aforementioned biosignals in gaming. While much of this section addresses the uses of biosignals in game-based research, commercial applications will also be considered. We introduce some commercially available devices, as well as the challenges and possibilities of these technologies reaching a wider audience, through the release of developer-friendly software and hardware. In <xref ref-type="table" rid="T1">Table&#x20;1</xref>, we provide a reference table of the games that have been reviewed in researching this article. This table includes serious games, commercial games, and games that were modified for use within a study. It lists for each game, the type of biosignal monitored and its usage, and summarizes most of the information contained in this section. Additionally, in <xref ref-type="table" rid="T2">Table&#x20;2</xref>, we present a taxonomy of the biosignals discussed in this article, broken down into sections by their usages and the devices available.</p>
<table-wrap id="T1" position="float">
<label>TABLE 1</label>
<caption>
<p>Games reviewed for this article. NA &#x3d; Not Applicable (Commercial Release).</p>
</caption>
<table>
<thead valign="top">
<tr>
<th align="left">Row</th>
<th align="center">Game</th>
<th align="center">Creator</th>
<th align="center">Genre</th>
<th align="center">Release Type</th>
<th align="center">Study</th>
<th align="center">Year</th>
<th align="center">Signal /Controller</th>
<th align="center">Purpose</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td align="left">1</td>
<td align="left">Assassin&#x2019;s Creed Rogue&#xae;</td>
<td align="left">Ubisoft</td>
<td align="left">Action</td>
<td align="left">Commercial</td>
<td align="left">NA</td>
<td align="center">2014</td>
<td align="left">Eye tracking</td>
<td align="left">Controller input</td>
</tr>
<tr>
<td align="left">2</td>
<td align="left">BrainBall</td>
<td align="left">Smart things studio Interactive Institute Box</td>
<td align="left">Competition</td>
<td align="left">Research</td>
<td align="left">Hjelm, Sara lstedt, Browall, Carolina <xref ref-type="bibr" rid="B30">Hjelm and Browall (2000)</xref>
</td>
<td align="center">2000</td>
<td align="left">EEG</td>
<td align="left">Stress reduction</td>
</tr>
<tr>
<td align="left">3</td>
<td align="left">League of Legends&#x2122;</td>
<td align="left">Riot Games</td>
<td align="left">MOBA-Online Battle Arena</td>
<td align="left">Commercial used for research</td>
<td align="left">Lim, S., Yeo, M. Yoon, G. <xref ref-type="bibr" rid="B67">Lim et al. (2019)</xref>
</td>
<td align="center">2018</td>
<td align="left">ECG, PPG, SKT</td>
<td align="left">Monitoring biosignals during gameplay</td>
</tr>
<tr>
<td align="left">4</td>
<td align="left">Guitar Hero</td>
<td align="left">
<ext-link ext-link-type="uri" xlink:href="https://en.wikipedia.org/wiki/Harmonix">Harmonix</ext-link>, <ext-link ext-link-type="uri" xlink:href="https://en.wikipedia.org/wiki/Budcat_Creations">Budcat Creations</ext-link>, <ext-link ext-link-type="uri" xlink:href="https://en.wikipedia.org/wiki/Vicarious_Visions">Vicarious Visions</ext-link>
</td>
<td align="left">Music/rhythm</td>
<td align="left">Revised commercial</td>
<td align="left">Armiger, Robert S. Vogelstein, R. Jacob <xref ref-type="bibr" rid="B8">Armiger and Vogelstein (2008)</xref>
</td>
<td align="center">2008</td>
<td align="left">EMG</td>
<td align="left">Physical rehabilitation/training</td>
</tr>
<tr>
<td align="left">5</td>
<td align="left">Pospos</td>
<td align="left">C. Gramatke and S. Gramatke, &#x201c;Pospos&#x2014;Im Land der Chukchuks,&#x201d; Carmen Gramatke, Sven Gramatke, 2015</td>
<td align="left">Navigation/puzzle</td>
<td align="left">Commercial used for research</td>
<td align="left">Prahm, Cosima Kayali, Fares Vujaklija, Ivan Sturma, Agnes Aszmann, Oskar <xref ref-type="bibr" rid="B47">Prahm et&#x20;al. (2017)</xref>
</td>
<td align="center">2017</td>
<td align="left">EMG</td>
<td align="left">Prosthesis training&#x2014;sustained and simultaneous contractions racing</td>
</tr>
<tr>
<td rowspan="2" align="left">6</td>
<td rowspan="2" align="left">SuperTuxKart</td>
<td rowspan="2" align="left">Baker, Steve Baker, Oliver</td>
<td rowspan="2" align="left">Racing</td>
<td rowspan="2" align="left">Commercial used for research</td>
<td align="left">Prahm, Cosima Kayali, Fares Vujaklija, Ivan Sturma, Agnes Aszmann, Oskar <xref ref-type="bibr" rid="B47">Prahm et&#x20;al. (2017)</xref>
</td>
<td align="center">
<xref ref-type="bibr" rid="B47">Prahm et&#x20;al. (2017)</xref>
</td>
<td align="left">EMG (Prahm et&#x20;al.)</td>
<td align="left">Prosthesis training&#x2014;quick and simultaneous contraction (Prahm et&#x20;al.)</td>
</tr>
<tr>
<td align="left">Soares, R., Siqueira, E., Miura, M., Silva, T., Jacobi, R. and Castanho, C. (2017).</td>
<td align="center">
<xref ref-type="bibr" rid="B50">Soares et&#x20;al. (2017)</xref>
</td>
<td align="left">ECG, EDA (Soares et&#x20;al.)</td>
<td align="left">Player behavior and biosignal device effectiveness (Soares et&#x20;al.)</td>
</tr>
<tr>
<td align="left">7</td>
<td align="left">StepMania 5</td>
<td align="left">C. Danford and G. Maynard, &#x201c;Step Mania 5,&#x201d; Chris Danford, Glenn Maynard, 2015</td>
<td align="left">Rhythm</td>
<td align="left">Commercial used for research</td>
<td align="left">Prahm, Cosima Kayali, Fares Vujaklija, Ivan Sturma, Agnes Aszmann, Oskar <xref ref-type="bibr" rid="B47">Prahm et&#x20;al. (2017)</xref>
</td>
<td align="center">2017</td>
<td align="left">EMG</td>
<td align="left">Prosthesis training&#x2014;quick, sustained, and simultaneous contractions</td>
</tr>
<tr>
<td align="left">8</td>
<td align="left">Untitled</td>
<td align="left">Custom</td>
<td align="left">Serious</td>
<td align="left">Research</td>
<td align="left">Alchalcabi, Alaa Eddin Eddin, Amer Nour Shirmohammadi, Shervin <xref ref-type="bibr" rid="B3">Alchalcabi et&#x20;al. (2017)</xref>
</td>
<td align="center">2017</td>
<td align="left">EEG</td>
<td align="left">ADHD focus training</td>
</tr>
<tr>
<td align="left">9</td>
<td align="left">MLB Home Run Derby</td>
<td align="left">MLB Advanced Media, MLB</td>
<td align="left">Sports/VR</td>
<td align="left">Commercial</td>
<td align="left">N/A</td>
<td align="center">2018</td>
<td align="left">Eye tracking</td>
<td align="left">Input/menu navigation</td>
</tr>
<tr>
<td align="left">10</td>
<td align="left">Untitled</td>
<td align="left">Custom</td>
<td align="left">Racing/driving</td>
<td align="left">Research</td>
<td align="left">Gorzkowski, Szymon, Sarwas, Grzegorz <xref ref-type="bibr" rid="B28">Gorzkowski and Sarwas (2019)</xref>
</td>
<td align="center">2019</td>
<td align="left">EMG</td>
<td align="left">EMG as control input/gesture recognition</td>
</tr>
<tr>
<td align="left">11</td>
<td align="left">ADEPT Tower Builder And Pong</td>
<td align="left">Custom</td>
<td align="left">Puzzle/challenge</td>
<td align="left">Research</td>
<td align="left">Converse, Hayes Ferraro, Teressa Jean, Daniel Mendhiratta, Vikas Naviasky, Emily Par, Mang Rimlinger, Thomas Southall, Steven Sprenkle, Jason Abshire, Pamela <xref ref-type="bibr" rid="B29">Hayes et al. (2013)</xref>
</td>
<td align="center">2013</td>
<td align="left">EMG</td>
<td align="left">Physiotherapy</td>
</tr>
<tr>
<td align="left">12</td>
<td align="left">Fly to Catch the Lotus, Flappy Bird, Control the Clouds, Jetpack</td>
<td align="left">Perifit</td>
<td align="left">Arcade, strategy</td>
<td align="left">Commercial</td>
<td align="left">Perifit</td>
<td align="center">2019</td>
<td align="left">EMG</td>
<td align="left">Rehabilitation</td>
</tr>
<tr>
<td align="left">13</td>
<td align="left">Virtual Iraq/Afghanistan</td>
<td align="left">Neuroscience and Simulation Laboratory (NeuroSim) and Institute for Creative Technologies, the University of Southern California</td>
<td align="left">Role Play/War/VR</td>
<td align="left">Research</td>
<td align="left">Parsons, Thomas D Reinebold, James L <xref ref-type="bibr" rid="B63">Parsons and Reinbold (2012)</xref>
</td>
<td align="center">2012</td>
<td align="left">GSR, ECG</td>
<td align="left">Neuropsychological assessment, PTSD treatment</td>
</tr>
<tr>
<td align="left">14</td>
<td align="left">Untitled</td>
<td align="left">Sakurazawa, Shigeru Yoshida, Naofumi Munekata, Nagisa</td>
<td align="left">Arcade/strategy</td>
<td align="left">Research</td>
<td align="left">Sakurazawa, Shigeru Yoshida, Naofumi Munekata, Nagisa <xref ref-type="bibr" rid="B61">Sakurazawa et&#x20;al. (2004)</xref>
</td>
<td align="center">2004</td>
<td align="left">GSR</td>
<td align="left">Affective gaming using GSR</td>
</tr>
<tr>
<td align="left">15</td>
<td align="left">Pong</td>
<td align="left">Atari</td>
<td align="left">Arcade</td>
<td align="left">Commercial adapted for research</td>
<td align="left">Dirrik H.G. Emmen Lampropoulos, Georgios Dirrik and Lampropoulos (2014)</td>
<td align="center">2014</td>
<td align="left">ECG, GSR</td>
<td align="left">Biofeedback for improved user performance</td>
</tr>
<tr>
<td align="left">16</td>
<td align="left">Left 4 Dead 2</td>
<td align="left">Valve Corporation</td>
<td align="left">Action/Horror</td>
<td align="left">Commercial-testing</td>
<td align="left">NA</td>
<td align="center">2009</td>
<td align="left">GSR</td>
<td align="left">Sweat data collected to test challenge times</td>
</tr>
<tr>
<td align="left">17</td>
<td align="left">NeuroBoy&#x2122;</td>
<td align="left">Neurosky</td>
<td align="left">Challenge/Arcade</td>
<td align="left">Commercial</td>
<td align="left">NA</td>
<td align="center">2009</td>
<td align="left">EEG</td>
<td align="left">Use BCI to push, pull, move objects within the game</td>
</tr>
<tr>
<td align="left">18</td>
<td align="left">Neuro Tower Defense</td>
<td align="left">Neurosky</td>
<td align="left">Strategy</td>
<td align="left">Commercial</td>
<td align="left">NA</td>
<td align="center">2017</td>
<td align="left">EEG</td>
<td align="left">Use brain signals to defend a tower against attack and stay calm</td>
</tr>
<tr>
<td align="left">19</td>
<td align="left">Alpha World of Warcraft/World of Warcraft</td>
<td align="left">Blizzard Entertainment</td>
<td align="left">Strategy, RPG</td>
<td align="left">Commercial adapted for research</td>
<td align="left">Plass-Oude Bos, D. Reuderink, B., M&#x00FC;hl, C., G&#x00FC;rk&#x00F6;k, H., Poel, M., Nijholt, A, Heylen, D. <xref ref-type="bibr" rid="B62">Plass-Oude Bos et al. (2010)</xref>
</td>
<td align="center">2010</td>
<td align="left">EEG</td>
<td align="left">Shape shifting is controlled by alpha activity</td>
</tr>
<tr>
<td align="center">20</td>
<td align="left">Slender: The Eight Pages</td>
<td align="left">Parsec Productions</td>
<td align="left">Horror</td>
<td align="left">Commercial used for research</td>
<td align="left">Vachiratamporn, Vanus Legaspi, Roberto Moriyama, Koichi Fukui, Ken ichi Numao, Masayuki <xref ref-type="bibr" rid="B53">Vachiratamporn et&#x20;al. (2015)</xref>
</td>
<td align="center">2015</td>
<td align="left">EEG, ECG</td>
<td align="left">Biosignals monitored users level during suspenseful and scary moments</td>
</tr>
<tr>
<td align="left">21</td>
<td align="left">Half Life 2</td>
<td align="left">Valve Corporation</td>
<td align="left">Horror-Survival</td>
<td align="left">Commercial adapted for research</td>
<td align="left">Dekker, Andrew Champion, Erik <xref ref-type="bibr" rid="B25">Dekker and Champion (2007)</xref>
</td>
<td align="center">2007</td>
<td align="left">GSR, ECG</td>
<td align="left">Biosignals changed NPCs, sound, and environment</td>
</tr>
<tr>
<td align="left">22</td>
<td align="left">Left 4 Dead</td>
<td align="left">Valve Corporation</td>
<td align="left">Horror-Survival</td>
<td align="left">Commercial adapted for research</td>
<td align="left">Bouchard, St&#xe9;phane Bernier, Fran&#xe7;ois Boivin, &#xc9;ric Morin, Brian Robillard, Genevi&#xe8;ve <xref ref-type="bibr" rid="B14">Bouchard et&#x20;al. (2012)</xref>
</td>
<td align="center">2012</td>
<td align="left">ECG, GSR</td>
<td align="left">Effectiveness of stress training for combat and stress situations</td>
</tr>
<tr>
<td align="left">23</td>
<td align="left">VRail Surfer</td>
<td align="left">Kumar and Sharma</td>
<td align="left">Sport</td>
<td align="left">Research</td>
<td align="left">D. Kumar and A. Sharma <xref ref-type="bibr" rid="B37">Kumar and Sharma (2016)</xref>
</td>
<td align="center">2016</td>
<td align="left">Eye tracking</td>
<td align="left">Use of eye tracking in VR for game navigation</td>
</tr>
<tr>
<td align="left">24</td>
<td align="left">Need For Speed&#x2122;</td>
<td align="left">EA, Criterion</td>
<td align="left">Racing</td>
<td align="left">Commercial used for research</td>
<td align="left">Balasubramanian V., Adalarasu K. <xref ref-type="bibr" rid="B10">Balasubramanian et&#x20;al. (2007)</xref>
</td>
<td align="center">1994 (Original Release)</td>
<td align="left">Eye tracking and EEG</td>
<td align="left">Driver fatigue</td>
</tr>
<tr>
<td align="left">25</td>
<td align="left">EyeGuitar</td>
<td align="left">Vickers S, Istance H, Smalley M</td>
<td align="left">Music</td>
<td align="left">Research</td>
<td align="left">Vickers S, Istance H, Smalley M. <xref ref-type="bibr" rid="B55">Vickers et&#x20;al. (2010)</xref>
</td>
<td align="center">2010</td>
<td align="left">Eye tracking</td>
<td align="left">Use of eye tracking to interact with musical video games</td>
</tr>
<tr>
<td align="left">26</td>
<td align="left">Nevermind</td>
<td align="left">Flying Mollusk</td>
<td align="left">Horror</td>
<td align="left">Commercial</td>
<td align="left">N/A</td>
<td align="center">2016 (VR)</td>
<td align="left">Heart rate, eye tracking</td>
<td align="left">Use of commercial sensors and cameras to measure gameplayers biosignals</td>
</tr>
<tr>
<td align="left">27</td>
<td align="left">Open Source Asteroids (1), Wizznic (2), Don&#x2019;t Let Animatronics Stuff You Into a Suit (3)</td>
<td align="left">N/A, DusteD, N/A&#x2014;Open Source</td>
<td align="left">Arcade (1), Strategy (2), Horror (3)</td>
<td align="left">Research</td>
<td align="left">
<xref ref-type="bibr" rid="B50">Soares et&#x20;al. (2017)</xref>
</td>
<td align="center">N/A, 2010, N/A</td>
<td align="left">EDA, ECG</td>
<td align="left">Use of Bitalino to test the effectiveness of biosensors, monitoring player reaction to gameplay</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap id="T2" position="float">
<label>TABLE 2</label>
<caption>
<p>Taxonomy of Biosignals Reviewed in this Paper.</p>
</caption>
<table>
<thead valign="top">
<tr>
<th align="left">Signal</th>
<th align="center">What it measures</th>
<th align="center">Devices mentioned in this article</th>
<th align="center">Passive examples (monitoring only)</th>
<th align="center">Active examples (affective gaming)</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td align="left">EEG</td>
<td align="left">Electrical signals in the brain</td>
<td align="left">Open BCI, Emotiv, Muse, Imec, Bitaling</td>
<td align="left">Emotion during gameplay, neurological assessment</td>
<td align="left">Environment changes, controller for gameplay</td>
</tr>
<tr>
<td align="left">EMG</td>
<td align="left">Electrical activity from muscles</td>
<td align="left">MYO band, Ottobock Myoboy, Bitalino, Perifit</td>
<td align="left">No passive examples presented in this article</td>
<td align="left">Motivation for prosthesis and rehabilitation, game controller, pelvic exercise</td>
</tr>
<tr>
<td align="left">ECG</td>
<td align="left">Heart rate variation</td>
<td align="left">Bitalino, consumer wearable devices, unnamed ECG measurement devices</td>
<td align="left">Player stress monitoring in gameplay, engagement level, stress management</td>
<td align="left">Change of experience and environment within gameplay</td>
</tr>
<tr>
<td align="left">EDA</td>
<td align="left">Skin conductance level and response, electrical activity</td>
<td align="left">Bitalino, unnamed EDA measurement devices</td>
<td align="left">Measurement of stress, fear, and panic during gameplay</td>
<td align="left">Change of experience and environment within gameplay</td>
</tr>
<tr>
<td align="left">EOG &#x2b; EYE</td>
<td align="left">Saccades, gaze, blinks, pupillometry</td>
<td align="left">Bitalino, Tobii Eye Tracking Cameras, Vive Eye Pro, Unnamed devices</td>
<td align="left">Fatigue level, point of interest, game controller using eye gaze as input</td>
<td align="left">Used as input, however no affective examples presented</td>
</tr>
</tbody>
</table>
</table-wrap>
<sec id="s3-1">
<title>Applications of EMG in Gaming</title>
<p>In some studies, researchers have used devices with biosensors such as EEG or EMG connected to a game, to aid persons with learning how to gain use of muscles or to test anticipation of movement for those with muscle impairment (<xref ref-type="bibr" rid="B8">Armiger and Vogelstein, 2008</xref>; <xref ref-type="bibr" rid="B45">Norman et&#x20;al., 2016</xref>; <xref ref-type="bibr" rid="B48">Rincon et&#x20;al., 2016</xref>; <xref ref-type="bibr" rid="B47">Prahm et&#x20;al., 2017</xref>). In other uses, a researcher can, for instance, monitor in real time how players react to a game by measuring their attention levels or immersion level (<xref ref-type="bibr" rid="B42">Nacke and Lindley, 2008</xref>), stress (<xref ref-type="bibr" rid="B33">Karthikeyan et&#x20;al., 2011</xref>), or panic levels (<xref ref-type="bibr" rid="B49">Russoniello et&#x20;al., 2009</xref>; <xref ref-type="bibr" rid="B58">Wang et&#x20;al., 2014</xref>).</p>
<p>Researchers have used EMG for designing experiences to train individuals with physical disabilities, to aid in their recovery (<xref ref-type="bibr" rid="B8">Armiger and Vogelstein, 2008</xref>; <xref ref-type="bibr" rid="B20">Converse et&#x20;al., 2013</xref>; <xref ref-type="bibr" rid="B29">Hayes et al., 2013</xref>). One study explored the feasibility of using an EMG device built with commercial off-the-shelf sensors and placed into an arm sleeve, for physical rehabilitation (<xref ref-type="bibr" rid="B56">Visconti et&#x20;al., 2018</xref>; <xref ref-type="bibr" rid="B20">Converse et&#x20;al., 2013</xref>; <xref ref-type="bibr" rid="B29">Hayes et al., 2013</xref>) (<xref ref-type="table" rid="T1">Table&#x20;1</xref>, row 11). The surface sensors were placed over the forearm just below the elbow. Two simple games were created for the experiment: a tower defense game<xref ref-type="fn" rid="FN8">
<sup>8</sup>
</xref> and a Pong-style game. EMG is also being used to build and test assistive technology for those in need of prostheses. One group of researchers modified a <italic>Guitar Hero</italic> controller with EMG sensors to allow amputees to play the game (<xref ref-type="bibr" rid="B8">Armiger and Vogelstein, 2008</xref>) (<xref ref-type="table" rid="T1">Table&#x20;1</xref>, row 4). Another group looked to broaden the use of a prosthetic limb to have more input options to a game (<xref ref-type="bibr" rid="B57">Vujaklija et&#x20;al., 2017</xref>) (<xref ref-type="table" rid="T1">Table&#x20;1</xref>, rows 57). For this, they used a prosthetic forearm and hand called the Ottobock Myoboy.<xref ref-type="fn" rid="FN9">
<sup>9</sup>
</xref> The aim of the study was to use gaming as a method of teaching users how to properly control a new prosthesis and stay motivated. The outcome showed that this method of training prosthesis users was efficient in training separate muscle movements and endurance. Studies such as these demonstrate methods of keeping those in rehabilitation interested and motivated while improving their physical&#x20;state.</p>
<p>One related example of commercial use of an EMG device was produced in 2019 by the French company Perifit,<xref ref-type="fn" rid="FN10">
<sup>10</sup>
</xref> which released a device that aids new mothers in strengthening their pelvic muscles in postnatal recovery, as well as women who suffer from bladder control issues. To motivate these users to do the exercises and to do them correctly, the company provides several games with their applications, where the user interacts with them by using the device as a controller (<xref ref-type="table" rid="T1">Table&#x20;1</xref>, row 12). These games change exercises at each level, depending on which recovery program the user has chosen.</p>
<p>EMG can also act as a noninvasive and reliable signal for measuring emotions. <xref ref-type="bibr" rid="B50">Soares et&#x20;al. (2017)</xref> employed a combination of EMG and EDA signals for reading emotions during a playing experience (<xref ref-type="table" rid="T1">Table&#x20;1</xref>, rows 6, 27). They placed EMG sensors on the facial muscles for tracking facial expressions to pinpoint positive or negative valence. For sensing EMG and EDA, these researchers used a Bitalino. The Bitalino<xref ref-type="fn" rid="FN11">
<sup>11</sup>
</xref> (<xref ref-type="fig" rid="F1">Figure&#x20;1D</xref>), from the Portuguese company Plux, is a small board that fitted with the corresponding sensors, can be used for measuring multiple simultaneous biosignals, including EMG, EDA, EEG, and ECG. It has become popular among researchers and hobbyists, due to its open-source platform, its affordability, and its reliability.</p>
</sec>
<sec id="s3-2">
<title>EEG in Gaming Applications</title>
<p>Whereas the aforementioned use of EMG for measuring emotions is still scarce, the use of EEG for the same purpose is much more widespread. Numerous studies have been done in monitoring EEG signals during gameplay, to analyze the emotional mindset of a player during game interaction. One direct benefit of measuring emotions during gameplay is to use the biological data to change the difficulty or experience of the game (<xref ref-type="bibr" rid="B16">Carofiglio et&#x20;al., 2019</xref>).</p>
<p>These devices show great possibilities, not just for analysis of emotional states and adaptive gaming, but can also provide opportunities for health-related research and for severely disabled persons to have a means of interacting with video games. Researchers have studied applications of using BCIs in games to search for new interventions for people with Autism and Autism Spectrum Disorder (<xref ref-type="bibr" rid="B26">Friedrich et&#x20;al., 2014</xref>). One study used a BCI to test the possibility of removing attention impairments from children with autism (<xref ref-type="bibr" rid="B66">Mercado et&#x20;al., 2019</xref>). In this study, the use of a BCI with a video game improved the attention level of the children with two showing no attention issues post-study. While researchers are using BCIs to aid those with an emotional disorder or other disabilities, there still remains a gap in testing the possibilities in BCI gaming with persons with severe physical disabilities.</p>
<p>Relaxation and meditation has become a popular topic in research as well as at a commercial level. In a seminal experiment, <xref ref-type="bibr" rid="B30">Hjelm and Browall (2000</xref>) created a game using a BCI to test the alleviation of stress (<xref ref-type="table" rid="T1">Table&#x20;1</xref>, row 2). This experiment involved two gamers &#x201c;competing&#x201d; to become relaxed by moving a metal ball with their alpha and beta signals. More recently, <xref ref-type="bibr" rid="B2">Ahn et&#x20;al. (2014</xref>) targeted the brain&#x2019;s alpha signals for an adapted version of <italic>World of Worldcraft</italic> (<xref ref-type="table" rid="T1">Table&#x20;1</xref>, row 19). In this experiment, the user could control the game through conventional input as well as affect changes in the game through their alpha brain signals. The company Neurosky,<xref ref-type="fn" rid="FN12">
<sup>12</sup>
</xref> which produces BCIs, launched a game entitled <italic>Neuroboy</italic> (<xref ref-type="table" rid="T1">Table&#x20;1</xref>, row 17), to accompany their headset in 2007. As of 2020, the company has released numerous applications and games either for free or purchase, such as <italic>Neuro Tower Defense</italic> (<xref ref-type="table" rid="T1">Table&#x20;1</xref>, row 18)<italic>,</italic> a strategy game that also tests the brain to remain calm during the task of defense. Neurosky is not the only commercially available headset that provides its own applications to go along with its BCIs. Other headsets such as Muse are in fact marketed with the main purpose of reducing the stress of the&#x20;user.</p>
</sec>
<sec id="s3-3">
<title>Electrooculography and Eye-Tracking Applications in Gaming</title>
<p>Widely used in commercial and marketing studies, eye gaze is still not as commonly employed in gaming as other biosignals. It is mostly used as an implementation for hands-free gaze-based interaction, which has clear applications in Virtual Reality (VR), as well as for allowing those with severe motor disabilities to interact with video games and other applications. In the case of using VR, the game VRailSurfer was developed and participants played it by controlling the direction of the character and avoiding obstacles through gaze and eye movement (<xref ref-type="bibr" rid="B37">Kumar and Sharma, 2016</xref>) (<xref ref-type="table" rid="T1">Table&#x20;1</xref>, row 23). The technique called dwell-time-based selection (<xref ref-type="bibr" rid="B31">Isokoski et&#x20;al., 2009</xref>) uses visual fixation for allowing the eyes to act as a &#x201c;mouse emulator.&#x201d; One group tested a Guitar Hero style game, evaluating where the eyes fixated (<xref ref-type="bibr" rid="B55">Vickers et&#x20;al., 2010</xref>) (<xref ref-type="table" rid="T1">Table&#x20;1</xref>, row 25). The Eyeharp is a software-based musical instrument that permits people with disabilities to play music with gaze or with head movements, allowing them to play melodies simply by looking at the notes on the screen (<xref ref-type="bibr" rid="B54">Vamvakousis and Ramirez, 2016</xref>). On a different line, <xref ref-type="bibr" rid="B10">Balasubramanian and Adalarasu (2007</xref>) combined EOG with EEG for measuring the onset of fatigue while users played a driving game (<xref ref-type="table" rid="T1">Table&#x20;1</xref>, row&#x20;24).</p>
<p>This eye-tracking technology has already been implemented commercially by companies such as Tobii and HTC Vive. The gaming giant Ubisoft worked with the Tobii-tracking software and hardware (<xref ref-type="fig" rid="F1">Figure&#x20;1C</xref>) to include this capability into the PC versions of more than 130 of their games. Eye tracking is also used in a process called foveated rendering, which tracks the eye and renders images in greater quality where the eye gazes. This allows for lower machine latency as the image can maintain optimum frame rate inside the focal area while leaving the area outside of view at lower quality when not being tracked. The VR headset HTC VIVE Pro Eye, released in 2019, has eye-tracking capabilities within the headset. Vive has also released an SDK for developers to easily integrate this into their&#x20;game.</p>
</sec>
<sec id="s3-4">
<title>Electrodermal Activity in Gaming</title>
<p>Several studies have shown that as the level of challenge or speed of a game increases, the skin conductance level also augments (<xref ref-type="bibr" rid="B64">Parsons and Reinebold, 2012</xref>) (<xref ref-type="table" rid="T1">Table&#x20;1</xref>, row 13). EDA correlates with the levels of nervousness or fear, the integration of this biosignal into gaming brings possibilities for adaptive games, such as introducing more challenges in a level depending on the user&#x2019;s state of calm <xref ref-type="bibr" rid="B17">Chanel et&#x20;al. (2008</xref>) used EDA (along with other biosignals) to measure the level of excitement, boredom, and anxiety in participants during gameplay of Tetris. The company Valve used skin response to measure players&#x2019; sweat levels during testing phases of the game Left 4 Dead 2 (<xref ref-type="bibr" rid="B46">Polygon, 2020</xref>).<xref ref-type="fn" rid="FN13">
<sup>13</sup>
</xref> This allowed them to measure stress levels and study and adjust the time that the objective should take to finish. EDA has also shown possibilities in therapeutic games, for instance in the treatment of phobias <xref ref-type="bibr" rid="B36">Kritikos et&#x20;al. (2021</xref>) use EDA with users with arachnophobia in a Virtual Reality environment. The users were exposed to an environment with spiders and their skin response was measured and the intensity of the experience was controlled in real time. Researchers are also using physiological signals to solve issues in experiences, such as Virtual Reality sickness (<xref ref-type="bibr" rid="B39">Martin et&#x20;al., 2018</xref>). In this study, EDA was used along with a measurement of cardiac activity to test a detection method for users who become motion sick within&#x20;VR.</p>
</sec>
<sec id="s3-5">
<title>ECG in Gaming Applications</title>
<p>ECG is another biosignal often used in gaming for measuring emotional states during gameplay. <xref ref-type="bibr" rid="B53">Vachiratamporn et&#x20;al. (2015)</xref> used EEG and ECG to monitor players&#x2019; emotional states during a horror game (<xref ref-type="table" rid="T1">Table&#x20;1</xref>, row 20). They reported that ECG was an effective means of predicting the user&#x2019;s affect during pre-scary and post-scary moments. ECG and skin conductance were also used during an experiment in which players played the game <italic>Half Life 2</italic> (<xref ref-type="bibr" rid="B25">Dekker and Champion, 2007</xref>) (<xref ref-type="table" rid="T1">Table&#x20;1</xref>, row 21). In this experiment, the researchers modified the game in several ways to react and change, depending on the readings, such as sound volume, shader effects (color changes), and rewards. Nonplayable characters would try to scare the player if they were reading too calmly. In 2016, the game company Flying Mollusk released a commercial game entitled Nevermind (<xref ref-type="table" rid="T1">Table&#x20;1</xref>, row 26), which is usable with consumer level heart rate monitors. The developers created the horror genre game with the purpose of creating a more frightening experience for the player, based upon their biological signals.</p>
<p>ECG monitoring during gameplay can also note the varying engagement levels of players during a game (<xref ref-type="bibr" rid="B27">Giakoumis et&#x20;al., 2011</xref>). By monitoring the engagement level of players, game developers can adjust game difficulty (Dynamic Difficulty Adjustment) and algorithms to keep players interested (<xref ref-type="bibr" rid="B34">Kivikangas et&#x20;al., 2011</xref>; <xref ref-type="bibr" rid="B59">Xue et&#x20;al., 2019</xref>). The company Valve has long been researching gameplay using biosignals (<xref ref-type="bibr" rid="B5">Ambinder, 2011</xref>), and has even patented a method for capturing player biosignals in real time to affect gameplay (<xref ref-type="bibr" rid="B13">Bond and Ambinder, 2009</xref>).</p>
<p>Using ECG during gameplay can also be used for training of breath and staying calm during stressful situations. An experiment was done using members of the Canadian Forces with this purpose (<xref ref-type="bibr" rid="B14">Bouchard et&#x20;al., 2012</xref>) (<xref ref-type="table" rid="T1">Table&#x20;1</xref>, row 22). The participants were given refresher training on Stress Management, which included breathing practices. The participants played an adapted version of a game while ECG and skin conductance were monitored.</p>
</sec>
<sec id="s3-6">
<title>Biosignal Use for Affective Gaming and Research</title>
<p>Biosignal data can help researchers understand how a person interacts with a game&#x2014;a technique that major companies have also employed to test game design and experience. Researchers can harness the biosignals of a player to provide real-time interaction and feedback in response to the player&#x2019;s emotional state. This implementation in which the users&#x2019; behavior or emotional state directly affects gameplay is known as affective gaming (<xref ref-type="bibr" rid="B35">Kotsia et&#x20;al., 2013</xref>). Affective gaming is a branch of affective computing, a type of computing that influences, arises from, or relates to human emotion (<xref ref-type="bibr" rid="B65">Picard, 1995</xref>). An affective program can receive input from gestures and biological signals. This is particularly helpful in gathering information in regard to learning in games (<xref ref-type="bibr" rid="B22">Cowley et&#x20;al., 2013</xref>; <xref ref-type="bibr" rid="B21">Cowley and Ravaja, 2014</xref>), as well as understanding how emotions correlate to stimuli such as music (<xref ref-type="bibr" rid="B12">Bo et&#x20;al., 2017</xref>; <xref ref-type="bibr" rid="B52">Suto and Oniga, 2018</xref>) or visuals (<xref ref-type="bibr" rid="B51">Suhaimi et&#x20;al., 2018</xref>). By designing a game to be affective, a developer or an artist can tailor the experience to the person playing the game (<xref ref-type="bibr" rid="B38">Liu et&#x20;al., 2009</xref>; <xref ref-type="bibr" rid="B43">Nacke et&#x20;al., 2011</xref>).</p>
</sec>
<sec id="s3-7">
<title>Biosignals in Commercial Gaming</title>
<p>The largest challenge for using biosignals on a massive scale in a game environment, be it for control, monitoring, affective gaming, or for providing some sort of biofeedback, is indeed the hardware equipment. In order for biosignals to be employed on a mainstream level, the leading companies such as Sony, Valve, or Microsoft would need to develop controllers with biosensors unobtrusive and easy to use, and game developers would need to create games that would integrate these sensors and make the games affective or provide some type of biofeedback.</p>
<p>As we have reviewed, the present level of market penetration varies considerably among the different biosignals, depending on several factors such as the potential applications and benefits that these biosignals can bring, and the cost, reliability, or unobtrusiveness of the related sensors. Eye-tracking technologies, for example, which offer interesting possibilities for gaze control, with clear applications in VR and for providing control mechanisms to people with severe motor disabilities, have already been implemented commercially by companies such as Tobii and HTC&#x20;Vive.</p>
<p>Concerning EEG, although consumer level and noninvasive headsets are being produced by companies such as Muse or Emotiv, this still constitutes a niche market, mostly targeting mindfulness and relaxation. If EEG sensing and interpretation will probably have to wait some more years for offering more reliable and useful possibilities at the consumer level, EMG, EDA, and ECG on their hand, which provide physiological information much simpler to measure and interpret, would seem to be ready for affective gaming.</p>
<p>In 2020, Sony released a patent for the Dualshock 5 controller (<xref ref-type="bibr" rid="B6">Andall and Hogarth, 2018</xref>), which incorporates the use of physiological signal technology. The controller design includes EDA sensors and an input for possible integration of other biosignals such as heart rate. Will this convey the massive use of biosignals in commercial controllers, and bring affective gameplay to the mainstream? Even in the case of not making gaming affective, game companies would likely find it useful to have the possibility of collecting users&#x2019; biodata during gameplay to adapt gameplay in updates or future releases. Perhaps the proposed commercial use of biofeedback from skin conductance from Sony will integrate this feature of bio-adaptability into Playstation 5 games in the future. If implemented, this would be an advance for biofeedback as well as the first popular commercial release of this technology.</p>
</sec>
</sec>
<sec id="s4">
<title>Future Uses and Possibilities and Conclusion</title>
<p>We have presented an overview of the past and present uses of biological signals in research utilizing video games, as well as their uses in current commercial gaming and their potential for the near future. Many of the examples we have commented come from the fields of medical and psychological research, but there still remain many possibilities for the use of biological and physiological signals in tandem with video games. With Virtual Reality now an accessible and commonplace technology, more researchers are integrating it into treatment for anxiety disorders. The ability to monitor a patient&#x2019;s emotional state is valuable, and being able to have a controlled environment, such as a game, allows a researcher or practitioner, the opportunity of coaxing a response from a patient using visual and audio stimuli. The investigation that has been done using immersive gaming has been promising; however, more integration of biosignal observation and usage for affective experiences is needed. New devices, such as several that have been discussed in this article, will make this integration easier and will assist with issues that arise from movement artifacts.</p>
<p>Utilizing games for research can be an effective means of studying emotional and physical states. It is clear that gaming is not waning in popularity and developers will continue to integrate more methods of interaction, and adaptability into video games and controllers.</p>
</sec>
</body>
<back>
<sec id="s5">
<title>Author Contributions</title>
<p>All authors contributed to manuscript revision, read, and approved the submitted version.</p>
</sec>
<sec id="s6">
<title>Funding</title>
<p>This research has been partially supported by the project Musical AI - PID2019-111403GB-I00/AEI/10.13039/501100011033 funded by the Spanish Ministerio de Ciencia, Innovaci&#x00F3;n y Universidades (MCIU) and the Agencia Estatal de Investigaci&#x00F3;n (AEI).</p>
</sec>
<sec sec-type="COI-statement" id="s7">
<title>Conflict of Interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<fn-group>
<fn id="FN1">
<label>1</label>
<p>Muse EEG Headset <ext-link ext-link-type="uri" xlink:href="https://choosemuse.com/?utm_source=google&#x26;utm_medium=cpc&#x26;gclid=Cj0KCQiAkKnyBRDwARIsALtxe7jBThM4p04pea4W-YIvNy-OaeEW5aTIQAzy8kYYZjBIn2vAxiam08YaAtjXEALw_wcB">https://choosemuse.com/?utm_source&#x3d;google&#x26;utm_medium&#x3d;cpc&#x26;gclid&#x3d;Cj0KCQiAkKnyBRDwARIsALtxe7jBThM4p04pea4W-YIvNy-OaeEW5aTIQAzy8kYYZjBIn2vAxiam08YaAtjXEALw_wcB</ext-link>
</p>
</fn>
<fn id="FN2">
<label>2</label>
<p>Emotiv Insight, <ext-link ext-link-type="uri" xlink:href="https://www.emotiv.com/product/emotiv-insight-5-channel-mobile-eeg/?gclid=Cj0KCQiAkKnyBRDwARIsALtxe7jJ9pVcLRQ2zoyxcoDoNJeGX4plsd_kz21Uqw6BWCaKJgP8u1VBxOcaAufUEALw_wcB">https://www.emotiv.com/product/emotiv-insight-5-channel-mobile-eeg/?gclid&#x3d;Cj0KCQiAkKnyBRDwARIsALtxe7jJ9pVcLRQ2zoyxcoDoNJeGX4plsd_kz21Uqw6BWCaKJgP8u1VBxOcaAufUEALw_wcB</ext-link>
</p>
</fn>
<fn id="FN3">
<label>3</label>
<p>Emotiv EPOC <ext-link ext-link-type="uri" xlink:href="https://www.emotiv.com/product/emotiv-epoc-14-channel-mobile-eeg/">https://www.emotiv.com/product/emotiv-epoc-14-channel-mobile-eeg/</ext-link>
</p>
</fn>
<fn id="FN4">
<label>4</label>
<p>Imec Wearable Neurotechnology <ext-link ext-link-type="uri" xlink:href="https://www.imec-int.com/en/connected-health-solutions/neurotechnology">https://www.imec-int.com/en/connected-health-solutions/neurotechnology</ext-link>
</p>
</fn>
<fn id="FN5">
<label>5</label>
<p>Open BCI <ext-link ext-link-type="uri" xlink:href="https://shop.openbci.com/collections/all/eeg">https://shop.openbci.com/collections/all/eeg</ext-link>
</p>
</fn>
<fn id="FN6">
<label>6</label>
<p>Vive Pro Eye, <ext-link ext-link-type="uri" xlink:href="https://www.vive.com/eu/product/vive-pro-eye/">https://www.vive.com/eu/product/vive-pro-eye/</ext-link>
</p>
</fn>
<fn id="FN7">
<label>7</label>
<p>Tobii Eye Tracker 4C, <ext-link ext-link-type="uri" xlink:href="https://gaming.tobii.com/tobii-eye-tracker-4c/">https://gaming.tobii.com/tobii-eye-tracker-4c/</ext-link>
</p>
</fn>
<fn id="FN8">
<label>8</label>
<p>Tower defense (TD) is a subgenre of strategy video game where the goal is to defend a player&#x2019;s territories or possessions by obstructing the enemy attackers or by stopping enemies from reaching the exits, usually achieved by placing defensive structures on or along their path of attack. <ext-link ext-link-type="uri" xlink:href="https://en.wikipedia.org/wiki/Tower_defense">https://en.wikipedia.org/wiki/Tower_defense</ext-link>
</p>
</fn>
<fn id="FN9">
<label>9</label>
<p>Ottobock Myoboy, <ext-link ext-link-type="uri" xlink:href="https://shop.ottobock.us/Prosthetics/Upper-Limb-Prosthetics/Myo-Hands-and-Components/Myo-Software/MyoBoy/p/757M11%7E5X-CHANGE">https://shop.ottobock.us/Prosthetics/Upper-Limb-Prosthetics/Myo-Hands-and-Components/Myo-Software/MyoBoy/p/757M11&#x223c;5X-CHANGE</ext-link>
</p>
</fn>
<fn id="FN10">
<label>10</label>
<p>Perifit, <ext-link ext-link-type="uri" xlink:href="https://perifit.co/pages/perifit-benefits-faster-postnatal-recovery">https://perifit.co/pages/perifit-benefits-faster-postnatal-recovery</ext-link>
</p>
</fn>
<fn id="FN11">
<label>11</label>
<p>Bitalino <ext-link ext-link-type="uri" xlink:href="https://bitalino.com/en/hardware">https://bitalino.com/en/hardware</ext-link>
</p>
</fn>
<fn id="FN12">
<label>12</label>
<p>NeuroSky, <ext-link ext-link-type="uri" xlink:href="https://store.neurosky.com/collections/apps/games">https://store.neurosky.com/collections/apps/games</ext-link>
</p>
</fn>
<fn id="FN13">
<label>13</label>
<p>(<ext-link ext-link-type="uri" xlink:href="https://www.polygon.com/2013/5/7/4307692/valve-experimenting-with-sweat-based-left-4-dead-and-eye-controlled">https://www.polygon.com/2013/5/7/4307692/valve-experimenting-with-sweat-based-left-4-dead-and-eye-controlled</ext-link>)</p>
</fn>
</fn-group>
<ref-list>
<title>References</title>
<ref id="B1">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Abdulkader</surname>
<given-names>S. N.</given-names>
</name>
<name>
<surname>Atia</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Mostafa</surname>
<given-names>M-S. M.</given-names>
</name>
</person-group> (<year>2015</year>). <article-title>Brain Computer Interfacing: Applications and Challenges</article-title>. <source>Egypt. Inform. J.</source> <volume>16</volume> (<issue>2</issue>), <fpage>213</fpage>&#x2013;<lpage>230</lpage>. <pub-id pub-id-type="doi">10.1016/j.eij.2015.06.002</pub-id> </citation>
</ref>
<ref id="B2">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ahn</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Lee</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Choi</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Jun</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2014</year>). <article-title>A Review of Brain-Computer Interface Games and an Opinion Survey from Researchers, Developers and Users</article-title>. <source>Sensors</source> <volume>14</volume> (<issue>8</issue>), <fpage>14601</fpage>&#x2013;<lpage>14633</lpage>. <pub-id pub-id-type="doi">10.3390/s140814601</pub-id> </citation>
</ref>
<ref id="B3">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Alchalcabi</surname>
<given-names>A. E.</given-names>
</name>
<name>
<surname>Eddin</surname>
<given-names>A. N.</given-names>
</name>
<name>
<surname>Shirmohammadi</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2017</year>). &#x201c;<article-title>More Attention, Less Deficit: Wearable EEG-Based Serious Game for Focus Improvement</article-title>,&#x201d; <conf-name>2017 IEEE 5th International Conference on Serious Games and Applications for Health (SeGAH)</conf-name>, <conf-loc>Perth, WA, Australia</conf-loc>, <fpage>1</fpage>&#x2013;<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1109/SeGAH.2017.7939288</pub-id> </citation>
</ref>
<ref id="B4">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Allison</surname>
<given-names>B. Z.</given-names>
</name>
<name>
<surname>McFarland</surname>
<given-names>D. J.</given-names>
</name>
<name>
<surname>Schalk</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Zheng</surname>
<given-names>S. D.</given-names>
</name>
<name>
<surname>Jackson</surname>
<given-names>M. M.</given-names>
</name>
<name>
<surname>Wolpaw</surname>
<given-names>J.&#x20;R.</given-names>
</name>
</person-group> (<year>2008</year>). <article-title>Towards an Independent Brain-Computer Interface Using Steady State Visual Evoked Potentials</article-title>. <source>Clin. Neurophysiol.</source> <volume>119</volume> (<issue>2</issue>), <fpage>399</fpage>&#x2013;<lpage>408</lpage>. <pub-id pub-id-type="doi">10.1016/j.clinph.2007.09.121</pub-id> </citation>
</ref>
<ref id="B5">
<citation citation-type="web">
<person-group person-group-type="author">
<name>
<surname>Ambinder</surname>
<given-names>M.</given-names>
</name>
</person-group> (<year>2011</year>). &#x201c;<article-title>Biofeedback in Gameplay: How Valve Measures Physiology to Enhance Gaming Experience</article-title>,&#x201d; <conf-name>Game Developers Conference 2011 Proceedings</conf-name>, <conf-loc>San Francisco, CA, USA</conf-loc>, <conf-date>28 February&#x2013;4 March</conf-date>, <comment>Available at: <ext-link ext-link-type="uri" xlink:href="http://www.gdcvault.com/play/1014510/Biofeedback-in-Gameplay-How-Valve">http://www.gdcvault.com/play/1014510/Biofeedback-in-Gameplay-How-Valve</ext-link>
</comment> (<comment>Accessed March 6, 2021</comment>). </citation>
</ref>
<ref id="B6">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Andall</surname>
</name>
<name>
<surname>Hogarth</surname>
</name>
</person-group> (<year>2018</year>). <source>inventor: Sony Interactive Entertainment Europe Limited., Assignee. Sensing Apparatus And Method</source>. <comment>United&#x20;States Patent US 20200054940</comment>.</citation>
</ref>
<ref id="B8">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Armiger</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Vogelstein</surname>
<given-names>R.</given-names>
</name>
</person-group> (<year>2008</year>). &#x201c;<article-title>Air-Guitar Hero: a Real-Time Video Game Interface for Training and Evaluation of Dexterous Upper-Extremity Neuroprosthetic Control Algorithms</article-title>,&#x201d; <conf-name>2008 IEEE-BIOCAS Biomedical Circuits and Systems Conference</conf-name>, <conf-loc>Baltimore, MD</conf-loc>, 1, <fpage>121</fpage>&#x2013;<lpage>124</lpage>. </citation>
</ref>
<ref id="B9">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Athavale</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Krishnan</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2017</year>). <article-title>Biosignal Monitoring Using Wearables: Observations and Opportunities</article-title>. <source>Biomed. Signal Process. Control.</source> <volume>38</volume>, <fpage>22</fpage>&#x2013;<lpage>33</lpage>. <pub-id pub-id-type="doi">10.1016/j.bspc.2017.03.011</pub-id> </citation>
</ref>
<ref id="B10">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Balasubramanian</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Adalarasu</surname>
<given-names>K.</given-names>
</name>
</person-group> (<year>2007</year>). <article-title>EMG-based Analysis of Change in Muscle Activity during Simulated Driving</article-title>. <source>J.&#x20;Bodywork Mov. Therapies</source> <volume>11</volume>, <fpage>151</fpage>&#x2013;<lpage>158</lpage>. <pub-id pub-id-type="doi">10.1016/j.jbmt.2006.12.005</pub-id> </citation>
</ref>
<ref id="B11">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Blackwood</surname>
<given-names>D. H.</given-names>
</name>
<name>
<surname>Muir</surname>
<given-names>W. J.</given-names>
</name>
</person-group> (<year>1990</year>). <article-title>Cognitive Brain Potentials and Their Application</article-title>. <source>Br. J.&#x20;Psychiatry</source> <volume>9</volume>, <fpage>96</fpage>&#x2013;<lpage>101</lpage>. <pub-id pub-id-type="doi">10.1192/s0007125000291897</pub-id> </citation>
</ref>
<ref id="B12">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bo</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Ma</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>H.</given-names>
</name>
</person-group> (<year>2017</year>). <article-title>Music-evoked Emotion Classification Using EEG Correlation-Based Information</article-title>. <source>Annu Int Conf IEEE Eng Med Biol Soc.</source> <volume>2017</volume>, <fpage>3348</fpage>&#x2013;<lpage>3351</lpage>. <pub-id pub-id-type="doi">10.1109/EMBC.2017.8037573</pub-id> </citation>
</ref>
<ref id="B13">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Bond</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Ambinder</surname>
<given-names>M.</given-names>
</name>
</person-group> (<year>2009</year>). <source>inventor Corporation. Player Biofeedback for Dynamically Controlling a Video Game State</source>. <comment>United&#x20;States Patent Office. US 9511289B2.VALVE</comment>.</citation>
</ref>
<ref id="B14">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bouchard</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Bernier</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Boivin</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Morin</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Robillard</surname>
<given-names>G.</given-names>
</name>
</person-group> (<year>2012</year>). <article-title>Using Biofeedback while Immersed in a Stressful Videogame Increases the Effectiveness of Stress Management Skills in Soldiers</article-title>. <source>PloS one</source> <volume>7</volume> (<issue>4</issue>), <fpage>e36169</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0036169</pub-id> </citation>
</ref>
<ref id="B15">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bulling</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Roggen</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Tr&#xf6;ster</surname>
<given-names>G.</given-names>
</name>
</person-group> (<year>2009</year>). <article-title>Wearable EOG Goggles: Seamless Sensing and Context-Awareness in Everyday Environments</article-title>. <source>J.&#x20;Ambient Intelligence Smart Environments</source> <volume>1</volume> (<issue>2</issue>), <fpage>157</fpage>&#x2013;<lpage>171</lpage>. <pub-id pub-id-type="doi">10.3233/ais-2009-0020</pub-id> </citation>
</ref>
<ref id="B16">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Carofiglio</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>de Carolis</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>D&#x2019;Errico</surname>
<given-names>F.</given-names>
</name>
</person-group> (<year>2019</year>). &#x201c;<article-title>A BCI-Based Assessment of a Player&#x2019;s State of Mind for Game Adaptation</article-title>,&#x201d; <conf-name>Proceedings of 3rd Workshop on Games-Human Interaction (GHItaly19)</conf-name>. <conf-loc>Padova, Italy</conf-loc>, <conf-date>September 23, 2019</conf-date>. </citation>
</ref>
<ref id="B17">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Chanel</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Rebetez</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>B&#xe9;trancourt</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Pun</surname>
<given-names>T.</given-names>
</name>
</person-group> (<year>2008</year>). &#x201c;<article-title>Boredom, Engagement and Anxiety as Indicators for Adaptation to Difficulty in Games</article-title>,&#x201d; <conf-name>MindTrek &#x27;08: Proceedings of the 12th international conference on Entertainment and media in the ubiquitous era</conf-name>, <conf-loc>Tampere, Finland</conf-loc>, 1, <fpage>13</fpage>&#x2013;<lpage>17</lpage>. <pub-id pub-id-type="doi">10.1145/1457199.1457203</pub-id> </citation>
</ref>
<ref id="B18">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Ch&#x119;&#x107;</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Olczak</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Fernandes</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Ferreira</surname>
<given-names>H.</given-names>
</name>
</person-group> (<year>2015</year>). <article-title>Physiological Computing Gaming-Use of Electrocardiogram as an Input for Video Gaming</article-title>,&#x201d; in <conf-name>International Conference on Physiological Computing Systems</conf-name>, <conf-loc>Loire Valley, France</conf-loc>, (<publisher-name>SCITEPRESS</publisher-name>) <volume>2</volume>, <fpage>157</fpage>&#x2013;<lpage>163</lpage>. </citation>
</ref>
<ref id="B19">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Chen</surname>
<given-names>Y-H.</given-names>
</name>
<name>
<surname>de Beeck</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Vanderheyden</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Carrette</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Mihajlovi&#x107;</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Vanstreels</surname>
<given-names>K.</given-names>
</name>
<etal/>
</person-group> (<year>2014</year>). <article-title>Soft, Comfortable Polymer Dry Electrodes for High Quality ECG and EEG Recording</article-title>. <source>Sensors</source> <volume>14</volume> (<issue>12</issue>), <fpage>23758</fpage>&#x2013;<lpage>23780</lpage>. <pub-id pub-id-type="doi">10.3390/s141223758</pub-id> </citation>
</ref>
<ref id="B20">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Converse</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Ferraro</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Jean</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Jones</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Mendhiratta</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Naviasky</surname>
<given-names>E.</given-names>
</name>
<etal/>
</person-group> (<year>2013</year>). &#x201c;<article-title>An EMG Biofeedback Device for Video Game Use in Forearm Physiotherapy</article-title>,&#x201d; <conf-name>Proceedings of IEEE Sensors</conf-name>, <conf-loc>Baltimore, MD, USA</conf-loc>, <conf-date>Nov 3-6, 2013</conf-date>, <fpage>1</fpage>&#x2013;<lpage>4</lpage>. </citation>
</ref>
<ref id="B21">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cowley</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Ravaja</surname>
<given-names>N.</given-names>
</name>
</person-group> (<year>2014</year>). <article-title>Learning in Balance: Using Oscillatory EEG Biomarkers of Attention, Motivation and Vigilance to Interpret Game-Based Learning</article-title>. <source>Cogent Edu.</source> <volume>1</volume> (<issue>1</issue>), <fpage>1</fpage>&#x2013;<lpage>23</lpage>. <pub-id pub-id-type="doi">10.1080/2331186x.2014.962236</pub-id> </citation>
</ref>
<ref id="B22">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cowley</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Ravaja</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Heikura</surname>
<given-names>T.</given-names>
</name>
</person-group> (<year>2013</year>). <article-title>Cardiovascular Physiology Predicts Learning Effects in a Serious Game Activity</article-title>. <source>Comput. Edu.</source> <volume>60</volume>, <fpage>299</fpage>&#x2013;<lpage>309</lpage>. <pub-id pub-id-type="doi">10.1016/j.compedu.2012.07.014</pub-id> </citation>
</ref>
<ref id="B23">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Dawson</surname>
<given-names>M. E.</given-names>
</name>
<name>
<surname>Schell</surname>
<given-names>A. M.</given-names>
</name>
<name>
<surname>Filion</surname>
<given-names>D. L.</given-names>
</name>
</person-group> (<year>2017</year>). <article-title>&#x201c;The Electrodermal System&#x201d;. Cambridge Handbooks in Psychology</article-title>
<source>.</source>in <source>Handbook of Psychophysiology</source>. Editors <person-group person-group-type="editor">
<name>
<surname>Cacioppo</surname>
<given-names>J.&#x20;T.</given-names>
</name>
<name>
<surname>Tassinary</surname>
<given-names>L. G.</given-names>
</name>
<name>
<surname>Berntson</surname>
<given-names>G. G.</given-names>
</name>
</person-group> (<publisher-name>Cambridge University Press</publisher-name>), <fpage>217</fpage>&#x2013;<lpage>243</lpage>. </citation>
</ref>
<ref id="B24">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>De Luca</surname>
<given-names>C. J.</given-names>
</name>
</person-group> (<year>1984</year>). <article-title>Myoelectrical Manifestations of Localized Muscular Fatigue in Humans</article-title>. <source>Crit. Rev. Biomed. Eng.</source> <volume>11</volume> (<issue>4</issue>), <fpage>251</fpage>&#x2013;<lpage>279</lpage>. </citation>
</ref>
<ref id="B25">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Dekker</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Champion</surname>
<given-names>E.</given-names>
</name>
</person-group> (<year>2007</year>). &#x201c;<article-title>Please Biofeed the Zombies: Enhancing the Gameplay and Display of a Horror Game Using Biofeedback</article-title>,&#x201d; <conf-name>3rd Digital Games Research Association International Conference: Situated Play, DiGRA</conf-name>, <fpage>550</fpage>&#x2013;<lpage>558</lpage>. </citation>
</ref>
<ref id="B26">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Friedrich</surname>
<given-names>E. V. C.</given-names>
</name>
<name>
<surname>Suttie</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Sivanathan</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Lim</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Louchart</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Pineda</surname>
<given-names>J.&#x20;A.</given-names>
</name>
</person-group> (<year>2014</year>). <article-title>Brain&#xe2;&#x20ac;"computer Interface Game Applications for Combined Neurofeedback and Biofeedback Treatment for Children on the Autism Spectrum</article-title>. <source>Front. Neuroeng.</source> <volume>7</volume>, <fpage>21</fpage>. <pub-id pub-id-type="doi">10.3389/fneng.2014.00021</pub-id> </citation>
</ref>
<ref id="B27">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Giakoumis</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Tzovaras</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Moustakas</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Hassapis</surname>
<given-names>G.</given-names>
</name>
</person-group> (<year>2011</year>). <article-title>Automatic Recognition of Boredom in Video Games Using Novel Biosignal Moment-Based Features</article-title>. <source>IEEE Trans. Affective Comput.</source> <volume>2</volume> (<issue>3</issue>), <fpage>119</fpage>&#x2013;<lpage>133</lpage>. <pub-id pub-id-type="doi">10.1109/t-affc.2011.4</pub-id> </citation>
</ref>
<ref id="B28">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gorzkowski</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Sarwas</surname>
<given-names>G.</given-names>
</name>
</person-group> (<year>2019</year>). &#x201c;<article-title>Exploitation of EMG Signals for Video Game Control</article-title>,&#x201d; <conf-name>20th International Carpathian Control Conference (ICCC)</conf-name>, <conf-loc>Krakow-Wieliczka, Poland</conf-loc>, <conf-date>May 26&#x2013;29, 2019</conf-date>, <fpage>1</fpage>&#x2013;<lpage>6</lpage>. </citation>
</ref>
<ref id="B29">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hayes</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Teressa</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Daniel</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Jones</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Mendhiratta</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Naviasky</surname>
<given-names>E. P.</given-names>
</name>
<etal/>
</person-group> (<year>2013</year>). <article-title>An EMG biofeedback device for video game use in forearm physiotherapy</article-title>. <source>Proc. IEEE Sens.</source>, <fpage>1</fpage>&#x2013;<lpage>4</lpage>. <pub-id pub-id-type="doi">10.1109/ICSENS.2013.6688474</pub-id> </citation>
</ref>
<ref id="B30">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hjelm</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Browall</surname>
<given-names>C.</given-names>
</name>
</person-group> (<year>2000</year>). &#x201c;<article-title>Brainball-Using Brain Activity for Cool Competition</article-title>,&#x201d; <conf-name>Proceedings NordiCHI 2000</conf-name>, <fpage>177</fpage>. </citation>
</ref>
<ref id="B31">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Isokoski</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Joos</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>&#x160;pakov</surname>
<given-names>O.</given-names>
</name>
<name>
<surname>Martin</surname>
<given-names>B.</given-names>
</name>
</person-group> (<year>2009</year>). <article-title>Gaze Controlled Games</article-title>. <source>Univ. Access Inf. Soc.</source> <volume>8</volume>, <fpage>323</fpage>&#x2013;<lpage>337</lpage>. <pub-id pub-id-type="doi">10.1007/s10209-009-0146-3</pub-id> </citation>
</ref>
<ref id="B32">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Jiang</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Guan</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Jiang</surname>
<given-names>B.</given-names>
</name>
</person-group> (<year>2011</year>). &#x201c;<article-title>Brain Computer Interface Based 3D Game for Attention Training and Rehabilitation</article-title>,&#x201d; <conf-name>Proceedings of the 2011&#x20;6th IEEE Conference on Industrial Electronics and Applications ICIEA</conf-name>, <conf-loc>Beijing, China</conf-loc>, <fpage>124</fpage>&#x2013;<lpage>127</lpage>. </citation>
</ref>
<ref id="B33">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Karthikeyan</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Murugappan</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Yaacob</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2011</year>). &#x201c;<article-title>A Review on Stress Inducement Stimuli for Assessing Human Stress Using Physiological Signals</article-title>,&#x201d; <conf-name>Proceedings&#x2013;2011 IEEE 7th International Colloquium on Signal Processing and its ApplicationsCSPA</conf-name>, <fpage>420</fpage>&#x2013;<lpage>425</lpage>. </citation>
</ref>
<ref id="B34">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kivikangas</surname>
<given-names>J.&#x20;M.</given-names>
</name>
<name>
<surname>Chanel</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Cowley</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Ekman</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Salminen</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>J&#xe4;rvel&#xe4;</surname>
<given-names>S.</given-names>
</name>
<etal/>
</person-group> (<year>2011</year>). <article-title>A Review of the Use of Psychophysiological Methods in Game Research</article-title>. <source>journal gaming virtual worlds</source> <volume>3</volume>, <fpage>181</fpage>&#x2013;<lpage>199</lpage>. <pub-id pub-id-type="doi">10.1386/jgvw.3.3.181_1</pub-id> </citation>
</ref>
<ref id="B35">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kotsia</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Zafeiriou</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Fotopoulos</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2013</year>). &#x201c;<article-title>Affective Gaming: A Comprehensive Survey</article-title>,&#x201d; <conf-name>IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops</conf-name>, <fpage>663</fpage>&#x2013;<lpage>670</lpage>. <pub-id pub-id-type="doi">10.1109/CVPRW.2013.100</pub-id> </citation>
</ref>
<ref id="B36">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kritikos</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Alevizopoulos</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Koutsouris</surname>
<given-names>D.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Personalized Virtual Reality Human-Computer Interaction for Psychiatric and Neurological Illnesses: A Dynamically Adaptive Virtual Reality Environment that Changes According to Real-Time Feedback from Electrophysiological Signal Responses</article-title>. <source>Front. Hum. Neurosci.</source> <volume>15</volume>, <lpage>596980</lpage>. <pub-id pub-id-type="doi">10.3389/fnhum.2021.596980</pub-id> </citation>
</ref>
<ref id="B37">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kumar</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Sharma</surname>
<given-names>A.</given-names>
</name>
</person-group> (<year>2016</year>). <article-title>Electrooculogram-based Virtual Reality Game Control Using Blink Detection and Gaze Calibration</article-title>. <source>International Conference on Advances Computing, Communications Inform (ICACCI)</source> Jaipur, India. <fpage>2358</fpage>&#x2013;<lpage>2362</lpage>. </citation>
</ref>
<ref id="B67">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lim</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Yeo</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Yoon</surname>
<given-names>G.</given-names>
</name>
</person-group> (<year>2019</year>). <article-title>Comparison between Concentration and Immersion Based on EEG Analysis</article-title>. <source>Sensors</source> <volume>19</volume>, <fpage>1669</fpage>. <pub-id pub-id-type="doi">10.3390/s19071669</pub-id> </citation>
</ref>
<ref id="B38">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Liu</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Agrawal</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Sarkar</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Chen</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2009</year>). <article-title>Dynamic Difficulty Adjustment in Computer Games through Real-Time Anxiety-Based Affective Feedback</article-title>. <source>Int. J.&#x20;Human-Computer Interaction</source> <volume>25</volume> (<issue>6</issue>), <fpage>506</fpage>&#x2013;<lpage>529</lpage>. <pub-id pub-id-type="doi">10.1080/10447310902963944506-529</pub-id> </citation>
</ref>
<ref id="B39">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Martin</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Mathieu</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Pallamin</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Ragot</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Diverrez</surname>
<given-names>J.</given-names>
</name>
</person-group> (<year>2018</year>). <source>Automatic Recognition of Virtual Reality Sickness Based on Physiological Signals</source>. <publisher-loc>Amsterdam, Netherlands</publisher-loc>: <publisher-name>IBC</publisher-name>.</citation>
</ref>
<ref id="B66">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Mercado</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Espinosa-Curiel</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Escobedo</surname>
<given-names>L.</given-names>
</name>
</person-group> (<year>2019</year>). <article-title>Developing and evaluating a BCI video game for neurofeedback training: the case of autism</article-title>. <source>Multimed Tools Appl.</source> <volume>78</volume>, <fpage>13675</fpage>&#x2013;<lpage>13712</lpage>. <pub-id pub-id-type="doi">10.1007/s11042-018-6916-2</pub-id> </citation>
</ref>
<ref id="B40">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>M&#xfc;ller</surname>
<given-names>M. M.</given-names>
</name>
<name>
<surname>Hillyard</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2000</year>). <article-title>Concurrent Recording of Steady-State and Transient Event-Related Potentials as Indices of Visual-Spatial Selective Attention</article-title>. <source>Clin. Neurophysiol.</source> <volume>111</volume> (<issue>9</issue>), <fpage>1544</fpage>&#x2013;<lpage>1552</lpage>. <pub-id pub-id-type="doi">10.1016/s1388-2457(00)00371-0</pub-id> </citation>
</ref>
<ref id="B41">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Murch</surname>
<given-names>W. S.</given-names>
</name>
<name>
<surname>Limbrick&#x2010;Oldfield</surname>
<given-names>E. H.</given-names>
</name>
<name>
<surname>Ferrari</surname>
<given-names>M. A.</given-names>
</name>
<name>
<surname>MacDonald</surname>
<given-names>K. I.</given-names>
</name>
<name>
<surname>Fooken</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Cherkasova</surname>
<given-names>M. V.</given-names>
</name>
<etal/>
</person-group> (<year>2020</year>). <article-title>Zoned in or Zoned Out? Investigating Immersion in Slot Machine Gambling Using Mobile Eye&#x2010;tracking</article-title>. <source>Addiction</source> <volume>115</volume> (<issue>6</issue>), <fpage>1127</fpage>&#x2013;<lpage>1138</lpage>. <pub-id pub-id-type="doi">10.1111/add.14899</pub-id> </citation>
</ref>
<ref id="B42">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Nacke</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Lindley</surname>
<given-names>C. A.</given-names>
</name>
</person-group> (<year>2008</year>). &#x201c;<article-title>Flow and Immersion in First-Person Shooters</article-title>,&#x201d; <conf-name>Proceedings of the 2008 Conference on Future Play: Research, Play. share</conf-name>, Toronto, Canada. 1. <fpage>81</fpage>&#x2013;<lpage>88</lpage>. <pub-id pub-id-type="doi">10.1145/1496984.1496998</pub-id> </citation>
</ref>
<ref id="B43">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Nacke</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Kalyn</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Lough</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Mandryk</surname>
<given-names>R.</given-names>
</name>
</person-group> (<year>2011</year>). <article-title>Biofeedback Game Design: Using Direct and Indirect Physiological Control to Enhance Game Interaction</article-title>. <source>Conference on Hum. Factors Comput. Syst.</source> Vancouver, Canada <volume>1</volume>, <fpage>103</fpage>&#x2013;<lpage>113</lpage>. </citation>
</ref>
<ref id="B45">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Norman</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Dennison</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Wolbrecht</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Cramer</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Srinivasan</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Srinivasan</surname>
<given-names>R.</given-names>
</name>
<etal/>
</person-group> (<year>2016</year>). <article-title>Movement Anticipation and EEG: Implications for BCI-Contingent Robot Therapy</article-title>. <source>IEEE Trans. on Neural Syst. Rehabil. Engineering</source> <volume>24</volume> (<issue>8</issue>), <fpage>911</fpage>&#x2013;<lpage>919</lpage>. <pub-id pub-id-type="doi">10.1109/tnsre.2016.2528167</pub-id> </citation>
</ref>
<ref id="B64">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Parsons</surname>
<given-names>T. D.</given-names>
</name>
<name>
<surname>Reinebold</surname>
<given-names>J. L.</given-names>
</name>
</person-group> (<year>2012</year>). <article-title>Adaptive virtual environments for neuropsychological assessment in serious games</article-title>. <source>IEEE Trans. Consumer Electron.</source> <volume>2</volume>, <fpage>197</fpage>&#x2013;<lpage>204</lpage>. <pub-id pub-id-type="doi">10.1109/TCE.2012.6227413</pub-id> </citation>
</ref>
<ref id="B62">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Plass-Oude Bos</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Reuderink</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Laar</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>G&#x00FC;rk&#x00F6;k</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>M&#x00FC;hl</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Poel</surname>
<given-names>M.</given-names>
</name>
<etal/>
</person-group> (<year>2010</year>). <source>Brain-Computer Interfacing and Games</source>. <pub-id pub-id-type="doi">10.1007/978-1-84996-272-8_10</pub-id> </citation>
</ref>
<ref id="B65">
<citation citation-type="web">
<person-group person-group-type="author">
<name>
<surname>Picard</surname>
<given-names>R. W.</given-names>
</name>
</person-group> (<year>1997</year>). <source>Affective Computing</source>. <publisher-loc>Cambridge, MA</publisher-loc>:<publisher-name>MIT Press</publisher-name>
</citation>
</ref>
<ref id="B63">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Porter</surname>
<given-names>A. M.</given-names>
</name>
<name>
<surname>Goolkasian</surname>
<given-names>P.</given-names>
</name>
</person-group> (<year>2019</year>). <article-title>Video games and stress: how stress appraisals and game content affect cardiovascular and emotion outcomes</article-title>. <source>Front. psycho.</source> <volume>10</volume>, <fpage>967</fpage>. <pub-id pub-id-type="doi">10.3389/fpsyg.2019.00967</pub-id> </citation>
</ref>
<ref id="B46">
<citation citation-type="web">
<collab>Polygon.</collab> (<year>2020</year>). <article-title>Valve Experimenting with Sweat-Based Left 4 Dead And Eye-Controlled Portal 2</article-title>. <comment>Available at: <ext-link ext-link-type="uri" xlink:href="https://www.polygon.com/2013/5/7/4307692/valve-experimenting-with-sweat-based-left-4-dead-and-eye-controlled">https://www.polygon.com/2013/5/7/4307692/valve-experimenting-with-sweat-based-left-4-dead-and-eye-controlled</ext-link>
</comment> (<comment>Accessed March 11, 2020</comment>). </citation>
</ref>
<ref id="B47">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Prahm</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Kayali</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Vujaklija</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Sturma</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Aszmann</surname>
<given-names>O.</given-names>
</name>
</person-group> (<year>2017</year>). &#x201c;<article-title>Increasing Motivation, Effort and Performance through Game-Based Rehabilitation for Upper Limb Myoelectric Prosthesis Control</article-title>,&#x201d; <conf-name>International Conference on Virtual Rehabilitation (ICVR)</conf-name>, <conf-loc>Montreal, QC, Canada</conf-loc>, <fpage>1</fpage>&#x2013;<lpage>6</lpage>. <pub-id pub-id-type="doi">10.1109/ICVR.2017.8007517</pub-id> </citation>
</ref>
<ref id="B48">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Rincon</surname>
<given-names>A. L.</given-names>
</name>
<name>
<surname>Yamasaki</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Shimoda</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2016</year>). &#x201c;<article-title>Design of a Video Game for Rehabilitation Using Motion Capture, EMG Analysis and Virtual Reality</article-title>,&#x201d; <conf-name>International Conference on Electronics, Communications and Computers, CONIELECOMP 2016</conf-name>, <fpage>198</fpage>&#x2013;<lpage>204</lpage>. <pub-id pub-id-type="doi">10.1109/CONIELECOMP.2016.7438575</pub-id> </citation>
</ref>
<ref id="B49">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Russoniello</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>O&#x2019;Brien</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Parks</surname>
<given-names>J.</given-names>
</name>
</person-group> (<year>2009</year>). <article-title>The Effectiveness of Casual Video Games in Improving Mood and Decreasing Stress</article-title>. <source>J.&#x20;Cyber Ther. Rehabil.</source> <volume>2</volume>, <fpage>53</fpage>&#x2013;<lpage>66</lpage>. </citation>
</ref>
<ref id="B61">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sakurazawa</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Yoshida</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Munekata</surname>
<given-names>N.</given-names>
</name>
</person-group> (<year>2004</year>). &#x201c;<article-title>Entertainment feature of a game using skin conductance response</article-title>&#x201d;. in <conf-name>Proceedings of the 2004 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology</conf-name>, <conf-loc>Singapore</conf-loc>, <conf-date>June 3&#x2013;5, 2004</conf-date>. <fpage>181</fpage>&#x2013;<lpage>186</lpage>. <pub-id pub-id-type="doi">10.1145/1067343.1067365</pub-id> </citation>
</ref>
<ref id="B50">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Soares</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Siqueira</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Miura</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Silva</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Jacobi</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Castanho</surname>
<given-names>C.</given-names>
</name>
</person-group> (<year>2017</year>). <source>Biofeedback Sensors in Electronic Games: A Practical Evaluation</source>. <fpage>56</fpage>&#x2013;<lpage>65</lpage>. <pub-id pub-id-type="doi">10.1109/SBGames.2017.00015</pub-id> </citation>
</ref>
<ref id="B51">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Suhaimi</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Yuan</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Teo</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Mountstephens</surname>
<given-names>J.</given-names>
</name>
</person-group> (<year>2018</year>). &#x201c;<article-title>Modeling the Affective Space of 360 Virtual Reality Videos Based on Arousal and Valence for Wearable EEG-Based VR Emotion Classification</article-title>,&#x201d; <conf-name>Proceedings&#x2013;2018 IEEE 14th International Colloquium on Signal Processing and its Application</conf-name>, <fpage>167</fpage>&#x2013;<lpage>172</lpage>. </citation>
</ref>
<ref id="B52">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Suto</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Oniga</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2018</year>). <article-title>Music Stimuli Recognition in Electroencephalogram Signal</article-title>. <source>ElAEE</source> <volume>24</volume>, <fpage>68</fpage>&#x2013;<lpage>71</lpage>. <pub-id pub-id-type="doi">10.5755/j01.eie.24.4.21482</pub-id> </citation>
</ref>
<ref id="B53">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Vachiratamporn</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Legaspi</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Moriyama</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Fukui</surname>
<given-names>K-I.</given-names>
</name>
<name>
<surname>Numao</surname>
<given-names>M.</given-names>
</name>
</person-group> (<year>2015</year>). <article-title>An Analysis of Player Affect Transitions in Survival Horror Games</article-title>. <source>J.&#x20;Multimodal User Inter.</source> <volume>9</volume>, <fpage>43</fpage>&#x2013;<lpage>54</lpage>. <pub-id pub-id-type="doi">10.1007/s12193-014-0153-4</pub-id> </citation>
</ref>
<ref id="B54">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Vamvakousis</surname>
<given-names>Z.</given-names>
</name>
<name>
<surname>Ramirez</surname>
<given-names>R.</given-names>
</name>
</person-group> (<year>2016</year>). <article-title>The EyeHarp: A Gaze-Controlled Digital Musical Instrument</article-title>. <source>Front. Psychol.</source> <volume>7</volume>. <pub-id pub-id-type="doi">10.3389/fpsyg.2016.00906</pub-id> </citation>
</ref>
<ref id="B55">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Vickers</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Istance</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Smalley</surname>
<given-names>M.</given-names>
</name>
</person-group> (<year>2010</year>). &#x201c;<article-title>EyeGuitar: Making Rhythm Based Music Video Games Accessible Using Only Eye Movements</article-title>,&#x201d; <conf-name>Proceedings of the 7th International Conference on Advances in Computer Entertainment Technology (ACE &#x27;10)</conf-name>, <conf-loc>New York, NY, USA</conf-loc>, <publisher-name>Association for Computing Machinery</publisher-name>, <fpage>36</fpage>&#x2013;<lpage>39</lpage>. </citation>
</ref>
<ref id="B56">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Visconti</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Gaetani</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Zappatore</surname>
<given-names>G. A.</given-names>
</name>
<name>
<surname>Primiceri</surname>
<given-names>P.</given-names>
</name>
</person-group> (<year>2018</year>). <article-title>Technical Features and Functionalities of Myo Armband: An Overview on Related Literature and Advanced Applications of Myoelectric Armbands Mainly Focused on Arm Prostheses</article-title>. <source>Int. J.&#x20;Smart Sensing Intell. Syst.</source> <volume>11</volume>, <fpage>1</fpage>&#x2013;<lpage>25</lpage>. <pub-id pub-id-type="doi">10.21307/ijssis-2018-005</pub-id> </citation>
</ref>
<ref id="B57">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Vujaklija</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Prahm</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Agnes</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Aszmann</surname>
<given-names>O.</given-names>
</name>
<name>
<surname>Kayali</surname>
<given-names>F.</given-names>
</name>
</person-group> (<year>2017</year>). &#x201c;<article-title>Increasing Motivation, Effort and Performance through Game-Based Rehabilitation for Upper Limb Myoelectric Prosthesis Control</article-title>,&#x201d; <conf-name>International Conference on Virtual Rehabilitation, ICVR</conf-name>, <fpage>1</fpage>&#x2013;<lpage>6</lpage>. <pub-id pub-id-type="doi">10.1109/ICVR.2017.8007517</pub-id> </citation>
</ref>
<ref id="B58">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wang</surname>
<given-names>Q.</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Hu</surname>
<given-names>F.</given-names>
</name>
</person-group> (<year>2014</year>). &#x201c;<article-title>Combining EEG and VR Technology to Assess Fear of Heights</article-title>,&#x201d; <conf-name>Proceedings&#x2013;9th International Conference on Information Technology in Medicine and Education, ITME 2018</conf-name>, <fpage>110</fpage>&#x2013;<lpage>114</lpage>. </citation>
</ref>
<ref id="B59">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Xue</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Wu</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Kolen</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Aghdaie</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Zaman</surname>
<given-names>K.</given-names>
</name>
</person-group> (<year>2019</year>). &#x201c;<article-title>Dynamic Difficulty Adjustment for Maximized Engagement in Digital Games</article-title>,&#x201d; <conf-name>26th International World Wide Web Conference 2017</conf-name>, <conf-loc>Perth, Australia</conf-loc>, (<publisher-name>WWW 2017 Companion</publisher-name>), <fpage>465</fpage>&#x2013;<lpage>471</lpage>. <pub-id pub-id-type="doi">10.1145/3041021.3054170</pub-id> </citation>
</ref>
</ref-list>
</back>
</article>