<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Hum. Neurosci.</journal-id>
<journal-title>Frontiers in Human Neuroscience</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Hum. Neurosci.</abbrev-journal-title>
<issn pub-type="epub">1662-5161</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fnhum.2021.773603</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Neuroscience</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Effects of Gaze Fixation on the Performance of a Motor Imagery-Based Brain-Computer Interface</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Meng</surname> <given-names>Jianjun</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="corresp" rid="c001"><sup>&#x002A;</sup></xref>
<xref ref-type="author-notes" rid="fn001"><sup>&#x2020;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/810475/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Wu</surname> <given-names>Zehan</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<xref ref-type="author-notes" rid="fn001"><sup>&#x2020;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/278108/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Li</surname> <given-names>Songwei</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/1487522/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Zhu</surname> <given-names>Xiangyang</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/278070/overview"/>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Department of Mechanical Engineering, Shanghai Jiao Tong University</institution>, <addr-line>Shanghai</addr-line>, <country>China</country></aff>
<aff id="aff2"><sup>2</sup><institution>Department of Neurosurgery, Huashan Hospital, Fudan University</institution>, <addr-line>Shanghai</addr-line>, <country>China</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Bin He, Carnegie Mellon University, United States</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Xiaogang Chen, Chinese Academy of Medical Sciences and Peking Union Medical College, China; Bradley Jay Edelman, Max Planck Institute of Neurobiology (MPIN), Germany</p></fn>
<corresp id="c001">&#x002A;Correspondence: Jianjun Meng, <email>mengjianjunxs008@sjtu.edu.cn</email></corresp>
<fn fn-type="equal" id="fn001"><p><sup>&#x2020;</sup>These authors have contributed equally to this work and share first authorship</p></fn>
<fn fn-type="other" id="fn004"><p>This article was submitted to Brain-Computer Interfaces, a section of the journal Frontiers in Human Neuroscience</p></fn>
</author-notes>
<pub-date pub-type="epub">
<day>24</day>
<month>01</month>
<year>2022</year>
</pub-date>
<pub-date pub-type="collection">
<year>2021</year>
</pub-date>
<volume>15</volume>
<elocation-id>773603</elocation-id>
<history>
<date date-type="received">
<day>10</day>
<month>09</month>
<year>2021</year>
</date>
<date date-type="accepted">
<day>08</day>
<month>12</month>
<year>2021</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x00A9; 2022 Meng, Wu, Li and Zhu.</copyright-statement>
<copyright-year>2022</copyright-year>
<copyright-holder>Meng, Wu, Li and Zhu</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license>
</permissions>
<abstract>
<p>Motor imagery-based brain-computer interfaces (BCIs) have been studied without controlling subjects&#x2019; gaze fixation position previously. The effect of gaze fixation and covert attention on the behavioral performance of BCI is still unknown. This study designed a gaze fixation controlled experiment. Subjects were required to conduct a secondary task of gaze fixation when performing the primary task of motor imagination. Subjects&#x2019; performance was analyzed according to the relationship between motor imagery target and the gaze fixation position, resulting in three BCI control conditions, i.e., congruent, incongruent, and center cross trials. A group of fourteen subjects was recruited. The average group performances of three different conditions did not show statistically significant differences in terms of BCI control accuracy, feedback duration, and trajectory length. Further analysis of gaze shift response time revealed a significantly shorter response time for congruent trials compared to incongruent trials. Meanwhile, the parietal occipital cortex also showed active neural activities for congruent and incongruent trials, and this was revealed by a contrast analysis of R-square values and lateralization index. However, the lateralization index computed from the parietal and occipital areas was not correlated with the BCI behavioral performance. Subjects&#x2019; BCI behavioral performance was not affected by the position of gaze fixation and covert attention. This indicated that motor imagery-based BCI could be used freely in robotic arm control without sacrificing performance.</p>
</abstract>
<kwd-group>
<kwd>brain-computer interface (BCI)</kwd>
<kwd>electroencephalography (EEG)</kwd>
<kwd>motor imagery</kwd>
<kwd>gaze fixation</kwd>
<kwd>covert attention</kwd>
</kwd-group>
<contract-sponsor id="cn001">National Key Research and Development Program of China<named-content content-type="fundref-id">10.13039/501100012166</named-content></contract-sponsor>
<contract-sponsor id="cn002">State Key Laboratory of Mechanical System and Vibration<named-content content-type="fundref-id">10.13039/501100011415</named-content></contract-sponsor>
<counts>
<fig-count count="12"/>
<table-count count="0"/>
<equation-count count="2"/>
<ref-count count="46"/>
<page-count count="16"/>
<word-count count="10605"/>
</counts>
</article-meta>
</front>
<body>
<sec id="S1" sec-type="intro">
<title>Introduction</title>
<p>Brain-computer interface (BCI) technology has attracted widespread attention in both research and clinical applications. It has opened doors to improving the life quality of patients who suffer from neurological disorders such as spinal cord injury and amyotrophic lateral sclerosis (<xref ref-type="bibr" rid="B3">Bouton et al., 2016</xref>; <xref ref-type="bibr" rid="B34">Soekadar et al., 2016</xref>; <xref ref-type="bibr" rid="B7">Chaudhary et al., 2017</xref>; <xref ref-type="bibr" rid="B25">Moses et al., 2021</xref>; <xref ref-type="bibr" rid="B41">Willett et al., 2021</xref>). Motor imagination (MI) utilizes a rehearsal of limb movement, and it is a commonly used strategy to build up a noninvasive BCI (<xref ref-type="bibr" rid="B28">Pfurtscheller and Neuper, 2001</xref>; <xref ref-type="bibr" rid="B16">He et al., 2020</xref>). In addition, MI-based BCI demonstrates promising applications of operating assistive devices such as a wheelchair (<xref ref-type="bibr" rid="B18">Long et al., 2012</xref>; <xref ref-type="bibr" rid="B36">Tonin et al., 2019</xref>) and a robotic arm (<xref ref-type="bibr" rid="B23">Meng et al., 2016</xref>; <xref ref-type="bibr" rid="B12">Edelman et al., 2019</xref>), rehabilitating stroke patients (<xref ref-type="bibr" rid="B2">Biasiucci et al., 2018</xref>), etc.</p>
<p>Motor imagination is an endogenous mental process; it provides a gaze-independent control way (<xref ref-type="bibr" rid="B43">Wolpaw et al., 2002</xref>). One evident example is that MI classification of the left hand and right hand in eyes closing could be comparable to the classification in the eyes opening scenario (<xref ref-type="bibr" rid="B4">Brandl et al., 2016</xref>). A typical MI paradigm with eyes opening requires the subjects to focus on the center of a center-cross during the control period. Feedback will prompt the decoded results at the end of each trial (<xref ref-type="bibr" rid="B28">Pfurtscheller and Neuper, 2001</xref>). Many previous studies take advantage of this conventional design (<xref ref-type="bibr" rid="B38">Van Gerven and Jensen, 2009</xref>; <xref ref-type="bibr" rid="B5">Brunner et al., 2011</xref>; <xref ref-type="bibr" rid="B45">Yao et al., 2013</xref>; <xref ref-type="bibr" rid="B1">Ang et al., 2014</xref>; <xref ref-type="bibr" rid="B46">Zhang et al., 2015</xref>). However, plenty of studies provide more feedback information, e.g., the dynamical process of the control is continuously available (<xref ref-type="bibr" rid="B42">Wolpaw and McFarland, 2004</xref>; <xref ref-type="bibr" rid="B39">Wander et al., 2013</xref>; <xref ref-type="bibr" rid="B40">Wei et al., 2013</xref>; <xref ref-type="bibr" rid="B23">Meng et al., 2016</xref>; <xref ref-type="bibr" rid="B12">Edelman et al., 2019</xref>). This dynamical control process is entirely meaningful to real applications such as operating a robotic arm or a wheelchair since subjects need to interact with the environment in real-time (<xref ref-type="bibr" rid="B23">Meng et al., 2016</xref>; <xref ref-type="bibr" rid="B12">Edelman et al., 2019</xref>).</p>
<p>In the above real applications, subjects might not have a visible center cross to focus on, but they still have the targets in their mind. In this sense, it is interesting to investigate the efficacy of gaze fixation during the MI-based BCI. For example, whether there is any performance difference between gaze fixation on the center cross and gaze fixation on the indicated target. Furthermore, there are additional neural activities in the process of robotic arm control, e.g., subjects have to covertly pay attention to the movements of the robotic arm if they choose to fix their gaze at the target or vice versa. These simultaneous multiple neural activities happen not only for static target reach-and-grasp (<xref ref-type="bibr" rid="B23">Meng et al., 2016</xref>) but also for continuous cursor tracking (<xref ref-type="bibr" rid="B12">Edelman et al., 2019</xref>). Therefore, it is also interesting to see the influence of covert attention on MI-based BCI&#x2019;s performance.</p>
<p>Eye movement and gaze fixation points have been utilized as additional features for improving the performance of MI-based BCI (<xref ref-type="bibr" rid="B14">Frisoli et al., 2012</xref>; <xref ref-type="bibr" rid="B8">Cheng et al., 2020</xref>). However, to our knowledge, research of motor imagination while rigorously controlling the position of subjects&#x2019; gaze is scarce. In this study, gaze fixation is a secondary task compared to the primary MI tasks. First, we aim to investigate the effect of gaze fixation at different locations when performing motor imagery tasks. Second, we want to explore the neural activity induced by the secondary task and its influence on the primary tasks.</p>
</sec>
<sec id="S2" sec-type="materials|methods">
<title>Materials and Methods</title>
<p>Fourteen subjects (1 female; all are right-handed subjects; average age 22.9 &#x00B1; 4.6 years; range, 20&#x2013;38) were recruited in a single session of BCI online experiment with cursor control. One subject was an experienced BCI user, and another two subjects had several sessions of BCI practice when developing the BCI program. All the other subjects were na&#x00EF;ve BCI subjects. Additionally, all of the subjects were na&#x00EF;ve to the dual tasks before participating in this study. All procedures and protocols were approved by the Institutional Review Board of Shanghai Jiao Tong University. Written informed consents were obtained from all of the participants before they agreed to participate in the experiment.</p>
<sec id="S2.SS1">
<title>Experimental Setup</title>
<p>A 64 channels g.HIamp system (g.tec Medical Engineering, Austria) and suitable size g.GAMA caps with 64 active electrodes were used to record EEG signals in an acoustic and magnetic shield room. EEG signals were recorded at a sampling rate of 1200 Hz. A bandpass filter between 0.1 to 100 Hz and a notch filter of 50 Hz was applied to the raw EEG signals. The electrodes on the left earlobe and forehead were chosen as the reference and ground, respectively. The impedances for all the electrodes were maintained below 20 k&#x03A9; as recommended by the manufacturer. A Gazepoint GP3 eye tracker was set up to track eye movement during the BCI task. The sampling rate of the eye tracker was 60 Hz. A chin rest was used to fixate subjects&#x2019; head position (see <xref ref-type="fig" rid="F1">Figure 1B</xref>). The data from eye tracker were recorded and synchronized with BCI2000 key events through a customized MATLAB script.</p>
<fig id="F1" position="float">
<label>FIGURE 1</label>
<caption><p><bold>(A)</bold> A single trial structure of a motor imagery based BCI with gaze shift and fixation. <bold>(B)</bold> The experimental setup of motor imagery based BCI with eye tracking. A chin rest was used to secure the position of a subject&#x2019;s head. <bold>(C)</bold> Three control conditions resulting from the congruence between a motor imagery task and a gaze shift and fixation position.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-15-773603-g001.tif"/>
</fig>
</sec>
<sec id="S2.SS2">
<title>Experimental Design and Protocol</title>
<p>Each subject was required to sit on a comfortable chair facing the center of a 24.5-inch LCD monitor. Before the start of each experimental session, native nine-point calibration of the eye tracker was implemented. The visual cues and feedback were displayed on the monitor. The distance between a subject and the monitor was set approximately 80 cm. Each of the participants took one session of an online BCI control task. Each session consisted of 10 runs of task blocks (around 5 min for each run), and each run included 30 trials of cursor control tasks. The task was a typical left vs. right cursor control task, but we required the subjects to control and fixate their gaze at a particular position during the mission. Before starting an experiment, each subject was allowed to practice one run of the experiment to be familiar with the tasks.</p>
<p>The trial structure was used in our previous research and was shown in <xref ref-type="fig" rid="F1">Figure 1A</xref> (<xref ref-type="bibr" rid="B23">Meng et al., 2016</xref>; <xref ref-type="bibr" rid="B21">Meng and He, 2019</xref>). Each trial started with a blank screen lasting for 2 s, which was also used as the inter-trial interval. At the end of the blank screen, a yellow square serving as a target cue appeared either on the screen&#x2019;s left side or right side, correspondingly a gray bar serving as the incorrect target appeared on the opposite side of the target cue. At the same time, a white cross appeared at the center of the screen with a left, right, or none arrow overlaying on the cross&#x2019;s horizontal bar. The yellow target cue was displayed for two and half seconds in order to indicate the subject be prepared for the primary motor imagination task. While a white cross was used to instruct the secondary task, the center cross with or without an arrow was used to indicate where the subjects should orient and fixate their gaze (see <xref ref-type="fig" rid="F1">Figure 1C</xref>). Subjects were asked to shift their gaze quickly toward the position according to the indicated cross arrow (left arrow: the center of a left target; right arrow: the center of a right target; a cross of none arrow: the center of the cross) and fixated their gaze during the entire trial. Then they performed motor imagination using repetitive movement imagination of their left hand or right hand corresponding to the left target or right target cue. At the end of the cue period, a round pinky cursor appeared in the center of the screen overlaying on top of the white cross. Subjects controlled the cursor moving toward the left or right target until it hit the correct (hit trial) or incorrect target (miss trial), or running out of 6 s without hitting the correct or incorrect target (abort trial). Then the cursor was frozen for 1 s to inform subjects of the current result. Note that, in many typical BCI studies, the BCI system would classify the trial to be either a hit trial or a miss trial. But in this study since there was a requirement of moving distance, those trials, in which subjects failed to reach the required distance, would be classified as abort trials. All subjects were instructed to perform the kinesthetic motor imagination from a first person&#x2019;s perspective (<xref ref-type="bibr" rid="B27">Neuper et al., 2005</xref>).</p>
<p>The target cues and the arrow of a center cross in each run were both assigned in a block randomized way. Thus, the number of left-hand and right-hand target cues, the number of arrowed crosses could be balanced accurately. Since the target center and the gaze center might be overlaid or at different locations, it resulted in three different BCI control conditions (see <xref ref-type="fig" rid="F2">Figure 2</xref>): the target center and the gaze center were overlaid (a condition of congruent trials), the gaze center was on the cross center, but the target was on the peripheral side (a condition of center cross), the gaze center was on the opposite side of the target (a condition of incongruent trials).</p>
<fig id="F2" position="float">
<label>FIGURE 2</label>
<caption><p>Overview of motor imagery tasks and gaze shift and fixation tasks. According to the congruence between a motor imagery task and a gaze shift and fixation position, there were three different control conditions, i.e., congruent trials, center cross trials and incongruent trials.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-15-773603-g002.tif"/>
</fig>
</sec>
<sec id="S2.SS3">
<title>Online Signal Processing and Performance Evaluation</title>
<p>A similar setting of online signal processing as our previous work (<xref ref-type="bibr" rid="B21">Meng and He, 2019</xref>) was used in this study. The high alpha (10&#x2013;14 Hz) power difference of channels C3 and C4 was used to control cursor movement. First, a small Laplacian filter (<xref ref-type="bibr" rid="B20">McFarland et al., 1997</xref>) was used to remove the surrounding common noise of channels C3 and C4. The power spectra of these two channels were estimated using an autoregressive (AR) method (<xref ref-type="bibr" rid="B19">McFarland and Wolpaw, 2008</xref>). A sliding window of 400ms was used to update the power spectrum continuously in a stepsize of 40 ms in order to increase the computation efficiency. Second, a single value of power difference between two channels was stored in a buffer of 30 s length, then the mean and standard deviation was calculated based on the buffered data. Thus, the output was a normalized value of the power difference using the above mean and standard deviation. Finally, the output signal was projected into the velocity of a cursor movement. The center cross was simultaneously programmed into the control sequence of a BCI2000 cursor task module (<xref ref-type="bibr" rid="B32">Schalk et al., 2004</xref>). The entire online signal processing procedures were illustrated in <xref ref-type="fig" rid="F3">Figure 3A</xref>. Note that a certain time period was used to train the normalizer since input data has to be accumulated into the buffer. Thus the cursor did not move for the first trial of a BCI session. Subjects were still instructed to shift their gaze toward the prompt position and start their motor imagination after they perceived the target cue even if there was no feedback of cursor movement during the period of the first trial. Specifically, the first static trial was because the buffer needed to accumulate data to output the cursor&#x2019;s movement.</p>
<fig id="F3" position="float">
<label>FIGURE 3</label>
<caption><p><bold>(A)</bold> Online signal processing flow chart and <bold>(B)</bold> schematic of the offline data analysis.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-15-773603-g003.tif"/>
</fig>
<p>Percent valid correct (PVC) (<xref ref-type="bibr" rid="B23">Meng et al., 2016</xref>; <xref ref-type="bibr" rid="B21">Meng and He, 2019</xref>) was used as an online performance measure to evaluate the BCI behavioral performance. PVC was calculated by the number of hits during each run divided by the total number of hits and misses. The number of abort trials was not taken into account in this metric. Since the number of abort trials was not considered in PVC, we also calculated the accuracy (ACC) defined as the number of hits divided by the total number of trials in each run. Since the cursor was controlled in a velocity-based fashion, the average duration and trajectory length of hit trials both represented extra measures that evaluated the efficiency of a BCI control. The mean duration and trajectory length of hit trials in each run were then averaged over all the runs of a session for each subject. The grand average duration and trajectory length of hit trials over subjects were investigated for each BCI control condition, respectively. Overall, the BCI behavioral performance including the PVC, ACC, feedback duration and the trajectory length were investigated. Several measures were used instead of a single ACC value in this study since these measures would give a comprehensive picture of subjects&#x2019; behavioral performance by BCI control.</p>
</sec>
<sec id="S2.SS4">
<title>Calculation of Response Time of Gaze Shift and Reorientation</title>
<p>Besides the BCI behavioral performance, there was another behavioral performance of gaze shift and reorientation. The response time of an individual&#x2019;s gaze shift and reorientation could be obtained using the following approach. At the beginning of each trial, a target cue appeared and lasted two and a half seconds. Gaze trajectory in the target cue period was captured and analyzed for each trial and each subject. However, the gaze trajectory could be noisy due to a small movement of the gaze. Here we thus set a threshold to the horizontal values of gaze positions, rounding the values toward three categorical numbers (0, left side; 0.5 center cross; 1, right side) by the following equation (1).</p>
<disp-formula id="S2.E1"><label>(1)</label><mml:math id="M1"><mml:mrow><mml:mrow><mml:mi>Gaze</mml:mi><mml:mi mathvariant="normal">_</mml:mi><mml:mpadded width="+3.3pt"><mml:mi>Xadjusted</mml:mi></mml:mpadded></mml:mrow><mml:mo rspace="5.8pt">=</mml:mo><mml:mrow><mml:mrow><mml:mrow><mml:mn>0.5</mml:mn><mml:mo>&#x002A;</mml:mo><mml:mi>round</mml:mi></mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi mathvariant="normal">X</mml:mi><mml:mo>-</mml:mo><mml:mn>0.5</mml:mn></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>&#x002A;</mml:mo><mml:mn>3</mml:mn></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mo>+</mml:mo><mml:mn>0.5</mml:mn></mml:mrow></mml:mrow></mml:math></disp-formula>
<p>X was the raw horizontal value of gaze position; X&#x2019;s value ranged from 0 to 1. It was a serial number representing the position to the proportion of the screen width. Gaze_Xadjusted was the projected value after thresholding. Then a response time could be obtained at the rising or falling edge of the adjusted curve. Note that we used the time at the last rising or falling edge as a response time when the gaze shift and reorientation were correct. In some trials subjects might shift their gaze toward the incorrect target at the beginning, then they might realize their error and shift back to the correct target. In order to capture the response time of these trials, we decided to use the last rising or falling edge. The response time was invalid if the gaze shift and reorientation were wrong throughout the trial. The schematic of the offline data analysis was illustrated in <xref ref-type="fig" rid="F3">Figure 3B</xref>.</p>
</sec>
<sec id="S2.SS5">
<title>Calculation of R-square Value and Event-Related De-synchronization/Synchronization</title>
<p>R-square (<italic>r</italic><sup>2</sup>) value, commonly used in BCI studies (<xref ref-type="bibr" rid="B42">Wolpaw and McFarland, 2004</xref>; <xref ref-type="bibr" rid="B31">Ramos-Murguialday et al., 2012</xref>; <xref ref-type="bibr" rid="B26">Nakanishi et al., 2017</xref>), was used to quantify the strength of each electrode to discriminate the left vs. right-hand imagination task. The r-square value was calculated at each electrode by the following equation (2):</p>
<disp-formula id="S2.E2"><label>(2)</label><mml:math id="M2"><mml:mrow><mml:mpadded width="+3.3pt"><mml:msup><mml:mi>r</mml:mi><mml:mn>2</mml:mn></mml:msup></mml:mpadded><mml:mo rspace="10.8pt">=</mml:mo><mml:mfrac><mml:mrow><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>v</mml:mi><mml:msup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mi>x</mml:mi><mml:mo>,</mml:mo><mml:mi>y</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mn>2</mml:mn></mml:msup></mml:mrow><mml:mrow><mml:mi>v</mml:mi><mml:mi>a</mml:mi><mml:mi>r</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mi>x</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mi>v</mml:mi><mml:mi>a</mml:mi><mml:mi>r</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mi>y</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:mfrac></mml:mrow></mml:math></disp-formula>
<p>The r-square value is the squared correlation coefficient for a single bivariate distribution computed from two sets of univariate data. Variable x is the measurement of condition one or condition two, y is the assigned value corresponding to condition one or condition two, e.g., y = +1 is assigned for condition one and <italic>y</italic> = &#x2212;1 for condition two. Here the measurement of variable x is the power spectrum estimated for a particular channel in a specified frequency. Condition one corresponded to the right-hand imagination task, while condition two corresponded to the left-hand imagination task.</p>
<p>Then a topography of r-square value was generated to show how strongly the electrodes correlate with the discrimination of imagination tasks. In the offline analysis, the r-square values were calculated based on all of the trials including hit, miss, and abort trials in the frequency band of mu rhythm used for online control. The r-squared values were calculated for each subject and each session, then the grand average r-squared values and its topography were illustrated to show the strength of discrimination for each of the three BCI control conditions. Then event-related de-synchronization/synchronization (ERD/ERS) was calculated to characterize the dynamic change of band power activities relative to a baseline period in the electrodes used for online control (<xref ref-type="bibr" rid="B15">Graimann et al., 2002</xref>).</p>
<p>Additionally, r-squared values of each subject were contrasted between different control conditions, i.e., congruent trials vs. incongruent trials, congruent trials vs. center cross trials, and incongruent trials vs. center cross trials. Based on the contrasted r-squared values, brain electrodes/regions which differentiated different control conditions might be found. We then calculated the ERD/ERS dynamic processes in these electrodes, which emerged in the above contrast analysis. The functional hypothesis of ERD was suggested as a signature of task-relevant active cortical activities, and the ERS represented an inactive or inhibited cortical network (<xref ref-type="bibr" rid="B29">Pfurtscheller and Neuper, 2006</xref>). Thus, these metrics might be correlated with the task-relevant/inhibited cortical network to a certain degree. For each subject, the last 1.5 s of the inter-trial interval was selected as a baseline period. The time course of ERD/ERS was calculated from the beginning of the inter-trial interval to the 3 s after the feedback began. Here, 3 s were chosen since the average feedback duration was a little above 3 s. All of the trials were used to calculate ERD/ERS in a session, and a grand average over subjects was obtained for each BCI control condition, i.e., congruent trials, incongruent trials, and center cross trials.</p>
<p>Besides the commonly used ERD/ERS, the task-independent ERD/ERS lateralization index (<xref ref-type="bibr" rid="B37">Van Ede et al., 2011</xref>; <xref ref-type="bibr" rid="B33">Shu et al., 2018</xref>), which measured the average difference of each task between contra- and ipsilateral ERD, was calculated as well. In general, all of the electrodes that identified in the contrast analysis of the R square values were clustered in the left hemisphere or the right hemisphere. Then for each motor imagination task, the contralateral ERD of the identified electrodes in the corresponding hemisphere was first averaged spatially and then was substracted to its counterpart of average ipsilateral ERD. Last, the values for both of the motor imagination tasks were averaged and a task-independent ERD/ERS lateralization index was obtained. This metric was shown to be a sensitive metric to discover the neurophysiological change in both healthy populations and patients (<xref ref-type="bibr" rid="B37">Van Ede et al., 2011</xref>; <xref ref-type="bibr" rid="B33">Shu et al., 2018</xref>). Finally, the task-independent ERD/ERS lateralization index was contrasted between different BCI control conditions.</p>
</sec>
<sec id="S2.SS6">
<title>Statistical Analysis</title>
<p>Statistical analysis was performed in Matlab R2019a and RStudio. The experiment adopted a randomized complete block design (<xref ref-type="bibr" rid="B24">Montgomery, 2017</xref>). Subjects were the block factors; The number of trials per condition was kept the same, i.e., 10 out of 30 trials per condition, and the order of each condition was kept relatively the same as well due to the randomization procedure. Therefore, within each subject, the conditions were as homogeneous as possible so that the treatment conditions could be compared under relatively homogeneous conditions. Factors such as fatigue, familiarity to the task would not affect each condition differently.</p>
<p>Since we have a limited number of subjects, a nonparametric approach of the Kruskal-Wallis test was used to evaluate the effect of three BCI control conditions on BCI behavioral performance, including the independent variables of PVC, ACC, feedback duration, and trajectory length. If the Kruskal-Wallis test was significant, a <italic>post-hoc</italic> analysis would be performed to determine which groups differ from each other group. Similarly, Wilcoxon&#x2019;s sign rank test was used to compare the response time of gaze shift and reorientation between congruent trials and incongruent trials (the response time of the center cross trials was always zero if the subjects responded correctly, thus this response time was not considered).</p>
<p>An electrode-wise analysis of r-square values was performed and compared among different BCI control conditions. We performed a 3(three BCI control conditions) &#x00D7; 63(channels) repeated measure ANOVA to determine whether the r-square values in each BCI control condition and in each channel were significantly different. If the 3 &#x00D7; 63 repeated measure ANOVA test was significant in the main factors or the interaction, a <italic>post-hoc</italic> analysis would be performed to determine which main factors have a significant difference among levels. Specifically, if the main factor of BCI control conditions was significantly different, paired <italic>t</italic>-test with Bonferroni correction would be performed. On the other hand, if the main factor of channels was significant, paired <italic>t</italic>-test with false discovery rate (FDR) correction would be performed since we have 63 channels to compare. If the interaction between main factors was significant, it might indicate the r-square values of the channels differ from each other depending on the BCI control conditions. Therefore, contrast analyses of r-square values between paired control conditions would be performed, resulting in three paired comparisons, and the false discovery rate (FDR) was used for multiple comparison corrections.</p>
<p>Based on the analysis results of the r-square values, brain areas having a significant correlation with the motor imagination tasks could be identified. Then, brain region-wise analysis of lateralization process could be performed. Specifically, an N brain regions (identified in the analysis of r-square values) &#x00D7; 3(three BCI control conditions) &#x00D7; 15(15 time points uniformly distributed across 0 to 7.5 s) repeated measure ANOVA would be performed to determine whether the lateralization processes in each brain region and in each BCI control condition were significantly different. Similarly, if any main factor was significantly different, paired <italic>t</italic>-test with Bonferroni correction would be performed. Otherwise, if the interaction between main factors was significant, it might indicate the lateralization processes differ in each brain region depending on the BCI control conditions. Therefore, contrast analyses of the lateralization process between paired BCI control conditions in each brain area would be performed, resulting in three paired comparisons. The cluster-based permutation test was used to identify the significant clusters of periods of the lateralization process between paired BCI control conditions.</p>
</sec>
</sec>
<sec id="S3" sec-type="results">
<title>Results</title>
<sec id="S3.SS1">
<title>Brain-Computer Interface Behavioral Performance</title>
<p>Percent valid correct and ACC defined in the methods were calculated separately for three BCI conditions, i.e., congruent, incongruent, and center cross trials. The individual and group average violin plots were shown in <xref ref-type="fig" rid="F4">Figures 4A,B</xref>. Additionally, the PVC and ACC with respect to the left side (stimulus-Left) and the right side (stimulus-Right) target were investigated as well; the corresponding results were illustrated in <xref ref-type="fig" rid="F4">Figure 4</xref>. The average PVCs &#x00B1; standard error of the mean (SEM) for congruent trials, incongruent trials, and center cross trials were 80.59 &#x00B1; 5.13, 87.44 &#x00B1; 3.51, 87.48 &#x00B1; 3.61%, respectively. The average ACCs &#x00B1; SEM for congruent trials, incongruent trials, and center cross trials were 49.35 &#x00B1; 7.37, 53.09 &#x00B1; 6.10, 51.24 &#x00B1; 6.99%, respectively. The Kruskal-Wallis test was performed between paired control conditions, resulting in three paired comparisons (center cross vs. congruent, center cross vs. incongruent, and congruent vs. incongruent). Bonferroni&#x2019;s <italic>post-hoc</italic> test was used to correct for multiple comparisons. The statistical analysis revealed that there was no significant difference in behavioral performance between different control conditions. Furthermore, left-target and right-target trials were analyzed separately to see whether there was any difference between them. The average PVCs &#x00B1; SEM for left-target and right target trials were 85.18 &#x00B1; 3.89, 83.86 &#x00B1; 3.65%, respectively. The average ACCs &#x00B1; SEM for left-target and right target trials were 52.08 &#x00B1; 6.93, 50.48 &#x00B1; 6.00%, respectively. There was no significant difference between left-target and right-target trials in PVC and ACC.</p>
<fig id="F4" position="float">
<label>FIGURE 4</label>
<caption><p>BCI behavioral performances of separate control conditions. Group average results of <bold>(A)</bold> PVC, <bold>(B)</bold> ACC, <bold>(C)</bold> feedback duration of single trials, <bold>(D)</bold> trajectory length of single trials. The labels of <italic>x</italic>-axis for all of the four figures were the same, representing the categorization of trials in terms of the motor imagery tasks (left vs. right) and the congruency between the motor imagery task and the gaze fixation task. Violin plots: shaded areas represent kernel density estimate of the data, white circles represent the median, and gray bars represent the interquartile range.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-15-773603-g004.tif"/>
</fig>
<p>Individual and group average violin plots of hit trials&#x2019; duration and trajectory length were shown in <xref ref-type="fig" rid="F4">Figures 4C,D</xref>. The group average durations of three BCI control conditions were 3.21 &#x00B1; 0.19, 3.56 &#x00B1; 0.15, 3.59 &#x00B1; 0.13 s, respectively. The group average trajectory lengths of three BCI control conditions were 0.40 &#x00B1; 0.02, 0.45 &#x00B1; 0.02, 0.44 &#x00B1; 0.02 units of the screen width, respectively. The statistical analysis revealed that there was no significant difference in behavioral performance between different control conditions. The average durations &#x00B1; SEM for left-target and right target trials were 3.39. &#x00B1; 0.16, 3.43 &#x00B1; 0.13 s, respectively. The average trajectory lengths &#x00B1; SEM for left-target and right target trials were 0.43 &#x00B1; 0.02, 0.43 &#x00B1; 0.01%, respectively. There was no significant difference between left-target and right-target trials in feedback duration and trajectory length.</p>
</sec>
<sec id="S3.SS2">
<title>Behavioral Performance of Gaze Shift and Reorientation</title>
<p>Subjects&#x2019; behavioral performance of gaze orientation and fixation was analyzed and displayed in <xref ref-type="fig" rid="F5">Figure 5</xref>. Overall, the group average accuracy is 95.1 &#x00B1; 2.4%, which means the majority of subjects could complete the gaze orientation and fixation with very high accuracy. The chance level of gaze orientation and fixation was 33.33%. Each subject&#x2019;s behavioral performance of gaze orientation and fixation in individual runs was shown in the violin plot of <xref ref-type="fig" rid="F5">Figure 5B</xref>. The horizontal and vertical positions of a particular subject&#x2019;s gaze endpoint during the feedback period were shown in the scatter plot of <xref ref-type="fig" rid="F5">Figure 5C</xref>. An orange line represented a cursor&#x2019;s trajectory in the feedback period. Representative examples of congruent trial, incongruent trial and center cross trials were depicted in the left, middle and right columns of <xref ref-type="fig" rid="F5">Figure 5C</xref>. The subject had a 100% accuracy of gaze shift and reorientation for this particular run.</p>
<fig id="F5" position="float">
<label>FIGURE 5</label>
<caption><p>Gaze shift and fixation performance. <bold>(A)</bold> Individual accuracy &#x00B1; standard error of gaze shift and fixation. The red line represents a mean value of fourteen subjects. <bold>(B)</bold> Violin plot of all of subjects&#x2019; performance in terms of individual runs. <bold>(C)</bold> An example of gaze fixation for a particular subject in a randomly selected run. Blue dots represent gaze fixation points, orange lines demonstrate the trajectories of a cursor&#x2019;s movement. Left column: incongruent trials, middle column: center cross trials, right column: congruent trials.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-15-773603-g005.tif"/>
</fig>
<p>The individual&#x2019;s average response time of gaze shift and reorientation was shown in <xref ref-type="fig" rid="F6">Figure 6A</xref>, their grand average was shown in <xref ref-type="fig" rid="F6">Figure 6B</xref>. Due to the gaze position was in the center cross at the beginning of each trial, only response times of congruent and incongruent conditions were available. A red line in <xref ref-type="fig" rid="F6">Figure 6A</xref> meant the response time of an incongruent trial was longer than the response time of a congruent trial for the subject, while a blue line meant the opposite result held. Eleven of the fourteen subjects showed an increase in response time for the incongruent trials. The average response times of congruent and incongruent trials were 0.57 &#x00B1; 0.09 and 0.65 &#x00B1; 0.09 s, respectively. Wilcoxon&#x2019;s signed-rank test showed a significant difference between the response times of congruent and incongruent trials. This difference indicated that the majority of subjects took a longer response time to shift and reorient their gaze to the correct target in the incongruent trials, but there were differences among subjects.</p>
<fig id="F6" position="float">
<label>FIGURE 6</label>
<caption><p>Response time of the gaze shift and fixation task. <bold>(A)</bold> The average response time of congruent trials vs. incongruent trials for each individual, the distribution is used for Wilcoxon&#x2019;s sign rank test. Red lines represent the response time of incongruent trials is longer than that of congruent trials. Blue lines show that the opposite holds. <bold>(B)</bold> Box plot of the group average response time of congruent trials vs. incongruent trials (&#x002A; means <italic>p</italic> &#x003C; 0.05). <bold>(C)</bold> An example of thresholding the gaze shift and fixation for a particular subject in a randomly selected run. Green lines represent gaze fixation trajectory in the pre-feedback period; blue lines demonstrate the rectified trajectories after thresholding; red lines indicate the correct value for an indicated gaze fixation cue. Left column: incongruent trials, middle column: center cross trials, right column: congruent trials.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-15-773603-g006.tif"/>
</fig>
</sec>
<sec id="S3.SS3">
<title>Brain-Computer Interface R-square Value Analysis in Three Conditions</title>
<p>First, the electrode-wise r-square values were calculated and compared among different BCI control conditions. As we planned in the statistical analysis, a 3(three BCI control conditions) &#x00D7; 63(channels) repeated measure ANOVA was conducted to determine whether the r-square values in each channel and in each BCI control condition were significantly different. BCI control conditions did not show significant effect on r-square values, <italic>F</italic>(2,26) = 2.79, <italic>p</italic> = 0.08, &#x03B7;<sub><italic>ges</italic></sub><sup>2</sup> = 0.02. However, channels had a significant effect on r-square values, <italic>F</italic>(62, 806) = 3.84, <italic>p</italic> &#x003C; 0.001, &#x03B7;<sub><italic>ges</italic></sub><sup>2</sup> = 0.15. Furthermore, the interaction between BCI control conditions and channels was significant, <italic>F</italic>(124,1612) = 3.55, <italic>p</italic> &#x003C; 0.001, &#x03B7;<sub><italic>ges</italic></sub><sup>2</sup> = 0.07, which indicated that channels located in different brain regions might have a significant effect on r-square values, but it depended on the BCI control condition. Due to we did not find any significant difference for the main factor of BCI control conditions, all of the trials were pooled together to get a channel frequency map of R-square values regardless of their condition, the result was plotted in <xref ref-type="fig" rid="F7">Figure 7A</xref>. This map indicated that channels C3, C4, and channel CP6, in the high alpha frequency band (10&#x2013;14 Hz), were most important to discriminate the left and right motor imagery task. The topography of R-square shown in <xref ref-type="fig" rid="F7">Figure 7B</xref> also proved that channels C3, C4, and channel CP6 were with higher R-square value. This was consistent with the prior knowledge of hand-related motor imagery tasks.</p>
<fig id="F7" position="float">
<label>FIGURE 7</label>
<caption><p>R-square value topography maps and statistics. <bold>(A)</bold> Channel-frequency map of group average R-square values over fourteen subjects. <bold>(B)</bold> Topography of the pooled group average R-square values over fourteen subjects. <bold>(C)</bold> Topographies of group average R-square values over fourteen subjects with respect to congruent trials, incongruent trials and center cross trials. The color bar for all three topographies of this row was the same in order to have a fair comparison. <bold>(D)</bold> Contrast analysis of R-square values between congruent trials and incongruent trials, between congruent trials and center cross trials, between incongruent trials and center cross trials. The color bar for all three <italic>p</italic>-value maps was the same (&#x002A; means <italic>p</italic> &#x003C; 0.05, <sup>&#x002A;&#x002A;</sup> means <italic>p</italic> &#x003C; 0.01).</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-15-773603-g007.tif"/>
</fig>
<p>Since a significant interaction effect between BCI control conditions and channels was found, we would like to visually explore the group average R-square topography for each BCI control condition first. Based on this information, grand average topographies of R-square values were calculated and plotted for congruent trials, incongruent trials, and center cross trials, respectively, in <xref ref-type="fig" rid="F7">Figure 7C</xref>. The topography of R-square values for the center cross condition, the typical condition in the conventional experiment, showed a concentration of activities around the C3, Cp5, C4, and Cp4. This concentration around the sensorimotor area was consistent with many previous studies (<xref ref-type="bibr" rid="B42">Wolpaw and McFarland, 2004</xref>; <xref ref-type="bibr" rid="B23">Meng et al., 2016</xref>; <xref ref-type="bibr" rid="B21">Meng and He, 2019</xref>). The topographies of R-square values for the congruent and incongruent trials, however, showed a shift toward the posterior parietal area.</p>
<p>Further contrast analysis was performed between each pair of conditions because the overall ANOVA analysis indicated the impact of channels depending on the BCI control condition. Contrast analysis of R-square values between each pair of conditions were shown in <xref ref-type="fig" rid="F7">Figure 7D</xref>. Since we have 63 channels in total, Bonferroni correction would be too conservative to find any significant channels, FDR was used for correction of multiple comparisons. The permutation test with FDR correction for multiple comparisons showed significant differences in R-square values in the frontal area, including the channels F1, Fz, F2, Fc1, Fcz, Fc2, C1, and Cz when comparing the congruent trials with incongruent trials. In addition, the significant differences of R-square values were displayed in the parietal occipital area, including P7, P5, P3, P1, Po7, Po3, O1, Po4, Po8, O2, O10, and Po10 when comparing the congruent trials with the center cross trials. The results of contrast analysis for incongruent trials vs. the center cross trials highlighted both the frontal area and the parietal occipital area, including the frontal channels F1, Fz, Fc2, and the parietal occipital channels Tp7, P7, P5, P3, Po7, Po3, Po4, P8, Po8, O2, O10, and Po10.</p>
</sec>
<sec id="S3.SS4">
<title>Event-Related De-synchronization/Synchronization Modulation Analysis in Three Conditions</title>
<p>According to the pooled group average topographies of R-square values, channels C3 and C4 played significant roles in discriminating the left and right-hand tasks. Thus, the change of ERD/ERS values of high alpha band across time over the channels C3, C4 was calculated and illustrated in <xref ref-type="fig" rid="F8">Figure 8</xref> for the left target and right target (upper and lower row), congruent trials, incongruent trials, and center cross trials (left, middle and right column), respectively. The channel C3 exhibited significant ERS during the feedback period of the left target&#x2019;s cursor control, which was marked by a grayed horizontal bar, regardless of control conditions. The small blue square indicated the starting time point of a feedback period. On the other hand, channel C4 showed less clear ERD or ERS compared to the baseline for the left target&#x2019;s cursor control. From the figure, we could see that a separation of ERD/ERS activities between channel C3 and C4 happened at a few hundreds of milliseconds after the target cue appeared. However, both channels C3 and C4 showed inseparable ERS during the feedback period of the right target&#x2019;s cursor control.</p>
<fig id="F8" position="float">
<label>FIGURE 8</label>
<caption><p>ERD/ERS values and statistics. <bold>(A)</bold> The group average of time varying ERD/ERS values for the left hand task (the upper row) and <bold>(B)</bold> right hand task (the bottom row) over fourteen subjects. The columns separated the results for congruent trials, incongruent trials and center cross trials, respectively. The target cue appeared at the end of second 2, which was indicated by a short black bar. The cursor feedback began at the second of 4.5, which was marked by a short blue bar. The light gray bar represented the feedback period.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-15-773603-g008.tif"/>
</fig>
<p>According to the contrast analysis of R-square values between paired BCI control conditions, frontal and parietal occipital areas may play roles in the current experimental design. Therefore, the task-independent ERD/ERS lateralization index was calculated for frontal, sensorimotor, and parietal occipital regions separately. We performed a 3(three BCI control conditions) &#x00D7; 15(15 time points uniformly distributed across 0s to 7.5 s) &#x00D7; 3 (brain regions identified in the analysis of r-square values) repeated measure ANOVA as planned in the statistical analysis. The main factor of BCI control conditions did not show significant effect on the alpha ERD/ERS lateralization indexes, <italic>F</italic>(2,26) = 3.19, <italic>p</italic> = 0.06, &#x03B7;<sub><italic>ges</italic></sub><sup>2</sup> = 0.03. Similarly, neither brain regions show significant effect on the alpha ERD/ERS lateralization indexes, <italic>F</italic>(2,26) = 0.95, <italic>p</italic> = 0.40, &#x03B7;<sub><italic>ges</italic></sub><sup>2</sup> = 0.01. However, time did show significant effect on the alpha ERD/ERS lateralization indexes, <italic>F</italic>(14,182) = 6.29, <italic>p</italic> &#x003C; 0.001, &#x03B7;<sub><italic>ges</italic></sub><sup>2</sup> = 0.03. Furthermore, the interaction between BCI control conditions and time [<italic>F</italic>(28,364) = 3.92, <italic>p</italic> &#x003C; 0.001, &#x03B7;<sub><italic>ges</italic></sub><sup>2</sup> = 0.06], the interaction between BCI control conditions and brain regions [<italic>F</italic>(4,52) = 5.78, <italic>p</italic> &#x003C; 0.001, &#x03B7;<sub><italic>ges</italic></sub><sup>2</sup> = 0.06] and the interaction among BCI control conditions, time and brain regions [<italic>F</italic>(56,728) = 2.89, <italic>p</italic> &#x003C; 0.001, &#x03B7;<sub><italic>ges</italic></sub><sup>2</sup> = 0.06] were all significant. This might indicate the alpha ERD/ERS lateralization process might have significant differences in different periods of time, but the difference depended on the BCI control conditions and brain regions.</p>
<p>Following the statistical analysis results, first, the alpha ERD/ERS lateralization index, i.e., the difference of average ERD/ERS activities over the channels F1, Fz, F2, Fc1, Fcz, Fc2, C1, Cz between right-hand trials and left-hand trials, were calculated and plotted over the frontal area in <xref ref-type="fig" rid="F9">Figure 9</xref>. The cluster-based permutation test identified one significant cluster (<italic>p</italic> &#x003C; 0.05), extending more than 1 s after the feedback begins when comparing the congruence trials with the center cross trials. Second, the alpha ERD/ERS lateralization index over the sensorimotor cortex was calculated and illustrated in <xref ref-type="fig" rid="F10">Figure 10</xref>. It measured an average difference between contralateral and ipsilateral ERD/ERS activities, which was not affected by a task type (left-hand or right-hand trials). The contrast of average ERD/ERS activities over the left cortex (Fc3, C5, C3, C1, and Cp3) between right-hand trials and left-hand trials, and the contrast of average ERD/ERS activities over the right cortex (Fc4, C6, C4, C2, and Cp4) between left-hand trials and right-hand trials were averaged to get a single value of alpha ERD/ERS lateralization index. The cluster-based permutation test did not identify any significant cluster between any paired comparisons.</p>
<fig id="F9" position="float">
<label>FIGURE 9</label>
<caption><p>Group average alpha ERD/ERS task difference across the time in the frontal area. Shaded areas represented the stand error of an alpha ERD/ERS task difference. The target cue appeared at the end of second 2, which was indicated by a short black bar. The cursor feedback began at the second of 4.5, which was marked by a short blue bar. The light gray bar represented the feedback period. The dark gray bar demonstrated the period of significant difference. Contrast analysis of alpha ERD/ERS task difference process in the frontal area between <bold>(A)</bold> congruent trials and incongruent trials, <bold>(B)</bold> between congruent trial and center cross trials, <bold>(C)</bold> between incongruent trials and center cross trials.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-15-773603-g009.tif"/>
</fig>
<fig id="F10" position="float">
<label>FIGURE 10</label>
<caption><p>Group average alpha lateralization index across the time in the sensorimotor area. Shaded areas represented the stand error of a lateralization index. The target cue appeared at the end of second 2, which was indicated by a short black bar. The cursor feedback began at the second of 4.5, which was marked by a short blue bar. The light gray bar represented the feedback period. Contrast analysis of lateralization process in the sensorimotor area <bold>(A)</bold> between congruent trials and incongruent trials, <bold>(B)</bold> between congruent trial and center cross trials, <bold>(C)</bold> between incongruent trials and center cross trials.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-15-773603-g010.tif"/>
</fig>
<p>Finally, the alpha ERD/ERS lateralization index over the parietal occipital cortex was calculated and illustrated in <xref ref-type="fig" rid="F11">Figure 11</xref>. The contrast of average ERD/ERS activities over the left parietal occipital cortex (P5, P3, Po3, Po7, the most significant channels in contrast analyses of r-square values) between right-hand trials and left-hand trials, and the contrast of average ERD/ERS activities over the right cortex (Po4, Po8, O2, Po10, the most significant channels in contrast analyses of r-square values) between left-hand trials and right-hand trials were averaged to get a single value of alpha ERD/ERS lateralization index. The cluster-based permutation test identified two significant clusters; one is for comparing congruent trials and incongruent trials, the other is for comparing congruent trials and center cross trials. The substantial separation in lateralization index began at about 1 s before the feedback cursor appeared and lasted for the entire feedback period we analyzed. This significant separation showed in the comparisons between congruent and incongruent trials and between congruent trials and center cross trials.</p>
<fig id="F11" position="float">
<label>FIGURE 11</label>
<caption><p>Group average alpha lateralization index across the time in the parietal occipital area. Shaded areas represented the stand error of a lateralization index. The target cue appeared at the end of second 2, which was indicated by a short black bar. The cursor feedback began at the second of 4.5, which was marked by a short blue bar. The light gray bar represented the feedback period. The dark gray bar demonstrated the period of significant difference. Contrast analysis of lateralization process in the parietal occipital area between <bold>(A)</bold> congruent trials and incongruent trials, <bold>(B)</bold> between congruent trial and center cross trials, <bold>(C)</bold> between incongruent trials and center cross trials.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-15-773603-g011.tif"/>
</fig>
</sec>
<sec id="S3.SS5">
<title>Correlation Analysis of Lateralization Index in Three Brain Areas</title>
<p>The correlations between lateralization indexes over frontal, sensorimotor and parietal occipital areas and PVC were analyzed separately. The results were shown in <xref ref-type="fig" rid="F12">Figure 12</xref>. Although significant clusters were found between different conditions in frontal and parietal occipital regions, no significant correlation between lateralization indexes over frontal and parietal occipital areas and PVC was found. However, a significant linear correlation between lateralization index over the sensorimotor area and PVC was found (<italic>p</italic> &#x003C; 0.01). A more negative lateralization index corresponded to a higher online classification accuracy in terms of PVC.</p>
<fig id="F12" position="float">
<label>FIGURE 12</label>
<caption><p>Linear regression analyses between lateralization indexes and PVCs. <bold>(A)</bold> Frontal area. <bold>(B)</bold> Sensorimotor area. <bold>(C)</bold> Parietal occipital area. A significant linear correlation was only shown between lateralization indexes and PVCs in sensorimotor area.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-15-773603-g012.tif"/>
</fig>
</sec>
</sec>
<sec id="S4" sec-type="discussion">
<title>Discussion</title>
<p>Previously, gaze orientations were not controlled rigorously; subjects might use different strategies to set up their gaze orientations in the experiment (<xref ref-type="bibr" rid="B43">Wolpaw et al., 2002</xref>; <xref ref-type="bibr" rid="B42">Wolpaw and McFarland, 2004</xref>; <xref ref-type="bibr" rid="B23">Meng et al., 2016</xref>; <xref ref-type="bibr" rid="B21">Meng and He, 2019</xref>; <xref ref-type="bibr" rid="B16">He et al., 2020</xref>). We set out to investigate whether the BCI behavioral performances were various under different gaze orientations. Thus, three conditions of gaze orientations and fixation were designed in the current study. Subjects were required to perform two tasks simultaneously at the appearance of the target cues for both motor imagery and gaze reorientation. Their behavioral gaze shift and reorientation performance showed a high completion rate and accuracy (see <xref ref-type="fig" rid="F5">Figure 5</xref>). This high completion rate meant that subjects could follow the instructions successfully. It excluded the trivial possibility that the non-significant difference between different BCI control conditions was due to the inability to complete the dual task under any BCI control condition.</p>
<p>Furthermore, the BCI behavioral performances did not show any significant difference in terms of PVC, ACC, time duration of control, and trajectory length. This might indicate that the gaze orientation or positions might not significantly affect the BCI behavioral performance, no matter whether it was congruent with the target position or not. Thus, the results were consistent with the convention that motor imagery-based BCI was a gaze-independent system.</p>
<p>However, the response time of gaze shift and reorientation for congruent trials was significantly shorter than for incongruent trials (see <xref ref-type="fig" rid="F6">Figure 6</xref>). This response time result was reasonable since the subjects might need a longer time to process the inconsistent information and make a correct judgment to follow the instruction. This was also corroborated by the contrast analysis between congruent trials and incongruent trials, between incongruent trials and center cross trials. The significant difference of alpha R-square values over the frontal areas indicated that frontal regions might play a functional role in processing conflict information in the incongruent trials (<xref ref-type="bibr" rid="B13">Ehlis et al., 2005</xref>; <xref ref-type="bibr" rid="B9">Cohen and Ridderinkhof, 2013</xref>). Due to the experimental design, the subjects were required to fixate their gaze at the center of the cross; thus, the response time of gaze shift and reorientation for center cross trials was not available.</p>
<p>Notice that, we explored the possibility of performing a motor imagery (MI) task and an SSVEP task simultaneously in our previous work (<xref ref-type="bibr" rid="B11">Edelman et al., 2018</xref>). The results of this previous study showed that subjects could perform two typical BCI tasks without interfering each other significantly. The direct evidence was that the BCI performance of either single modality (MI or SSVEP) in multi-tasking (MI+SSVEP) did not decrease compared to its corresponding performance in a single task (MI only or SSVEP only) which required significantly less demanding mental effort. This previous work demonstrated a particular case of combining a motor imagery BCI with another SSVEP BCI modality. A direct hypothesis following the previous experiment was that gaze fixation might be a secondary task which did not interfere with the motor imagery task. Because most of the visual stimulus dependent BCI paradigm relied on the gaze fixation to the attended target, the results of this current study built the foundation to expand the multi-tasking modality.</p>
<p>The previous work in <xref ref-type="bibr" rid="B11">Edelman et al. (2018)</xref> found a significant difference between the congruent trials and incongruent trials (see <xref ref-type="fig" rid="F6">Figure 6C</xref> in <xref ref-type="bibr" rid="B11">Edelman et al., 2018</xref>). However, the experimental results in this study did not show statistical significance among different conditions that was to say the results did not support the hypothesis of which there was a significant difference between the congruent trials and incongruent trials. The reason might be that the 1D task in this study was considerably easier than the 2D task in the previous study. There were not many visual resources conflicts between the gaze shift, reorientation and motor imagery.</p>
<p>The ERD/ERS process over the sensorimotor area was consistent with the previous studies (<xref ref-type="bibr" rid="B23">Meng et al., 2016</xref>; <xref ref-type="bibr" rid="B16">He et al., 2020</xref>). Especially, the ERS was pronounced in channel C3, and ERD/ERS was almost at a baseline level in channel C4 for the left target trials. However, ERS was substantial in both channels C3 and C4 for the right-hand target trials, but their difference was inseparable (see <xref ref-type="fig" rid="F8">Figure 8</xref>). This ERD/ERS result was more or less consistent with our previous studies (<xref ref-type="bibr" rid="B21">Meng and He, 2019</xref>). The analysis of the ERD/ERS lateralization index showed a similar negative level regardless of the control conditions (see <xref ref-type="fig" rid="F10">Figure 10</xref>). This demonstrated that contralateral alpha activities were smaller than the ipsilateral alpha activities over the sensorimotor area when using motor imagination as control strategies.</p>
<p>The contrast analysis between congruent and center cross trials, between incongruent trials and center cross trials showed that the posterior parietal occipital cortex was significantly activated. This activation of parietal occipital region could be interpreted that when subjects maintain their gaze at the indicated position, they also had to covertly pay attention to the movement of a cursor so that the cursor could successfully hit the correct target. Plenty of studies showed that covert attention would modulate the alpha rhythm of the posterior parietal occipital cortex (<xref ref-type="bibr" rid="B44">Worden et al., 2000</xref>; <xref ref-type="bibr" rid="B17">Kelly et al., 2006</xref>; <xref ref-type="bibr" rid="B35">Thut et al., 2006</xref>). Attention to location or motion direction both activated the parietal and occipital areas (<xref ref-type="bibr" rid="B10">Corbetta and Shulman, 2002</xref>). One of our previous studies also proved that it could serve as a supplementary control dimension to complete the 3D cursor control by combining the motor imagery and covert attention modulation (<xref ref-type="bibr" rid="B22">Meng et al., 2018</xref>).</p>
<p>Furthermore, the analysis of ERD/ERS lateralization over the posterior parietal occipital area showed a pronounced positive level for congruent trials, a pronounced negative level for incongruent trials. The contrast analysis between congruent and incongruent trials revealed a significant period starting around 1 s before the feedback cue appeared and lasting for the entire analyzed feedback period. This significant period indicated a strong lateralization effect happened for both congruent and incongruent trials. Additionally, early lateralization before cue appeared might suggest the covert attention was paid before the feedback cursor appeared. An increase of lateralization index might indicate the following hypothesis. Covert attention to moving objects might generate more significant lateralization after the cursor appeared than covert attention to static or nonspecific objects. However, we do not have experimental data to support this speculation.</p>
<p>Furthermore, lateralization directions of congruent trials and incongruent trials were the opposite. The opposite lateralization direction was caused by the calculation method. First, subjects fixated their gaze at the position opposite to an indicated target in the incongruent trials; thus, the cursor would move away from the gaze center if the motor imagination correctly controlled the cursor. Since the direction of covert attention was congruent with the target, the lateralization index should be negative due to contralateral ERD and ipsilateral ERS. On the contrary, subjects maintained their gaze at the center of an indicated target in the congruent trials; thus, the cursor would move toward the gaze center if the motor imagination correctly controlled the cursor. The lateralization index was positive due to contralateral ERD and ipsilateral since the direction of covert attention was incongruent with the target (i.e., for left target, covert attention is on the right side).</p>
<p>The correlation analysis of lateralization index with PVC from different brain areas only showed a significant linear correlation between the sensorimotor area and the PVC. Although parietal occipital area was significantly activated during the control, it had little effect on the accuracy of cursor control. This uncorrelated parietal occipital activity was reasonable since the cursor was supposed to be controlled by the motor imagination, which should activate the sensorimotor area. Thus, the neural activities in the parietal occipital area did not influence the BCI performance, even though covert attention strongly activated it.</p>
<sec id="S4.SS1">
<title>Limitations and Future Work</title>
<p>In this study, we included a group of subjects including a single female and a few subjects who had BCI experience previously. Previous studies showed that the gender (<xref ref-type="bibr" rid="B6">Cantillo-Negrete et al., 2014</xref>) and the subjects&#x2019; experience (<xref ref-type="bibr" rid="B30">Pillette et al., 2021</xref>) might be influential factors to the study results. First, because the gender and experience were both across subjects factors, it would not affect our investigating factors (within-subject factors) significantly. Second, a few subjects only got access to the BCI paradigm previously. All of the subjects were na&#x00EF;ve to the dual tasks before participating in this study. To remove the potential influence of familiarity to the dual tasks, additionally, all of the subjects were given a practicing run to be familiar with the dual tasks. Last, we had used a random block design in our paradigm to further alleviate the potential influence of familiarity and fatigue since the sequence of the BCI control tasks might affect the BCI performance. Due to these reasons, the experienced subjects in this study were not excluded considering a limited number of subjects.</p>
<p>A user would more likely control a robotic arm or wheelchair to perform 2D tasks in real life. But the 2D experiment would be much more complex than the 1D experiment of this study. First, the number of congruent and incongruent trials would be quite different, this would require more trials, which were challenging, for each session. Second, there might be difference between the incongruent trials, which made the comparison among conditions less homogeneous than the current study. Thus, we did not conduct the 2D experiment in this study. But the 2D experiment would be an interesting exploration for the future work.</p>
</sec>
</sec>
<sec id="S5" sec-type="conclusion">
<title>Conclusion</title>
<p>In this study, motor imagination of left-hand vs. right-hand has demonstrated comparable BCI behavioral performance under three control conditions of gaze shift and fixation. A group of fourteen subjects&#x2019; PVC has shown a performance of over 80% accuracy. Further analyses of individuals&#x2019; response time reveal that subjects respond faster for congruent trials than incongruent trials. During the feedback control, covert attention to a cursor&#x2019;s movement induces lateralized alpha activities over the parietal occipital area. This lateralization process displays a significant deviation of baseline level when comparing the congruent trials to the incongruent trials, and when comparing the congruent trials to the center cross trials. This lateralization process starts about 1 s before the feedback begins, which indicates that covert attention is paid to the cursor early than it moves. Nevertheless, neither gaze shift and fixation on different positions nor covert attention to a cursor&#x2019;s movement affect the BCI behavioral performance. These independent brain activities might be advantageous to BCI real applications such as controlling a robotic arm and a wheelchair.</p>
</sec>
<sec id="S6" sec-type="data-availability">
<title>Data Availability Statement</title>
<p>The raw data supporting the conclusions of this article will be made available by the authors under a material transfer agreement with Shanghai Jiao Tong University, without undue reservation.</p>
</sec>
<sec id="S7">
<title>Ethics Statement</title>
<p>The studies involving human participants were reviewed and approved by The Institutional Review Board of Shanghai Jiao Tong University. The patients/participants provided their written informed consent to participate in this study.</p>
</sec>
<sec id="S8">
<title>Author Contributions</title>
<p>JM wrote the first draft of the manuscript. ZW revised the draft of the manuscript. JM, ZW, SL, and XZ edited the manuscript. JM and XZ conceived and designed the experimental paradigm and wrote the manuscript. JM, ZW, and SL performed the research and analyzed the data. All authors contributed to the article and approved the submitted version.</p>
</sec>
<sec id="conf1" sec-type="COI-statement">
<title>Conflict of Interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest. The handling editor declared a past collaboration with one of the authors JM.</p>
</sec>
<sec id="pudiscl1" sec-type="disclaimer">
<title>Publisher&#x2019;s Note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
</body>
<back>
<sec id="S9" sec-type="funding-information">
<title>Funding</title>
<p>This work is supported in part by the National Key R&#x0026;D Program of China (Grant No. 2020YFC207800), Shanghai Pujiang Program (Grant No. 20PJ1408000), the National Natural Science Foundation of China (Grant No. 52175023), and State Key Laboratory of Mechanical System and Vibration (SJTU, Grant No. MSVZD202013).</p>
</sec>
<ack>
<p>The authors would like to thank the volunteers who took part in the experiments.</p>
</ack>
<sec id="S11" sec-type="supplementary-material">
<title>Supplementary Material</title>
<p>The Supplementary Material for this article can be found online at: <ext-link ext-link-type="uri" xlink:href="https://www.frontiersin.org/articles/10.3389/fnhum.2021.773603/full#supplementary-material">https://www.frontiersin.org/articles/10.3389/fnhum.2021.773603/full#supplementary-material</ext-link></p>
<supplementary-material xlink:href="Video_1.MP4" id="VS1" mimetype="video/mp4" xmlns:xlink="http://www.w3.org/1999/xlink">
<label>Supplementary Video 1</label>
<caption><p>The video demonstrated the first ten trials of a particular subject in a run. The first trial was an abort trial (center cross condition), following a miss trial in the second one (incongruent condition). Then the subject succeeded to hit all of the trials from trial three to trial six (trial three was the congruent condition). Trials seven and eight were both miss trials. Trials nine and ten were both hit trials.</p></caption>
</supplementary-material>
</sec>
<ref-list>
<title>References</title>
<ref id="B1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ang</surname> <given-names>K. K.</given-names></name> <name><surname>Guan</surname> <given-names>C.</given-names></name> <name><surname>Phua</surname> <given-names>K. S.</given-names></name> <name><surname>Wang</surname> <given-names>C.</given-names></name> <name><surname>Zhou</surname> <given-names>L.</given-names></name> <name><surname>Tang</surname> <given-names>K. Y.</given-names></name><etal/></person-group> (<year>2014</year>). <article-title>Brain-computer interface-based robotic end effector system for wrist and hand rehabilitation: results of a three-armed randomized controlled trial for chronic stroke.</article-title> <source><italic>Front. Neuroeng.</italic></source> <volume>7</volume>:<issue>30</issue>. <pub-id pub-id-type="doi">10.3389/fneng.2014.00030</pub-id> <pub-id pub-id-type="pmid">25120465</pub-id></citation></ref>
<ref id="B2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Biasiucci</surname> <given-names>A.</given-names></name> <name><surname>Leeb</surname> <given-names>R.</given-names></name> <name><surname>Iturrate</surname> <given-names>I.</given-names></name> <name><surname>Perdikis</surname> <given-names>S.</given-names></name> <name><surname>Al-Khodairy</surname> <given-names>A.</given-names></name> <name><surname>Corbet</surname> <given-names>T.</given-names></name><etal/></person-group> (<year>2018</year>). <article-title>Brain-actuated functional electrical stimulation elicits lasting arm motor recovery after stroke.</article-title> <source><italic>Nat. Commun.</italic></source> <volume>9</volume>:<issue>2421</issue>. <pub-id pub-id-type="doi">10.1038/s41467-018-04673-z</pub-id> <pub-id pub-id-type="pmid">29925890</pub-id></citation></ref>
<ref id="B3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bouton</surname> <given-names>C. E.</given-names></name> <name><surname>Shaikhouni</surname> <given-names>A.</given-names></name> <name><surname>Annetta</surname> <given-names>N. V.</given-names></name> <name><surname>Bockbrader</surname> <given-names>M. A.</given-names></name> <name><surname>Friedenberg</surname> <given-names>D. A.</given-names></name> <name><surname>Nielson</surname> <given-names>D. M.</given-names></name><etal/></person-group> (<year>2016</year>). <article-title>Restoring cortical control of functional movement in a human with quadriplegia.</article-title> <source><italic>Nature</italic></source> <volume>533</volume> <fpage>247</fpage>&#x2013;<lpage>250</lpage>. <pub-id pub-id-type="doi">10.1038/nature17435</pub-id> <pub-id pub-id-type="pmid">27074513</pub-id></citation></ref>
<ref id="B4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brandl</surname> <given-names>S.</given-names></name> <name><surname>Fr&#x00F8;lich</surname> <given-names>L.</given-names></name> <name><surname>H&#x00F6;hne</surname> <given-names>J.</given-names></name> <name><surname>M&#x00FC;ller</surname> <given-names>K. R.</given-names></name> <name><surname>Samek</surname> <given-names>W.</given-names></name></person-group> (<year>2016</year>). <article-title>Brain&#x2013;computer interfacing under distraction: an evaluation study.</article-title> <source><italic>J. Neural Eng.</italic></source> <volume>13</volume>:<issue>056012</issue>. <pub-id pub-id-type="doi">10.1088/1741-2560/13/5/056012</pub-id></citation></ref>
<ref id="B5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brunner</surname> <given-names>C.</given-names></name> <name><surname>Allison</surname> <given-names>B. Z.</given-names></name> <name><surname>Altst&#x00E4;tter</surname> <given-names>C.</given-names></name> <name><surname>Neuper</surname> <given-names>C.</given-names></name></person-group> (<year>2011</year>). <article-title>A comparison of three brain&#x2013;computer interfaces based on event-related de-synchronization, steady state visual evoked potentials, or a hybrid approach using both signals.</article-title> <source><italic>J. Neural Eng.</italic></source> <volume>8</volume>:<issue>025010</issue>. <pub-id pub-id-type="doi">10.1088/1741-2560/8/2/025010</pub-id></citation></ref>
<ref id="B6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cantillo-Negrete</surname> <given-names>J.</given-names></name> <name><surname>Gutierrez-Martinez</surname> <given-names>J.</given-names></name> <name><surname>Carino-Escobar</surname> <given-names>R. I.</given-names></name> <name><surname>Carrillo-Mora</surname> <given-names>P.</given-names></name> <name><surname>Elias-Vinas</surname> <given-names>D.</given-names></name></person-group> (<year>2014</year>). <article-title>An approach to improve the performance of subject-independent BCIs-based on motor imagery allocating subjects by gender.</article-title> <source><italic>Biomed. Eng. Online</italic></source> <volume>13</volume>:<issue>158</issue>. <pub-id pub-id-type="doi">10.1186/1475-925X-13-158</pub-id> <pub-id pub-id-type="pmid">25476924</pub-id></citation></ref>
<ref id="B7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chaudhary</surname> <given-names>U.</given-names></name> <name><surname>Xia</surname> <given-names>B.</given-names></name> <name><surname>Silvoni</surname> <given-names>S.</given-names></name> <name><surname>Cohen</surname> <given-names>L. G.</given-names></name> <name><surname>Birbaumer</surname> <given-names>N.</given-names></name></person-group> (<year>2017</year>). <article-title>Brain&#x2013;computer interface&#x2013;based communication in the completely locked-in state.</article-title> <source><italic>PLoS Biol.</italic></source> <volume>15</volume>:<issue>e1002593</issue>. <pub-id pub-id-type="doi">10.1371/journal.pbio.1002593</pub-id> <pub-id pub-id-type="pmid">28141803</pub-id></citation></ref>
<ref id="B8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cheng</surname> <given-names>S.</given-names></name> <name><surname>Wang</surname> <given-names>J.</given-names></name> <name><surname>Zhang</surname> <given-names>L.</given-names></name> <name><surname>Wei</surname> <given-names>Q.</given-names></name></person-group> (<year>2020</year>). <article-title>Motion imagery-BCI based on EEG and eye movement data fusion.</article-title> <source><italic>IEEE Trans. Neural Syst. Rehabil. Eng.</italic></source> <volume>28</volume> <fpage>2783</fpage>&#x2013;<lpage>2793</lpage>. <pub-id pub-id-type="doi">10.1109/TNSRE.2020.3048422</pub-id> <pub-id pub-id-type="pmid">33382658</pub-id></citation></ref>
<ref id="B9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cohen</surname> <given-names>M. X.</given-names></name> <name><surname>Ridderinkhof</surname> <given-names>K. R.</given-names></name></person-group> (<year>2013</year>). <article-title>EEG source reconstruction reveals frontal-parietal dynamics of spatial conflict processing.</article-title> <source><italic>PLoS One</italic></source> <volume>8</volume>:<issue>e57293</issue>. <pub-id pub-id-type="doi">10.1371/journal.pone.0057293</pub-id> <pub-id pub-id-type="pmid">23451201</pub-id></citation></ref>
<ref id="B10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Corbetta</surname> <given-names>M.</given-names></name> <name><surname>Shulman</surname> <given-names>G. L.</given-names></name></person-group> (<year>2002</year>). <article-title>Control of goal-directed and stimulus-driven attention in the brain.</article-title> <source><italic>Nat. Rev. Neurosci.</italic></source> <volume>3</volume> <fpage>201</fpage>&#x2013;<lpage>215</lpage>. <pub-id pub-id-type="doi">10.1038/nrn755</pub-id> <pub-id pub-id-type="pmid">11994752</pub-id></citation></ref>
<ref id="B11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Edelman</surname> <given-names>B. J.</given-names></name> <name><surname>Meng</surname> <given-names>J.</given-names></name> <name><surname>Gulachek</surname> <given-names>N.</given-names></name> <name><surname>Cline</surname> <given-names>C. C.</given-names></name> <name><surname>He</surname> <given-names>B.</given-names></name></person-group> (<year>2018</year>). <article-title>Exploring cognitive flexibility with a noninvasive BCI using simultaneous steady-state visual evoked potentials and sensorimotor rhythms.</article-title> <source><italic>IEEE Trans. Neural Syst. Rehabil. Eng.</italic></source> <volume>26</volume> <fpage>936</fpage>&#x2013;<lpage>947</lpage>. <pub-id pub-id-type="doi">10.1109/TNSRE.2018.2817924</pub-id> <pub-id pub-id-type="pmid">29752228</pub-id></citation></ref>
<ref id="B12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Edelman</surname> <given-names>B. J.</given-names></name> <name><surname>Meng</surname> <given-names>J.</given-names></name> <name><surname>Suma</surname> <given-names>D.</given-names></name> <name><surname>Zurn</surname> <given-names>C.</given-names></name> <name><surname>Nagarajan</surname> <given-names>E.</given-names></name> <name><surname>Baxter</surname> <given-names>B. S.</given-names></name><etal/></person-group> (<year>2019</year>). <article-title>Noninvasive neuroimaging enhances continuous neural tracking for robotic device control.</article-title> <source><italic>Sci. Robot.</italic></source> <volume>4</volume>:<issue>eaaw6844</issue>. <pub-id pub-id-type="doi">10.1126/scirobotics.aaw6844</pub-id> <pub-id pub-id-type="pmid">31656937</pub-id></citation></ref>
<ref id="B13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ehlis</surname> <given-names>A. C.</given-names></name> <name><surname>Herrmann</surname> <given-names>M. J.</given-names></name> <name><surname>Wagener</surname> <given-names>A.</given-names></name> <name><surname>Fallgatter</surname> <given-names>A. J.</given-names></name></person-group> (<year>2005</year>). <article-title>Multi-channel near-infrared spectroscopy detects specific inferior-frontal activation during incongruent Stroop trials.</article-title> <source><italic>Biol. Psychol.</italic></source> <volume>69</volume> <fpage>315</fpage>&#x2013;<lpage>331</lpage>. <pub-id pub-id-type="doi">10.1016/j.biopsycho.2004.09.003</pub-id> <pub-id pub-id-type="pmid">15925033</pub-id></citation></ref>
<ref id="B14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Frisoli</surname> <given-names>A.</given-names></name> <name><surname>Loconsole</surname> <given-names>C.</given-names></name> <name><surname>Leonardis</surname> <given-names>D.</given-names></name> <name><surname>Banno</surname> <given-names>F.</given-names></name> <name><surname>Barsotti</surname> <given-names>M.</given-names></name> <name><surname>Chisari</surname> <given-names>C.</given-names></name><etal/></person-group> (<year>2012</year>). <article-title>A new gaze-BCI-driven control of an upper limb exoskeleton for rehabilitation in real-world tasks.</article-title> <source><italic>IEEE Trans. Syst. Man Cybernet. C Appl. Rev.</italic></source> <volume>42</volume> <fpage>1169</fpage>&#x2013;<lpage>1179</lpage>. <pub-id pub-id-type="doi">10.1109/TSMCC.2012.2226444</pub-id></citation></ref>
<ref id="B15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Graimann</surname> <given-names>B.</given-names></name> <name><surname>Huggins</surname> <given-names>J. E.</given-names></name> <name><surname>Levine</surname> <given-names>S. P.</given-names></name> <name><surname>Pfurtscheller</surname> <given-names>G.</given-names></name></person-group> (<year>2002</year>). <article-title>Visualization of significant ERD/ERS patterns in multichannel EEG and ECoG data.</article-title> <source><italic>Clin. Neurophysiol.</italic></source> <volume>113</volume> <fpage>43</fpage>&#x2013;<lpage>47</lpage>. <pub-id pub-id-type="doi">10.1016/S1388-2457(01)00697-6</pub-id></citation></ref>
<ref id="B16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>He</surname> <given-names>B.</given-names></name> <name><surname>Yuan</surname> <given-names>H.</given-names></name> <name><surname>Meng</surname> <given-names>J.</given-names></name> <name><surname>Gao</surname> <given-names>S.</given-names></name></person-group> (<year>2020</year>). &#x201C;<article-title>Brain&#x2013;computer interfaces</article-title>,&#x201D; in <source><italic>Neural Engineering</italic></source>, <role>ed.</role> <person-group person-group-type="editor"><name><surname>He</surname> <given-names>B.</given-names></name></person-group> (<publisher-loc>Cham</publisher-loc>: <publisher-name>Springer</publisher-name>), <fpage>131</fpage>&#x2013;<lpage>183</lpage>. <pub-id pub-id-type="doi">10.1007/978-3-030-43395-6_4</pub-id></citation></ref>
<ref id="B17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kelly</surname> <given-names>S. P.</given-names></name> <name><surname>Lalor</surname> <given-names>E. C.</given-names></name> <name><surname>Reilly</surname> <given-names>R. B.</given-names></name> <name><surname>Foxe</surname> <given-names>J. J.</given-names></name></person-group> (<year>2006</year>). <article-title>Increases in alpha oscillatory power reflect an active retinotopic mechanism for distracter suppression during sustained visuospatial attention.</article-title> <source><italic>J. Neurophysiol.</italic></source> <volume>95</volume> <fpage>3844</fpage>&#x2013;<lpage>3851</lpage>. <pub-id pub-id-type="doi">10.1152/jn.01234.2005</pub-id> <pub-id pub-id-type="pmid">16571739</pub-id></citation></ref>
<ref id="B18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Long</surname> <given-names>J.</given-names></name> <name><surname>Li</surname> <given-names>Y.</given-names></name> <name><surname>Wang</surname> <given-names>H.</given-names></name> <name><surname>Yu</surname> <given-names>T.</given-names></name> <name><surname>Pan</surname> <given-names>J.</given-names></name> <name><surname>Li</surname> <given-names>F.</given-names></name></person-group> (<year>2012</year>). <article-title>A hybrid brain computer interface to control the direction and speed of a simulated or real wheelchair.</article-title> <source><italic>IEEE Trans. Neural Syst. Rehabil. Eng.</italic></source> <volume>20</volume> <fpage>720</fpage>&#x2013;<lpage>729</lpage>. <pub-id pub-id-type="doi">10.1109/TNSRE.2012.2197221</pub-id> <pub-id pub-id-type="pmid">22692936</pub-id></citation></ref>
<ref id="B19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>McFarland</surname> <given-names>D. J.</given-names></name> <name><surname>Wolpaw</surname> <given-names>J. R.</given-names></name></person-group> (<year>2008</year>). <article-title>Sensorimotor rhythm-based brain&#x2013;computer interface (BCI): model order selection for autoregressive spectral analysis.</article-title> <source><italic>J. Neural Eng.</italic></source> <volume>5</volume>:<issue>155</issue>. <pub-id pub-id-type="doi">10.1088/1741-2560/5/2/006</pub-id></citation></ref>
<ref id="B20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>McFarland</surname> <given-names>D. J.</given-names></name> <name><surname>McCane</surname> <given-names>L. M.</given-names></name> <name><surname>David</surname> <given-names>S. V.</given-names></name> <name><surname>Wolpaw</surname> <given-names>J. R.</given-names></name></person-group> (<year>1997</year>). <article-title>Spatial filter selection for EEG-based communication.</article-title> <source><italic>Electroencephalogr. Clin. Neurophysiol.</italic></source> <volume>103</volume> <fpage>386</fpage>&#x2013;<lpage>394</lpage>. <pub-id pub-id-type="doi">10.1016/S0013-4694(97)00022-2</pub-id></citation></ref>
<ref id="B21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Meng</surname> <given-names>J.</given-names></name> <name><surname>He</surname> <given-names>B.</given-names></name></person-group> (<year>2019</year>). <article-title>Exploring training effect in 42 human subjects using a noninvasive sensorimotor rhythm based online BCI.</article-title> <source><italic>Front. Hum. Neurosci.</italic></source> <volume>13</volume>:<issue>128</issue>. <pub-id pub-id-type="doi">10.3389/fnhum.2019.00128</pub-id> <pub-id pub-id-type="pmid">31057380</pub-id></citation></ref>
<ref id="B22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Meng</surname> <given-names>J.</given-names></name> <name><surname>Streitz</surname> <given-names>T.</given-names></name> <name><surname>Gulachek</surname> <given-names>N.</given-names></name> <name><surname>Suma</surname> <given-names>D.</given-names></name> <name><surname>He</surname> <given-names>B.</given-names></name></person-group> (<year>2018</year>). <article-title>Three-dimensional brain&#x2013;computer interface control through simultaneous overt spatial attentional and motor imagery tasks.</article-title> <source><italic>IEEE Trans. Biomed. Eng.</italic></source> <volume>65</volume> <fpage>2417</fpage>&#x2013;<lpage>2427</lpage>. <pub-id pub-id-type="doi">10.1109/TBME.2018.2872855</pub-id> <pub-id pub-id-type="pmid">30281428</pub-id></citation></ref>
<ref id="B23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Meng</surname> <given-names>J.</given-names></name> <name><surname>Zhang</surname> <given-names>S.</given-names></name> <name><surname>Bekyo</surname> <given-names>A.</given-names></name> <name><surname>Olsoe</surname> <given-names>J.</given-names></name> <name><surname>Baxter</surname> <given-names>B.</given-names></name> <name><surname>He</surname> <given-names>B.</given-names></name></person-group> (<year>2016</year>). <article-title>Noninvasive electroencephalogram based control of a robotic arm for reach and grasp tasks.</article-title> <source><italic>Sci. Rep.</italic></source> <volume>6</volume>:<issue>38565</issue>. <pub-id pub-id-type="doi">10.1038/srep38565</pub-id> <pub-id pub-id-type="pmid">27966546</pub-id></citation></ref>
<ref id="B24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Montgomery</surname> <given-names>D. C.</given-names></name></person-group> (<year>2017</year>). <source><italic>Design and Analysis of Experiments.</italic></source> <publisher-loc>Hoboken, NJ</publisher-loc>: <publisher-name>John wiley &#x0026; sons</publisher-name>.</citation></ref>
<ref id="B25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Moses</surname> <given-names>D. A.</given-names></name> <name><surname>Metzger</surname> <given-names>S. L.</given-names></name> <name><surname>Liu</surname> <given-names>J. R.</given-names></name> <name><surname>Anumanchipalli</surname> <given-names>G. K.</given-names></name> <name><surname>Makin</surname> <given-names>J. G.</given-names></name> <name><surname>Sun</surname> <given-names>P. F.</given-names></name><etal/></person-group> (<year>2021</year>). <article-title>Neuroprosthesis for decoding speech in a paralyzed person with anarthria.</article-title> <source><italic>N. Engl. J. Med.</italic></source> <volume>385</volume> <fpage>217</fpage>&#x2013;<lpage>227</lpage>. <pub-id pub-id-type="doi">10.1056/NEJMoa2027540</pub-id> <pub-id pub-id-type="pmid">34260835</pub-id></citation></ref>
<ref id="B26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nakanishi</surname> <given-names>M.</given-names></name> <name><surname>Wang</surname> <given-names>Y.</given-names></name> <name><surname>Chen</surname> <given-names>X.</given-names></name> <name><surname>Wang</surname> <given-names>Y. T.</given-names></name> <name><surname>Gao</surname> <given-names>X.</given-names></name> <name><surname>Jung</surname> <given-names>T. P.</given-names></name></person-group> (<year>2017</year>). <article-title>Enhancing detection of SSVEPs for a high-speed brain speller using task-related component analysis.</article-title> <source><italic>IEEE Trans. Biomed. Eng.</italic></source> <volume>65</volume> <fpage>104</fpage>&#x2013;<lpage>112</lpage>. <pub-id pub-id-type="doi">10.1109/TBME.2017.2694818</pub-id> <pub-id pub-id-type="pmid">28436836</pub-id></citation></ref>
<ref id="B27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Neuper</surname> <given-names>C.</given-names></name> <name><surname>Scherer</surname> <given-names>R.</given-names></name> <name><surname>Reiner</surname> <given-names>M.</given-names></name> <name><surname>Pfurtscheller</surname> <given-names>G.</given-names></name></person-group> (<year>2005</year>). <article-title>Imagery of motor actions: differential effects of kinesthetic and visual&#x2013;motor mode of imagery in single-trial EEG.</article-title> <source><italic>Cogn. Brain Res.</italic></source> <volume>25</volume> <fpage>668</fpage>&#x2013;<lpage>677</lpage>. <pub-id pub-id-type="doi">10.1016/j.cogbrainres.2005.08.014</pub-id> <pub-id pub-id-type="pmid">16236487</pub-id></citation></ref>
<ref id="B28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pfurtscheller</surname> <given-names>G.</given-names></name> <name><surname>Neuper</surname> <given-names>C.</given-names></name></person-group> (<year>2001</year>). <article-title>Motor imagery and direct brain-computer communication.</article-title> <source><italic>Proc. IEEE</italic></source> <volume>89</volume> <fpage>1123</fpage>&#x2013;<lpage>1134</lpage>. <pub-id pub-id-type="doi">10.1109/5.939829</pub-id></citation></ref>
<ref id="B29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pfurtscheller</surname> <given-names>G.</given-names></name> <name><surname>Neuper</surname> <given-names>C.</given-names></name></person-group> (<year>2006</year>). <article-title>Future prospects of ERD/ERS in the context of brain&#x2013;computer interface (BCI) developments.</article-title> <source><italic>Progr. Brain Res.</italic></source> <volume>159</volume> <fpage>433</fpage>&#x2013;<lpage>437</lpage>. <pub-id pub-id-type="doi">10.1016/S0079-6123(06)59028-4</pub-id></citation></ref>
<ref id="B30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pillette</surname> <given-names>L.</given-names></name> <name><surname>Roc</surname> <given-names>A.</given-names></name> <name><surname>N&#x2019;kaoua</surname> <given-names>B.</given-names></name> <name><surname>Lotte</surname> <given-names>F.</given-names></name></person-group> (<year>2021</year>). <article-title>Experimenters&#x2019; influence on mental-imagery based brain-computer interface user training.</article-title> <source><italic>Int. J. Hum. Comput. Stud.</italic></source> <volume>149</volume>:<issue>102603</issue>. <pub-id pub-id-type="doi">10.1016/j.ijhcs.2021.102603</pub-id></citation></ref>
<ref id="B31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ramos-Murguialday</surname> <given-names>A.</given-names></name> <name><surname>Sch&#x00FC;rholz</surname> <given-names>M.</given-names></name> <name><surname>Caggiano</surname> <given-names>V.</given-names></name> <name><surname>Wildgruber</surname> <given-names>M.</given-names></name> <name><surname>Caria</surname> <given-names>A.</given-names></name> <name><surname>Hammer</surname> <given-names>E. M.</given-names></name><etal/></person-group> (<year>2012</year>). <article-title>Proprioceptive feedback and brain computer interface (BCI) based neuroprostheses.</article-title> <source><italic>PLoS One</italic></source> <volume>7</volume>:<issue>e47048</issue>. <pub-id pub-id-type="doi">10.1371/journal.pone.0047048</pub-id> <pub-id pub-id-type="pmid">23071707</pub-id></citation></ref>
<ref id="B32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schalk</surname> <given-names>G.</given-names></name> <name><surname>McFarland</surname> <given-names>D. J.</given-names></name> <name><surname>Hinterberger</surname> <given-names>T.</given-names></name> <name><surname>Birbaumer</surname> <given-names>N.</given-names></name> <name><surname>Wolpaw</surname> <given-names>J. R.</given-names></name></person-group> (<year>2004</year>). <article-title>BCI2000: a general-purpose brain-computer interface (BCI) system.</article-title> <source><italic>IEEE Trans. Biomed. Eng.</italic></source> <volume>51</volume> <fpage>1034</fpage>&#x2013;<lpage>1043</lpage>. <pub-id pub-id-type="doi">10.1109/TBME.2004.827072</pub-id> <pub-id pub-id-type="pmid">15188875</pub-id></citation></ref>
<ref id="B33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Shu</surname> <given-names>X.</given-names></name> <name><surname>Chen</surname> <given-names>S.</given-names></name> <name><surname>Yao</surname> <given-names>L.</given-names></name> <name><surname>Sheng</surname> <given-names>X.</given-names></name> <name><surname>Zhang</surname> <given-names>D.</given-names></name> <name><surname>Jiang</surname> <given-names>N.</given-names></name><etal/></person-group> (<year>2018</year>). <article-title>Fast recognition of BCI-inefficient users using physiological features from EEG signals: a screening study of stroke patients.</article-title> <source><italic>Front. Neurosci.</italic></source> <volume>12</volume>:<issue>93</issue>. <pub-id pub-id-type="doi">10.3389/fnins.2018.00093</pub-id> <pub-id pub-id-type="pmid">29515363</pub-id></citation></ref>
<ref id="B34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Soekadar</surname> <given-names>S. R.</given-names></name> <name><surname>Witkowski</surname> <given-names>M.</given-names></name> <name><surname>G&#x00F3;mez</surname> <given-names>C.</given-names></name> <name><surname>Opisso</surname> <given-names>E.</given-names></name> <name><surname>Medina</surname> <given-names>J.</given-names></name> <name><surname>Cortese</surname> <given-names>M.</given-names></name><etal/></person-group> (<year>2016</year>). <article-title>Hybrid EEG/EOG-based brain/neural hand exoskeleton restores fully independent daily living activities after quadriplegia.</article-title> <source><italic>Sci. Robot.</italic></source> <volume>1</volume> <fpage>32</fpage>&#x2013;<lpage>96</lpage>. <pub-id pub-id-type="doi">10.1126/scirobotics.aag3296</pub-id> <pub-id pub-id-type="pmid">33157855</pub-id></citation></ref>
<ref id="B35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Thut</surname> <given-names>G.</given-names></name> <name><surname>Nietzel</surname> <given-names>A.</given-names></name> <name><surname>Brandt</surname> <given-names>S. A.</given-names></name> <name><surname>Pascual-Leone</surname> <given-names>A.</given-names></name></person-group> (<year>2006</year>). <article-title>&#x03B1;-Band electroencephalographic activity over occipital cortex indexes visuospatial attention bias and predicts visual target detection.</article-title> <source><italic>J. Neurosci.</italic></source> <volume>26</volume> <fpage>9494</fpage>&#x2013;<lpage>9502</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.0875-06.2006</pub-id> <pub-id pub-id-type="pmid">16971533</pub-id></citation></ref>
<ref id="B36"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tonin</surname> <given-names>L.</given-names></name> <name><surname>Bauer</surname> <given-names>F. C.</given-names></name> <name><surname>Mill&#x00E1;n</surname> <given-names>J. D. R.</given-names></name></person-group> (<year>2019</year>). <article-title>The role of the control framework for continuous teleoperation of a brain&#x2013;machine interface-driven mobile robot.</article-title> <source><italic>IEEE Trans. Robot.</italic></source> <volume>36</volume> <fpage>78</fpage>&#x2013;<lpage>91</lpage>. <pub-id pub-id-type="doi">10.1109/TRO.2019.2943072</pub-id></citation></ref>
<ref id="B37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Van Ede</surname> <given-names>F.</given-names></name> <name><surname>De Lange</surname> <given-names>F.</given-names></name> <name><surname>Jensen</surname> <given-names>O.</given-names></name> <name><surname>Maris</surname> <given-names>E.</given-names></name></person-group> (<year>2011</year>). <article-title>Orienting attention to an upcoming tactile event involves a spatially and temporally specific modulation of sensorimotor alpha-and beta-band oscillations.</article-title> <source><italic>J. Neurosci.</italic></source> <volume>31</volume> <fpage>2016</fpage>&#x2013;<lpage>2024</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.5630-10.2011</pub-id> <pub-id pub-id-type="pmid">21307240</pub-id></citation></ref>
<ref id="B38"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Van Gerven</surname> <given-names>M.</given-names></name> <name><surname>Jensen</surname> <given-names>O.</given-names></name></person-group> (<year>2009</year>). <article-title>Attention modulations of posterior alpha as a control signal for two-dimensional brain&#x2013;computer interfaces.</article-title> <source><italic>J. Neurosci. Methods</italic></source> <volume>179</volume> <fpage>78</fpage>&#x2013;<lpage>84</lpage>. <pub-id pub-id-type="doi">10.1016/j.jneumeth.2009.01.016</pub-id> <pub-id pub-id-type="pmid">19428515</pub-id></citation></ref>
<ref id="B39"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wander</surname> <given-names>J. D.</given-names></name> <name><surname>Blakely</surname> <given-names>T.</given-names></name> <name><surname>Miller</surname> <given-names>K. J.</given-names></name> <name><surname>Weaver</surname> <given-names>K. E.</given-names></name> <name><surname>Johnson</surname> <given-names>L. A.</given-names></name> <name><surname>Olson</surname> <given-names>J. D.</given-names></name><etal/></person-group> (<year>2013</year>). <article-title>Distributed cortical adaptation during learning of a brain&#x2013;computer interface task.</article-title> <source><italic>Proc. Natl. Acad. Sci. U.S.A.</italic></source> <volume>110</volume> <fpage>10818</fpage>&#x2013;<lpage>10823</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.1221127110</pub-id> <pub-id pub-id-type="pmid">23754426</pub-id></citation></ref>
<ref id="B40"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wei</surname> <given-names>P.</given-names></name> <name><surname>He</surname> <given-names>W.</given-names></name> <name><surname>Zhou</surname> <given-names>Y.</given-names></name> <name><surname>Wang</surname> <given-names>L.</given-names></name></person-group> (<year>2013</year>). <article-title>Performance of motor imagery brain-computer interface based on anodal transcranial direct current stimulation modulation.</article-title> <source><italic>IEEE Trans. Neural Syst. Rehabil. Eng.</italic></source> <volume>21</volume> <fpage>404</fpage>&#x2013;<lpage>415</lpage>. <pub-id pub-id-type="doi">10.1109/TNSRE.2013.2249111</pub-id> <pub-id pub-id-type="pmid">23475381</pub-id></citation></ref>
<ref id="B41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Willett</surname> <given-names>F. R.</given-names></name> <name><surname>Avansino</surname> <given-names>D. T.</given-names></name> <name><surname>Hochberg</surname> <given-names>L. R.</given-names></name> <name><surname>Henderson</surname> <given-names>J. M.</given-names></name> <name><surname>Shenoy</surname> <given-names>K. V.</given-names></name></person-group> (<year>2021</year>). <article-title>High-performance brain-to-text communication via handwriting.</article-title> <source><italic>Nature</italic></source> <volume>593</volume> <fpage>249</fpage>&#x2013;<lpage>254</lpage>. <pub-id pub-id-type="doi">10.1038/s41586-021-03506-2</pub-id> <pub-id pub-id-type="pmid">33981047</pub-id></citation></ref>
<ref id="B42"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wolpaw</surname> <given-names>J. R.</given-names></name> <name><surname>McFarland</surname> <given-names>D. J.</given-names></name></person-group> (<year>2004</year>). <article-title>Control of a two-dimensional movement signal by a noninvasive brain-computer interface in humans.</article-title> <source><italic>Proc. Natl. Acad. Sci. U.S.A.</italic></source> <volume>101</volume> <fpage>17849</fpage>&#x2013;<lpage>17854</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.0403504101</pub-id> <pub-id pub-id-type="pmid">15585584</pub-id></citation></ref>
<ref id="B43"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wolpaw</surname> <given-names>J. R.</given-names></name> <name><surname>Birbaumer</surname> <given-names>N.</given-names></name> <name><surname>McFarland</surname> <given-names>D. J.</given-names></name> <name><surname>Pfurtscheller</surname> <given-names>G.</given-names></name> <name><surname>Vaughan</surname> <given-names>T. M.</given-names></name></person-group> (<year>2002</year>). <article-title>Brain&#x2013;computer interfaces for communication and control.</article-title> <source><italic>Clin. Neurophysiol.</italic></source> <volume>113</volume> <fpage>767</fpage>&#x2013;<lpage>791</lpage>. <pub-id pub-id-type="doi">10.1016/S1388-2457(02)00057-3</pub-id></citation></ref>
<ref id="B44"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Worden</surname> <given-names>M. S.</given-names></name> <name><surname>Foxe</surname> <given-names>J. J.</given-names></name> <name><surname>Wang</surname> <given-names>N.</given-names></name> <name><surname>Simpson</surname> <given-names>G. V.</given-names></name></person-group> (<year>2000</year>). <article-title>Anticipatory biasing of visuospatial attention indexed by retinotopically specific &#x03B1;-bank electroencephalography increases over occipital cortex.</article-title> <source><italic>J. Neurosci.</italic></source> <volume>20</volume>:<issue>RC63</issue>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.20-06-j0002.2000</pub-id> <pub-id pub-id-type="pmid">10704517</pub-id></citation></ref>
<ref id="B45"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yao</surname> <given-names>L.</given-names></name> <name><surname>Meng</surname> <given-names>J.</given-names></name> <name><surname>Zhang</surname> <given-names>D.</given-names></name> <name><surname>Sheng</surname> <given-names>X.</given-names></name> <name><surname>Zhu</surname> <given-names>X.</given-names></name></person-group> (<year>2013</year>). <article-title>Combining motor imagery with selective sensation toward a hybrid-modality BCI.</article-title> <source><italic>IEEE Trans. Biomed. Eng.</italic></source> <volume>61</volume> <fpage>2304</fpage>&#x2013;<lpage>2312</lpage>. <pub-id pub-id-type="doi">10.1109/TBME.2013.2287245</pub-id> <pub-id pub-id-type="pmid">24235291</pub-id></citation></ref>
<ref id="B46"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhang</surname> <given-names>Y.</given-names></name> <name><surname>Zhou</surname> <given-names>G.</given-names></name> <name><surname>Jin</surname> <given-names>J.</given-names></name> <name><surname>Wang</surname> <given-names>X.</given-names></name> <name><surname>Cichocki</surname> <given-names>A.</given-names></name></person-group> (<year>2015</year>). <article-title>Optimizing spatial patterns with sparse filter bands for motor-imagery based brain&#x2013;computer interface.</article-title> <source><italic>J. Neurosci. Methods</italic></source> <volume>255</volume> <fpage>85</fpage>&#x2013;<lpage>91</lpage>. <pub-id pub-id-type="doi">10.1016/j.jneumeth.2015.08.004</pub-id> <pub-id pub-id-type="pmid">26277421</pub-id></citation></ref>
</ref-list>
</back>
</article>
