<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Hum. Neurosci.</journal-id>
<journal-title>Frontiers in Human Neuroscience</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Hum. Neurosci.</abbrev-journal-title>
<issn pub-type="epub">1662-5161</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fnhum.2023.1239071</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Neuroscience</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Visual motion detection thresholds can be reliably measured during walking and standing</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>DiBianca</surname> <given-names>Stephen</given-names></name>
<xref ref-type="corresp" rid="c001"><sup>&#x002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/979084/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Jeka</surname> <given-names>John</given-names></name>
<uri xlink:href="http://loop.frontiersin.org/people/549989/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Reimann</surname> <given-names>Hendrik</given-names></name>
<uri xlink:href="http://loop.frontiersin.org/people/21780/overview"/>
</contrib>
</contrib-group>
<aff><institution>Coordination of Balance and Posture, Kinesiology and Applied Physiology, Biomechanics and Movement Science, University of Delaware</institution>, <addr-line>Newark, DE</addr-line>, <country>United States</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Jana Kimijanov&#x00E1;, Slovak Academy of Sciences (SAS), Slovakia</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Diane Elizabeth Adamo, Wayne State University, United States; Timothy Hullar, United States Department of Veterans Affairs, United States</p></fn>
<corresp id="c001">&#x002A;Correspondence: Stephen DiBianca, <email>deebs@udel.edu</email></corresp>
</author-notes>
<pub-date pub-type="epub">
<day>09</day>
<month>11</month>
<year>2023</year>
</pub-date>
<pub-date pub-type="collection">
<year>2023</year>
</pub-date>
<volume>17</volume>
<elocation-id>1239071</elocation-id>
<history>
<date date-type="received">
<day>12</day>
<month>06</month>
<year>2023</year>
</date>
<date date-type="accepted">
<day>25</day>
<month>10</month>
<year>2023</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x00A9; 2023 DiBianca, Jeka and Reimann.</copyright-statement>
<copyright-year>2023</copyright-year>
<copyright-holder>DiBianca, Jeka and Reimann</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license>
</permissions>
<abstract>
<sec>
<title>Introduction</title>
<p>In upright standing and walking, the motion of the body relative to the environment is estimated from a combination of visual, vestibular, and somatosensory cues. Associations between vestibular or somatosensory impairments and balance problems are well established, but less is known whether visual motion detection thresholds affect upright balance control. Typically, visual motion threshold values are measured while sitting, with the head fixated to eliminate self-motion. In this study we investigated whether visual motion detection thresholds: (1) can be reliably measured during standing and walking in the presence of natural self-motion; and (2) differ during standing and walking.</p>
</sec>
<sec>
<title>Methods</title>
<p>Twenty-nine subjects stood on and walked on a self-paced, instrumented treadmill inside a virtual visual environment projected on a large dome. Participants performed a two-alternative forced choice experiment in which they discriminated between a counterclockwise (&#x201C;left&#x201D;) and clockwise (&#x201C;right&#x201D;) rotation of a visual scene. A 6-down 1-up adaptive staircase algorithm was implemented to change the amplitude of the rotation. A psychometric fit to the participants&#x2019; binary responses provided an estimate for the detection threshold.</p>
</sec>
<sec>
<title>Results</title>
<p>We found strong correlations between the repeated measurements in both the walking (<italic>R</italic> = 0.84, <italic>p</italic> &#x003C; 0.001) and the standing condition (<italic>R</italic> = 0.73, <italic>p</italic> &#x003C; 0.001) as well as good agreement between the repeated measures with Bland&#x2013;Altman plots. Average thresholds during walking (mean = 1.04&#x00B0;, SD = 0.43&#x00B0;) were significantly higher than during standing (mean = 0.73&#x00B0;, SD = 0.47&#x00B0;).</p>
</sec>
<sec>
<title>Conclusion</title>
<p>Visual motion detection thresholds can be reliably measured during both walking and standing, and thresholds are higher during walking.</p>
</sec>
</abstract>
<kwd-group>
<kwd>vision</kwd>
<kwd>walking</kwd>
<kwd>standing</kwd>
<kwd>visual motion detection</kwd>
<kwd>psychophysics</kwd>
<kwd>sensory threshold</kwd>
</kwd-group>
<counts>
<fig-count count="5"/>
<table-count count="0"/>
<equation-count count="1"/>
<ref-count count="50"/>
<page-count count="9"/>
<word-count count="6613"/>
</counts>
<custom-meta-wrap>
<custom-meta>
<meta-name>section-at-acceptance</meta-name>
<meta-value>Motor Neuroscience</meta-value>
</custom-meta>
</custom-meta-wrap>
</article-meta>
</front>
<body>
<sec id="S1" sec-type="intro">
<title>Introduction</title>
<p>Vision plays an important role in balance control for standing and walking by providing information about movement relative to the environment via optical flow (<xref ref-type="bibr" rid="B13">Gibson, 1958</xref>). Quantifying the capabilities of the human visual system is challenging, as any particular property such as contrast sensitivity (<xref ref-type="bibr" rid="B34">Owsley, 2003</xref>), depth perception (<xref ref-type="bibr" rid="B45">Walk and Gibson, 1961</xref>; <xref ref-type="bibr" rid="B5">Brenner and Smeets, 2018</xref>), and motion detection (<xref ref-type="bibr" rid="B3">Borst and Egelhaaf, 1989</xref>) may affect different functional behaviors. The ability to detect self-motion from optical flow is expected to be most relevant for balance control, but visual motion detection thresholds are typically measured during sitting (<xref ref-type="bibr" rid="B47">Warren et al., 1989</xref>; <xref ref-type="bibr" rid="B14">Gilmore et al., 1992</xref>; <xref ref-type="bibr" rid="B44">Turano and Wong, 1992</xref>; <xref ref-type="bibr" rid="B15">Habak and Faubert, 2000</xref>; <xref ref-type="bibr" rid="B12">Freeman et al., 2006</xref>, <xref ref-type="bibr" rid="B11">2008</xref>; <xref ref-type="bibr" rid="B7">Conlon et al., 2017</xref>), where self-motion is eliminated, and balance is not an issue. Our motivation for this study was to investigate whether: (1) visual motion thresholds can be reliably measured during standing and walking; and (2) to determine whether thresholds differ during balance tasks when self-motion is not constrained.</p>
<p>When studies investigate the relationship between visual processing and fall risk, they typically assess qualities of visual acuity such as contrast sensitivity (<xref ref-type="bibr" rid="B29">Lord and Fitzpatrick, 2001</xref>; <xref ref-type="bibr" rid="B50">Wood et al., 2011</xref>), depth perception (<xref ref-type="bibr" rid="B8">Felson et al., 1989</xref>; <xref ref-type="bibr" rid="B28">Lord and Dayhew, 2001</xref>), or size of the visual field (<xref ref-type="bibr" rid="B18">Ivers et al., 1998</xref>; <xref ref-type="bibr" rid="B6">Broman et al., 2004</xref>). Visual acuity is meaningful for maneuvering around an environment and avoiding falls caused by tripping or hitting obstacles (<xref ref-type="bibr" rid="B6">Broman et al., 2004</xref>) as vision provides information about object size, location, and where to place the swing leg foot into a safe space. Visual acuity relates to central vision, or focal vision, capable of high spatial resolution and particularly useful for pattern and object recognition (<xref ref-type="bibr" rid="B25">Larson and Loschky, 2009</xref>). While visual acuity mostly concerns central vision, visual motion perception is more related to peripheral vision (<xref ref-type="bibr" rid="B31">Monaco et al., 2007</xref>). Illusion of self-motion in response to visual motion, &#x201C;vection,&#x201D; has been shown to be primarily influenced by stimuli in the peripheral visual field (<xref ref-type="bibr" rid="B4">Brandt et al., 1973</xref>; <xref ref-type="bibr" rid="B40">Tarita-Nistor et al., 2008</xref>, <xref ref-type="bibr" rid="B41">2014</xref>). Optic flow can produce illusions of self-motion, and thus disturb upright balance in both standing (<xref ref-type="bibr" rid="B35">Peterka, 2002</xref>; <xref ref-type="bibr" rid="B23">Kiemel et al., 2006</xref>; <xref ref-type="bibr" rid="B20">Jeka et al., 2010</xref>) and walking (<xref ref-type="bibr" rid="B27">Logan et al., 2010</xref>; <xref ref-type="bibr" rid="B30">McAndrew et al., 2010</xref>; <xref ref-type="bibr" rid="B10">Franz et al., 2015</xref>; <xref ref-type="bibr" rid="B36">Reimann et al., 2018</xref>). To our knowledge, there has been only one study that has directly compared measures of visual acuity to motion perception in their relationship to control of upright balance. Data collected during the Salisbury Eye Evaluation (SEE Project) (<xref ref-type="bibr" rid="B11">Freeman et al., 2008</xref>) found that in a model including visual acuity, contrast sensitivity, visual field, and motion detection threshold, the motion detection thresholds were associated with over three times higher odds of failing on a single leg balance stance task when adjusted for age, sex, and race compared to the other measures of vision. A review by <xref ref-type="bibr" rid="B37">Saftari and Kwon (2018)</xref> highlights the general finding that decreased visual acuity is associated with increased risk for falls and hip fractures. Despite these findings, they emphasize that visual motion perception as a contributor to fall risk has been a critical omission in the literature.</p>
<p>Here we tested the reliability of measuring a visual motion detection threshold for detecting optic flow during standing and walking, tasks typically performed for investigating upright balance control. To our knowledge, visual motion detection tests have only been performed while sitting in which the head is typically immobilized (<xref ref-type="bibr" rid="B47">Warren et al., 1989</xref>; <xref ref-type="bibr" rid="B14">Gilmore et al., 1992</xref>; <xref ref-type="bibr" rid="B44">Turano and Wong, 1992</xref>; <xref ref-type="bibr" rid="B15">Habak and Faubert, 2000</xref>; <xref ref-type="bibr" rid="B12">Freeman et al., 2006</xref>, <xref ref-type="bibr" rid="B11">2008</xref>; <xref ref-type="bibr" rid="B7">Conlon et al., 2017</xref>). Here we measured visual motion detection thresholds during standing and walking, where body sway generates a natural background level of self-motion. A threshold measure characterizes the underlying mechanism (<xref ref-type="bibr" rid="B24">Kingdom and Prins, 2009</xref>) of a sensory system. In the case of a visual motion detection test, this threshold provides a measure of how sensitive the visual system is in detecting movement in the environment. To calculate a threshold value, perceptual responses are recorded after exposing participants to optic flow stimuli with varying directions and maximum amplitudes. In this study, we use a common adaptive psychophysical method, in which the amplitude of the stimulus is increased or decreased depending on the history of responses. Our hypotheses are that (1) visual motion detection thresholds are correlated between repeated measures in both standing and walking and (2) thresholds in walking are different than in standing.</p>
</sec>
<sec id="S2" sec-type="materials|methods">
<title>Materials and methods</title>
<p>Twenty-nine healthy participants (14 female, 39 &#x00B1; 15 years old) between the ages of 23 and 67 were recruited for this experiment. Subjects provided informed verbal and written consent to participate. Subjects did not have a history of neurological disorders or visual diagnoses, and no history of surgical procedures involving the legs, spine, or head within 6 months of the protocol. Participants presented with normal or corrected to normal vision (glasses/contacts). The experiment was approved by the University of Delaware Institutional Review Board.</p>
<sec id="S2.SS1">
<title>Experimental protocol and setup</title>
<p>Participants stood and walked on a self-paced, tied-belt treadmill (Bertec, Columbus, OH, USA) surrounded by a virtual environment displayed on a large dome that occupied the subjects&#x2019; full visual field (<xref ref-type="fig" rid="F1">Figure 1</xref>). All participants started with a 15-min walking block to familiarize themselves with the environment, walking on the self-paced treadmill, and the two-alternative forced choice task (2AFC). Ten trials of the 2AFC task were performed at this time. After the familiarization block, participants performed four blocks of the 2AFC task, alternating between standing and walking, with the order counter-balanced between participants, with 15 participants walking first and 14 standing first. In the standing blocks, participants stood 2 m away from the center of the curved screen. Six reflective markers were placed on the subjects: two on the temples, two over the occipital condyles, and two on the posterior superior iliac spines. Marker positions were recorded using a Qualisys Motion Tracker System with 13 cameras at a sampling rate of 200 Hz. For self-paced control of the treadmill, a nonlinear proportional-derivative (PD) controller was implemented via Labview (National instruments Inc., Austin, TX, USA) to keep the midpoint of the two markers on the posterior iliac spine at the midline of the treadmill. With a PD controller, the treadmill speeds up and slows down with the speed of the subject to maintain the subject in the center of the treadmill. Visual perspective and position in the virtual world were linked to the midpoint between the two markers placed on the subjects&#x2019; temples and superimposed over the forward motion dictated by the treadmill speed. Subjects wore a safety harness in the event of a fall, although none occurred.</p>
<fig id="F1" position="float">
<label>FIGURE 1</label>
<caption><p>Experimental setup depicting a participant walking in front of the virtual reality dome on the self-paced treadmill.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-17-1239071-g001.tif"/>
</fig>
</sec>
<sec id="S2.SS2">
<title>The virtual scene</title>
<p>The experimental setup is displayed in <xref ref-type="fig" rid="F1">Figure 1</xref>, showing a participant walking on the self-paced treadmill in the virtual environment designed and implemented in Unity3d (Unity Technologies, San Francisco, CA, USA.). The scene consisted of 1,000 cubes floating before a dark background, randomly distributed in a cylindrical tunnel along the anterior-posterior axis with a radius of 14&#x2013;40 m from the central axis through the treadmill. Each cube was 1 &#x00D7; 1 &#x00D7; 1 m in size. A red sphere was linked to the midpoint between the two markers placed on the temples as a focal point for participants and was placed 50 m ahead in the virtual environment. Fog was displayed in the distance to obfuscate the end of the tunnel and create the perception of infinite distance. The anterior-posterior movement of the virtual scene matched the speed of the treadmill.</p>
</sec>
<sec id="S2.SS3">
<title>Two-alternative forced choice task</title>
<p>The 2AFC task presents participants with a rotation of the virtual environment around the anterior-posterior axis of the treadmill in a counter-clockwise (left) or clockwise motion (right). Participants were instructed to use the red dot as a focal point and that the cubes would rotate either counter-clockwise, &#x201C;left,&#x201D; or clockwise, &#x201C;right,&#x201D; around the red dot and to verbally report the direction of motion as &#x201C;left&#x201D; or &#x201C;right.&#x201D; The stimulus waveform was a single cycle of a raised cosine for velocity with a frequency of 1 Hz. A variable amplitude was determined by the adaptive staircase algorithm such that the screen rotated by the designated amplitude (degrees) per 1 s (see details below). The stimulus was manually triggered by the experimenter at an arbitrary time every 1&#x2013;2 s after each response. A monotone sound was played during the stimulus in which participants verbally reported the direction of motion as &#x201C;left&#x201D; or &#x201C;right&#x201D; once the tone ended. This methodology was used for both the standing and walking condition. Each block consisted of 100 trials, where one trial is a single stimulus. After each response, the experimenter initiated the next trial. After every 25 trials the subject was given a brief break to release concentration on the task, then indicated when ready to continue, which typically took about 15 s. In the walking trials, subjects kept walking normally during these breaks. After each block of 100 trials, subjects took longer breaks of at least 2 min, more if needed.</p>
</sec>
<sec id="S2.SS4">
<title>Adaptive staircase for stimulus amplitude</title>
<p>The amplitude of the stimulus was the maximum angle of rotation around the anterior-posterior axis presented as degrees. The stimulus was generated by the equation:</p>
<disp-formula id="S2.Ex1">
<mml:math id="M1">
<mml:mrow>
<mml:mpadded width="+3.3pt">
<mml:mi>A</mml:mi>
</mml:mpadded>
<mml:mo rspace="5.8pt">&#x00D7;</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>t</mml:mi>
<mml:mo>-</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi>sin</mml:mi>
<mml:mo>&#x2061;</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mrow>
<mml:mrow>
<mml:mpadded width="+3.3pt">
<mml:mi>t</mml:mi>
</mml:mpadded>
<mml:mo rspace="5.8pt">&#x00D7;</mml:mo>
<mml:mn>2</mml:mn>
</mml:mrow>
<mml:mo>&#x2062;</mml:mo>
<mml:mpadded width="+3.3pt">
<mml:mi mathvariant="normal">&#x03C0;</mml:mi>
</mml:mpadded>
</mml:mrow>
<mml:mo rspace="5.8pt">&#x00D7;</mml:mo>
<mml:mi>f</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mrow>
<mml:mn>2</mml:mn>
<mml:mo>&#x2062;</mml:mo>
<mml:mpadded width="+3.3pt">
<mml:mi mathvariant="normal">&#x03C0;</mml:mi>
</mml:mpadded>
</mml:mrow>
<mml:mo rspace="5.8pt">&#x00D7;</mml:mo>
<mml:mi>f</mml:mi>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
<p>where <italic>A</italic> is the amplitude of the stimulus (degree), <italic>t</italic> is the time vector, and <italic>f</italic> is the frequency, or 1. To clarify, we define the stimulus in terms of maximum angular amplitude, which co-varies with changes in the velocity of the stimulus. Therefore, an increase in the amplitude would result in an increase in stimulus velocity, and vice versa. These values can also be expressed as peak velocities by taking the first derivative of the stimulus waveform, or, by simply multiplying by a factor of 2. For example, an amplitude of 4&#x00B0; could be expressed as a peak velocity of 8 degree/s. The amplitude was adjusted using an adaptive 6-down 1-up staircase algorithm (<xref ref-type="bibr" rid="B21">Karmali et al., 2016</xref>) for parameter estimation by sequential testing (<xref ref-type="bibr" rid="B42">Taylor and Creelman, 1967</xref>). A 6-down 1-up adaptive staircase algorithm was implemented based off work by <xref ref-type="bibr" rid="B21">Karmali et al. (2016)</xref> who showed a 6-down 1-up staircase provided experimental threshold values closer to theoretical values via Monte Carlo simulations over the more common 3-down 1-up staircase algorithm. The amplitude decreased after six correct responses and increased after one incorrect response, until 100 trials were completed. <xref ref-type="fig" rid="F2">Figure 2</xref> shows an example from subject VMD04 during their second walking trial of the adaptive staircase protocol. The initial amplitude was always set to 4&#x00B0;.</p>
<fig id="F2" position="float">
<label>FIGURE 2</label>
<caption><p>Example stimulus amplitudes from participant VMD04 performing the two-alternative forced choice task while walking. The green circles represent correct answers after which the stimulus amplitude stayed the same. A green downward arrow represents a sixth correct response in a row, which leads to a decrease in stimulus amplitude, making the task more difficult. An upward red arrow represents an incorrect response, which leads to an increase in movement amplitude, making the task easier. This is a 6-down 1-up adaptive staircase.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-17-1239071-g002.tif"/>
</fig>
</sec>
<sec id="S2.SS5">
<title>Psychometric fit</title>
<p>To obtain a visual motion detection threshold, we fit a psychometric curve to the 100 binary responses of the 2AFC task in each condition. The fit was performed in MATLAB&#x2019;s fitglm function using a generalized linear model (GLM) with a probit link. The motion detection threshold was defined as the value corresponding to a target probability of <inline-formula><mml:math id="M2"><mml:mrow><mml:mroot><mml:mrow><mml:mn>0.5</mml:mn></mml:mrow><mml:mn>6</mml:mn></mml:mroot><mml:mtext>&#x00A0;=&#x00A0;0.89</mml:mtext></mml:mrow></mml:math></inline-formula> (<xref ref-type="bibr" rid="B42">Taylor and Creelman, 1967</xref>; <xref ref-type="bibr" rid="B26">Levitt, 1971</xref>; <xref ref-type="bibr" rid="B17">Hall, 1981</xref>; <xref ref-type="bibr" rid="B21">Karmali et al., 2016</xref>). <xref ref-type="fig" rid="F3">Figure 3</xref> displays the psychometric fits for participant VMD04. The threshold value is marked on the psychometric fit during walking trial two, the same trial from the adaptive staircase data shown above in <xref ref-type="fig" rid="F2">Figure 2</xref>. The threshold value is highlighted by the dashed red line that corresponds to the amplitude of rotation at which the subject responded &#x201C;right&#x201D; with 89% confidence.</p>
<fig id="F3" position="float">
<label>FIGURE 3</label>
<caption><p>Example psychometric fit data from participant VMD04. Standing fits are represented by cyan dash-dotted lines and walking fits are represented by blue solid lines. The threshold value is highlighted by the dashed red line for walking trial two taken from the staircase data shown in <xref ref-type="fig" rid="F2">Figure 2</xref>. The threshold value is defined by the angle in degrees at which the participant was responding with a rightward rotation with 89% confidence. The amplitude of the rotation is displayed on the horizontal axis, with positive values corresponding to rightward rotations and negative values to leftward rotations. The probability of responding &#x201C;right&#x201D; is on the vertical axis.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-17-1239071-g003.tif"/>
</fig>
<p>The mean and standard deviation of the underlying normal distribution represent the bias and the slope for the psychometric fit. A bias value of 0 indicates an equal chance of left and right guesses (50%) at 0 amplitude movement. The bias of the psychometric curve was set to 0 for all participants. The slope of the psychometric fit is determined by the standard deviation of the underlying Gaussian distribution, which characterizes the acuteness of detection, or how accurate the visual system can detect the stimulus, visual motion (<xref ref-type="bibr" rid="B32">Morgan et al., 2011</xref>). Example fits for one participant (VMD04) are shown in <xref ref-type="fig" rid="F3">Figure 3</xref>. Here both the slopes of the walking trials (dark blue) are more shallow than the slopes of the standing trials (cyan), resulting in a larger threshold or a less accurate ability to detect motion in the environment while walking.</p>
</sec>
<sec id="S2.SS6">
<title>Statistical analysis</title>
<p>The normality and homoscedasticity of the visual motion detection thresholds for both walking and standing per block and per condition were evaluated using a Shapiro&#x2013;Wilk test of normality and an <italic>F</italic>-test. While the walking threshold data met these assumptions, the standing thresholds were slightly skewed. For Hypothesis 1 on the agreement between threshold measurements obtained in trial one and two, Pearson&#x2019;s correlation coefficients were calculated between the two measurements of both walking and standing conditions. We used Bland&#x2013;Altman plots to show the mean difference between the repeated measures and construct limits of agreement (<xref ref-type="bibr" rid="B2">Bland and Altman, 1986</xref>). To test for any differences for time and condition on the threshold measures, a two-way repeated measures ANOVA was performed.</p>
</sec>
</sec>
<sec id="S3" sec-type="results">
<title>Results</title>
<p>All subjects completed the experiment of both walking and standing conditions. On the individual level, 24 subjects had higher thresholds during walking versus standing. The average walking speed for participants during the walking condition was 1.09 m/s (SD = 0.20 m/s).</p>
<sec id="S3.SS1">
<title>Test-retest reliability</title>
<p>Thresholds were obtained from walking block one (mean = 1.13&#x00B0;, SD = 0.45&#x00B0;), walking two (mean = 0.97&#x00B0;, SD = 0.41&#x00B0;), standing one (mean = 0.73&#x00B0;, SD = 0.41&#x00B0;), and standing two (mean = 0.74&#x00B0;, SD = 0.54&#x00B0;). Both walking and standing conditions showed a strong positive correlation between measurements one and two. The correlation coefficient between the walking measurements was 0.84 (<italic>p</italic> &#x003C; 0.001), and the correlation coefficient between the standing measurements was 0.73 (<italic>p</italic> &#x003C; 0.001). <xref ref-type="fig" rid="F4">Figure 4</xref> displays threshold values from trial one and two against each other for both walking (<xref ref-type="fig" rid="F4">Figure 4A</xref>) and standing (<xref ref-type="fig" rid="F4">Figure 4B</xref>). Also shown in <xref ref-type="fig" rid="F4">Figure 4</xref> are the Bland&#x2013;Altman plots for walking (<xref ref-type="fig" rid="F4">Figure 4C</xref>) and standing (<xref ref-type="fig" rid="F4">Figure 4D</xref>). The mean absolute difference between measurement one and two for walking was 0.16&#x00B0; and the mean difference between standing measurements one and two was 0.01&#x00B0;. A two-way repeated measures ANOVA revealed no significant interaction between time and threshold measures <italic>F</italic>(1,1) = 0.792, <italic>p</italic> = 0.375.</p>
<fig id="F4" position="float">
<label>FIGURE 4</label>
<caption><p>Panel <bold>(A)</bold> shows the visual motion threshold values for walking, with block one on the horizontal and block two on the vertical axis. The red line is the diagonal, and the black line is the linear regression. Panel <bold>(B)</bold> shows the same for standing. Panel <bold>(C)</bold> shows the Bland&#x2013;Altman plot for walking, with the mean of the two blocks on the horizontal and the difference on the vertical axis. Solid horizontal lines indicate the mean difference and dashed horizontal lines mark two standard deviations from the mean. Panel <bold>(D)</bold> shows the same for standing.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-17-1239071-g004.tif"/>
</fig>
<p>We estimated the detection threshold by fitting the slope of the psychometric curve, i.e., the standard deviation of the underlying normal distribution, to the response data for each participant. Bias, or the mean of the underlying normal distribution, can also be used as a parameter for fitting. A bias can occur if a participant either habitually chose a particular side when they were unsure of the direction or if the visual system itself had a skewed mapping (i.e., left movement more noticeable than right). Including the bias can lead to improper fits in rare cases (<xref ref-type="bibr" rid="B9">Fioravanti et al., 2021</xref>), and it has been found that participants could voluntarily shift their central bias, causing changes in threshold values, but the slope parameter remained relatively unchanged (<xref ref-type="bibr" rid="B32">Morgan et al., 2011</xref>). Since we had no a <italic>priori</italic> reason to expect a bias, we constrained the mean of the psychometric curve to zero for all participants. To investigate possible effects of this choice, we also repeated our analysis with fitting both slope and bias. We found that with this choice, the correlation between the slope estimates decreased for both the walking trials (<italic>R</italic> = 0.57, <italic>p</italic> = 0.001) and standing trials (<italic>R</italic> = 0.67, <italic>p</italic> &#x003C; 0.001). Additionally, we asked whether the added bias parameter provides meaningful information about the motion perception system for each participant, or rather represents a superfluous parameter that leads to overfitting. To this end, we analyzed the repeatability of the bias estimates between the two trials, using the same approach as for the threshold estimate in the main analysis. We found weak correlation for walking (<italic>R</italic> = 0.35, <italic>p</italic> = 0.064) and standing (<italic>R</italic> = 0.38, <italic>p</italic> = 0.041) as shown in <xref ref-type="supplementary-material" rid="FS1">Supplementary Figure 1</xref>. Although there was significance for the standing condition, there is little consistency between the bias measures of the two repetitions in both walking and standing, indicating that adding bias as a parameter for the psychometric curve fit represents overfitting rather than meaningful information about the visual system.</p>
</sec>
<sec id="S3.SS2">
<title>Walking versus standing visual motion thresholds</title>
<p>We observed higher visual motion detection thresholds in walking versus standing. <xref ref-type="fig" rid="F5">Figure 5</xref> shows box and whisker plots of the visual motion detection thresholds for walking (dark blue) and standing (cyan). The two way repeated measures ANOVA revealed a significant effect of condition on the threshold measures [<italic>F</italic>(1,1) = 13.634, <italic>p</italic> &#x003C; 0.001]. Thresholds were significantly higher during walking compared to standing (<italic>p</italic> &#x003C; 0.001), with an average threshold value of 1.04&#x00B0; (SD = 0.43) for walking and 0.73&#x00B0; (SD = 0.47) for standing. Twenty-four out of the 29 participants had higher thresholds on average during walking compared to standing.</p>
<fig id="F5" position="float">
<label>FIGURE 5</label>
<caption><p>Shows box and whisker plots of visual motion thresholds for walking versus standing. Boxes and whiskers are the median, quartiles, and inter-quartile ranges. Dots are data from individual participants, where each dot is the average between the two repeated measures. Diamonds represent group means. Gray lines connect walking and standing measures from the same participant.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-17-1239071-g005.tif"/>
</fig>
</sec>
</sec>
<sec id="S4" sec-type="discussion">
<title>Discussion</title>
<p>Our study investigated the reliability of measuring a visual motion detection threshold during walking and standing and compared these measures between those two tasks. We used virtual reality to display a visual scene that rotated in a clockwise or counter-clockwise motion at different amplitudes and asked participants to report the direction of rotation. We found that a visual motion threshold can be reliably obtained during tasks involving body sway and movement, namely standing and walking. We also found evidence that visual motion detection thresholds are higher when walking versus standing, potentially influencing the way visual information is processed between the two tasks.</p>
<p>Visual motion detection thresholds can be reliably obtained during walking and standing. Strong correlations indicate good agreement between the two measures taken at different time points. Bland&#x2013;Altman plots in <xref ref-type="fig" rid="F4">Figure 4</xref> support this finding by indicating small average differences between the two measurements for both walking (0.16&#x00B0;) and standing (0.01&#x00B0;). Traditionally, visual motion detection tests are performed while sitting, with the head fixated to avoid any type of head movement (<xref ref-type="bibr" rid="B47">Warren et al., 1989</xref>; <xref ref-type="bibr" rid="B14">Gilmore et al., 1992</xref>; <xref ref-type="bibr" rid="B44">Turano and Wong, 1992</xref>; <xref ref-type="bibr" rid="B49">Wood and Bullimore, 1995</xref>; <xref ref-type="bibr" rid="B43">Tran et al., 1998</xref>; <xref ref-type="bibr" rid="B15">Habak and Faubert, 2000</xref>; <xref ref-type="bibr" rid="B39">Snowden and Kavanagh, 2006</xref>; <xref ref-type="bibr" rid="B12">Freeman et al., 2006</xref>, <xref ref-type="bibr" rid="B11">2008</xref>; <xref ref-type="bibr" rid="B7">Conlon et al., 2017</xref>). These tests have explored different aspects of motion detection and it is unclear which are most relevant for balance. For example, minimal displacement thresholds have been quantified using the translation of a random dot pattern (<xref ref-type="bibr" rid="B15">Habak and Faubert, 2000</xref>; <xref ref-type="bibr" rid="B39">Snowden and Kavanagh, 2006</xref>; <xref ref-type="bibr" rid="B7">Conlon et al., 2017</xref>), motion coherence thresholds have been quantified by translating select percentages of those dots (<xref ref-type="bibr" rid="B14">Gilmore et al., 1992</xref>; <xref ref-type="bibr" rid="B43">Tran et al., 1998</xref>; <xref ref-type="bibr" rid="B39">Snowden and Kavanagh, 2006</xref>), speed discrimination thresholds have been quantified by varying the object motion&#x2019;s speed (<xref ref-type="bibr" rid="B39">Snowden and Kavanagh, 2006</xref>), and heading direction thresholds have been quantified by varying optic flow patterns relevant to a vertical object (<xref ref-type="bibr" rid="B48">Warren and Hannon, 1988</xref>; <xref ref-type="bibr" rid="B47">Warren et al., 1989</xref>). These studies all implement psychophysical testing, however, there is no standardized approach, using different methods such as the BEST PEST (<xref ref-type="bibr" rid="B14">Gilmore et al., 1992</xref>; <xref ref-type="bibr" rid="B15">Habak and Faubert, 2000</xref>), method of constant stimuli (<xref ref-type="bibr" rid="B44">Turano and Wong, 1992</xref>; <xref ref-type="bibr" rid="B43">Tran et al., 1998</xref>), or adaptive staircases (<xref ref-type="bibr" rid="B49">Wood and Bullimore, 1995</xref>; <xref ref-type="bibr" rid="B39">Snowden and Kavanagh, 2006</xref>; <xref ref-type="bibr" rid="B12">Freeman et al., 2006</xref>, <xref ref-type="bibr" rid="B11">2008</xref>). Furthermore, these studies have manipulated and controlled for various characteristics of the stimuli such as luminance and contrast. Again, it is unclear which of these aspects of motion discrimination is most relevant to balance, thus motivating our study to see if motion discrimination is viable to explore during upright balance tasks such as standing and walking with the inherent added head movement. Immobilizing the head eliminates retinal slip caused by natural head sway during standing and walking. This may explain why values of motion discrimination are considerably smaller while sitting (<xref ref-type="bibr" rid="B44">Turano and Wong, 1992</xref>; <xref ref-type="bibr" rid="B39">Snowden and Kavanagh, 2006</xref>; <xref ref-type="bibr" rid="B12">Freeman et al., 2006</xref>, <xref ref-type="bibr" rid="B11">2008</xref>; <xref ref-type="bibr" rid="B7">Conlon et al., 2017</xref>), obtaining values ranging from 0.009&#x00B0; to 0.121&#x00B0;. Heading direction thresholds obtained by <xref ref-type="bibr" rid="B48">Warren and Hannon (1988)</xref>, and <xref ref-type="bibr" rid="B47">Warren et al. (1989)</xref> may be more relevant to upright balance control since navigating through our environment via optic flow is important for locomotion. Thresholds calculated via heading direction are similar to what is seen here, ranging from 1.1&#x00B0; to 1.9&#x00B0;. Although it must be noted that motion thresholds here cannot be directly compared to past literature due to methodological differences such as psychophysical protocols, and stimulus conditions such as luminance, contrast, and type of motion. Here, we used a rotation of the virtual scene to measure motion detection. This type of stimulus was selected based on previous work investigating the role of vision and balance during walking in which visual perturbations have been implemented around the anterior/posterior axis to simulate the sensation of a fall and investigate medial/lateral balance control (<xref ref-type="bibr" rid="B30">McAndrew et al., 2010</xref>; <xref ref-type="bibr" rid="B10">Franz et al., 2015</xref>; <xref ref-type="bibr" rid="B36">Reimann et al., 2018</xref>).</p>
<p>The visual motion stimulus presented here is in terms of maximum angular amplitude per 1 s, resulting in changes of both displacement and velocity of the stimulus during the adaptive staircase protocol to quantify a visual motion threshold. Visual motion processing has been generally viewed as having two distinct processes: first and second order processing. First order processing is defined by differences in luminance while second order processing is defined by differences in contrast, texture, or depth (<xref ref-type="bibr" rid="B38">Seiffert and Cavanagh, 1998</xref>; <xref ref-type="bibr" rid="B15">Habak and Faubert, 2000</xref>; <xref ref-type="bibr" rid="B1">Baker and Mareschal, 2001</xref>). <xref ref-type="bibr" rid="B38">Seiffert and Cavanagh (1998)</xref> have shown that second order processing is position based and support findings from <xref ref-type="bibr" rid="B33">Nakayama and Tyler (1981)</xref> that first order processing is velocity based. Since both speed and amplitude were manipulated in this experiment, we expect that a combination of first order and second order visual processing were used to detect the movement of the stimuli. It is still undetermined as to which parameter is more or less useful for controlling upright balance. To note, it has been suggested that the loss of accurate velocity estimations from a sensory modality based on modeling work from <xref ref-type="bibr" rid="B22">Kiemel et al. (2002)</xref> are more problematic to upright postural control than position estimates (<xref ref-type="bibr" rid="B19">Jeka et al., 2004</xref>). <xref ref-type="bibr" rid="B39">Snowden and Kavanagh (2006)</xref> provide evidence that speed discrimination is hindered in older adults compared to healthy young. The loss of velocity dependent information in visual motion processing could potentially be an underlying mechanism to hindered upright balance control. Since the stimulus presented here contains both a position and speed component, the underlying neural mechanism cannot be attributed to one or the other.</p>
<p>Our motivation was to perform the psychometric tests in an ergonomic manner that would quantify visual motion thresholds during natural movements of the head and body while maintaining upright balance, as opposed to a restrained position. Since participants were not restrained in any way, potential cues from the vestibular and proprioceptive system from the natural motion may have influenced the participants&#x2019; threshold results. Although small, a visual stimulus can provoke the illusion of a fall and cause a balance response that may cause added movement of the head and/or ankles, cueing the vestibular or proprioceptive system (<xref ref-type="bibr" rid="B36">Reimann et al., 2018</xref>). Some subjects reported perception of self-motion when the visual stimulus became small, rather than movement of the visual stimulus. Such cues from other sensory systems may have influenced responses that could not be considered purely visual. Head movements in standing are relatively small, but walking produces considerable head movement and might influence a person&#x2019;s ability to detect motion depending on the direction in which the stimulus is moving relative to the head. For the walking conditions, the visual stimulus was manually triggered by the experimenter with a 1&#x2013;2 s time window between stimuli regardless of the phase of the gait cycle. For example, the visual stimulus may have rotated to the right while the participant was swaying to the left or right, which would add to or reduce visual motion on the retina, respectively. Controlling the onset of the visual stimulus relative to the gait cycle may reduce the variability of thresholds measured in the current investigation.</p>
<p>Beyond accurate estimates of visual detection thresholds, it may be beneficial to understand if the influence of visual movement on upright balance changes during the gait cycle. For example, during double stance, more information is available from lower limb proprioceptors and thus may lead to less reliance on vision. In contrast, during single stance, in which the contralateral leg is in swing phase, visual cues may be more important to maintain balance since there is less contact with the body to the ground. In fact, phase-dependent visual coupling has been observed during walking (<xref ref-type="bibr" rid="B27">Logan et al., 2010</xref>). One major mechanism for balance control is modulation of the foot placement based on the state of the body at mid-stance (<xref ref-type="bibr" rid="B46">Wang and Srinivasan, 2014</xref>), and visual motion detection is likely used to estimate the body state (i.e., CoM position and velocity).</p>
<p>The influence of age on visual motion thresholds during standing and walking is under studied. Previous literature has indicated an increase in visual motion detection thresholds for older adults compared to young adults (<xref ref-type="bibr" rid="B47">Warren et al., 1989</xref>; <xref ref-type="bibr" rid="B14">Gilmore et al., 1992</xref>; <xref ref-type="bibr" rid="B43">Tran et al., 1998</xref>; <xref ref-type="bibr" rid="B15">Habak and Faubert, 2000</xref>; <xref ref-type="bibr" rid="B39">Snowden and Kavanagh, 2006</xref>; <xref ref-type="bibr" rid="B7">Conlon et al., 2017</xref>), but there are no studies that attempt to measure thresholds in older adults during standing and walking. Older adults are known to place more emphasis on vision (i.e., upweight) while standing (<xref ref-type="bibr" rid="B16">Haibach et al., 2009</xref>) and walking (<xref ref-type="bibr" rid="B10">Franz et al., 2015</xref>). If motion thresholds are higher for older adults during standing and walking, combined with their higher reliance on vision for upright balance control, larger thresholds may contribute to their fall risk. <xref ref-type="bibr" rid="B37">Saftari and Kwon (2018)</xref> emphasize that particularly in the aging literature, the relationship between visual detection thresholds and fall-risk remains a critical knowledge gap.</p>
<p>Overall, our results indicate that a visual motion detection threshold can be reliably measured during walking and standing, and that thresholds are higher for walking than standing. The relationship between visual processing and fall risk has focused on aspects of visual acuity, although not in the case of optic flow detection. The ability to reliably measure a visual motion detection threshold while standing and walking adds to our understanding of visual motion processing and balance control, particularly in populations with higher fall risk.</p>
</sec>
<sec id="S5" sec-type="data-availability">
<title>Data availability statement</title>
<p>The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.</p>
</sec>
<sec id="S6" sec-type="ethics-statement">
<title>Ethics statement</title>
<p>The studies involving humans were approved by the University of Delaware IRB, Newark, DE. The studies were conducted in accordance with the local legislation and institutional requirements. The participants provided their written informed consent to participate in this study. Written informed consent was obtained from the individual(s) for the publication of any potentially identifiable images or data included in this article.</p>
</sec>
<sec id="S7" sec-type="author-contributions">
<title>Author contributions</title>
<p>SD and HR contributed to the conception, implementation, and analysis of this study. SD ran the study and drafted the original manuscript. HR and JJ reviewed and edited the manuscript for publication. All authors contributed to manuscript revision, read, and approved the submitted version.</p>
</sec>
</body>
<back>
<ack><p>We would like to thank all the participants of this research study for their time and energy.</p>
</ack>
<sec id="S8" sec-type="COI-statement">
<title>Conflict of interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec id="S9" sec-type="disclaimer">
<title>Publisher&#x2019;s note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
<sec id="S10" sec-type="supplementary-material">
<title>Supplementary material</title>
<p>The Supplementary Material for this article can be found online at: <ext-link ext-link-type="uri" xlink:href="https://www.frontiersin.org/articles/10.3389/fnhum.2023.1239071/full#supplementary-material">https://www.frontiersin.org/articles/10.3389/fnhum.2023.1239071/full#supplementary-material</ext-link></p>
<supplementary-material xlink:href="Image_1.TIFF" id="FS1" mimetype="image/tiff" xmlns:xlink="http://www.w3.org/1999/xlink"/>
</sec>
<ref-list>
<title>References</title>
<ref id="B1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Baker</surname> <given-names>C. L.</given-names></name> <name><surname>Mareschal</surname> <given-names>I.</given-names></name></person-group> (<year>2001</year>). <article-title>Processing of second-order stimuli in the visual cortex.</article-title> <source><italic>Prog. Brain Res.</italic></source> <volume>134</volume> <fpage>171</fpage>&#x2013;<lpage>191</lpage>. <pub-id pub-id-type="doi">10.1016/s0079-6123(01)34013-x</pub-id> <pub-id pub-id-type="pmid">11702543</pub-id></citation></ref>
<ref id="B2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bland</surname> <given-names>J.</given-names></name> <name><surname>Altman</surname> <given-names>D.</given-names></name></person-group> (<year>1986</year>). <article-title>Statistical methods for assessing agreement between two methods of clinical measurement.</article-title> <source><italic>Lancet</italic></source> <volume>1</volume> <fpage>307</fpage>&#x2013;<lpage>310</lpage>.</citation></ref>
<ref id="B3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Borst</surname> <given-names>A.</given-names></name> <name><surname>Egelhaaf</surname> <given-names>M.</given-names></name></person-group> (<year>1989</year>). <article-title>Principles of visual motion detection.</article-title> <source><italic>Trends Neurosci.</italic></source> <volume>12</volume> <fpage>297</fpage>&#x2013;<lpage>306</lpage>. <pub-id pub-id-type="doi">10.1016/0166-2236(89)90010-6</pub-id> <pub-id pub-id-type="pmid">2475948</pub-id></citation></ref>
<ref id="B4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brandt</surname> <given-names>T.</given-names></name> <name><surname>Dichgans</surname> <given-names>J.</given-names></name> <name><surname>Koenig</surname> <given-names>E.</given-names></name></person-group> (<year>1973</year>). <article-title>Differential effects of central versus peripheral vision on egocentric and exocentric motion perception.</article-title> <source><italic>Exp. Brain Res.</italic></source> <volume>16</volume> <fpage>476</fpage>&#x2013;<lpage>491</lpage>. <pub-id pub-id-type="doi">10.1007/BF00234474</pub-id> <pub-id pub-id-type="pmid">4695777</pub-id></citation></ref>
<ref id="B5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brenner</surname> <given-names>E.</given-names></name> <name><surname>Smeets</surname> <given-names>J. B. J.</given-names></name></person-group> (<year>2018</year>). &#x201C;<article-title>Depth Perception</article-title>,&#x201D; in <source><italic>Stevens&#x2019; Handbook of Experimental Psychology and Cognitive Neuroscience</italic></source>, <role>ed.</role> <person-group person-group-type="editor"><name><surname>Wixted</surname> <given-names>J. T.</given-names></name></person-group> (<publisher-loc>Hoboken, NJ</publisher-loc>: <publisher-name>John Wiley &#x0026; Sons, Inc</publisher-name>), <pub-id pub-id-type="doi">10.1002/9781119170174.epcn209</pub-id></citation></ref>
<ref id="B6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Broman</surname> <given-names>A. T.</given-names></name> <name><surname>West</surname> <given-names>S. K.</given-names></name> <name><surname>Munoz</surname> <given-names>B.</given-names></name> <name><surname>Bandeen-Roche</surname> <given-names>K.</given-names></name> <name><surname>Rubin</surname> <given-names>G. S.</given-names></name> <name><surname>Turano</surname> <given-names>K. A.</given-names></name></person-group> (<year>2004</year>). <article-title>Divided visual attention as a predictor of bumping while walking: The salisbury eye evaluation.</article-title> <source><italic>Invest. Ophthalmol. Vis. Sci.</italic></source> <volume>45</volume>:<issue>2955</issue>. <pub-id pub-id-type="doi">10.1167/iovs.04-0219</pub-id> <pub-id pub-id-type="pmid">15326107</pub-id></citation></ref>
<ref id="B7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Conlon</surname> <given-names>E. G.</given-names></name> <name><surname>Power</surname> <given-names>G. F.</given-names></name> <name><surname>Hine</surname> <given-names>T. J.</given-names></name> <name><surname>Rahaley</surname> <given-names>N.</given-names></name></person-group> (<year>2017</year>). <article-title>The impact of older age and sex on motion discrimination.</article-title> <source><italic>Exp. Aging Res.</italic></source> <volume>43</volume> <fpage>55</fpage>&#x2013;<lpage>79</lpage>. <pub-id pub-id-type="doi">10.1080/0361073X.2017.1258226</pub-id> <pub-id pub-id-type="pmid">28067609</pub-id></citation></ref>
<ref id="B8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Felson</surname> <given-names>D. T.</given-names></name> <name><surname>Anderson</surname> <given-names>J. J.</given-names></name> <name><surname>Hannan</surname> <given-names>M. T.</given-names></name> <name><surname>Milton</surname> <given-names>R. C.</given-names></name> <name><surname>Wilson</surname> <given-names>P. W. F.</given-names></name> <name><surname>Kiel</surname> <given-names>D. P.</given-names></name></person-group> (<year>1989</year>). <article-title>Impaired Vision and Hip Fracture: The Framingham Study.</article-title> <source><italic>J. Am. Geriatr. Soc.</italic></source> <volume>37</volume> <fpage>495</fpage>&#x2013;<lpage>500</lpage>. <pub-id pub-id-type="doi">10.1111/j.1532-5415.1989.tb05678.x</pub-id> <pub-id pub-id-type="pmid">2715555</pub-id></citation></ref>
<ref id="B9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fioravanti</surname> <given-names>C.</given-names></name> <name><surname>Braun</surname> <given-names>C.</given-names></name> <name><surname>Lindner</surname> <given-names>A.</given-names></name> <name><surname>Ruiz</surname> <given-names>S.</given-names></name> <name><surname>Sitaram</surname> <given-names>R.</given-names></name> <name><surname>Kajal</surname> <given-names>D.</given-names></name></person-group> (<year>2021</year>). <article-title>A new adaptive procedure for estimating perceptual thresholds: The effects of observer bias and its correction.</article-title> <source><italic>J. Ment. Health Disord.</italic></source> <volume>1</volume> <fpage>45</fpage>&#x2013;<lpage>58</lpage>.</citation></ref>
<ref id="B10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Franz</surname> <given-names>J. R.</given-names></name> <name><surname>Francis</surname> <given-names>C. A.</given-names></name> <name><surname>Allen</surname> <given-names>M. S.</given-names></name> <name><surname>O&#x2019;Connor</surname> <given-names>S. M.</given-names></name> <name><surname>Thelen</surname> <given-names>D. G.</given-names></name></person-group> (<year>2015</year>). <article-title>Advanced age brings a greater reliance on visual feedback to maintain balance during walking.</article-title> <source><italic>Hum. Move. Sci.</italic></source> <volume>40</volume> <fpage>381</fpage>&#x2013;<lpage>392</lpage>. <pub-id pub-id-type="doi">10.1016/j.humov.2015.01.012</pub-id> <pub-id pub-id-type="pmid">25687664</pub-id></citation></ref>
<ref id="B11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Freeman</surname> <given-names>E. E.</given-names></name> <name><surname>Broman</surname> <given-names>A. T.</given-names></name> <name><surname>Turano</surname> <given-names>K. A.</given-names></name> <name><surname>West</surname> <given-names>S. K.</given-names></name></person-group> (<year>2008</year>). <article-title>Motion-detection threshold and measures of balance in older adults: The SEE project.</article-title> <source><italic>Invest. Ophthalmol. Vis. Sci.</italic></source> <volume>49</volume> <issue>5257</issue>. <pub-id pub-id-type="doi">10.1167/iovs.07-1106</pub-id> <pub-id pub-id-type="pmid">18719087</pub-id></citation></ref>
<ref id="B12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Freeman</surname> <given-names>E. E.</given-names></name> <name><surname>Munoz</surname> <given-names>B.</given-names></name> <name><surname>Turano</surname> <given-names>K. A.</given-names></name> <name><surname>West</surname> <given-names>S. K.</given-names></name></person-group> (<year>2006</year>). <article-title>Dynamic measures of visual function and their relationship to self-report of visual functioning.</article-title> <source><italic>Invest. Ophthalmol. Vis. Sci.</italic></source> <volume>47</volume>:<issue>4762</issue>. <pub-id pub-id-type="doi">10.1167/iovs.06-0436</pub-id> <pub-id pub-id-type="pmid">17065485</pub-id></citation></ref>
<ref id="B13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gibson</surname> <given-names>J. J.</given-names></name></person-group> (<year>1958</year>). <article-title>Visually controlled locomotion and visual orientation in animals.</article-title> <source><italic>Br. J. Psychol.</italic></source> <volume>49</volume> <fpage>182</fpage>&#x2013;<lpage>194</lpage>. <pub-id pub-id-type="doi">10.1111/j.2044-8295.1958.tb00656.x</pub-id> <pub-id pub-id-type="pmid">13572790</pub-id></citation></ref>
<ref id="B14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gilmore</surname> <given-names>G. C.</given-names></name> <name><surname>Wenk</surname> <given-names>H. E.</given-names></name> <name><surname>Naylor</surname> <given-names>L. A.</given-names></name> <name><surname>Stuve</surname> <given-names>T. A.</given-names></name></person-group> (<year>1992</year>). <article-title>Motion perception and aging.</article-title> <source><italic>Psychol. Aging</italic></source> <volume>7</volume> <fpage>654</fpage>&#x2013;<lpage>660</lpage>.</citation></ref>
<ref id="B15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Habak</surname> <given-names>C.</given-names></name> <name><surname>Faubert</surname> <given-names>J.</given-names></name></person-group> (<year>2000</year>). <article-title>Larger effect of aging on the perception of higher-order stimuli.</article-title> <source><italic>Vis. Res.</italic></source> <volume>40</volume> <fpage>943</fpage>&#x2013;<lpage>950</lpage>. <pub-id pub-id-type="doi">10.1016/S0042-6989(99)00235-7</pub-id> <pub-id pub-id-type="pmid">10720665</pub-id></citation></ref>
<ref id="B16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Haibach</surname> <given-names>P.</given-names></name> <name><surname>Slobounov</surname> <given-names>S.</given-names></name> <name><surname>Newell</surname> <given-names>K.</given-names></name></person-group> (<year>2009</year>). <article-title>Egomotion and vection in young and elderly adults.</article-title> <source><italic>Gerontology</italic></source> <volume>55</volume> <fpage>637</fpage>&#x2013;<lpage>643</lpage>. <pub-id pub-id-type="doi">10.1159/000235816</pub-id> <pub-id pub-id-type="pmid">19707011</pub-id></citation></ref>
<ref id="B17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hall</surname> <given-names>J. L.</given-names></name></person-group> (<year>1981</year>). <article-title>Hybrid adaptive procedure for estimation of psychometric functions.</article-title> <source><italic>J. Acoust. Soc. Am.</italic></source> <volume>69</volume> <fpage>1763</fpage>&#x2013;<lpage>1769</lpage>. <pub-id pub-id-type="doi">10.1121/1.385912</pub-id> <pub-id pub-id-type="pmid">7240589</pub-id></citation></ref>
<ref id="B18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ivers</surname> <given-names>R.</given-names></name> <name><surname>Cumming</surname> <given-names>R.</given-names></name> <name><surname>Mitchell</surname> <given-names>P.</given-names></name> <name><surname>Attebo</surname> <given-names>K.</given-names></name></person-group> (<year>1998</year>). <article-title>Visual impairment and falls in older adults: the Blue Mountains Eye Study.</article-title> <source><italic>J. Am. Geriatr. Soc.</italic></source> <volume>46</volume> <fpage>58</fpage>&#x2013;<lpage>64</lpage>. <pub-id pub-id-type="doi">10.1111/j.1532-5415.1998.tb01014.x</pub-id> <pub-id pub-id-type="pmid">9434666</pub-id></citation></ref>
<ref id="B19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jeka</surname> <given-names>J.</given-names></name> <name><surname>Kiemel</surname> <given-names>T.</given-names></name> <name><surname>Creath</surname> <given-names>R.</given-names></name> <name><surname>Horak</surname> <given-names>F.</given-names></name> <name><surname>Peterka</surname> <given-names>R.</given-names></name></person-group> (<year>2004</year>). <article-title>Controlling human upright posture: velocity information is more accurate than position or acceleration.</article-title> <source><italic>J. Neurophysiol.</italic></source> <volume>92</volume> <fpage>2368</fpage>&#x2013;<lpage>2379</lpage>. <pub-id pub-id-type="doi">10.1152/jn.00983.2003</pub-id> <pub-id pub-id-type="pmid">15140910</pub-id></citation></ref>
<ref id="B20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jeka</surname> <given-names>J. J.</given-names></name> <name><surname>Allison</surname> <given-names>L. K.</given-names></name> <name><surname>Kiemel</surname> <given-names>T.</given-names></name></person-group> (<year>2010</year>). <article-title>The dynamics of visual reweighting in healthy and fall-prone older adults.</article-title> <source><italic>J. Motor Behav.</italic></source> <volume>42</volume> <fpage>197</fpage>&#x2013;<lpage>208</lpage>. <pub-id pub-id-type="doi">10.1080/00222895.2010.481693</pub-id> <pub-id pub-id-type="pmid">20501430</pub-id></citation></ref>
<ref id="B21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Karmali</surname> <given-names>F.</given-names></name> <name><surname>Chaudhuri</surname> <given-names>S. E.</given-names></name> <name><surname>Yi</surname> <given-names>Y.</given-names></name> <name><surname>Merfeld</surname> <given-names>D. M.</given-names></name></person-group> (<year>2016</year>). <article-title>Determining thresholds using adaptive procedures and psychometric fits: evaluating efficiency using theory, simulations, and human experiments.</article-title> <source><italic>Exp. Brain Res.</italic></source> <volume>234</volume> <fpage>773</fpage>&#x2013;<lpage>789</lpage>. <pub-id pub-id-type="doi">10.1007/s00221-015-4501-8</pub-id> <pub-id pub-id-type="pmid">26645306</pub-id></citation></ref>
<ref id="B22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kiemel</surname> <given-names>T.</given-names></name> <name><surname>Oie</surname> <given-names>K. S.</given-names></name> <name><surname>Jeka</surname> <given-names>J. J.</given-names></name></person-group> (<year>2002</year>). <article-title>Multisensory fusion and the stochastic structure of postural sway.</article-title> <source><italic>Biol. Cybern.</italic></source> <volume>87</volume> <fpage>262</fpage>&#x2013;<lpage>277</lpage>. <pub-id pub-id-type="doi">10.1007/s00422-002-0333-2</pub-id> <pub-id pub-id-type="pmid">12386742</pub-id></citation></ref>
<ref id="B23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kiemel</surname> <given-names>T.</given-names></name> <name><surname>Oie</surname> <given-names>K. S.</given-names></name> <name><surname>Jeka</surname> <given-names>J. J.</given-names></name></person-group> (<year>2006</year>). <article-title>Slow dynamics of postural sway are in the feedback loop.</article-title> <source><italic>J. Neurophysiol.</italic></source> <volume>95</volume> <fpage>1410</fpage>&#x2013;<lpage>1418</lpage>. <pub-id pub-id-type="doi">10.1152/jn.01144.2004</pub-id> <pub-id pub-id-type="pmid">16192341</pub-id></citation></ref>
<ref id="B24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kingdom</surname> <given-names>F.</given-names></name> <name><surname>Prins</surname> <given-names>N.</given-names></name></person-group> (<year>2009</year>). <source><italic>Psychophysics: A Practical Introduction.</italic></source> <publisher-loc>London</publisher-loc>: <publisher-name>Elsevier</publisher-name>.</citation></ref>
<ref id="B25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Larson</surname> <given-names>A. M.</given-names></name> <name><surname>Loschky</surname> <given-names>L. C.</given-names></name></person-group> (<year>2009</year>). <article-title>The contributions of central versus peripheral vision to scene gist recognition.</article-title> <source><italic>J. Vis.</italic></source> <volume>9</volume>:<issue>6</issue>. <pub-id pub-id-type="doi">10.1167/9.10.6</pub-id> <pub-id pub-id-type="pmid">19810787</pub-id></citation></ref>
<ref id="B26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Levitt</surname> <given-names>H.</given-names></name></person-group> (<year>1971</year>). <article-title>Transformed up-down methods in psychoacoustics.</article-title> <source><italic>J. Acoust. Soc. Am.</italic></source> <volume>49</volume> <fpage>467</fpage>&#x2013;<lpage>477</lpage>. <pub-id pub-id-type="doi">10.1121/1.1912375</pub-id></citation></ref>
<ref id="B27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Logan</surname> <given-names>D.</given-names></name> <name><surname>Kiemel</surname> <given-names>T.</given-names></name> <name><surname>Dominici</surname> <given-names>N.</given-names></name> <name><surname>Cappellini</surname> <given-names>G.</given-names></name> <name><surname>Ivanenko</surname> <given-names>Y.</given-names></name> <name><surname>Lacquaniti</surname> <given-names>F.</given-names></name><etal/></person-group> (<year>2010</year>). <article-title>The many roles of vision during walking.</article-title> <source><italic>Exp. Brain Res.</italic></source> <volume>206</volume> <fpage>337</fpage>&#x2013;<lpage>350</lpage>. <pub-id pub-id-type="doi">10.1007/s00221-010-2414-0</pub-id> <pub-id pub-id-type="pmid">20852990</pub-id></citation></ref>
<ref id="B28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lord</surname> <given-names>S.</given-names></name> <name><surname>Dayhew</surname> <given-names>J.</given-names></name></person-group> (<year>2001</year>). <article-title>Visual risk factors for falls in older people.</article-title> <source><italic>J. Am. Geriatr. Soc.</italic></source> <volume>49</volume> <fpage>508</fpage>&#x2013;<lpage>515</lpage>. <pub-id pub-id-type="doi">10.1046/j.1532-5415.2001.49107.x</pub-id> <pub-id pub-id-type="pmid">11380741</pub-id></citation></ref>
<ref id="B29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lord</surname> <given-names>S. R.</given-names></name> <name><surname>Fitzpatrick</surname> <given-names>R. C.</given-names></name></person-group> (<year>2001</year>). <article-title>Choice stepping reaction time: A composite measure of falls risk in older people.</article-title> <source><italic>J. Gerontol. Ser. A Biol. Sci. Med. Sci.</italic></source> <volume>56</volume> <fpage>M627</fpage>&#x2013;<lpage>M632</lpage>. <pub-id pub-id-type="doi">10.1093/gerona/56.10.M627</pub-id> <pub-id pub-id-type="pmid">11584035</pub-id></citation></ref>
<ref id="B30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>McAndrew</surname> <given-names>P. M.</given-names></name> <name><surname>Dingwell</surname> <given-names>J. B.</given-names></name> <name><surname>Wilken</surname> <given-names>J. M.</given-names></name></person-group> (<year>2010</year>). <article-title>Walking variability during continuous pseudo-random oscillations of the support surface and visual field.</article-title> <source><italic>J. Biomech.</italic></source> <volume>43</volume> <fpage>1470</fpage>&#x2013;<lpage>1475</lpage>. <pub-id pub-id-type="doi">10.1016/j.jbiomech.2010.02.003</pub-id> <pub-id pub-id-type="pmid">20346453</pub-id></citation></ref>
<ref id="B31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Monaco</surname> <given-names>W. A.</given-names></name> <name><surname>Kalb</surname> <given-names>J. T.</given-names></name> <name><surname>Johnson</surname> <given-names>C. A.</given-names></name></person-group> (<year>2007</year>). <source><italic>Motion Detection in the Far Peripheral Visual Field.</italic></source> <publisher-loc>Fort Belvoir, VA</publisher-loc>: <publisher-name>Defense Technical Information Center</publisher-name>.</citation></ref>
<ref id="B32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Morgan</surname> <given-names>M.</given-names></name> <name><surname>Dillenburger</surname> <given-names>B.</given-names></name> <name><surname>Raphael</surname> <given-names>S.</given-names></name> <name><surname>Solomon</surname> <given-names>J.</given-names></name></person-group> (<year>2011</year>). <article-title>Observers can voluntarily shift their psychometric functions without losing sensitivity.</article-title> <source><italic>Atten. Percept. Psychophys.</italic></source> <volume>74</volume> <fpage>185</fpage>&#x2013;<lpage>193</lpage>. <pub-id pub-id-type="doi">10.3758/s13414-011-0222-7</pub-id> <pub-id pub-id-type="pmid">22033949</pub-id></citation></ref>
<ref id="B33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nakayama</surname> <given-names>K.</given-names></name> <name><surname>Tyler</surname> <given-names>C.</given-names></name></person-group> (<year>1981</year>). <article-title>Psychophysical isolation of movement sensitivity by removal of familiar position cues.</article-title> <source><italic>Vis. Res.</italic></source> <volume>21</volume> <fpage>427</fpage>&#x2013;<lpage>433</lpage>.</citation></ref>
<ref id="B34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Owsley</surname> <given-names>C.</given-names></name></person-group> (<year>2003</year>). <article-title>Contrast sensitivity.</article-title> <source><italic>Ophthalmol. Clin. North Am.</italic></source> <volume>16</volume> <fpage>171</fpage>&#x2013;<lpage>177</lpage>. <pub-id pub-id-type="doi">10.1016/S0896-1549(03)00003-8</pub-id> <pub-id pub-id-type="pmid">12809156</pub-id></citation></ref>
<ref id="B35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Peterka</surname> <given-names>R. J.</given-names></name></person-group> (<year>2002</year>). <article-title>Sensorimotor integration in human postural control.</article-title> <source><italic>J. Neurophysiol.</italic></source> <volume>88</volume> <fpage>1097</fpage>&#x2013;<lpage>1118</lpage>. <pub-id pub-id-type="doi">10.1152/jn.2002.88.3.1097</pub-id> <pub-id pub-id-type="pmid">12205132</pub-id></citation></ref>
<ref id="B36"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Reimann</surname> <given-names>H.</given-names></name> <name><surname>Fettrow</surname> <given-names>T.</given-names></name> <name><surname>Thompson</surname> <given-names>E. D.</given-names></name> <name><surname>Jeka</surname> <given-names>J. J.</given-names></name></person-group> (<year>2018</year>). <article-title>Neural control of balance during walking.</article-title> <source><italic>Front. Physiol.</italic></source> <volume>9</volume>:<issue>1271</issue>. <pub-id pub-id-type="doi">10.3389/fphys.2018.01271</pub-id> <pub-id pub-id-type="pmid">30271354</pub-id></citation></ref>
<ref id="B37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Saftari</surname> <given-names>L. N.</given-names></name> <name><surname>Kwon</surname> <given-names>O.-S.</given-names></name></person-group> (<year>2018</year>). <article-title>Ageing vision and falls: a review.</article-title> <source><italic>J. Physiol. Anthropol.</italic></source> <volume>37</volume>:<issue>11</issue>. <pub-id pub-id-type="doi">10.1186/s40101-018-0170-1</pub-id> <pub-id pub-id-type="pmid">29685171</pub-id></citation></ref>
<ref id="B38"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Seiffert</surname> <given-names>A. E.</given-names></name> <name><surname>Cavanagh</surname> <given-names>P.</given-names></name></person-group> (<year>1998</year>). <article-title>Position displacement, not velocity, is the cue to motion detection of second-order stimuli.</article-title> <source><italic>Vis. Res.</italic></source> <volume>38</volume> <fpage>3569</fpage>&#x2013;<lpage>3582</lpage>. <pub-id pub-id-type="doi">10.1016/S0042-6989(98)00035-2</pub-id> <pub-id pub-id-type="pmid">9893790</pub-id></citation></ref>
<ref id="B39"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Snowden</surname> <given-names>R. J.</given-names></name> <name><surname>Kavanagh</surname> <given-names>E.</given-names></name></person-group> (<year>2006</year>). <article-title>Motion perception in the ageing visual system: minimum motion, motion coherence, and speed discrimination thresholds.</article-title> <source><italic>Perception</italic></source> <volume>35</volume> <fpage>9</fpage>&#x2013;<lpage>24</lpage>. <pub-id pub-id-type="doi">10.1068/p5399</pub-id> <pub-id pub-id-type="pmid">16491704</pub-id></citation></ref>
<ref id="B40"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tarita-Nistor</surname> <given-names>L.</given-names></name> <name><surname>Gonzalez</surname> <given-names>E. G.</given-names></name> <name><surname>Markowitz</surname> <given-names>S. N.</given-names></name> <name><surname>Lillakas</surname> <given-names>L.</given-names></name> <name><surname>Steinbach</surname> <given-names>M. J.</given-names></name></person-group> (<year>2008</year>). <article-title>Increased role of peripheral vision in self-induced motion in patients with age-related macular degeneration.</article-title> <source><italic>Invest. Ophthalmol. Vis. Sci.</italic></source> <volume>49</volume>:<issue>3253</issue>. <pub-id pub-id-type="doi">10.1167/iovs.07-1290</pub-id> <pub-id pub-id-type="pmid">18390642</pub-id></citation></ref>
<ref id="B41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tarita-Nistor</surname> <given-names>L.</given-names></name> <name><surname>Hadavi</surname> <given-names>S.</given-names></name> <name><surname>Steinbach</surname> <given-names>M. J.</given-names></name> <name><surname>Markowitz</surname> <given-names>S. N.</given-names></name> <name><surname>Gonz&#x00E1;lez</surname> <given-names>E. G.</given-names></name></person-group> (<year>2014</year>). <article-title>Vection in patients with glaucoma.</article-title> <source><italic>Optometry Vis. Sci.</italic></source> <volume>91</volume> <fpage>556</fpage>&#x2013;<lpage>563</lpage>. <pub-id pub-id-type="doi">10.1097/OPX.0000000000000233</pub-id> <pub-id pub-id-type="pmid">24681830</pub-id></citation></ref>
<ref id="B42"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Taylor</surname> <given-names>M. M.</given-names></name> <name><surname>Creelman</surname> <given-names>C. D.</given-names></name></person-group> (<year>1967</year>). <article-title>PEST: Efficient estimates on probability functions.</article-title> <source><italic>J. Acoust. Soc. Am.</italic></source> <volume>41</volume> <fpage>782</fpage>&#x2013;<lpage>787</lpage>. <pub-id pub-id-type="doi">10.1121/1.1910407</pub-id></citation></ref>
<ref id="B43"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tran</surname> <given-names>D. B.</given-names></name> <name><surname>Silverman</surname> <given-names>S. E.</given-names></name> <name><surname>Zimmerman</surname> <given-names>K.</given-names></name> <name><surname>Feldon</surname> <given-names>S. E.</given-names></name></person-group> (<year>1998</year>). <article-title>Age-related deterioration of motion perception and detection.</article-title> <source><italic>Graefe Arch. Clin. Exp. Ophthalmol.</italic></source> <volume>236</volume> <fpage>269</fpage>&#x2013;<lpage>273</lpage>. <pub-id pub-id-type="doi">10.1007/s004170050076</pub-id> <pub-id pub-id-type="pmid">9561359</pub-id></citation></ref>
<ref id="B44"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Turano</surname> <given-names>K.</given-names></name> <name><surname>Wong</surname> <given-names>X.</given-names></name></person-group> (<year>1992</year>). <article-title>Motion thresholds in refiniris pigmentoso.</article-title> <source><italic>Invest. Ophthalmol. Vis. Sci.</italic></source> <volume>33</volume> <fpage>2411</fpage>&#x2013;<lpage>2422</lpage>.</citation></ref>
<ref id="B45"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Walk</surname> <given-names>R. D.</given-names></name> <name><surname>Gibson</surname> <given-names>E. J.</given-names></name></person-group> (<year>1961</year>). <article-title>A comparative and analytical study of visual depth perception.</article-title> <source><italic>Psychol. Monogr.</italic></source> <volume>75</volume> <fpage>1</fpage>&#x2013;<lpage>44</lpage>. <pub-id pub-id-type="doi">10.1037/h0093827</pub-id></citation></ref>
<ref id="B46"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname> <given-names>Y.</given-names></name> <name><surname>Srinivasan</surname> <given-names>M.</given-names></name></person-group> (<year>2014</year>). <article-title>Stepping in the direction of the fall: the next foot placement can be predicted from current upper body state in steady-state walking.</article-title> <source><italic>Biol. Lett.</italic></source> <volume>10</volume>:<issue>20140405</issue>. <pub-id pub-id-type="doi">10.1098/rsbl.2014.0405</pub-id> <pub-id pub-id-type="pmid">25252834</pub-id></citation></ref>
<ref id="B47"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Warren</surname> <given-names>W. H.</given-names></name> <name><surname>Blackwell</surname> <given-names>A. W.</given-names></name> <name><surname>Morris</surname> <given-names>M. W.</given-names></name></person-group> (<year>1989</year>). <article-title>Age differences in perceiving the direction of self-motion from optical flow.</article-title> <source><italic>J. Gerontol.</italic></source> <volume>44</volume> <fpage>147</fpage>&#x2013;<lpage>153</lpage>. <pub-id pub-id-type="doi">10.1093/geronj/44.5.P147</pub-id> <pub-id pub-id-type="pmid">2768773</pub-id></citation></ref>
<ref id="B48"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Warren</surname> <given-names>W. H.</given-names></name> <name><surname>Hannon</surname> <given-names>D. J.</given-names></name></person-group> (<year>1988</year>). <article-title>Direction of self-motion is perceived from optical flow.</article-title> <source><italic>Nature</italic></source> <volume>336</volume> <fpage>162</fpage>&#x2013;<lpage>163</lpage>. <pub-id pub-id-type="doi">10.1038/336162a0</pub-id></citation></ref>
<ref id="B49"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wood</surname> <given-names>J. M.</given-names></name> <name><surname>Bullimore</surname> <given-names>M. A.</given-names></name></person-group> (<year>1995</year>). <article-title>Changes in the lower displacement limit for motion with age.</article-title> <source><italic>Ophthal. Physiol. Opt.</italic></source> <volume>15</volume> <fpage>31</fpage>&#x2013;<lpage>36</lpage>.</citation></ref>
<ref id="B50"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wood</surname> <given-names>J. M.</given-names></name> <name><surname>Lacherez</surname> <given-names>P.</given-names></name> <name><surname>Black</surname> <given-names>A. A.</given-names></name> <name><surname>Cole</surname> <given-names>M. H.</given-names></name> <name><surname>Boon</surname> <given-names>M. Y.</given-names></name> <name><surname>Kerr</surname> <given-names>G. K.</given-names></name></person-group> (<year>2011</year>). <article-title>Risk of falls, injurious falls, and other injuries resulting from visual impairment among older adults with age-related macular degeneration.</article-title> <source><italic>Invest. Ophthalmol. Vis. Sci.</italic></source> <volume>52</volume>:<issue>5088</issue>. <pub-id pub-id-type="doi">10.1167/iovs.10-6644</pub-id> <pub-id pub-id-type="pmid">21474773</pub-id></citation></ref>
</ref-list>
</back>
</article>