<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="review-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Hum. Neurosci.</journal-id>
<journal-title>Frontiers in Human Neuroscience</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Hum. Neurosci.</abbrev-journal-title>
<issn pub-type="epub">1662-5161</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fnhum.2024.1356680</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Human Neuroscience</subject>
<subj-group>
<subject>Mini Review</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Interpersonal eye-tracking reveals the dynamics of interacting minds</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Wohltjen</surname> <given-names>Sophie</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="corresp" rid="c001"><sup>&#x0002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/1498953/overview"/>
<role content-type="https://credit.niso.org/contributor-roles/conceptualization/"/>
<role content-type="https://credit.niso.org/contributor-roles/writing-original-draft/"/>
<role content-type="https://credit.niso.org/contributor-roles/writing-review-editing/"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Wheatley</surname> <given-names>Thalia</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<xref ref-type="aff" rid="aff3"><sup>3</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/49304/overview"/>
<role content-type="https://credit.niso.org/contributor-roles/conceptualization/"/>
<role content-type="https://credit.niso.org/contributor-roles/funding-acquisition/"/>
<role content-type="https://credit.niso.org/contributor-roles/resources/"/>
<role content-type="https://credit.niso.org/contributor-roles/writing-original-draft/"/>
<role content-type="https://credit.niso.org/contributor-roles/writing-review-editing/"/>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Department of Psychology, University of Wisconsin&#x02013;Madison</institution>, <addr-line>Madison, WI</addr-line>, <country>United States</country></aff>
<aff id="aff2"><sup>2</sup><institution>Department of Psychological and Brain Sciences, Consortium for Interacting Minds, Dartmouth College</institution>, <addr-line>Hanover, NH</addr-line>, <country>United States</country></aff>
<aff id="aff3"><sup>3</sup><institution>Santa Fe Institute</institution>, <addr-line>Santa Fe, NM</addr-line>, <country>United States</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Ilanit Gordon, Bar-Ilan University, Israel</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Kuan-Hua Chen, University of Nebraska Medical Center, United States</p>
<p>Takahiko Koike, RIKEN Center for Brain Science (CBS), Japan</p></fn>
<corresp id="c001">&#x0002A;Correspondence: Sophie Wohltjen <email>wohltjen&#x00040;wisc.edu</email></corresp>
</author-notes>
<pub-date pub-type="epub">
<day>12</day>
<month>03</month>
<year>2024</year>
</pub-date>
<pub-date pub-type="collection">
<year>2024</year>
</pub-date>
<volume>18</volume>
<elocation-id>1356680</elocation-id>
<history>
<date date-type="received">
<day>16</day>
<month>12</month>
<year>2023</year>
</date>
<date date-type="accepted">
<day>20</day>
<month>02</month>
<year>2024</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2024 Wohltjen and Wheatley.</copyright-statement>
<copyright-year>2024</copyright-year>
<copyright-holder>Wohltjen and Wheatley</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license>
</permissions>
<abstract>
<p>The human eye is a rich source of information about where, when, and how we attend. Our gaze paths indicate where and what captures our attention, while changes in pupil size can signal surprise, revealing our expectations. Similarly, the pattern of our blinks suggests levels of alertness and when our attention shifts between external engagement and internal thought. During interactions with others, these cues reveal how we coordinate and share our mental states. To leverage these insights effectively, we need accurate, timely methods to observe these cues as they naturally unfold. Advances in eye-tracking technology now enable real-time observation of these cues, shedding light on mutual cognitive processes that foster shared understanding, collaborative thought, and social connection. This brief review highlights these advances and the new opportunities they present for future research.</p></abstract>
<kwd-group>
<kwd>eye-tracking</kwd>
<kwd>social attention</kwd>
<kwd>gaze</kwd>
<kwd>blinking</kwd>
<kwd>pupillometry</kwd>
<kwd>social interaction</kwd>
<kwd>naturalistic experimental design</kwd>
</kwd-group>
<counts>
<fig-count count="1"/>
<table-count count="0"/>
<equation-count count="0"/>
<ref-count count="111"/>
<page-count count="9"/>
<word-count count="7496"/>
</counts>
<custom-meta-wrap>
<custom-meta>
<meta-name>section-at-acceptance</meta-name>
<meta-value>Interacting Minds and Brains</meta-value>
</custom-meta>
</custom-meta-wrap>
</article-meta>
</front>
<body>
<sec sec-type="intro" id="s1">
<title>1 Introduction</title>
<p>Social interactions form the bedrock of human experience, shaping our emotions, beliefs, and behaviors while fostering a sense of belonging and purpose (Cacioppo and Patrick, <xref ref-type="bibr" rid="B9">2008</xref>). But the mechanics of this crucial behavior remain a mystery: How does the brain activity of one individual influence the brain activity of another, enabling the transfer of thoughts, feelings, and beliefs? Moreover, how do interacting minds create new ideas that cannot be traced back to either individual alone? Only recently has science begun to tackle these fundamental questions, utilizing new methods that can trace the complex dynamics of minds that adapt to each other (Risko et al., <xref ref-type="bibr" rid="B93">2016</xref>). Still far from mainstream (Schilbach et al., <xref ref-type="bibr" rid="B98">2013</xref>; Schilbach, <xref ref-type="bibr" rid="B97">2016</xref>), the science of interacting minds is an growing field. This growth is driven by the realization that relying solely on static models and single-participant studies has constrained our understanding of the human mind (Wheatley et al., <xref ref-type="bibr" rid="B106">2023</xref>).</p>
<p>Tackling these core questions of mind-to-mind influence presents significant challenges. The seemingly effortless and common nature of interaction masks its underlying complexity (Garrod and Pickering, <xref ref-type="bibr" rid="B31">2004</xref>). Even the simplest interaction involves multiple communication channels, leading to the continuous reshaping of thought (Shockley et al., <xref ref-type="bibr" rid="B101">2009</xref>). Commensurate with this complexity, many methodologies struggle to capture the ongoing, reciprocal dynamics. The restrictive environments of neuroimaging machines and concerns of motion artifact (Fargier et al., <xref ref-type="bibr" rid="B27">2018</xref>) make scanning interacting minds challenging (Pinti et al., <xref ref-type="bibr" rid="B84">2020</xref>) while behavioral studies can be laborious and challenging to scale. Other physiological signals, such as heart rate, offer valuable insights into the synchrony between people, a phenomenon that may be amplified by mutual attention (for example, individuals listening to the same story may experience synchronized heart rate variations (P&#x000E9;rez et al., <xref ref-type="bibr" rid="B82">2021</xref>). Nonetheless, these indicators often fall short in providing detailed insights into how individuals coordinate their attention with one another from one moment to the next.</p>
<p>One technique has increasingly surmounted these challenges: eye-tracking. While not explicitly neuroscientific in traditional terms, pupil dilations under consistent lighting are tightly correlated with activity in the brain&#x00027;s locus coeruleus (Rajkowski, <xref ref-type="bibr" rid="B89">1993</xref>; Aston-Jones et al., <xref ref-type="bibr" rid="B4">1994</xref>), the neural hub integral to attention. Fluctuations in pupil size track the release of norepinephrine, providing a temporally sensitive measure of when attention is modulated (Joshi et al., <xref ref-type="bibr" rid="B52">2016</xref>). Further, gaze direction and blink rate offer their own insights into what people find interesting and how they shift their attention between internal thought and the external world. This one technique thus produces multiple sources of information about the mind that are temporally sensitive and can be monitored passively without affecting the unfolding of natural responses.</p>
<p>Interpersonal eye-tracking&#x02014;eye tracking with two or more individuals&#x02014;captures the moment-by-moment attentional dynamics as people interact. This technique is already bearing fruit. For example, research has demonstrated its use in detecting the emergence of mutual understanding. When people attend in the same way, their pupil dilations synchronize providing a visual cue of minds in sync (Kang and Wheatley, <xref ref-type="bibr" rid="B55">2017</xref>; Nencheva et al., <xref ref-type="bibr" rid="B80">2021</xref>). Similarly, correspondence between people&#x00027;s gaze trajectories, blink rates and eye contact provide additional cues that reveal how minds interact (Richardson et al., <xref ref-type="bibr" rid="B92">2007</xref>; Nakano, <xref ref-type="bibr" rid="B76">2015</xref>; Capozzi and Ristic, <xref ref-type="bibr" rid="B12">2022</xref>; Mayrand et al., <xref ref-type="bibr" rid="B69">2023</xref>). With the emergence of wearable devices, such as eye-tracking glasses, we can more easily monitor these cues as they unfold naturally during social interactions in ways that are portable across diverse settings, demand minimal setup, and are scalable to larger groups. Coupled with new, innovative analytical techniques, these recent advances have made eye-tracking a portable, inexpensive, temporally precise, and efficient tool for addressing fundamental questions about the bidirectional neural influence of interaction and how these processes may differ in populations that find communication challenging (e.g., Autism Spectrum Conditions). In this mini review, we briefly describe the evolution of this technique and its promise for deepening our understanding of human sociality.</p>
</sec>
<sec id="s2">
<title>2 Major advances in eye-tracking</title>
<p>The first eye-trackers were designed as stationary machines, with a participant&#x00027;s head stabilized by a chin rest or bite-bar, restricting movement and field of view (Hartridge and Thomson, <xref ref-type="bibr" rid="B38">1948</xref>; Mackworth and Mackworth, <xref ref-type="bibr" rid="B64">1958</xref>; P&#x00142;u&#x0017C;yczka and Warszawski, <xref ref-type="bibr" rid="B85">2018</xref>). Later, head-mounted eye-tracking cameras were developed (Mackworth and Thomas, <xref ref-type="bibr" rid="B65">1962</xref>) but remained burdensome, restrictive, and required prolonged and frequent calibration, making them unsuitable for the study of social interaction (Hornof and Halverson, <xref ref-type="bibr" rid="B48">2002</xref>). In recent years, eye-tracking technology has witnessed a rapid evolution. In this section, we will highlight some of these recent developments and explore how they have transformed interpersonal eye-tracking into an indispensable resource for understanding social interaction.</p>
<p>Recent technological progress has enabled the eye-tracking of dyads and groups without disrupting their complex exchange of communicative signals. For example, software innovations now automate calibration (e.g., Kassner et al., <xref ref-type="bibr" rid="B57">2014</xref>), synchronize data from multiple devices (e.g., openSync; Razavi et al., <xref ref-type="bibr" rid="B91">2022</xref>) and simplify the analysis of eye-tracking data collected in naturalistic settings (e.g., iMotions, <xref ref-type="bibr" rid="B50">2022</xref>). These breakthroughs streamline device setup, eliminate the need for intrusive recalibrations, and facilitate analysis of gaze and pupil data in real-time. Packages developed within Python (e.g., PyTrack; Ghose et al., <xref ref-type="bibr" rid="B33">2020</xref>), Matlab (e.g., PuPl; Kinley and Levy, <xref ref-type="bibr" rid="B59">2022</xref>; e.g., CHAP; Hershman et al., <xref ref-type="bibr" rid="B43">2019</xref>), and R (e.g., gazeR; Geller et al., <xref ref-type="bibr" rid="B32">2020</xref>) also streamline preprocessing and analysis of eye movement and pupillometry data. Eye-tracking glasses, designed to be worn like regular glasses, afford a wider range of motion, affording natural facial expressions (Valtakari et al., <xref ref-type="bibr" rid="B104">2021</xref>) and gestures, such as the frequent head nods that regulate interaction (McClave, <xref ref-type="bibr" rid="B71">2000</xref>).</p>
<p>New analysis methods for continuous data are better equipped to handle non-linearity and non-stationarity, making them invaluable for quantifying the real-time interplay between the eyes of interacting dyads and groups. For example, Dynamic Time Warping (Berndt and Clifford, <xref ref-type="bibr" rid="B6">1994</xref>) is a non-linear method often used in speech recognition software for aligning time-shifted signals. This method is useful for capturing alignment in social interactions in a way that accounts for noisy, high-resolution data, leader follower dynamics, or other natural features of social interactions where alignment is present but not precisely time-locked. Recent research has employed this method to measure synchrony between two continuous pupillary time series (a measure of shared attention&#x02014;Kang and Wheatley, <xref ref-type="bibr" rid="B55">2017</xref>; Nencheva et al., <xref ref-type="bibr" rid="B80">2021</xref>; Fink et al., <xref ref-type="bibr" rid="B28">2023</xref>) as people interact (see Section 3 for a detailed description of this phenomenon). Cross-recurrence quantification analysis (Zbilut et al., <xref ref-type="bibr" rid="B110">1998</xref>) quantifies the shared dynamics between two systems, determining lag and identifying leaders and followers during interactions via their gaze behavior (Fusaroli et al., <xref ref-type="bibr" rid="B30">2014</xref>). Advanced methods now allow scientists to analyze the interactions between multiple individuals&#x00027; eye-tracking data. Multi-Level Vector Auto-Regression (Epskamp et al., <xref ref-type="bibr" rid="B25">2024</xref>) estimates multiple networks of relationships between time series variables, where variables are nodes in the network and edges represent correlations between variables. This method has been used to quantify the relationships between gaze fixation duration and dispersion (Moulder et al., <xref ref-type="bibr" rid="B73">2022</xref>) as well as eye contact, pupil size, and pupillary synchrony (Wohltjen and Wheatley, <xref ref-type="bibr" rid="B109">2021</xref>). Other advanced methods include cross-correlation and reverse correlation (Brinkman et al., <xref ref-type="bibr" rid="B8">2017</xref>), Detrended Fluctuation Analysis (Peng et al., <xref ref-type="bibr" rid="B81">1994</xref>), and deconvolution (Wierda et al., <xref ref-type="bibr" rid="B107">2012</xref>), with the number of analysis techniques continually increasing.</p>
<p>Software is continually improving and open-source, making synchronization of recordings from different eye-trackers more efficient. When analyzing the correspondence between multiple eye-tracking time series, many different methods exist that can account for the non-linearity of eye-tracking data and leverage the multiple measurements that the device captures. These advances have made it relatively simple to collect and analyze eye-tracking data from multiple interacting people across diverse settings.</p>
</sec>
<sec id="s3">
<title>3 What can we learn from interpersonal eye-tracking?</title>
<p>Modern eye-trackers capture several physiological correlates of social attention, including gaze trajectories, pupil dilations, and blink behavior (<xref ref-type="fig" rid="F1">Figure 1</xref>). In this section, we outline how interpersonal eye-tracking leverages these signals to reveal the coordinated dynamics of interacting minds.</p>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption><p>Schematic of three forms of social attention that can be measured using interpersonal eye-tracking. Gaze, pupil size, and eye blinks each reveal unique information about how people dynamically coordinate their attention with each other.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnhum-18-1356680-g0001.tif"/>
</fig>
<sec>
<title>3.1 Interpersonal gaze: coordinating the &#x0201C;where&#x0201D; of attention</title>
<p>The focus of one&#x00027;s gaze has long been understood as a &#x0201C;spotlight&#x0201D; of attention (Posner et al., <xref ref-type="bibr" rid="B86">1980</xref>), revealing what a person finds most informative about a scene. During social interaction, gaze is often concentrated on the interaction partner&#x00027;s eyes and mouth (Rogers et al., <xref ref-type="bibr" rid="B94">2018</xref>). By virtue of this association, gaze also communicates the focus of one&#x00027;s attention to others (Argyle and Dean, <xref ref-type="bibr" rid="B2">1965</xref>; Gobel et al., <xref ref-type="bibr" rid="B34">2015</xref>).</p>
<p>This dual function of gaze&#x02014;perception and communication&#x02014;results in well-established social gaze patterns such as joint attention (Gobel et al., <xref ref-type="bibr" rid="B34">2015</xref>). In joint attention, one person uses their gaze to <italic>communicate</italic> where to look, at which point their partner follows their gaze so that both people are <italic>perceiving</italic> the same thing (Scaife and Bruner, <xref ref-type="bibr" rid="B96">1975</xref>). Joint attention emerges early in life and is critical for the development of neural systems that support social cognition (Mundy, <xref ref-type="bibr" rid="B74">2017</xref>). Another common gaze pattern, mutual gaze (or eye contact), involves a simultaneous process of signaling and perceiving by both parties, embodying a dual role of giving and receiving within the same action (Heron, <xref ref-type="bibr" rid="B41">1970</xref>). Interpersonal eye-tracking provides a unique opportunity to measure both the perceptual and communicative functions of gaze, simultaneously, as they unfold during real interactions. In this vein, a recent study demonstrated that eye contact made people more attentive to the gaze patterns of their conversation partners (Mayrand et al., <xref ref-type="bibr" rid="B69">2023</xref>) and predicted greater across-brain coherence in social brain structures (Hirsch et al., <xref ref-type="bibr" rid="B46">2017</xref>; Dravida et al., <xref ref-type="bibr" rid="B22">2020</xref>). Coupled gaze patterns between conversation partners is associated with their amount of shared knowledge about the conversation topic at hand (Richardson et al., <xref ref-type="bibr" rid="B92">2007</xref>). Thus, attention to another&#x00027;s gaze and the coupling of those gaze patterns appears to indicate engagement and shared understanding.</p>
</sec>
<sec>
<title>3.2 Interpersonal pupil dilations: tracking when minds are &#x0201C;in sync&#x0201D;</title>
<p>Under consistent lighting, pupil dilations are tightly correlated with the release of norepinephrine in the brain&#x00027;s locus coeruleus (Rajkowski, <xref ref-type="bibr" rid="B89">1993</xref>; Aston-Jones et al., <xref ref-type="bibr" rid="B4">1994</xref>; Joshi et al., <xref ref-type="bibr" rid="B52">2016</xref>), the neural hub integral to attention. When presented with a periodic visual (Naber et al., <xref ref-type="bibr" rid="B75">2013</xref>) or auditory (Fink et al., <xref ref-type="bibr" rid="B29">2018</xref>) stimulus, the pupil dilates and constricts in tandem with the stimulus presentation, which is thought to be the product of prediction (Schwiedrzik and Sudmann, <xref ref-type="bibr" rid="B99">2020</xref>). Pupillary entrainment can also occur with more complex and naturally varying stimuli such as music (Kang et al., <xref ref-type="bibr" rid="B56">2014</xref>; Kang and Banaji, <xref ref-type="bibr" rid="B53">2020</xref>) or speech (Kang and Wheatley, <xref ref-type="bibr" rid="B55">2017</xref>).</p>
<p>When people attend to the same dynamic stimulus in the same way, their pupil dilations synchronize, providing a visual indicator of synchronized minds (Kang and Wheatley, <xref ref-type="bibr" rid="B54">2015</xref>; Nencheva et al., <xref ref-type="bibr" rid="B80">2021</xref>; Fink et al., <xref ref-type="bibr" rid="B28">2023</xref>). For example, when two people listen to the same story, their pupils may constrict and dilate in synchrony indicating that they are similarly anticipating, moment by moment, what will happen next. Synchronized attention, often referred to as &#x0201C;shared attention,&#x0201D; has wide-ranging social benefits including social verification (i.e., the sense that one&#x00027;s subjective reality is validated by virtue of it being shared; Hardin and Higgins, <xref ref-type="bibr" rid="B37">1996</xref>; Echterhoff et al., <xref ref-type="bibr" rid="B23">2009</xref>), heightened perspective-taking (Smith and Mackie, <xref ref-type="bibr" rid="B102">2016</xref>), better memory (Eskenazi et al., <xref ref-type="bibr" rid="B26">2013</xref>; He et al., <xref ref-type="bibr" rid="B40">2014</xref>), and a feeling of social connection (Cheong et al., <xref ref-type="bibr" rid="B14">2023</xref>). This form of synchrony does not require interaction, it is simply a dynamic measurement of how similarly people attend to the same stimulus. It can be measured even when individuals are eye-tracked on separate occasions and are unable to see each other, ruling out pupil mimicry as an underlying cause (Prochazkova et al., <xref ref-type="bibr" rid="B87">2018</xref>).</p>
<p>Recent research investigating pupillary synchrony has uncovered new insights about shared attention. For example, comparing the similarity of toddlers&#x00027; pupillary dilations as they listened to a story told with child-directed vs. adult-directed speech intonation, Nencheva et al. (<xref ref-type="bibr" rid="B80">2021</xref>) found that toddlers had more similar pupillary dilation patterns when hearing child-directed speech, suggesting that it helped entrain their attention. In a conversation study using pupillary synchrony as a metric of shared attention, researchers found that eye contact marked when shared attention rose and fell. Specifically, eye contact occurred as interpersonal pupillary synchrony peaked, at which point synchrony progressively declined until eye contact was broken (Wohltjen and Wheatley, <xref ref-type="bibr" rid="B109">2021</xref>). This suggests that eye contact may communicate high shared attention but also may disrupt that shared focus, possibly to allow for the emergence of independent thinking necessary for conversation to evolve. This may help explain why conversations that are more engaging tend to have more eye contact (Argyle and Dean, <xref ref-type="bibr" rid="B2">1965</xref>; Mazur et al., <xref ref-type="bibr" rid="B70">1980</xref>; Jarick and Bencic, <xref ref-type="bibr" rid="B51">2019</xref>; Dravida et al., <xref ref-type="bibr" rid="B22">2020</xref>).</p>
</sec>
<sec>
<title>3.3 Interpersonal blink rate: marking changes in cognitive states</title>
<p>When measuring continuous gaze and pupil size, the signal will often be momentarily lost. These moments, caused by blinks, are commonly discarded from eye-tracking analyses yet their timing and frequency are non-random and offer their own clues about the mind (Hershman et al., <xref ref-type="bibr" rid="B42">2018</xref>).</p>
<p>People spontaneously blink every 3 s on average, more than what is necessary for lubricating the eyes (Doane, <xref ref-type="bibr" rid="B21">1980</xref>). Furthermore, the rate of spontaneous eye blinking varies with cognitive states. It changes when chunking information (Stern et al., <xref ref-type="bibr" rid="B103">1984</xref>) or when attending to the rhythmic sequence of presented tones (Huber et al., <xref ref-type="bibr" rid="B49">2022</xref>). Blink rate decreases as attentional demands grow and increases with boredom (Maffei and Angrilli, <xref ref-type="bibr" rid="B66">2018</xref>). However, blink rate also increases with indications of engagement, such as arousal (Stern et al., <xref ref-type="bibr" rid="B103">1984</xref>; Bentivoglio et al., <xref ref-type="bibr" rid="B5">1997</xref>) and attentional switching (Rac-Lubashevsky et al., <xref ref-type="bibr" rid="B88">2017</xref>). Although it seems paradoxical to blink more when bored <italic>and</italic> when engaged, these findings are explained by the role of blinking in the various tasks in which it has been measured. Blinks are related to increased default mode, hippocampal, and cerebellar activity and decreased dorsal and ventral attention network activity. This suggests that blinks may facilitate the transition between outward and inward states of focus (Nakano et al., <xref ref-type="bibr" rid="B77">2013</xref>; Nakano, <xref ref-type="bibr" rid="B76">2015</xref>). As a result, people might blink more frequently when they feel bored due to periodic disengagement, oscillating between focusing on the external environment and their internal thoughts. Additionally, increased blinking occurs in activities requiring regular alternation between external and internal attention, like when participating in conversation (Bentivoglio et al., <xref ref-type="bibr" rid="B5">1997</xref>).</p>
<p>Scientists have discovered intriguing patterns in how people coordinate their eye blinks during interactions. Cummins (<xref ref-type="bibr" rid="B16">2012</xref>) observed that individuals strategically adjust their blink rates during conversations based on their partners&#x00027; gaze direction, indicating shifts between internal and external attention. Moreover, researchers have found that people tend to synchronize their blinks with others during problem-solving tasks (Hoffmann et al., <xref ref-type="bibr" rid="B47">2023</xref>) and conversations (Nakano and Kitazawa, <xref ref-type="bibr" rid="B78">2010</xref>; Gupta et al., <xref ref-type="bibr" rid="B35">2019</xref>), reflecting mutual transitions between cognitive states. Similarly, Nakano and Miyazaki (<xref ref-type="bibr" rid="B79">2019</xref>) noted that people who found videos engaging blinked in sync, suggesting shared processing of the content. These studies demonstrate how blinks can signify when interaction partners collectively shift between cognitive states.</p>
<p>Interpersonal eye-tracking records multiple dynamic features that each yield unique insights about how attention is dynamically coordinated and communicated when minds interact (Richardson et al., <xref ref-type="bibr" rid="B92">2007</xref>; Nakano, <xref ref-type="bibr" rid="B76">2015</xref>; Capozzi and Ristic, <xref ref-type="bibr" rid="B12">2022</xref>; Mayrand et al., <xref ref-type="bibr" rid="B69">2023</xref>). Blinking, eye gaze, and pupillary synchrony each reflect dissociable aspects of social attention (see <xref ref-type="fig" rid="F1">Figure 1</xref>). However, these components likely complement and dynamically interact with each other to support social engagement. For example, when pupillary synchrony between conversation partners peaks, eye contact occurs. Coincident with the onset of eye contact, pupillary synchrony declines until eye contact breaks (Wohltjen and Wheatley, <xref ref-type="bibr" rid="B109">2021</xref>). This precise temporal relationship between gaze and pupillary synchrony highlights the importance of combining these measures to shed light on how these components work together as an integrated system.</p>
</sec>
</sec>
<sec id="s4">
<title>4 Future directions in interpersonal eye-tracking</title>
<p>As interpersonal eye-tracking technology continues to advance, many long-standing questions about interacting minds are newly tractable. In this section, we discuss some of these open research areas, highlighting the untapped potential of interpersonal eye-tracking for the future of social scientific research.</p>
<sec>
<title>4.1 Testing existing theories in ecologically-valid scenarios</title>
<p>Social interaction is immensely complex and difficult to study in controlled laboratory conditions. The ecological validity afforded by interpersonal eye-tracking allows researchers to test how human minds naturally coordinate their attention in ways that afford the sharing and creation of knowledge (Kingstone et al., <xref ref-type="bibr" rid="B58">2003</xref>; Risko et al., <xref ref-type="bibr" rid="B93">2016</xref>). This additional ecological validity is instrumental in discerning the generalizability of psychological theories developed in tightly controlled conditions.</p>
<p>Three recent examples from eye-tracking research highlight potential discrepancies between controlled and naturalistic paradigms. First, a substantial body of research utilizing static images of faces has consistently shown that East Asian participants tend to avoid looking at the eye region more than their Western counterparts (Blais et al., <xref ref-type="bibr" rid="B7">2008</xref>; Akechi et al., <xref ref-type="bibr" rid="B1">2013</xref>; Senju et al., <xref ref-type="bibr" rid="B100">2013</xref>). However, when examined within the context of live social interactions, a striking reversal of this pattern emerges (Haensel et al., <xref ref-type="bibr" rid="B36">2022</xref>). Second, interpersonal eye-tracking has shown that people engage in significantly less mutual gaze than traditional non-naturalistic paradigms would predict. This behavior likely stems from individuals&#x00027; reluctance to appear as though they are fixating on their interaction partner, a concern that is absent when looking at static images (Laidlaw et al., <xref ref-type="bibr" rid="B60">2016</xref>; Macdonald and Tatler, <xref ref-type="bibr" rid="B63">2018</xref>). Third, interpersonal eye-tracking has revealed how social context can change gaze behavior. For example, when pairs of participants were assigned roles in a collaborative task (e.g., &#x0201C;chef&#x0201D; and &#x0201C;gatherer&#x0201D; for baking), they looked at each more and aligned their gaze faster than pairs who were not assigned roles (Macdonald and Tatler, <xref ref-type="bibr" rid="B63">2018</xref>), suggesting that social roles may help coordinate attention. In the domain of autism research, there has been ongoing debate regarding how autistic individuals use nonverbal cues, such as eye contact. Interpersonal eye-tracking has been proposed to mitigate this lack of consensus by placing experiments in more ecologically valid, interactive scenarios (Laskowitz et al., <xref ref-type="bibr" rid="B61">2021</xref>). With its potential for portability and ease of use, interpersonal eye-tracking offers new opportunities to test existing psychological theories and generate new insights.</p>
<p>Interpersonal eye-tracking also affords direct, side-by-side comparisons of attentional dynamics between tightly controlled lab conditions and more ecologically robust (but less controlled) contexts. For example, Wohltjen et al. (<xref ref-type="bibr" rid="B108">2023</xref>) directly compared an individual&#x00027;s attention to rigidly spaced single tones with how well that individual shares attention with a storyteller. Individuals whose pupils tended to synchronize with the tones were also more likely to synchronize their pupil dilations with those of the storyteller. This suggests that the tendency to synchronize one&#x00027;s attention is a reliable individual difference that varies in the human population, manifests across levels of complexity (from highly structured to continuously-varying dynamics) and predicts synchrony between minds.</p>
</sec>
<sec>
<title>4.2 Tracking moment-to-moment fluctuations in coupled attention</title>
<p>In social interactions, behaviors are dynamic, constantly adjusting to the evolving needs of one&#x00027;s partner and the surrounding context. Techniques that sample a behavior sparsely in time, aggregate over time, or record from a single individual in a noninteractive setting, miss important information relevant to interaction. An illustration of this issue can be found in the synchrony literature, which has long emphasized the benefits of synchrony for successful communication, shared understanding, and many other positive outcomes (Wheatley et al., <xref ref-type="bibr" rid="B105">2012</xref>; Hasson and Frith, <xref ref-type="bibr" rid="B39">2016</xref>; Launay et al., <xref ref-type="bibr" rid="B62">2016</xref>; Mogan et al., <xref ref-type="bibr" rid="B72">2017</xref>). Recent research using interpersonal eye-tracking suggests that synchrony is not always beneficial. Rather, intermittently <italic>breaking</italic> synchrony appears to be equally important (Dahan et al., <xref ref-type="bibr" rid="B17">2016</xref>; Wohltjen and Wheatley, <xref ref-type="bibr" rid="B109">2021</xref>; Ravreby et al., <xref ref-type="bibr" rid="B90">2022</xref>). Mayo and Gordon (<xref ref-type="bibr" rid="B68">2020</xref>) suggest that the tendencies to synchronize with one another as well as act independently both exist during social interaction, and that flexibly moving between these two states is the hallmark of a truly adaptive social system. It is possible that several conversational mechanisms prompt this mental state-switching (e.g., topic changes; Egbert, <xref ref-type="bibr" rid="B24">1997</xref>), turn taking (David Mortensen, <xref ref-type="bibr" rid="B18">2011</xref>), and segments of conversation that communicate complete thoughts or Turn Construction Units (Sacks et al., <xref ref-type="bibr" rid="B95">1978</xref>; Clayman, <xref ref-type="bibr" rid="B15">2012</xref>). Future work should investigate how fluctuations of pupillary synchrony, eye blinks and other nonverbal cues help coordinate the coupling-decoupling dynamics between minds that optimize the goals of social interaction.</p>
</sec>
<sec>
<title>4.3 Tracking the coordinated dynamics of groups</title>
<p>Interpersonal eye-tracking research has traditionally concentrated on dyadic interactions, but the introduction of cost-effective wearable eye-tracking devices has ushered in new possibilities for exploring the intricacies of social interactions within both small and larger groups (Pfeiffer et al., <xref ref-type="bibr" rid="B83">2013</xref>; Ca&#x000F1;igueral and Hamilton, <xref ref-type="bibr" rid="B10">2019</xref>; Mayrand et al., <xref ref-type="bibr" rid="B69">2023</xref>). For example, when studying group dynamics, wearable devices allow for the spontaneous head and body movements that naturally occur when interacting with multiple people, such as turning one&#x00027;s head to orient to the current speaker. Recent studies employing wearable or mobile eye-tracking technology in group settings have demonstrated a nuanced interplay of gaze direction, shared attention, and the exchange of nonverbal communication cues (Capozzi et al., <xref ref-type="bibr" rid="B11">2019</xref>; Maran et al., <xref ref-type="bibr" rid="B67">2021</xref>; Capozzi and Ristic, <xref ref-type="bibr" rid="B12">2022</xref>). These studies suggest that interactions are not only about where individuals look but also about the timing and duration of their gaze shifts. Moreover, these studies have highlighted the profound role of gaze as a potent social tool that contributes to the establishment of rapport (Mayrand et al., <xref ref-type="bibr" rid="B69">2023</xref>), the facilitation of group cohesion (Capozzi and Ristic, <xref ref-type="bibr" rid="B12">2022</xref>), and the negotiation of social hierarchies (Capozzi et al., <xref ref-type="bibr" rid="B11">2019</xref>). By increasingly extending the scope of interpersonal eye-tracking research beyond dyads, we stand to gain a more comprehensive understanding of the full spectrum of human social dynamics.</p>
</sec>
</sec>
<sec sec-type="discussion" id="s5">
<title>5 Discussion</title>
<p>Social interaction is a remarkably intricate process that involves the integration of numerous continuous streams of information. Our comprehension of this crucial behavior has been limited by historical and methodological constraints that have made it challenging to study more than one individual at a time. However, recent advances in eye-tracking technology have revolutionized our ability to measure interactions with high temporal precision, in natural social settings, and in ways that are scalable from dyads to larger groups. The term &#x0201C;eye tracking&#x0201D; belies the wealth of data these devices capture. Parameters such as gaze direction, pupillary dynamics, and blinks each offer unique insights into the human mind. In combination, these metrics shed light on how minds work together to facilitate the sharing and co-creation of thought.</p>
<p>Interpersonal eye-tracking provides exciting opportunities for clinical and educational applications. For example, the relatively low-cost, ease of setup and ability to capture attentional dynamics unobtrusively during interaction make interpersonal eye tracking a promising clinical tool for studying communication difficulties in neurodiverse populations, such as people with Autism Spectrum Conditions (Laskowitz et al., <xref ref-type="bibr" rid="B61">2021</xref>). A recent meta-analysis found that pupil responses in ASC have longer latencies (de Vries et al., <xref ref-type="bibr" rid="B19">2021</xref>), with implications for coordination dynamics in turn-taking. Further, gaze patterns are also diagnostic of ASC from infancy (Zwaigenbaum et al., <xref ref-type="bibr" rid="B111">2005</xref>; Chawarska et al., <xref ref-type="bibr" rid="B13">2012</xref>), with implications for how gaze regulates social interaction (Ca&#x000F1;igueral and Hamilton, <xref ref-type="bibr" rid="B10">2019</xref>). Eye-tracking research has also shown that people with aphasia have language comprehension deficits that are partially explained by difficulties in dynamically allocating attention (Heuer and Hallowell, <xref ref-type="bibr" rid="B45">2015</xref>). By using eye-tracking to pinpoint the moments in natural social interaction that increase attention demands, we can learn how social interaction may be adjusted to aid people with attentional difficulties. Interpersonal eye-tracking also has clear implications for understanding how teacher-student and peer-to-peer interactions scaffold learning (e.g., Dikker et al., <xref ref-type="bibr" rid="B20">2017</xref>). We are excited by the accelerating pace of eye-tracking research in naturalistic social interactions that promise to extend our understanding of these and other important domains.</p>
<p>It is important to note that all methods have limitations and eye-tracking is no exception. For instance, changes in pupil size can signal activation in the locus coeruleus, associated with increased attention (Aston-Jones and Cohen, <xref ref-type="bibr" rid="B3">2005</xref>). Yet, this measurement does not clarify which cognitive function benefits from this attentional &#x0201C;gain&#x0201D; (Aston-Jones and Cohen, <xref ref-type="bibr" rid="B3">2005</xref>). Pinpointing a particular mental process or semantic representation, would require incorporating other behavioral assessments, establish comparison conditions, or techniques with higher spatial resolution, such as fMRI. Another challenge arises with the use of wearable eye-tracking technology. While these devices mitigate the issue of signal loss caused by natural head movements, they cannot eliminate it entirely. The freedom of head movement that wearable devices allow can also complicate the interpretation of gaze patterns (such as fixations, quick eye movements, or smooth following movements; Hessels et al., <xref ref-type="bibr" rid="B44">2020</xref>) because gaze is tracked in relation to the head&#x00027;s position.</p>
<p>Despite these challenges, both wearable and stationary eye-tracking technologies offer valuable insights on how people coordinate their attention in real time. The continuous recording of pupil dilations, gaze direction, and blink rate sheds light on the ways that minds mutually adapt, facilitating the exchange of knowledge, shared understanding, and social bonding. By capturing the attentional dynamics of interacting minds, interpersonal eye-tracking offers a unique window into the mechanisms that scaffold social interaction.</p>
</sec>
<sec sec-type="author-contributions" id="s6">
<title>Author contributions</title>
<p>SW: Conceptualization, Writing &#x02013; original draft, Writing &#x02013; review &#x00026; editing. TW: Conceptualization, Funding acquisition, Resources, Writing &#x02013; original draft, Writing &#x02013; review &#x00026; editing.</p>
</sec>
</body>
<back>
<sec sec-type="funding-information" id="s7">
<title>Funding</title>
<p>The author(s) declare that no financial support was received for the research, authorship, and/or publication of this article.</p>
</sec>
<sec sec-type="COI-statement" id="conf1">
<title>Conflict of interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec sec-type="disclaimer" id="s8">
<title>Publisher&#x00027;s note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
<ref-list>
<title>References</title>
<ref id="B1">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Akechi</surname> <given-names>H.</given-names></name> <name><surname>Senju</surname> <given-names>A.</given-names></name> <name><surname>Uibo</surname> <given-names>H.</given-names></name> <name><surname>Kikuchi</surname> <given-names>Y.</given-names></name> <name><surname>Hasegawa</surname> <given-names>T.</given-names></name> <name><surname>Hietanen</surname> <given-names>J. K.</given-names></name> <etal/></person-group>. (<year>2013</year>). <article-title>Attention to eye contact in the West and East: autonomic responses and evaluative ratings</article-title>. <source>PLoS ONE</source> <volume>8</volume>:<fpage>e59312</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0059312</pub-id></citation>
</ref>
<ref id="B2">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Argyle</surname> <given-names>M.</given-names></name> <name><surname>Dean</surname> <given-names>J.</given-names></name></person-group> (<year>1965</year>). <article-title>Eye-contact, distance and affiliation</article-title>. <source>Sociometry</source> <volume>28</volume>, <fpage>289</fpage>&#x02013;<lpage>304</lpage>. <pub-id pub-id-type="doi">10.2307/2786027</pub-id></citation>
</ref>
<ref id="B3">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Aston-Jones</surname> <given-names>G.</given-names></name> <name><surname>Cohen</surname> <given-names>J. D.</given-names></name></person-group> (<year>2005</year>). <article-title>An integrative theory of locus coeruleus-norepinephrine function: adaptive gain and optimal performance</article-title>. <source>Annu. Rev. Neurosci.</source> <volume>28</volume>, <fpage>403</fpage>&#x02013;<lpage>450</lpage>. <pub-id pub-id-type="doi">10.1146/annurev.neuro.28.061604.135709</pub-id></citation>
</ref>
<ref id="B4">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Aston-Jones</surname> <given-names>G.</given-names></name> <name><surname>Rajkowski</surname> <given-names>J.</given-names></name> <name><surname>Kubiak</surname> <given-names>P.</given-names></name> <name><surname>Alexinsky</surname> <given-names>T.</given-names></name></person-group> (<year>1994</year>). <article-title>Locus coeruleus neurons in monkey are selectively activated by attended cues in a vigilance task</article-title>. <source>J. Neurosci.</source> <volume>14</volume>, <fpage>4467</fpage>&#x02013;<lpage>4480</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.14-07-04467.1994</pub-id></citation>
</ref>
<ref id="B5">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bentivoglio</surname> <given-names>A. R.</given-names></name> <name><surname>Bressman</surname> <given-names>S. B.</given-names></name> <name><surname>Cassetta</surname> <given-names>E.</given-names></name> <name><surname>Carretta</surname> <given-names>D.</given-names></name> <name><surname>Tonali</surname> <given-names>P.</given-names></name> <name><surname>Albanese</surname> <given-names>A.</given-names></name> <etal/></person-group>. (<year>1997</year>). <article-title>Analysis of blink rate patterns in normal subjects</article-title>. <source>Mov. Disord.</source> <volume>12</volume>, <fpage>1028</fpage>&#x02013;<lpage>1034</lpage>. <pub-id pub-id-type="doi">10.1002/mds.870120629</pub-id></citation>
</ref>
<ref id="B6">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Berndt</surname> <given-names>D. J.</given-names></name> <name><surname>Clifford</surname> <given-names>J.</given-names></name></person-group> (<year>1994</year>). <article-title>Using dynamic time warping to find patterns in time series</article-title>. <source>KDD Workshop</source> <volume>10</volume>, <fpage>359</fpage>&#x02013;<lpage>370</lpage>.</citation>
</ref>
<ref id="B7">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Blais</surname> <given-names>C.</given-names></name> <name><surname>Jack</surname> <given-names>R. E.</given-names></name> <name><surname>Scheepers</surname> <given-names>C.</given-names></name> <name><surname>Fiset</surname> <given-names>D.</given-names></name> <name><surname>Caldara</surname> <given-names>R.</given-names></name></person-group> (<year>2008</year>). <article-title>Culture shapes how we look at faces</article-title>. <source>PLoS ONE</source> <volume>3</volume>:<fpage>e3022</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0003022</pub-id></citation>
</ref>
<ref id="B8">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brinkman</surname> <given-names>L.</given-names></name> <name><surname>Todorov</surname> <given-names>A.</given-names></name> <name><surname>Dotsch</surname> <given-names>R.</given-names></name></person-group> (<year>2017</year>). <article-title>Visualising mental representations: a primer on noise-based reverse correlation in social psychology</article-title>. <source>Eur. Rev. Soc. Psychol.</source> <volume>28</volume>, <fpage>333</fpage>&#x02013;<lpage>361</lpage>. <pub-id pub-id-type="doi">10.1080/10463283.2017.1381469</pub-id></citation>
</ref>
<ref id="B9">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Cacioppo</surname> <given-names>J. T.</given-names></name> <name><surname>Patrick</surname> <given-names>W.</given-names></name></person-group> (<year>2008</year>). <source>Loneliness: Human Nature and the Need for Social Connection</source>. <publisher-loc>New York, NY</publisher-loc>: <publisher-name>W. W. Norton and Company</publisher-name>.</citation>
</ref>
<ref id="B10">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ca&#x000F1;igueral</surname> <given-names>R.</given-names></name> <name><surname>Hamilton</surname> <given-names>A. F. C.</given-names></name></person-group> (<year>2019</year>). <article-title>The role of eye gaze during natural social interactions in typical and autistic people</article-title>. <source>Front. Psychol.</source> <volume>10</volume>:<fpage>560</fpage>. <pub-id pub-id-type="doi">10.3389/fpsyg.2019.00560</pub-id></citation>
</ref>
<ref id="B11">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Capozzi</surname> <given-names>F.</given-names></name> <name><surname>Beyan</surname> <given-names>C.</given-names></name> <name><surname>Pierro</surname> <given-names>A.</given-names></name> <name><surname>Koul</surname> <given-names>A.</given-names></name> <name><surname>Murino</surname> <given-names>V.</given-names></name> <name><surname>Livi</surname> <given-names>S.</given-names></name> <etal/></person-group>. (<year>2019</year>). <article-title>Tracking the leader: gaze behavior in group interactions</article-title>. <source>iScience</source> <volume>16</volume>, <fpage>242</fpage>&#x02013;<lpage>249</lpage>. <pub-id pub-id-type="doi">10.1016/j.isci.2019.05.035</pub-id></citation>
</ref>
<ref id="B12">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Capozzi</surname> <given-names>F.</given-names></name> <name><surname>Ristic</surname> <given-names>J.</given-names></name></person-group> (<year>2022</year>). <article-title>Attentional gaze dynamics in group interactions</article-title>. <source>Vis. Cogn.</source> <volume>30</volume>, <fpage>135</fpage>&#x02013;<lpage>150</lpage>. <pub-id pub-id-type="doi">10.1080/13506285.2021.1925799</pub-id></citation>
</ref>
<ref id="B13">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chawarska</surname> <given-names>K.</given-names></name> <name><surname>Macari</surname> <given-names>S.</given-names></name> <name><surname>Shic</surname> <given-names>F.</given-names></name></person-group> (<year>2012</year>). <article-title>Context modulates attention to social scenes in toddlers with autism</article-title>. <source>J. Child Psychol. Psychiatry</source> <volume>53</volume>, <fpage>903</fpage>&#x02013;<lpage>913</lpage>. <pub-id pub-id-type="doi">10.1111/j.1469-7610.2012.02538.x</pub-id></citation>
</ref>
<ref id="B14">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cheong</surname> <given-names>J. H.</given-names></name> <name><surname>Molani</surname> <given-names>Z.</given-names></name> <name><surname>Sadhukha</surname> <given-names>S.</given-names></name> <name><surname>Chang</surname> <given-names>L.</given-names></name></person-group> (<year>2023</year>). <article-title>Synchronized affect in shared experiences strengthens social connection</article-title>. <source>Commun. Biol.</source> <volume>6</volume>:<fpage>1099</fpage>. <pub-id pub-id-type="doi">10.1038/s42003-023-05461-2</pub-id></citation>
</ref>
<ref id="B15">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Clayman</surname> <given-names>S. E.</given-names></name></person-group> (<year>2012</year>). <article-title>&#x0201C;Turn-constructional units and the transition-relevance place,&#x0201D;</article-title> in <source>The Handbook of Conversation Analysis</source>, eds J. Sidnell, and? T. Stivers (<publisher-loc>Hoboken, NJ</publisher-loc>: <publisher-name>John Wiley and Sons, Ltd</publisher-name>), <fpage>151</fpage>&#x02013;<lpage>166</lpage>. <pub-id pub-id-type="doi">10.1002/9781118325001.ch8</pub-id></citation>
</ref>
<ref id="B16">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cummins</surname> <given-names>F.</given-names></name></person-group> (<year>2012</year>). <article-title>Gaze and blinking in dyadic conversation: a study in coordinated behaviour among individuals</article-title>. <source>Lang. Cogn. Processes</source> <volume>27</volume>, <fpage>1525</fpage>&#x02013;<lpage>1549</lpage>. <pub-id pub-id-type="doi">10.1080/01690965.2011.615220</pub-id></citation>
</ref>
<ref id="B17">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dahan</surname> <given-names>A.</given-names></name> <name><surname>Noy</surname> <given-names>L.</given-names></name> <name><surname>Hart</surname> <given-names>Y.</given-names></name> <name><surname>Mayo</surname> <given-names>A.</given-names></name> <name><surname>Alon</surname> <given-names>U.</given-names></name></person-group> (<year>2016</year>). <article-title>Exit from synchrony in joint improvised motion</article-title>. <source>PLoS ONE</source> <volume>11</volume>:<fpage>e0160747</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0160747</pub-id></citation>
</ref>
<ref id="B18">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>David Mortensen</surname> <given-names>C.</given-names></name></person-group> (<year>2011</year>). <source>Communication Theory</source>. <publisher-loc>Piscataway, NJ</publisher-loc>: <publisher-name>Transaction Publishers</publisher-name>.</citation>
</ref>
<ref id="B19">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>de Vries</surname> <given-names>L.</given-names></name> <name><surname>Fouquaet</surname> <given-names>I.</given-names></name> <name><surname>Boets</surname> <given-names>B.</given-names></name> <name><surname>Naulaers</surname> <given-names>G.</given-names></name> <name><surname>Steyaert</surname> <given-names>J.</given-names></name></person-group> (<year>2021</year>). <article-title>Autism spectrum disorder and pupillometry: a systematic review and meta-analysis</article-title>. <source>Neurosci. Biobehav. Rev.</source> <volume>120</volume>, <fpage>479</fpage>&#x02013;<lpage>508</lpage>. <pub-id pub-id-type="doi">10.1016/j.neubiorev.2020.09.032</pub-id></citation>
</ref>
<ref id="B20">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dikker</surname> <given-names>S.</given-names></name> <name><surname>Wan</surname> <given-names>L.</given-names></name> <name><surname>Davidesco</surname> <given-names>I.</given-names></name> <name><surname>Kaggen</surname> <given-names>L.</given-names></name> <name><surname>Oostrik</surname> <given-names>M.</given-names></name> <name><surname>McClintock</surname> <given-names>J.</given-names></name> <etal/></person-group>. (<year>2017</year>). <article-title>Brain-to-brain synchrony tracks real-world dynamic group interactions in the classroom</article-title>. <source>Curr. Biol.</source> <volume>27</volume>, <fpage>1375</fpage>&#x02013;<lpage>1380</lpage>. <pub-id pub-id-type="doi">10.1016/j.cub.2017.04.002</pub-id></citation>
</ref>
<ref id="B21">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Doane</surname> <given-names>M. G.</given-names></name></person-group> (<year>1980</year>). <article-title>Interactions of eyelids and tears in corneal wetting and the dynamics of the normal human eyeblink</article-title>. <source>Am. J. Ophthalmol.</source> <volume>89</volume>, <fpage>507</fpage>&#x02013;<lpage>516</lpage>. <pub-id pub-id-type="doi">10.1016/0002-9394(80)90058-6</pub-id></citation>
</ref>
<ref id="B22">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dravida</surname> <given-names>S.</given-names></name> <name><surname>Noah</surname> <given-names>J. A.</given-names></name> <name><surname>Zhang</surname> <given-names>X.</given-names></name> <name><surname>Hirsch</surname> <given-names>J.</given-names></name></person-group> (<year>2020</year>). <article-title>Joint attention during live person-to-person contact activates rTPJ, including a sub-component associated with spontaneous eye-to-eye contact</article-title>. <source>Front. Hum. Neurosci.</source> <volume>14</volume>:<fpage>201</fpage>. <pub-id pub-id-type="doi">10.3389/fnhum.2020.00201</pub-id></citation>
</ref>
<ref id="B23">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Echterhoff</surname> <given-names>G.</given-names></name> <name><surname>Higgins</surname> <given-names>E. T.</given-names></name> <name><surname>Levine</surname> <given-names>J. M.</given-names></name></person-group> (<year>2009</year>). <article-title>Shared reality: experiencing commonality with others&#x00027; inner states about the world</article-title>. <source>Perspect. Psychol. Sci.</source> <volume>4</volume>, <fpage>496</fpage>&#x02013;<lpage>521</lpage>. <pub-id pub-id-type="doi">10.1111/j.1745-6924.2009.01161.x</pub-id></citation>
</ref>
<ref id="B24">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Egbert</surname> <given-names>M. M.</given-names></name></person-group> (<year>1997</year>). <article-title>Schisming: the collaborative transformation from a single conversation to multiple conversations</article-title>. <source>Res. Lang. Soc. Interact.</source> <volume>30</volume>, <fpage>1</fpage>&#x02013;<lpage>51</lpage>. <pub-id pub-id-type="doi">10.1207/s15327973rlsi3001_1</pub-id></citation>
</ref>
<ref id="B25">
<citation citation-type="web"><person-group person-group-type="author"><name><surname>Epskamp</surname> <given-names>S.</given-names></name> <name><surname>Deserno</surname> <given-names>M. K.</given-names></name> <name><surname>Bringmann</surname> <given-names>L. F.</given-names></name></person-group> (<year>2024</year>). <source>mlVAR: Multi-level Vector Autoregression. R Package Version 0.5.2.</source> Available online at: <ext-link ext-link-type="uri" xlink:href="https://CRAN.R-project.org/package=mlVAR">https://CRAN.R-project.org/package=mlVAR</ext-link> (accessed February 27, 2024).</citation>
</ref>
<ref id="B26">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Eskenazi</surname> <given-names>T.</given-names></name> <name><surname>Doerrfeld</surname> <given-names>A.</given-names></name> <name><surname>Logan</surname> <given-names>G. D.</given-names></name> <name><surname>Knoblich</surname> <given-names>G.</given-names></name> <name><surname>Sebanz</surname> <given-names>N.</given-names></name></person-group> (<year>2013</year>). <article-title>Your words are my words: effects of acting together on encoding</article-title>. <source>Q. J. Exp. Physiol</source>. <volume>66</volume>, <fpage>1026</fpage>&#x02013;<lpage>1034</lpage>. <pub-id pub-id-type="doi">10.1080/17470218.2012.725058</pub-id></citation>
</ref>
<ref id="B27">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fargier</surname> <given-names>R.</given-names></name> <name><surname>B&#x000FC;rki</surname> <given-names>A.</given-names></name> <name><surname>Pinet</surname> <given-names>S.</given-names></name> <name><surname>Alario</surname> <given-names>F.-X.</given-names></name> <name><surname>Laganaro</surname> <given-names>M.</given-names></name></person-group> (<year>2018</year>). <article-title>Word onset phonetic properties and motor artifacts in speech production EEG recordings</article-title>. <source>Psychophysiology</source> <volume>55</volume>, <fpage>1</fpage>&#x02013;<lpage>10</lpage>. <pub-id pub-id-type="doi">10.1111/psyp.12982</pub-id></citation>
</ref>
<ref id="B28">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fink</surname> <given-names>L.</given-names></name> <name><surname>Simola</surname> <given-names>J.</given-names></name> <name><surname>Tavano</surname> <given-names>A.</given-names></name> <name><surname>Lange</surname> <given-names>E.</given-names></name> <name><surname>Wallot</surname> <given-names>S.</given-names></name> <name><surname>Laeng</surname> <given-names>B.</given-names></name> <etal/></person-group>. (<year>2023</year>). <article-title>From pre-processing to advanced dynamic modeling of pupil data</article-title>. <source>Behav Res Methods</source>. <pub-id pub-id-type="doi">10.3758/s13428-023-02098-1</pub-id></citation>
</ref>
<ref id="B29">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fink</surname> <given-names>L. K.</given-names></name> <name><surname>Hurley</surname> <given-names>B. K.</given-names></name> <name><surname>Geng</surname> <given-names>J. J.</given-names></name> <name><surname>Janata</surname> <given-names>P.</given-names></name></person-group> (<year>2018</year>). <article-title>A linear oscillator model predicts dynamic temporal attention and pupillary entrainment to rhythmic patterns</article-title>. <source>J. Eye Mov. Res.</source> <volume>11</volume>, <fpage>1</fpage>&#x02013;<lpage>24</lpage>. <pub-id pub-id-type="doi">10.16910/jemr.11.2.12</pub-id></citation>
</ref>
<ref id="B30">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fusaroli</surname> <given-names>R.</given-names></name> <name><surname>Konvalinka</surname> <given-names>I.</given-names></name> <name><surname>Wallot</surname> <given-names>S.</given-names></name></person-group> (<year>2014</year>). <article-title>Analyzing social interactions: the promises and challenges of using cross recurrence quantification analysis</article-title>. <source>Transl. Recurrences</source> 137&#x02212;155. <pub-id pub-id-type="doi">10.1007/978-3-319-09531-8_9</pub-id></citation>
</ref>
<ref id="B31">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Garrod</surname> <given-names>S.</given-names></name> <name><surname>Pickering</surname> <given-names>M. J.</given-names></name></person-group> (<year>2004</year>). <article-title>Why is conversation so easy?</article-title> <source>Trends Cogn. Sci.</source> <volume>8</volume>, <fpage>8</fpage>&#x02013;<lpage>11</lpage>. <pub-id pub-id-type="doi">10.1016/j.tics.2003.10.016</pub-id></citation>
</ref>
<ref id="B32">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Geller</surname> <given-names>J.</given-names></name> <name><surname>Winn</surname> <given-names>M. B.</given-names></name> <name><surname>Mahr</surname> <given-names>T.</given-names></name> <name><surname>Mirman</surname> <given-names>D.</given-names></name></person-group> (<year>2020</year>). <article-title>GazeR: a package for processing gaze position and pupil size data</article-title>. <source>Behav. Res. Methods</source> <volume>52</volume>, <fpage>2232</fpage>&#x02013;<lpage>2255</lpage>. <pub-id pub-id-type="doi">10.3758/s13428-020-01374-8</pub-id></citation>
</ref>
<ref id="B33">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ghose</surname> <given-names>U.</given-names></name> <name><surname>Srinivasan</surname> <given-names>A. A.</given-names></name> <name><surname>Boyce</surname> <given-names>W. P.</given-names></name> <name><surname>Xu</surname> <given-names>H.</given-names></name> <name><surname>Chng</surname> <given-names>E. S.</given-names></name></person-group> (<year>2020</year>). <article-title>PyTrack: an end-to-end analysis toolkit for eye tracking</article-title>. <source>Behav. Res. Methods</source> <volume>52</volume>, <fpage>2588</fpage>&#x02013;<lpage>2603</lpage>. <pub-id pub-id-type="doi">10.3758/s13428-020-01392-6</pub-id></citation>
</ref>
<ref id="B34">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gobel</surname> <given-names>M. S.</given-names></name> <name><surname>Kim</surname> <given-names>H. S.</given-names></name> <name><surname>Richardson</surname> <given-names>D. C.</given-names></name></person-group> (<year>2015</year>). <article-title>The dual function of social gaze</article-title>. <source>Cognition</source> <volume>136</volume>, <fpage>359</fpage>&#x02013;<lpage>364</lpage>. <pub-id pub-id-type="doi">10.1016/j.cognition.2014.11.040</pub-id></citation>
</ref>
<ref id="B35">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Gupta</surname> <given-names>A.</given-names></name> <name><surname>Strivens</surname> <given-names>F. L.</given-names></name> <name><surname>Tag</surname> <given-names>B.</given-names></name> <name><surname>Kunze</surname> <given-names>K.</given-names></name> <name><surname>Ward</surname> <given-names>J. A.</given-names></name></person-group> (<year>2019</year>). <article-title>&#x0201C;Blink as you sync: uncovering eye and nod synchrony in conversation using wearable sensing,&#x0201D;</article-title> in <source>Proceedings of the 2019 ACM International Symposium on Wearable Computers</source> (<publisher-loc>New York, NY</publisher-loc>: <publisher-name>ACM</publisher-name>), <fpage>66</fpage>&#x02013;<lpage>71</lpage>. <pub-id pub-id-type="doi">10.1145/3341163.3347736</pub-id></citation>
</ref>
<ref id="B36">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Haensel</surname> <given-names>J. X.</given-names></name> <name><surname>Smith</surname> <given-names>T. J.</given-names></name> <name><surname>Senju</surname> <given-names>A.</given-names></name></person-group> (<year>2022</year>). <article-title>Cultural differences in mutual gaze during face-to-face interactions: a dual head-mounted eye-tracking study</article-title>. <source>Vis. Cogn.</source> <volume>30</volume>, <fpage>100</fpage>&#x02013;<lpage>115</lpage>. <pub-id pub-id-type="doi">10.1080/13506285.2021.1928354</pub-id></citation>
</ref>
<ref id="B37">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Hardin</surname> <given-names>C. D.</given-names></name> <name><surname>Higgins</surname> <given-names>E. T.</given-names></name></person-group> (<year>1996</year>). <article-title>&#x0201C;Shared reality: how social verification makes the subjective objective,&#x0201D;</article-title> in <source>Handbook of Motivation and Cognition</source>, Vol. 3, ed. R. M. Sorrentino (<publisher-loc>New York, NY</publisher-loc>: <publisher-name>The Guilford Press</publisher-name>), <fpage>28</fpage>&#x02013;<lpage>84</lpage>, xxvi.</citation>
</ref>
<ref id="B38">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hartridge</surname> <given-names>H.</given-names></name> <name><surname>Thomson</surname> <given-names>L. C.</given-names></name></person-group> (<year>1948</year>). <article-title>Methods of investigating eye movements</article-title>. <source>Br. J. Ophthalmol.</source> <volume>32</volume>, <fpage>581</fpage>&#x02013;<lpage>591</lpage>. <pub-id pub-id-type="doi">10.1136/bjo.32.9.581</pub-id></citation>
</ref>
<ref id="B39">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hasson</surname> <given-names>U.</given-names></name> <name><surname>Frith</surname> <given-names>C. D.</given-names></name></person-group> (<year>2016</year>). <article-title>Mirroring and beyond: coupled dynamics as a generalized framework for modelling social interactions</article-title>. <source>Philos. Trans. R. Soc. Lond. B: Biol. Sci.</source> <volume>371</volume>:<fpage>20150366</fpage>. <pub-id pub-id-type="doi">10.1098/rstb.2015.0366</pub-id></citation>
</ref>
<ref id="B40">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>He</surname> <given-names>X.</given-names></name> <name><surname>Sebanz</surname> <given-names>N.</given-names></name> <name><surname>Sui</surname> <given-names>J.</given-names></name> <name><surname>Humphreys</surname> <given-names>G. W.</given-names></name></person-group> (<year>2014</year>). <article-title>Individualism-collectivism and interpersonal memory guidance of attention</article-title>. <source>J. Exp. Soc. Psychol.</source> <volume>54</volume>, <fpage>102</fpage>&#x02013;<lpage>114</lpage>. <pub-id pub-id-type="doi">10.1016/j.jesp.2014.04.010</pub-id></citation>
</ref>
<ref id="B41">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Heron</surname> <given-names>J.</given-names></name></person-group> (<year>1970</year>). <article-title>The phenomenology of social encounter: the gaze</article-title>. <source>Philos. Phenomenol. Res.</source> <volume>31</volume>, <fpage>243</fpage>&#x02013;<lpage>264</lpage>. <pub-id pub-id-type="doi">10.2307/2105742</pub-id></citation>
</ref>
<ref id="B42">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hershman</surname> <given-names>R.</given-names></name> <name><surname>Henik</surname> <given-names>A.</given-names></name> <name><surname>Cohen</surname> <given-names>N.</given-names></name></person-group> (<year>2018</year>). <article-title>A novel blink detection method based on pupillometry noise</article-title>. <source>Behav. Res. Methods</source> <volume>50</volume>, <fpage>107</fpage>&#x02013;<lpage>114</lpage>. <pub-id pub-id-type="doi">10.3758/s13428-017-1008-1</pub-id></citation>
</ref>
<ref id="B43">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hershman</surname> <given-names>R.</given-names></name> <name><surname>Henik</surname> <given-names>A.</given-names></name> <name><surname>Cohen</surname> <given-names>N.</given-names></name></person-group> (<year>2019</year>). <article-title>CHAP: open-source software for processing and analyzing pupillometry data</article-title>. <source>Behav. Res. Methods</source> <volume>51</volume>, <fpage>1059</fpage>&#x02013;<lpage>1074</lpage>. <pub-id pub-id-type="doi">10.3758/s13428-018-01190-1</pub-id></citation>
</ref>
<ref id="B44">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hessels</surname> <given-names>R. S.</given-names></name> <name><surname>Niehorster</surname> <given-names>D. C.</given-names></name> <name><surname>Holleman</surname> <given-names>G. A.</given-names></name> <name><surname>Benjamins</surname> <given-names>J. S.</given-names></name> <name><surname>Hooge</surname> <given-names>I. T. C.</given-names></name></person-group> (<year>2020</year>). <article-title>Wearable technology for &#x0201C;real-world research&#x0201D;: realistic or not?</article-title> <source>Perception</source> <volume>49</volume>, <fpage>611</fpage>&#x02013;<lpage>615</lpage>. <pub-id pub-id-type="doi">10.1177/0301006620928324</pub-id></citation>
</ref>
<ref id="B45">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Heuer</surname> <given-names>S.</given-names></name> <name><surname>Hallowell</surname> <given-names>B.</given-names></name></person-group> (<year>2015</year>). <article-title>A novel eye-tracking method to assess attention allocation in individuals with and without aphasia using a dual-task paradigm</article-title>. <source>J. Commun. Disord.</source> <volume>55</volume>, <fpage>15</fpage>&#x02013;<lpage>30</lpage>. <pub-id pub-id-type="doi">10.1016/j.jcomdis.2015.01.005</pub-id></citation>
</ref>
<ref id="B46">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hirsch</surname> <given-names>J.</given-names></name> <name><surname>Zhang</surname> <given-names>X.</given-names></name> <name><surname>Noah</surname> <given-names>J. A.</given-names></name> <name><surname>Ono</surname> <given-names>Y.</given-names></name></person-group> (<year>2017</year>). <article-title>Frontal temporal and parietal systems synchronize within and across brains during live eye-to-eye contact</article-title>. <source>Neuroimage</source> <volume>157</volume>, <fpage>314</fpage>&#x02013;<lpage>330</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2017.06.018</pub-id></citation>
</ref>
<ref id="B47">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hoffmann</surname> <given-names>A.</given-names></name> <name><surname>Schellhorn</surname> <given-names>A.-M.</given-names></name> <name><surname>Ritter</surname> <given-names>M.</given-names></name> <name><surname>Sachse</surname> <given-names>P.</given-names></name> <name><surname>Maran</surname> <given-names>T.</given-names></name></person-group> (<year>2023</year>). <article-title>Blink synchronization increases over time and predicts problem-solving performance in virtual teams</article-title>. <source>Small Group Res.</source> 1&#x02013;23. <pub-id pub-id-type="doi">10.1177/10464964231195618</pub-id></citation>
</ref>
<ref id="B48">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hornof</surname> <given-names>A. J.</given-names></name> <name><surname>Halverson</surname> <given-names>T.</given-names></name></person-group> (<year>2002</year>). <article-title>Cleaning up systematic error in eye-tracking data by using required fixation locations</article-title>. <source>Beh. Res. Methods Instrum. Comput.</source> <volume>34</volume>, <fpage>592</fpage>&#x02013;<lpage>604</lpage>. <pub-id pub-id-type="doi">10.3758/BF03195487</pub-id></citation>
</ref>
<ref id="B49">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Huber</surname> <given-names>S. E.</given-names></name> <name><surname>Martini</surname> <given-names>M.</given-names></name> <name><surname>Sachse</surname> <given-names>P.</given-names></name></person-group> (<year>2022</year>). <article-title>Patterns of eye blinks are modulated by auditory input in humans</article-title>. <source>Cognition</source> <volume>221</volume>:<fpage>104982</fpage>. <pub-id pub-id-type="doi">10.1016/j.cognition.2021.104982</pub-id></citation>
</ref>
<ref id="B50">
<citation citation-type="book"><person-group person-group-type="author"><collab>iMotions</collab></person-group> (<year>2022</year>). <source>iMotions (9.3)</source>. <publisher-loc>Copenhagen</publisher-loc>: <publisher-name>iMotions A/S</publisher-name>.</citation>
</ref>
<ref id="B51">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jarick</surname> <given-names>M.</given-names></name> <name><surname>Bencic</surname> <given-names>R.</given-names></name></person-group> (<year>2019</year>). <article-title>Eye contact is a two-way street: arousal is elicited by the sending and receiving of eye gaze information</article-title>. <source>Front. Psychol.</source> <volume>10</volume>:<fpage>1262</fpage>. <pub-id pub-id-type="doi">10.3389/fpsyg.2019.01262</pub-id></citation>
</ref>
<ref id="B52">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Joshi</surname> <given-names>S.</given-names></name> <name><surname>Li</surname> <given-names>Y.</given-names></name> <name><surname>Kalwani</surname> <given-names>R. M.</given-names></name> <name><surname>Gold</surname> <given-names>J. I.</given-names></name></person-group> (<year>2016</year>). <article-title>Relationships between pupil diameter and neuronal activity in the locus coeruleus, colliculi, and cingulate cortex</article-title>. <source>Neuron</source> <volume>89</volume>, <fpage>221</fpage>&#x02013;<lpage>234</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuron.2015.11.028</pub-id></citation>
</ref>
<ref id="B53">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kang</surname> <given-names>O.</given-names></name> <name><surname>Banaji</surname> <given-names>M. R.</given-names></name></person-group> (<year>2020</year>). <article-title>Pupillometric decoding of high-level musical imagery</article-title>. <source>Conscious. Cogn.</source> <volume>77</volume>:<fpage>102862</fpage>. <pub-id pub-id-type="doi">10.1016/j.concog.2019.102862</pub-id></citation>
</ref>
<ref id="B54">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kang</surname> <given-names>O.</given-names></name> <name><surname>Wheatley</surname> <given-names>T.</given-names></name></person-group> (<year>2015</year>). <article-title>Pupil dilation patterns reflect the contents of consciousness</article-title>. <source>Conscious. Cogn.</source> <volume>35</volume>, <fpage>128</fpage>&#x02013;<lpage>135</lpage>. <pub-id pub-id-type="doi">10.1016/j.concog.2015.05.001</pub-id></citation>
</ref>
<ref id="B55">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kang</surname> <given-names>O.</given-names></name> <name><surname>Wheatley</surname> <given-names>T.</given-names></name></person-group> (<year>2017</year>). <article-title>Pupil dilation patterns spontaneously synchronize across individuals during shared attention</article-title>. <source>J. Exp. Psychol. Gen.</source> <volume>146</volume>, <fpage>569</fpage>&#x02013;<lpage>576</lpage>. <pub-id pub-id-type="doi">10.1037/xge0000271</pub-id></citation>
</ref>
<ref id="B56">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kang</surname> <given-names>O. E.</given-names></name> <name><surname>Huffer</surname> <given-names>K. E.</given-names></name> <name><surname>Wheatley</surname> <given-names>T. P.</given-names></name></person-group> (<year>2014</year>). <article-title>Pupil dilation dynamics track attention to high-level information</article-title>. <source>PLoS ONE</source> <volume>9</volume>:<fpage>e102463</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0102463</pub-id></citation>
</ref>
<ref id="B57">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Kassner</surname> <given-names>M.</given-names></name> <name><surname>Patera</surname> <given-names>W.</given-names></name> <name><surname>Bulling</surname> <given-names>A.</given-names></name></person-group> (<year>2014</year>). <article-title>&#x0201C;Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction,&#x0201D;</article-title> in <source>Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication</source> (<publisher-loc>New York, NY</publisher-loc>: <publisher-name>ACM</publisher-name>), <fpage>1151</fpage>&#x02013;<lpage>1160</lpage>. <pub-id pub-id-type="doi">10.1145/2638728.2641695</pub-id></citation>
</ref>
<ref id="B58">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kingstone</surname> <given-names>A.</given-names></name> <name><surname>Smilek</surname> <given-names>D.</given-names></name> <name><surname>Ristic</surname> <given-names>J.</given-names></name> <name><surname>Kelland Friesen</surname> <given-names>C.</given-names></name> <name><surname>Eastwood</surname> <given-names>J. D.</given-names></name></person-group> (<year>2003</year>). <article-title>Attention, researchers! It is time to take a look at the real world</article-title>. <source>Curr. Dir. Psychol. Sci.</source> <volume>12</volume>, <fpage>176</fpage>&#x02013;<lpage>180</lpage>. <pub-id pub-id-type="doi">10.1111/1467-8721.01255</pub-id></citation>
</ref>
<ref id="B59">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kinley</surname> <given-names>I.</given-names></name> <name><surname>Levy</surname> <given-names>Y.</given-names></name></person-group> (<year>2022</year>). <article-title>PuPl: an open-source tool for processing pupillometry data</article-title>. <source>Behav. Res. Methods</source> <volume>54</volume>, <fpage>2046</fpage>&#x02013;<lpage>2069</lpage>. <pub-id pub-id-type="doi">10.3758/s13428-021-01717-z</pub-id></citation>
</ref>
<ref id="B60">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Laidlaw</surname> <given-names>K. E. W.</given-names></name> <name><surname>Risko</surname> <given-names>E. F.</given-names></name> <name><surname>Kingstone</surname> <given-names>A.</given-names></name></person-group> (<year>2016</year>). <article-title>&#x0201C;Levels of complexity and the duality of gaze: how social attention changes from lab to life,&#x0201D;</article-title> in <source>Shared Representations: Sensorimotor Foundations of Social Life</source>, eds S. S. Obhi, and E. S. Cross (<publisher-loc>Cambridge</publisher-loc>: <publisher-name>Cambridge University Press</publisher-name>), <fpage>195</fpage>&#x02013;<lpage>215</lpage>. <pub-id pub-id-type="doi">10.1017/CBO9781107279353.011</pub-id></citation>
</ref>
<ref id="B61">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Laskowitz</surname> <given-names>S.</given-names></name> <name><surname>Griffin</surname> <given-names>J. W.</given-names></name> <name><surname>Geier</surname> <given-names>C. F.</given-names></name> <name><surname>Scherf</surname> <given-names>K. S.</given-names></name></person-group> (<year>2021</year>). <article-title>Cracking the code of live human social interactions in autism: a review of the eye-tracking literature</article-title>. <source>Proc. Mach. Learn. Res.</source> <volume>173</volume>, <fpage>242</fpage>&#x02013;<lpage>264</lpage>.</citation>
</ref>
<ref id="B62">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Launay</surname> <given-names>J.</given-names></name> <name><surname>Tarr</surname> <given-names>B.</given-names></name> <name><surname>Dunbar</surname> <given-names>R. I. M.</given-names></name></person-group> (<year>2016</year>). <article-title>Synchrony as an adaptive mechanism for large-scale human social bonding</article-title>. <source>Ethology</source> <volume>122</volume>, <fpage>779</fpage>&#x02013;<lpage>789</lpage>. <pub-id pub-id-type="doi">10.1111/eth.12528</pub-id></citation>
</ref>
<ref id="B63">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Macdonald</surname> <given-names>R. G.</given-names></name> <name><surname>Tatler</surname> <given-names>B. W.</given-names></name></person-group> (<year>2018</year>). <article-title>Gaze in a real-world social interaction: a dual eye-tracking study</article-title>. <source>Q. J. Exp. Physiol.</source> <volume>71</volume>, <fpage>2162</fpage>&#x02013;<lpage>2173</lpage>. <pub-id pub-id-type="doi">10.1177/1747021817739221</pub-id><pub-id pub-id-type="pmid">30226438</pub-id></citation></ref>
<ref id="B64">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mackworth</surname> <given-names>J. F.</given-names></name> <name><surname>Mackworth</surname> <given-names>N. H.</given-names></name></person-group> (<year>1958</year>). <article-title>Eye fixations recorded on changing visual scenes by the television eye-marker</article-title>. <source>J. Opt. Soc. Am.</source> <volume>48</volume>, <fpage>439</fpage>&#x02013;<lpage>445</lpage>. <pub-id pub-id-type="doi">10.1364/JOSA.48.000439</pub-id><pub-id pub-id-type="pmid">13564324</pub-id></citation></ref>
<ref id="B65">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mackworth</surname> <given-names>N. H.</given-names></name> <name><surname>Thomas</surname> <given-names>E. L.</given-names></name></person-group> (<year>1962</year>). <article-title>Head-mounted eye-marker camera</article-title>. <source>J. Opt. Soc. Am.</source> <volume>52</volume>, <fpage>713</fpage>&#x02013;<lpage>716</lpage>. <pub-id pub-id-type="doi">10.1364/JOSA.52.000713</pub-id><pub-id pub-id-type="pmid">14467994</pub-id></citation></ref>
<ref id="B66">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Maffei</surname> <given-names>A.</given-names></name> <name><surname>Angrilli</surname> <given-names>A.</given-names></name></person-group> (<year>2018</year>). <article-title>Spontaneous eye blink rate: an index of dopaminergic component of sustained attention and fatigue</article-title>. <source>Int. J. Psychophysiol.</source> <volume>123</volume>, <fpage>58</fpage>&#x02013;<lpage>63</lpage>. <pub-id pub-id-type="doi">10.1016/j.ijpsycho.2017.11.009</pub-id><pub-id pub-id-type="pmid">29133149</pub-id></citation></ref>
<ref id="B67">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Maran</surname> <given-names>T.</given-names></name> <name><surname>Furtner</surname> <given-names>M.</given-names></name> <name><surname>Liegl</surname> <given-names>S.</given-names></name> <name><surname>Ravet-Brown</surname> <given-names>T.</given-names></name> <name><surname>Haraped</surname> <given-names>L.</given-names></name> <name><surname>Sachse</surname> <given-names>P.</given-names></name> <etal/></person-group>. (<year>2021</year>). <article-title>Visual attention in real-world conversation: gaze patterns are modulated by communication and group size</article-title>. <source>Appl. Psychol.</source> <volume>70</volume>, <fpage>1602</fpage>&#x02013;<lpage>1627</lpage>. <pub-id pub-id-type="doi">10.1111/apps.12291</pub-id></citation>
</ref>
<ref id="B68">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mayo</surname> <given-names>O.</given-names></name> <name><surname>Gordon</surname> <given-names>I.</given-names></name></person-group> (<year>2020</year>). <article-title>In and out of synchrony-Behavioral and physiological dynamics of dyadic interpersonal coordination</article-title>. <source>Psychophysiology</source> <volume>57</volume>:<fpage>e13574</fpage>. <pub-id pub-id-type="doi">10.1111/psyp.13574</pub-id><pub-id pub-id-type="pmid">32221984</pub-id></citation></ref>
<ref id="B69">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mayrand</surname> <given-names>F.</given-names></name> <name><surname>Capozzi</surname> <given-names>F.</given-names></name> <name><surname>Ristic</surname> <given-names>J.</given-names></name></person-group> (<year>2023</year>). <article-title>A dual mobile eye tracking study on natural eye contact during live interactions</article-title>. <source>Sci. Rep.</source> <volume>13</volume>:<fpage>11385</fpage>. <pub-id pub-id-type="doi">10.1038/s41598-023-38346-9</pub-id><pub-id pub-id-type="pmid">37452135</pub-id></citation></ref>
<ref id="B70">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mazur</surname> <given-names>A.</given-names></name> <name><surname>Rosa</surname> <given-names>E.</given-names></name> <name><surname>Faupel</surname> <given-names>M.</given-names></name> <name><surname>Heller</surname> <given-names>J.</given-names></name> <name><surname>Leen</surname> <given-names>R.</given-names></name> <name><surname>Thurman</surname> <given-names>B.</given-names></name> <etal/></person-group>. (<year>1980</year>). <article-title>Physiological aspects of communication via mutual gaze</article-title>. <source>Am. J. Sociol.</source> <volume>86</volume>, <fpage>50</fpage>&#x02013;<lpage>74</lpage>. <pub-id pub-id-type="doi">10.1086/227202</pub-id><pub-id pub-id-type="pmid">7435770</pub-id></citation></ref>
<ref id="B71">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>McClave</surname> <given-names>E. Z.</given-names></name></person-group> (<year>2000</year>). <article-title>Linguistic functions of head movements in the context of speech</article-title>. <source>J. Pragmat.</source> <volume>32</volume>, <fpage>855</fpage>&#x02013;<lpage>878</lpage>. <pub-id pub-id-type="doi">10.1016/S0378-2166(99)00079-X</pub-id></citation>
</ref>
<ref id="B72">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mogan</surname> <given-names>R.</given-names></name> <name><surname>Fischer</surname> <given-names>R.</given-names></name> <name><surname>Bulbulia</surname> <given-names>J. A.</given-names></name></person-group> (<year>2017</year>). <article-title>To be in synchrony or not? A meta-analysis of synchrony&#x00027;s effects on behavior, perception, cognition and affect</article-title>. <source>J. Exp. Soc. Psychol.</source> <volume>72</volume>, <fpage>13</fpage>&#x02013;<lpage>20</lpage>. <pub-id pub-id-type="doi">10.1016/j.jesp.2017.03.009</pub-id></citation>
</ref>
<ref id="B73">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Moulder</surname> <given-names>R. G.</given-names></name> <name><surname>Duran</surname> <given-names>N. D.</given-names></name> <name><surname>D&#x00027;Mello</surname> <given-names>S. K.</given-names></name></person-group> (<year>2022</year>). <article-title>&#x0201C;Assessing multimodal dynamics in multi-party collaborative interactions with multi-level vector autoregression,&#x0201D;</article-title> in <source>Proceedings of the 2022 International Conference on Multimodal Interaction</source> (<publisher-loc>New York, NY</publisher-loc>: <publisher-name>ACM</publisher-name>), <fpage>615</fpage>&#x02013;<lpage>625</lpage>. <pub-id pub-id-type="doi">10.1145/3536221.3556595</pub-id></citation>
</ref>
<ref id="B74">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mundy</surname> <given-names>P.</given-names></name></person-group> (<year>2017</year>). <article-title>A review of joint attention and social-cognitive brain systems in typical development and autism spectrum disorder</article-title>. <source>Eur. J. Neurosci.</source> <volume>47</volume>, <fpage>497</fpage>&#x02013;<lpage>514</lpage>. <pub-id pub-id-type="doi">10.1111/ejn.13720</pub-id><pub-id pub-id-type="pmid">28922520</pub-id></citation></ref>
<ref id="B75">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Naber</surname> <given-names>M.</given-names></name> <name><surname>Alvarez</surname> <given-names>G. A.</given-names></name> <name><surname>Nakayama</surname> <given-names>K.</given-names></name></person-group> (<year>2013</year>). <article-title>Tracking the allocation of attention using human pupillary oscillations</article-title>. <source>Front. Psychol.</source> <volume>4</volume>:<fpage>919</fpage>. <pub-id pub-id-type="doi">10.3389/fpsyg.2013.00919</pub-id><pub-id pub-id-type="pmid">24368904</pub-id></citation></ref>
<ref id="B76">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nakano</surname> <given-names>T.</given-names></name></person-group> (<year>2015</year>). <article-title>Blink-related dynamic switching between internal and external orienting networks while viewing videos</article-title>. <source>Neurosci. Res.</source> <volume>96</volume>, <fpage>54</fpage>&#x02013;<lpage>58</lpage>. <pub-id pub-id-type="doi">10.1016/j.neures.2015.02.010</pub-id><pub-id pub-id-type="pmid">25828154</pub-id></citation></ref>
<ref id="B77">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nakano</surname> <given-names>T.</given-names></name> <name><surname>Kato</surname> <given-names>M.</given-names></name> <name><surname>Morito</surname> <given-names>Y.</given-names></name> <name><surname>Itoi</surname> <given-names>S.</given-names></name> <name><surname>Kitazawa</surname> <given-names>S.</given-names></name></person-group> (<year>2013</year>). <article-title>Blink-related momentary activation of the default mode network while viewing videos</article-title>. <source>Proc. Nat. Acad. Sci.</source> <volume>110</volume>, <fpage>702</fpage>&#x02013;<lpage>706</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.1214804110</pub-id><pub-id pub-id-type="pmid">23267078</pub-id></citation></ref>
<ref id="B78">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nakano</surname> <given-names>T.</given-names></name> <name><surname>Kitazawa</surname> <given-names>S.</given-names></name></person-group> (<year>2010</year>). <article-title>Eyeblink entrainment at breakpoints of speech</article-title>. <source>Exp. Brain Res.</source> <volume>205</volume>, <fpage>577</fpage>&#x02013;<lpage>581</lpage>. <pub-id pub-id-type="doi">10.1007/s00221-010-2387-z</pub-id><pub-id pub-id-type="pmid">20700731</pub-id></citation></ref>
<ref id="B79">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nakano</surname> <given-names>T.</given-names></name> <name><surname>Miyazaki</surname> <given-names>Y.</given-names></name></person-group> (<year>2019</year>). <article-title>Blink synchronization is an indicator of interest while viewing videos</article-title>. <source>Int. J. Psychophysiol.</source> <volume>135</volume>, <fpage>1</fpage>&#x02013;<lpage>11</lpage>. <pub-id pub-id-type="doi">10.1016/j.ijpsycho.2018.10.012</pub-id><pub-id pub-id-type="pmid">30428333</pub-id></citation></ref>
<ref id="B80">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nencheva</surname> <given-names>M. L.</given-names></name> <name><surname>Piazza</surname> <given-names>E. A.</given-names></name> <name><surname>Lew-Williams</surname> <given-names>C.</given-names></name></person-group> (<year>2021</year>). <article-title>The moment-to-moment pitch dynamics of child-directed speech shape toddlers&#x00027; attention and learning</article-title>. <source>Dev. Sci.</source> <volume>24</volume>:<fpage>e12997</fpage>. <pub-id pub-id-type="doi">10.1111/desc.12997</pub-id><pub-id pub-id-type="pmid">32441385</pub-id></citation></ref>
<ref id="B81">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Peng</surname> <given-names>C. K.</given-names></name> <name><surname>Buldyrev</surname> <given-names>S. V.</given-names></name> <name><surname>Havlin</surname> <given-names>S.</given-names></name> <name><surname>Simons</surname> <given-names>M.</given-names></name> <name><surname>Stanley</surname> <given-names>H. E.</given-names></name> <name><surname>Goldberger</surname> <given-names>A. L.</given-names></name> <etal/></person-group>. (<year>1994</year>). <article-title>Mosaic organization of DNA nucleotides</article-title>. <source>Phys. Rev. E Stat. Phys. Plasmas Fluids Relat. Interdiscip. Topics</source> <volume>49</volume>, <fpage>1685</fpage>&#x02013;<lpage>1689</lpage>. <pub-id pub-id-type="doi">10.1103/PhysRevE.49.1685</pub-id><pub-id pub-id-type="pmid">9961383</pub-id></citation></ref>
<ref id="B82">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>P&#x000E9;rez</surname> <given-names>P.</given-names></name> <name><surname>Madsen</surname> <given-names>J.</given-names></name> <name><surname>Banellis</surname> <given-names>L.</given-names></name> <name><surname>T&#x000FC;rker</surname> <given-names>B.</given-names></name> <name><surname>Raimondo</surname> <given-names>F.</given-names></name> <name><surname>Perlbarg</surname> <given-names>V.</given-names></name> <etal/></person-group>. (<year>2021</year>). <article-title>Conscious processing of narrative stimuli synchronizes heart rate between individuals</article-title>. <source>Cell Rep.</source> <volume>36</volume>:<fpage>109692</fpage>. <pub-id pub-id-type="doi">10.1016/j.celrep.2021.109692</pub-id><pub-id pub-id-type="pmid">34525363</pub-id></citation></ref>
<ref id="B83">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pfeiffer</surname> <given-names>U. J.</given-names></name> <name><surname>Vogeley</surname> <given-names>K.</given-names></name> <name><surname>Schilbach</surname> <given-names>L.</given-names></name></person-group> (<year>2013</year>). <article-title>From gaze cueing to dual eye-tracking: novel approaches to investigate the neural correlates of gaze in social interaction</article-title>. <source>Neurosci. Biobehav. Rev.</source> <volume>37</volume>, <fpage>2516</fpage>&#x02013;<lpage>2528</lpage>. <pub-id pub-id-type="doi">10.1016/j.neubiorev.2013.07.017</pub-id><pub-id pub-id-type="pmid">23928088</pub-id></citation></ref>
<ref id="B84">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pinti</surname> <given-names>P.</given-names></name> <name><surname>Tachtsidis</surname> <given-names>I.</given-names></name> <name><surname>Hamilton</surname> <given-names>A.</given-names></name> <name><surname>Hirsch</surname> <given-names>J.</given-names></name> <name><surname>Aichelburg</surname> <given-names>C.</given-names></name> <name><surname>Gilbert</surname> <given-names>S.</given-names></name> <etal/></person-group>. (<year>2020</year>). <article-title>The present and future use of functional near-infrared spectroscopy (fNIRS) for cognitive neuroscience</article-title>. <source>Ann. N. Y. Acad. Sci.</source> <volume>1464</volume>, <fpage>5</fpage>&#x02013;<lpage>29</lpage>. <pub-id pub-id-type="doi">10.1111/nyas.13948</pub-id><pub-id pub-id-type="pmid">30085354</pub-id></citation></ref>
<ref id="B85">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>P&#x00142;u&#x0017C;yczka</surname> <given-names>M.</given-names></name> <name><surname>Warszawski</surname> <given-names>U.</given-names></name></person-group> (<year>2018</year>). <article-title>The first hundred years: a history of eye tracking as a research method</article-title>. <source>Appl. Linguist. Pap.</source> <volume>4</volume>, <fpage>101</fpage>&#x02013;<lpage>116</lpage>. <pub-id pub-id-type="doi">10.32612/uw.25449354.2018.4.pp.101-116</pub-id></citation>
</ref>
<ref id="B86">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Posner</surname> <given-names>M. I.</given-names></name> <name><surname>Snyder</surname> <given-names>C. R.</given-names></name> <name><surname>Davidson</surname> <given-names>B. J.</given-names></name></person-group> (<year>1980</year>). <article-title>Attention and the detection of signals</article-title>. <source>J. Exp. Psychol.</source> <volume>109</volume>, <fpage>160</fpage>&#x02013;<lpage>174</lpage>. <pub-id pub-id-type="doi">10.1037/0096-3445.109.2.160</pub-id></citation>
</ref>
<ref id="B87">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Prochazkova</surname> <given-names>E.</given-names></name> <name><surname>Prochazkova</surname> <given-names>L.</given-names></name> <name><surname>Giffin</surname> <given-names>M. R.</given-names></name> <name><surname>Scholte</surname> <given-names>H. S.</given-names></name> <name><surname>De Dreu</surname> <given-names>C. K. W.</given-names></name> <name><surname>Kret</surname> <given-names>M. E.</given-names></name></person-group> (<year>2018</year>). <article-title>Pupil mimicry promotes trust through the theory-of-mind network</article-title>. <source>Proc. Natl. Acad. Sci. USA.</source> <volume>115</volume>, <fpage>E7265</fpage>&#x02013;<lpage>E7274</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.1803916115</pub-id><pub-id pub-id-type="pmid">30012623</pub-id></citation></ref>
<ref id="B88">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rac-Lubashevsky</surname> <given-names>R.</given-names></name> <name><surname>Slagter</surname> <given-names>H. A.</given-names></name> <name><surname>Kessler</surname> <given-names>Y.</given-names></name></person-group> (<year>2017</year>). <article-title>Tracking real-time changes in working memory updating and gating with the event-based eye-blink rate</article-title>. <source>Sci. Rep.</source> <volume>7</volume>:<fpage>2547</fpage>. <pub-id pub-id-type="doi">10.1038/s41598-017-02942-3</pub-id><pub-id pub-id-type="pmid">28566762</pub-id></citation></ref>
<ref id="B89">
<citation citation-type="web"><person-group person-group-type="author"><name><surname>Rajkowski</surname> <given-names>J.</given-names></name></person-group> (<year>1993</year>). <article-title>Correlations between locus coeruleus (LC) neural activity, pupil diameter and behavior in monkey support a role of LC in attention</article-title>. <source>Soc. Neurosc.</source> Abstract, Washington, DC. Available online at: <ext-link ext-link-type="uri" xlink:href="https://ci.nii.ac.jp/naid/10021384962/">https://ci.nii.ac.jp/naid/10021384962/</ext-link> (accessed February 27, 2024).</citation>
</ref>
<ref id="B90">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ravreby</surname> <given-names>I.</given-names></name> <name><surname>Shilat</surname> <given-names>Y.</given-names></name> <name><surname>Yeshurun</surname> <given-names>Y.</given-names></name></person-group> (<year>2022</year>). <article-title>Liking as a balance between synchronization, complexity, and novelty</article-title>. <source>Nat. Sci. Rep.</source> <volume>12</volume>, <fpage>1</fpage>&#x02013;<lpage>12</lpage>. <pub-id pub-id-type="doi">10.1038/s41598-022-06610-z</pub-id><pub-id pub-id-type="pmid">35210459</pub-id></citation></ref>
<ref id="B91">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Razavi</surname> <given-names>M.</given-names></name> <name><surname>Janfaza</surname> <given-names>V.</given-names></name> <name><surname>Yamauchi</surname> <given-names>T.</given-names></name> <name><surname>Leontyev</surname> <given-names>A.</given-names></name> <name><surname>Longmire-Monford</surname> <given-names>S.</given-names></name> <name><surname>Orr</surname> <given-names>J.</given-names></name> <etal/></person-group>. (<year>2022</year>). <article-title>OpenSync: an open-source platform for synchronizing multiple measures in neuroscience experiments</article-title>. <source>J. Neurosci. Methods</source> <volume>369</volume>:<fpage>109458</fpage>. <pub-id pub-id-type="doi">10.1016/j.jneumeth.2021.109458</pub-id><pub-id pub-id-type="pmid">34968624</pub-id></citation></ref>
<ref id="B92">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Richardson</surname> <given-names>D. C.</given-names></name> <name><surname>Dale</surname> <given-names>R.</given-names></name> <name><surname>Kirkham</surname> <given-names>N. Z.</given-names></name></person-group> (<year>2007</year>). <article-title>The Art of Conversation Is Coordination</article-title>. <source>Psychol. Sci.</source> <volume>18</volume>, <fpage>407</fpage>&#x02013;<lpage>413</lpage>. <pub-id pub-id-type="doi">10.1111/j.1467-9280.2007.01914.x</pub-id></citation>
</ref>
<ref id="B93">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Risko</surname> <given-names>E. F.</given-names></name> <name><surname>Richardson</surname> <given-names>D. C.</given-names></name> <name><surname>Kingstone</surname> <given-names>A.</given-names></name></person-group> (<year>2016</year>). <article-title>Breaking the fourth wall of cognitive science: real-world social attention and the dual function of gaze</article-title>. <source>Curr. Dir. Psychol. Sci.</source> <volume>25</volume>, <fpage>70</fpage>&#x02013;<lpage>74</lpage>. <pub-id pub-id-type="doi">10.1177/0963721415617806</pub-id></citation>
</ref>
<ref id="B94">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rogers</surname> <given-names>S. L.</given-names></name> <name><surname>Speelman</surname> <given-names>C. P.</given-names></name> <name><surname>Guidetti</surname> <given-names>O.</given-names></name> <name><surname>Longmuir</surname> <given-names>M.</given-names></name></person-group> (<year>2018</year>). <article-title>Using dual eye tracking to uncover personal gaze patterns during social interaction</article-title>. <source>Sci. Rep.</source> <volume>8</volume>:<fpage>4271</fpage>. <pub-id pub-id-type="doi">10.1038/s41598-018-22726-7</pub-id><pub-id pub-id-type="pmid">29523822</pub-id></citation></ref>
<ref id="B95">
<citation citation-type="web"><person-group person-group-type="author"><name><surname>Sacks</surname> <given-names>H.</given-names></name> <name><surname>Schegloff</surname> <given-names>E. A.</given-names></name> <name><surname>Jefferson</surname> <given-names>G.</given-names></name></person-group> (<year>1978</year>). <article-title>A simplest systematics for the organization of turn taking for conversation</article-title>. <source>In the Organization of Conversational</source>. Available oline at: <ext-link ext-link-type="uri" xlink:href="https://www.sciencedirect.com/science/article/pii/B9780126235500500082">https://www.sciencedirect.com/science/article/pii/B9780126235500500082</ext-link> (accessed February 27, 2024).</citation>
</ref>
<ref id="B96">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Scaife</surname> <given-names>M.</given-names></name> <name><surname>Bruner</surname> <given-names>J. S.</given-names></name></person-group> (<year>1975</year>). <article-title>The capacity for joint visual attention in the infant</article-title>. <source>Nature</source> <volume>253</volume>, <fpage>265</fpage>&#x02013;<lpage>266</lpage>. <pub-id pub-id-type="doi">10.1038/253265a0</pub-id><pub-id pub-id-type="pmid">1113842</pub-id></citation></ref>
<ref id="B97">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schilbach</surname> <given-names>L.</given-names></name></person-group> (<year>2016</year>). <article-title>Towards a second-person neuropsychiatry</article-title>. <source>Philos. Trans. R. Soc. Lond. B: Biol. Sci.</source> <volume>371</volume>, <fpage>20150081</fpage>. <pub-id pub-id-type="doi">10.1098/rstb.2015.0081</pub-id></citation>
</ref>
<ref id="B98">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schilbach</surname> <given-names>L.</given-names></name> <name><surname>Timmermans</surname> <given-names>B.</given-names></name> <name><surname>Reddy</surname> <given-names>V.</given-names></name> <name><surname>Costall</surname> <given-names>A.</given-names></name> <name><surname>Bente</surname> <given-names>G.</given-names></name> <name><surname>Schlicht</surname> <given-names>T.</given-names></name> <etal/></person-group>. (<year>2013</year>). <article-title>Toward a second-person neuroscience</article-title>. <source>Behav. Brain Sci.</source> <volume>36</volume>, <fpage>393</fpage>&#x02013;<lpage>414</lpage>. <pub-id pub-id-type="doi">10.1017/S0140525X12000660</pub-id></citation>
</ref>
<ref id="B99">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schwiedrzik</surname> <given-names>C. M.</given-names></name> <name><surname>Sudmann</surname> <given-names>S. S.</given-names></name></person-group> (<year>2020</year>). <article-title>Pupil diameter tracks statistical structure in the environment to increase visual sensitivity</article-title>. <source>J. Neurosci.</source> <volume>40</volume>, <fpage>4565</fpage>&#x02013;<lpage>4575</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.0216-20.2020</pub-id><pub-id pub-id-type="pmid">32371603</pub-id></citation></ref>
<ref id="B100">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Senju</surname> <given-names>A.</given-names></name> <name><surname>Vernetti</surname> <given-names>A.</given-names></name> <name><surname>Kikuchi</surname> <given-names>Y.</given-names></name> <name><surname>Akechi</surname> <given-names>H.</given-names></name> <name><surname>Hasegawa</surname> <given-names>T.</given-names></name> <name><surname>Johnson</surname> <given-names>M. H.</given-names></name> <etal/></person-group>. (<year>2013</year>). <article-title>Cultural background modulates how we look at other persons&#x00027; gaze</article-title>. <source>Int. J. Behav. Dev.</source> <volume>37</volume>, <fpage>131</fpage>&#x02013;<lpage>136</lpage>. <pub-id pub-id-type="doi">10.1177/0165025412465360</pub-id><pub-id pub-id-type="pmid">23585703</pub-id></citation></ref>
<ref id="B101">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Shockley</surname> <given-names>K.</given-names></name> <name><surname>Richardson</surname> <given-names>D. C.</given-names></name> <name><surname>Dale</surname> <given-names>R.</given-names></name></person-group> (<year>2009</year>). <article-title>Conversation and coordinative structures</article-title>. <source>Top. Cogn. Sci.</source> <volume>1</volume>, <fpage>305</fpage>&#x02013;<lpage>319</lpage>. <pub-id pub-id-type="doi">10.1111/j.1756-8765.2009.01021.x</pub-id><pub-id pub-id-type="pmid">25164935</pub-id></citation></ref>
<ref id="B102">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Smith</surname> <given-names>E. R.</given-names></name> <name><surname>Mackie</surname> <given-names>D. M.</given-names></name></person-group> (<year>2016</year>). <article-title>Representation and incorporation of close others&#x00027; responses: the RICOR model of social influence</article-title>. <source>Pers. Soc. Psychol. Rev.</source> <volume>20</volume>, <fpage>311</fpage>&#x02013;<lpage>331</lpage>. <pub-id pub-id-type="doi">10.1177/1088868315598256</pub-id><pub-id pub-id-type="pmid">26238964</pub-id></citation></ref>
<ref id="B103">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Stern</surname> <given-names>J. A.</given-names></name> <name><surname>Walrath</surname> <given-names>L. C.</given-names></name> <name><surname>Goldstein</surname> <given-names>R.</given-names></name></person-group> (<year>1984</year>). <article-title>The endogenous eyeblink</article-title>. <source>Psychophysiology</source> <volume>21</volume>, <fpage>22</fpage>&#x02013;<lpage>33</lpage>. <pub-id pub-id-type="doi">10.1111/j.1469-8986.1984.tb02312.x</pub-id></citation>
</ref>
<ref id="B104">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Valtakari</surname> <given-names>N. V.</given-names></name> <name><surname>Hooge</surname> <given-names>I. T. C.</given-names></name> <name><surname>Viktorsson</surname> <given-names>C.</given-names></name> <name><surname>Nystr&#x000F6;m</surname> <given-names>P.</given-names></name> <name><surname>Falck-Ytter</surname> <given-names>T.</given-names></name> <name><surname>Hessels</surname> <given-names>R. S.</given-names></name> <etal/></person-group>. (<year>2021</year>). <article-title>Eye tracking in human interaction: possibilities and limitations</article-title>. <source>Behav. Res. Methods</source> <volume>53</volume>, <fpage>1592</fpage>&#x02013;<lpage>1608</lpage>. <pub-id pub-id-type="doi">10.3758/s13428-020-01517-x</pub-id><pub-id pub-id-type="pmid">33409984</pub-id></citation></ref>
<ref id="B105">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wheatley</surname> <given-names>T.</given-names></name> <name><surname>Kang</surname> <given-names>O.</given-names></name> <name><surname>Parkinson</surname> <given-names>C.</given-names></name> <name><surname>Looser</surname> <given-names>C. E.</given-names></name></person-group> (<year>2012</year>). <article-title>From mind perception to mental connection: Synchrony as a mechanism for social understanding</article-title>. <source>Soc. Personal. Psychol. Compass</source> <volume>6</volume>, <fpage>589</fpage>&#x02013;<lpage>606</lpage>. <pub-id pub-id-type="doi">10.1111/j.1751-9004.2012.00450.x</pub-id></citation>
</ref>
<ref id="B106">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wheatley</surname> <given-names>T.</given-names></name> <name><surname>Thornton</surname> <given-names>M.</given-names></name> <name><surname>Stolk</surname> <given-names>A.</given-names></name> <name><surname>Chang</surname> <given-names>L. J.</given-names></name></person-group> (<year>2023</year>). <article-title>The emerging science of interacting minds</article-title>. <source>Perspect. Psychol. Sci</source>. 1&#x02013;19. <pub-id pub-id-type="doi">10.1177/17456916231200177</pub-id><pub-id pub-id-type="pmid">38096443</pub-id></citation></ref>
<ref id="B107">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wierda</surname> <given-names>S. M.</given-names></name> <name><surname>van Rijn</surname> <given-names>H.</given-names></name> <name><surname>Taatgen</surname> <given-names>N. A.</given-names></name> <name><surname>Martens</surname> <given-names>S.</given-names></name></person-group> (<year>2012</year>). <article-title>Pupil dilation deconvolution reveals the dynamics of attention at high temporal resolution</article-title>. <source>Proc. Natl. Acad. Sci. USA.</source> <volume>109</volume>, <fpage>8456</fpage>&#x02013;<lpage>8460</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.1201858109</pub-id><pub-id pub-id-type="pmid">22586101</pub-id></citation></ref>
<ref id="B108">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wohltjen</surname> <given-names>S.</given-names></name> <name><surname>Toth</surname> <given-names>B.</given-names></name> <name><surname>Boncz</surname> <given-names>A.</given-names></name> <name><surname>Wheatley</surname> <given-names>T.</given-names></name></person-group> (<year>2023</year>). <article-title>Synchrony to a beat predicts synchrony with other minds</article-title>. <source>Sci. Rep.</source> <volume>13</volume>:<fpage>3591</fpage>. <pub-id pub-id-type="doi">10.1038/s41598-023-29776-6</pub-id><pub-id pub-id-type="pmid">36869056</pub-id></citation></ref>
<ref id="B109">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wohltjen</surname> <given-names>S.</given-names></name> <name><surname>Wheatley</surname> <given-names>T.</given-names></name></person-group> (<year>2021</year>). <article-title>Eye contact marks the rise and fall of shared attention in conversation</article-title>. <source>Proc. Natl. Acad. Sci. USA.</source> <volume>118</volume>:<fpage>e2106645118</fpage>. <pub-id pub-id-type="doi">10.1073/pnas.2106645118</pub-id><pub-id pub-id-type="pmid">34504001</pub-id></citation></ref>
<ref id="B110">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zbilut</surname> <given-names>J. P.</given-names></name> <name><surname>Giuliani</surname> <given-names>A.</given-names></name> <name><surname>Webber</surname> <given-names>C. L.</given-names></name></person-group> (<year>1998</year>). <article-title>Detecting deterministic signals in exceptionally noisy environments using cross-recurrence quantification</article-title>. <source>Phys. Lett. A</source> <volume>246</volume>, <fpage>122</fpage>&#x02013;<lpage>128</lpage>. <pub-id pub-id-type="doi">10.1016/S0375-9601(98)00457-5</pub-id></citation>
</ref>
<ref id="B111">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zwaigenbaum</surname> <given-names>L.</given-names></name> <name><surname>Bryson</surname> <given-names>S.</given-names></name> <name><surname>Rogers</surname> <given-names>T.</given-names></name> <name><surname>Roberts</surname> <given-names>W.</given-names></name> <name><surname>Brian</surname> <given-names>J.</given-names></name> <name><surname>Szatmari</surname> <given-names>P.</given-names></name> <etal/></person-group>. (<year>2005</year>). <article-title>Behavioral manifestations of autism in the first year of life</article-title>. <source>Int. J. Dev. Neurosci.</source> <volume>23</volume>, <fpage>143</fpage>&#x02013;<lpage>152</lpage>. <pub-id pub-id-type="doi">10.1016/j.ijdevneu.2004.05.001</pub-id><pub-id pub-id-type="pmid">15749241</pub-id></citation></ref>
</ref-list>
</back>
</article>