<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" article-type="review-article" dtd-version="2.3" xml:lang="EN">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Psychol.</journal-id>
<journal-title>Frontiers in Psychology</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Psychol.</abbrev-journal-title>
<issn pub-type="epub">1664-1078</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fpsyg.2022.1028150</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Psychology</subject>
<subj-group>
<subject>Review</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Fading boundaries between the physical and the social world: Insights and novel techniques from the intersection of these two fields</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author"><name><surname>Dapor</surname><given-names>Cecilia</given-names></name>
<uri xlink:href="https://loop.frontiersin.org/people/1656113/overview"/>
</contrib>
<contrib contrib-type="author" corresp="yes"><name><surname>Sperandio</surname><given-names>Irene</given-names></name><xref rid="c001" ref-type="corresp"><sup>&#x002A;</sup></xref><xref rid="fn0001" ref-type="author-notes"><sup>&#x2020;</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/81899/overview"/>
</contrib>
<contrib contrib-type="author" corresp="yes"><name><surname>Meconi</surname><given-names>Federica</given-names></name><xref rid="c002" ref-type="corresp"><sup>&#x002A;</sup></xref><xref rid="fn0002" ref-type="author-notes"><sup>&#x2020;</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/1034049/overview"/>
</contrib>
</contrib-group>
<aff><institution>Department of Psychology and Cognitive Science, University of Trento</institution>, <addr-line>Rovereto</addr-line>, <country>Italy</country></aff>
<author-notes>
<fn id="fn0002" fn-type="edited-by">
<p>Edited by: Antonino Raffone, Sapienza University of Rome, Italy</p>
</fn>
<fn id="fn0003" fn-type="edited-by">
<p>Reviewed by: Peter J. Marshall, Temple University, United States; Luca Simione, Institute of Cognitive Sciences and Technologies, Italy</p>
</fn>
<corresp id="c001">&#x002A;Correspondence: Irene Sperandio, &#x02709; <email>irene.sperandio@unitn.it</email></corresp>
<corresp id="c002">Federica Meconi, &#x02709; <email>federica.meconi@unitn.it</email>
</corresp>
<fn id="fn0001" fn-type="equal">
<p><sup>&#x2020;</sup>These authors have contributed equally to this work and share senior authorship</p>
</fn>
<fn id="fn0004" fn-type="other">
<p>This article was submitted to Consciousness Research, a section of the journal Frontiers in Psychology</p>
</fn>
</author-notes>
<pub-date pub-type="epub">
<day>13</day>
<month>02</month>
<year>2023</year>
</pub-date>
<pub-date pub-type="collection">
<year>2022</year>
</pub-date>
<volume>13</volume>
<elocation-id>1028150</elocation-id>
<history>
<date date-type="received">
<day>25</day>
<month>08</month>
<year>2022</year>
</date>
<date date-type="accepted">
<day>12</day>
<month>12</month>
<year>2022</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x00A9; 2023 Dapor, Sperandio and Meconi.</copyright-statement>
<copyright-year>2023</copyright-year>
<copyright-holder>Dapor, Sperandio and Meconi</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/">
<p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p>
</license>
</permissions>
<abstract>
<p>This review focuses on the subtle interactions between sensory input and social cognition in visual perception. We suggest that body indices, such as gait and posture, can mediate such interactions. Recent trends in cognitive research are trying to overcome approaches that define perception as stimulus-centered and are pointing toward a more embodied agent-dependent perspective. According to this view, perception is a constructive process in which sensory inputs and motivational systems contribute to building an image of the external world. A key notion emerging from new theories on perception is that the body plays a critical role in shaping our perception. Depending on our arm&#x2019;s length, height and capacity of movement, we create our own image of the world based on a continuous compromise between sensory inputs and expected behavior. We use our bodies as natural &#x201C;rulers&#x201D; to measure both the physical and the social world around us. We point out the necessity of an integrative approach in cognitive research that takes into account the interplay between social and perceptual dimensions. To this end, we review long-established and novel techniques aimed at measuring bodily states and movements, and their perception, with the assumption that only by combining the study of visual perception and social cognition can we deepen our understanding of both fields.</p>
</abstract>
<kwd-group>
<kwd>social cognition</kwd>
<kwd>visual perception</kwd>
<kwd>embodiment</kwd>
<kwd>body indexes</kwd>
<kwd>posture</kwd>
<kwd>empathy</kwd>
<kwd>social body</kwd>
</kwd-group>
<counts>
<fig-count count="0"/>
<table-count count="3"/>
<equation-count count="0"/>
<ref-count count="132"/>
<page-count count="22"/>
<word-count count="18700"/>
</counts>
</article-meta>
</front>
<body>
<sec id="sec1">
<label>1.</label>
<title>General introduction</title>
<p>New insights on how traditional neuroscientific methodologies and instruments can adapt to emerging models of social cognition and perception are long needed. The vast majority of studies on perception so far have mainly concentrated on dissecting bottom-up processes, such as feature categorization and grouping. However, higher cognition seems to be rooted in our sensory representation of the world and to be modeled by our physicality. Therefore, we need research instruments able to tap into the deep embedding of our cognition in bodily processes. After all, the interface between the environment, whether physical or social, and its perception at our end comes down to our body.</p>
<p>For instance, this article focuses on the subtle interactions between sensory input and social cognition in visual perception and proposes the body indices, such as gait and posture, as a critical mediator of such interactions.</p>
<p>We structure our perspective in three sections: the first one starts by exposing different approaches in perception studies and by highlighting the deep interplay between top-down and perceptual processes; the second one focuses on embodied sensorial aspects of social cognition and its reliance on visuospatial behavior mechanisms; the third one presents a selection of studies in which bodily states (physiological, postural, and kinematical) are analyzed in their role of conveying social information to the viewer.</p>
<p>The key idea behind this work is that sociality is a part of &#x2013; and not apart from &#x2013; our biological being: we have preconscious dispositions toward perceiving social cues, namely other people&#x2019;s bodies and their expressions, and the way we perceive the (social) environment is shaped by our own body and its expressions. Accumulating evidence from different lines of research is supporting this notion, revealing that our perceptual experience is not a mere copy of the external appearances, but rather the result of a compromise between physical environment&#x2019;s layouts and people&#x2019;s social needs and expectations (e.g., <xref ref-type="bibr" rid="ref7">Balcetis and Dunning, 2009</xref>; <xref ref-type="bibr" rid="ref36">Firestone and Scholl, 2016</xref>; <xref ref-type="bibr" rid="ref80">Niedenthal and Wood, 2019</xref>; <xref ref-type="bibr" rid="ref101">Sato et al., 2019</xref>; <xref ref-type="bibr" rid="ref120">Valenti and Firestone, 2019</xref>; <xref ref-type="bibr" rid="ref114">Sun et al., 2021</xref>).</p>
<p>Building upon compelling evidence showing an overlap between spatial and social behavior in the perception of distances (<xref ref-type="bibr" rid="ref53">Henderson et al., 2006</xref>; <xref ref-type="bibr" rid="ref8">Bar-Anan et al., 2007</xref>; <xref ref-type="bibr" rid="ref125">Yamakawa et al., 2009</xref>; <xref ref-type="bibr" rid="ref124">Xiao and Bavel, 2012</xref>; <xref ref-type="bibr" rid="ref88">Parkinson and Wheatley, 2013</xref>; <xref ref-type="bibr" rid="ref115">Takahashi et al., 2013</xref>; <xref ref-type="bibr" rid="ref55">Hamilton et al., 2014</xref>; <xref ref-type="bibr" rid="ref102">Schiano Lomoriello et al., 2018</xref>; <xref ref-type="bibr" rid="ref101">Sato et al., 2019</xref>; <xref ref-type="bibr" rid="ref60">Kroczek et al., 2020</xref>), and on studies on the preconscious and automatic processing of social visual stimuli (<xref ref-type="bibr" rid="ref77">Morris et al., 2001</xref>; <xref ref-type="bibr" rid="ref90">Pegna et al., 2005</xref>; <xref ref-type="bibr" rid="ref116">Tamietto and de Gelder, 2008</xref>; <xref ref-type="bibr" rid="ref25">de Gelder, 2009</xref>; <xref ref-type="bibr" rid="ref27">de Gelder et al., 2010</xref>; <xref ref-type="bibr" rid="ref130">Zhou et al., 2019</xref>), we claim that in the process of understanding others, social top-down and visual bottom-up processes join their forces to prompt an adaptive response to the social stimulus. Moreover, we will show that the exploration of our physical and social surroundings is self-centered, or, more precisely, body-centered. In other words, our body shape and appearance strongly affect the way we move and conduct ourselves toward physical and social entities (<xref ref-type="bibr" rid="ref127">Yee et al., 2009</xref>; <xref ref-type="bibr" rid="ref67">Linkenauger et al., 2010</xref>; <xref ref-type="bibr" rid="ref122">van der Hoort et al., 2011</xref>), revealing a fundamental role of our body in shaping our perception and our sociality.</p>
<p>In short, the main claim of this review is that our body is a critical mediator of the interplay between social cognition and visual perception; the body is indeed our interface at the basis of our interaction with the environment. We can trace the effects of this mediation even in the rescaling effects that it exerts on our sight: understanding the deep influence of our bodies and bodily processes on our visual perception is a crucial step to frame also its role in shaping our attitude toward and interpretation of social relationships. We observe other people&#x2019;s postures and movements in order to understand their emotions or intentions, and the effective recognition of the others&#x2019; psychological state is accompanied by the instinctive simulation of their expressions in our own body. Theories of embodied cognition (<xref ref-type="bibr" rid="ref45">Goldstone and Barsalou, 1998</xref>; <xref ref-type="bibr" rid="ref79">Niedenthal et al., 2005</xref>; <xref ref-type="bibr" rid="ref92">Proffitt, 2006</xref>; <xref ref-type="bibr" rid="ref43">Gallese, 2009</xref>) claim that we use sensorimotor models originating from our proprioception and body schema (see definition in Box 2) to infer the psychological state of others. However, the literature often lacks the specification of the exact bodily indexes observed and simulated during social processes. We want to help fill this gap by presenting and promoting novel techniques that integrate the study of social cognition with physiological and visual measures, aiming at incentivizing integrative and multimodal approaches. To sum up, we strongly encourage the re-integration of the whole body in the study of social behavior and visual perception, following Nakayama&#x2019;s claim that &#x201C;vision science is going social&#x201D; (<xref ref-type="bibr" rid="ref400">Nakayama, 2011</xref>).</p>
</sec>
<sec id="sec2">
<label>2.</label>
<title>Reconstructing visual perception</title>
<p>The concept of perception has evolved throughout history. The last century has seen different theories succeeding one another in an attempt of giving a systematic explanation of how humans perceive the world surrounding them. Since the first definitions of perceptual processes, a differentiation between early sensory systems and higher cognitive processes has emerged. Although a reciprocal influence between these different levels of processing is well-established in the literature, an unsolved issue concerns the stage at which these two levels start interacting. For instance, traditional views assume a hierarchical structure in which the external input is initially processed by sensory cortices and it is only later ri-elaborated and interpreted by higher cognition. In contrast, alternative approaches (e.g., New Look, enactivism, ecological psychology, 4E approaches) argue for an early intervention of motivational systems, which modulate our perception by increasing the saliency of some objects on the basis of our current needs or desires. Within this other theoretical framework, perceptual processes have a constructive nature for which external information is gathered and modeled for the demands of our inner states (<xref ref-type="bibr" rid="ref6">Balcetis and Dunning, 2006</xref>).</p>
<sec id="sec3">
<label>2.1.</label>
<title>New look and other perspectives in perception: Visual perception as a constructive process</title>
<p>According to the New Look theory, perception is far from being a neutral and objective representation of the external world. Instead, what we see appears to be the result of a compromise between autochthonous and behavioral determinants (<xref ref-type="bibr" rid="ref18">Bruner and Goodman, 1947</xref>). While <italic>autochthonous</italic> refers to the electrochemical signals generated by the sensory end organs, <italic>behavioral</italic> includes learning and motivation, social needs and attitudes, basic physiological needs, such as hunger and thirst. These two determinants build a perceptual hypothesis that undergoes a selection process driven by our needs or requirements. Objects that comply with the selective criteria become &#x201C;more vivid, have greater clarity or greater brightness or greater apparent size&#x201D; (<xref ref-type="bibr" rid="ref18">Bruner and Goodman, 1947</xref>). In a classic experiment, <xref ref-type="bibr" rid="ref18">Bruner and Goodman (1947)</xref> showed that a physical entity, like a metal disk, invested of social value (e.g., a coin) is perceived as having a bigger shape compared to the same form without any utility (i.e., a simple metal disk). The authors postulated that the greater the need for a socially valued object, the more marked the reshaping of the perceived entity would be. In agreement with this hypothesis, they demonstrated that poorer children were more likely to perceive the coins as bigger, compared to an age-matched group of more wealthy individuals (<xref ref-type="bibr" rid="ref18">Bruner and Goodman, 1947</xref>).</p>
<p>Despite initial criticism, this theory has recently regained attention thanks to scholars who have developed rigorous methodological approaches and demonstrated how semantic knowledge (e.g., stereotypes) can shape - at a very early stage &#x2013; the way we process sensory inputs (<xref ref-type="bibr" rid="ref83">Otten et al., 2017</xref>).</p>
<p>Two additional influential theories that date back to the second half of the last century and originated in opposition to cognitivism, are the enactivism and ecological psychology. The latter was pioneered by <xref ref-type="bibr" rid="ref44">Gibson (2014)</xref> and has its foundations in his perspective on visual perception. The most relevant point of this framework is the concept of affordances (see Box 1), a notion that has been exploited in different fields and transposed from the physical environment of inanimate objects to the social affordances offered by another person. Another fundamental aspect of ecological psychology is the reintroduction of the body as a reference point for our perceptions, and the concept of perception as an act of information pickup. Gibson revolutionized the whole field of visual perception, introducing new models based on the way that our visual sensory system is built on the idiosyncrasies of our body rather than the external stimulus.</p>
<p>While one can easily consider ecological psychology as a new theory of perception, the redefinition of the organism-environment relationship imposed by this theory has important implications for cognition (<xref ref-type="bibr" rid="ref91">Popova and R&#x0105;czaszek-Leonardi, 2020</xref>). A more radical elaboration of the centrality of our body in cognitive processes has been put forward by the enactivist approach, largely founded on the ideas of <xref ref-type="bibr" rid="ref500">Varela et al. (1991)</xref>. Similar to ecological psychology, this theory strongly highlights the importance of the relationship between agent and environment, with the assumption that cognition and perception emerge together and are so closely connected that they cannot be considered separately. The objectivity of perception is deconstructed in favor of the inevitability of the subjective state in the global landscape formed by the individual immersed in its surroundings (<xref ref-type="bibr" rid="ref41">Fuchs, 2020</xref>; <xref ref-type="bibr" rid="ref52">Heft, 2020</xref>; <xref ref-type="bibr" rid="ref28">de Pinedo Garc&#x00ED;a, 2020</xref>; <xref ref-type="bibr" rid="ref91">Popova and R&#x0105;czaszek-Leonardi, 2020</xref>; <xref ref-type="bibr" rid="ref97">Read and Szokolszky, 2020</xref>).</p>
<p>This account of perception conforms also to the more and more popular theory of predictive coding (PC), which has its origin in the model of visual perception formulated by <xref ref-type="bibr" rid="ref96">Rao and Ballard (1999)</xref>. The authors described visual perception not only as a feedforward loop from lower-to higher-order visual cortical areas, but also as a cycle of prediction and error-correction. The new emerging ideas that followed this initial assertion, are based on the inferential nature of the brain (<xref ref-type="bibr" rid="ref39">Friston, 2018</xref>). In a nutshell, the PC framework states that our neural networks constantly predict sensorial aspects of the environment based on the statistical regularities of the natural world. This prediction generates a perceptive model which is confronted with the incoming sensory inputs and corrected in case of mismatch (<xref ref-type="bibr" rid="ref96">Rao and Ballard, 1999</xref>; <xref ref-type="bibr" rid="ref40">Friston and Kiebel, 2009</xref>). One can easily see the parallelism with Bayesian modeling in statistics, for which an hypothetical or aprioristic model of a certain phenomenon is continuously updated on the basis of the newly collected data. What the system, in this case the brain, should focus its energy on is to minimize the prediction error (<xref ref-type="bibr" rid="ref40">Friston and Kiebel, 2009</xref>; <xref ref-type="bibr" rid="ref39">Friston, 2018</xref>). This inferential view of our cognition and sensorium, described also as Baesyan brain hypothesis, has gained growing consensus and has become one of the most dominant models in cognitive neuroscience. The PC approach has been applied in multiple fields of science. A relevant example is its application to explain interoceptive awareness or accuracy (<xref ref-type="bibr" rid="ref109">Seth et al., 2012</xref>; <xref ref-type="bibr" rid="ref3">Ainley et al., 2016</xref>). In the study of <xref ref-type="bibr" rid="ref3">Ainley et al. (2016)</xref>, the sensitivity for internal bodily information is explained as the ability to adapt a &#x201C;prior&#x201D; representation of the body state to the effective interoceptive sensation, by continuously adjusting and minimizing the prediction errors. If this predictive dynamics holds true for interoceptive inputs as well and not only for what we expect to perceive from the external environment, then the model can be extended to partially explain our emotional responses (<xref ref-type="bibr" rid="ref108">Seth, 2013</xref>; <xref ref-type="bibr" rid="ref9">Barrett and Satpute, 2019</xref>). In fact, emotions can be seen as predictions of reactions to an event based on past experiences that generated models of behavior. They can represent the active inference of how to respond to a particular situation, and be created from memory with the possibility of future adjustments. In a more articulated manner, <xref ref-type="bibr" rid="ref108">Seth (2013)</xref> integrates the interoceptive predictive coding hypothesis in a coherent understanding of emotional content as an &#x201C;active top-down inference of the causes of interoceptive signals.&#x201D; Interoception is no longer considered as a one-directional collection of bodily sensations from peripheral receptors to the central processor, but rather as a process shaped by top-down inferences and influences along with bottom-up error-correction processes. In summary, the predictive coding account of emotions defines them as active inferences on the causes of physiological changes, both at internal and external representational levels (<xref ref-type="bibr" rid="ref108">Seth, 2013</xref>; <xref ref-type="bibr" rid="ref9">Barrett and Satpute, 2019</xref>).</p>
<p>Again, the &#x201C;strange inversion&#x201D; brought about by this theoretical account goes in the same direction as the other frameworks described so far, which point out the inferential, constructive, and active functioning of the perceiving organism, whilst leaving behind the idea of the brain &#x201C;as a glorious stimulus&#x2013;response link&#x201D; (<xref ref-type="bibr" rid="ref39">Friston, 2018</xref>). Following this discussion, it is probable that the &#x201C;naive realism,&#x201D; supported by traditional approaches, whereby what we see, smell and feel is a faithful representation of what is &#x201C;out there,&#x201D; will be progressively replaced by a view of our visual system as grounded in action capabilities and social influences (<xref ref-type="bibr" rid="ref94">Proffitt and Linkenauger, 2013</xref>; <xref ref-type="bibr" rid="ref93">Proffitt and Baer, 2020</xref>). The current review aims at pushing cognitive research into this direction, highlighting the deep influences that our inner states and social needs exert on perceptual processing. In the ensuing sections, we will argue that the visual salience of an object is given by its affordances (see definition in Box 1) to satisfy our physiological or social needs. In other words, we project on the objects surrounding us different degrees of desiderability &#x2013; based on our bodily and psychological states - and perceive them accordingly, inevitably bonding our perception to our action possibilities.</p>
<boxed-text id="box1" position="float">
<p><bold>BOX 1 Definition of affordances.</bold></p>
<p>The term was introduced by James Gibson, within its ecological approach to the study of vision, and indicates the totality of actions that an object allows to perform. The affordances of an object can be defined as the potential actions elicited by the view of that object. For example, seeing an apple suggests the actions of eating it, grabbing it, or moving it. A chair &#x201C;affords&#x201D; being sat on, and so forth.</p>
</boxed-text>
</sec>
<sec id="sec4">
<label>2.2.</label>
<title>Our metabolic energies influence visual perception</title>
<p>Our action possibilities are delimited by the resources we dispose of and the costs of the actions we want to undertake. Using Proffitt&#x2019;s words, &#x201C;survival for any organism, including people, is a matter of resource management&#x201D; (<xref ref-type="bibr" rid="ref46">Gross and Proffitt, 2013</xref>). Our brain calculates costs and opportunities associated with every movement, and this constant evaluation is carried out automatically and unconsciously, in that, if we had to be aware of it, our executive functions would overload (<xref ref-type="bibr" rid="ref92">Proffitt, 2006</xref>; <xref ref-type="bibr" rid="ref46">Gross and Proffitt, 2013</xref>).</p>
<p>First and foremost, our ability to perform an action depends on our body characteristics: our body size determines what we can reach as well as what we can see (<xref ref-type="bibr" rid="ref66">Linkenauger and Proffitt, 2008</xref>; <xref ref-type="bibr" rid="ref113">Sugovic et al., 2016</xref>). The effect of our body mass on perceived distance was assessed in an experiment of <xref ref-type="bibr" rid="ref113">Sugovic et al. (2016)</xref> in which normal weight, overweight, and obese individuals were asked to estimate a same distance and report their beliefs concerning their body size. They found that perceived distance was mainly affected by physical body weight and that this effect was independent from personal beliefs. Specifically, they found that the heavier the person, the greater the estimation of distance (<xref ref-type="bibr" rid="ref113">Sugovic et al., 2016</xref>). Also, the physiological state of our body plays a major role in what we can perform and how. Being tired, or out of training, or carrying a weight, are all factors that can diminish our potential to perform actions (<xref ref-type="bibr" rid="ref92">Proffitt, 2006</xref>; <xref ref-type="bibr" rid="ref66">Linkenauger and Proffitt, 2008</xref>; <xref ref-type="bibr" rid="ref93">Proffitt and Baer, 2020</xref>). A reduction of our action possibilities translates into an adjustment of the environmental perception. In fact, it has been shown that perceived distances increase when our energies are scarce and are instead reduced when we are trained and performative (<xref ref-type="bibr" rid="ref128">Zadra et al., 2016</xref>; <xref ref-type="bibr" rid="ref93">Proffitt and Baer, 2020</xref>). In the same fashion, <xref ref-type="bibr" rid="ref13">Bhalla and Proffitt (1999)</xref> showed that the steepness of a hill was overestimated by participants who were asked to carry a backpack, fatigued runners or those in low physical and health conditions when compared to participants in their full forces (e.g., not carrying a backpack or fit and in good health). A further example on how sugar intake and fitness level affect visual estimates was provided by <xref ref-type="bibr" rid="ref128">Zadra et al. (2016)</xref>. The authors asked participants to judge distances after physical exercise. Prior to the physical activity, half of the participants received a carbohydrate supplement, whereas the other half received a placebo. They observed that those who received the energizer rated the extent to be shorter compared to the placebo group. They also found that perceived distance correlated with other measures of fitness, such as blood glucose, heart rate (HR), and caloric expenditure under physical fatigue, further confirming the influence of bioenergetic resources on perceptual processing (<xref ref-type="bibr" rid="ref128">Zadra et al., 2016</xref>). Interestingly, <xref ref-type="bibr" rid="ref21">Changizi and Hall (2001)</xref> demonstrated that thirsty people perceive ambiguous visual stimuli as more transparent than non-thirsty subjects, and this may be due to the implicit association of transparency with water.</p>
</sec>
<sec id="sec5">
<label>2.3.</label>
<title>Tools extend our action possibilities modifying the perception of our surroundings</title>
<p>Beside dimensions and fitness of our body, another factor that can determine our potential to perform actions is the use of tools. In fact, tools can extend our reach to extrapersonal space, including farther objects in our area of manipulability. Objects within our reach are automatically perceived as candidates for potential actions and this implies different perceptual processings (for a review, see <xref ref-type="bibr" rid="ref17">Brockmole et al., 2013</xref>). Indeed, they are visually scanned in a more attentive way and with a detail-oriented processing style in order to enable appropriate action responses compared to objects we cannot touch or immediately interact with (<xref ref-type="bibr" rid="ref17">Brockmole et al., 2013</xref>). Hence, holding a tool that increases the area of our possible interaction with the surroundings immediately modifies the visual perception of what would have been beyond our reach. For example, when patients with hemispatial neglect (a neuropsychological condition characterized by reduced awareness of visual stimuli in one side of the visual field, not accompanied by sensorial deficit) limited to the near space were provided with a stick and asked to perform a line bisection task in the far space, the neglect expanded to include the area reachable with the tool (<xref ref-type="bibr" rid="ref12">Berti and Frassinetti, 2000</xref>). This demonstrates that the artificial extension of our reach remodels peripersonal space and the perception of far and near objects, suggesting once more that our chance of interaction with the environment has deep effects on how we perceive it.</p>
</sec>
<sec id="sec6">
<label>2.4.</label>
<title>Social baseline theory: Social resources can directly alter our visual perceptions</title>
<p>Not only metabolic energies weigh on the capacity to undertake any action, but also they influence our social relationships. Supportive social networks allow distributing the efforts of any endeavor and protect the individual from potential dangers. It has even been argued that receiving help from other humans is a matter of survival and that the greatest human strength are other humans (<xref ref-type="bibr" rid="ref82">Oishi et al., 2013</xref>). Indeed, we are born and raised within a social environment that provides for our basic needs until we can take care of ourselves. Even then, we are for the rest of our lives embedded in social networks (<xref ref-type="bibr" rid="ref46">Gross and Proffitt, 2013</xref>). Interestingly, similarly to what has been observed for metabolic resources, it has been demonstrated that social resources, too, influence our perception of the environment.</p>
<p>The Social Baseline Theory (SBT), formulated by <xref ref-type="bibr" rid="ref200">Beckes and Coan (2011)</xref>, describes interindividual differences in reacting to social support. According to this theoretical account, social support is considered as a default precondition of our actions and determines the baseline from which we calculate the amount of energy available (<xref ref-type="bibr" rid="ref200">Beckes and Coan, 2011</xref>; <xref ref-type="bibr" rid="ref24">Cole et al., 2013</xref>; <xref ref-type="bibr" rid="ref46">Gross and Proffitt, 2013</xref>). Nevertheless, investigations on the role of social resources in perception have been largely neglected in the literature, especially if compared to the wealth of evidence on the role of physiological states (<xref ref-type="bibr" rid="ref46">Gross and Proffitt, 2013</xref>). However, accumulating evidence indicates that the presence of a significant other acts as an empowering factor when facing difficulties or specific tasks. For example, <xref ref-type="bibr" rid="ref29">Doerrfeld et al. (2012)</xref> observed that when participants were asked to lift a box they tended to judge its weight as lighter if they knew they would receive help, compared to when they knew they would lift the box without any help. <xref ref-type="bibr" rid="ref103">Schnall et al. (2008)</xref> replicated Proffitt&#x2019;s study on slant perception where slant estimates varied as a function of the physical and health conditions of the observer (see subsection 2.2), but this time the social support factor was also considered. Participants could either be accompanied by a friend or imagine the presence of another person (friendly or not). In both cases, social support decreased the perception of the steepness of the hill. Other studies have tried to shed light on the ways in which social inclusion or exclusion impact our sensory systems. These studies revealed a wide range of effects on visual perception. For instance, the feeling of being understood is another factor that appears to influence distance and steepness perception. In an experiment by <xref ref-type="bibr" rid="ref82">Oishi et al. (2013)</xref>, participants were judged by an evaluator with a few adjectives chosen from a list, from which the same participants had previously picked a few words to describe themselves. In the understanding condition, participants received the same judgments they also had chosen, while in the misunderstanding condition the evaluator judged the participants with words that largely differed from the participants&#x2019; self-assessment. After the evaluation session, participants performed a slant estimation task similar to the one used in Bhalla and Proffitt&#x2019;s study (1999). Results showed that feeling understood decreased the slope estimates compared to the misunderstanding condition (<xref ref-type="bibr" rid="ref82">Oishi et al., 2013</xref>).</p>
<p>To conclude, we can claim that vision assists our possibility of action by modulating perceived size and distance of objects. Our potential to perform actions, in turn, is determined by our dispositional bioenergetic and social resources. Therefore, we see the environment according to the dispositions of our bodily states and of our social network.</p>
<p><xref rid="tab1" ref-type="table">Table 1</xref> provides a summary of the studies cited in this section.</p>
<boxed-text position="float" id="box2">
<p><bold>BOX 2 Definitions of proprioception and body schema.</bold></p>
<p>Proprioception is the capacity of perceiving and detecting the position of our own body in space, as well as the state of activation of our muscles, without the support of our sight. The collection of combined signals from sensory receptors in the muscle, skin, and joints, allows us to be aware of our limbs position and movements, and it is - in this sense - a fundamental aspect of motor control. In fact, proprioceptive information is integrated in our body schema, which combines the peripheral inputs with central (brain) processes in order to lead the execution of any action or movement. Body schema has been defined as the body representation for action. Indeed, it can extend to incorporate any tool we are holding allowing the sophistication of human tool-use abilities</p>
</boxed-text>
<table-wrap position="float" id="tab1">
<label>Table 1</label>
<caption>
<p>Summary table of the studies cited in section 2.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="top">Study</th>
<th align="left" valign="top">Sample</th>
<th align="left" valign="top">Stimuli</th>
<th align="left" valign="top">Design</th>
<th align="left" valign="top">Results</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref114">Sun et al. (2021)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;18 (13 females, mean age&#x2009;=&#x2009;22.9)</td>
<td align="left" valign="top">Target rectangles placed of the top of different stimuli</td>
<td align="left" valign="top">The different stimuli were selected to induce positive (squirrel), negative (rats), or neutral (wooden blocks) emotions. Participants were asked to rate distances from and size of the targets</td>
<td align="left" valign="top">Participants perceived the target object on top of the toy rats (induced negative stimulus) significantly closer and larger than the same target object resting on top of the toy squirrels (induced positive stimulus) suggesting that there is a perceptual bias even within reachable personal space</td>
</tr>
<tr>
<td align="left" valign="top" rowspan="3"><xref ref-type="bibr" rid="ref7">Balcetis and Dunning (2009)</xref></td>
<td align="left" valign="top"><italic>Experiment 1: N</italic> =&#x2009;63</td>
<td align="left" valign="top">Ambiguous figures</td>
<td align="left" valign="top">Taste-testing of a desirable and a non-desirable beverage. The choice of the beverage/food was based on seeing one of the two possible interpretations of the ambiguous stimulus</td>
<td align="left" valign="top">Participants&#x2019; desire to obtain the desirable reward influenced their interpretation of the ambiguous figure</td>
</tr>
<tr>
<td align="left" valign="top"><italic>Experiment 2: N</italic> =&#x2009;43</td>
<td align="left" valign="top"><italic>Experiment 1:</italic> Number-letter figure, B / 13.</td>
<td/>
<td/>
</tr>
<tr>
<td/>
<td align="left" valign="top"><italic>Experiment 2</italic>: Seal-horse figure</td>
<td/>
<td/>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref7">Balcetis and Dunning (2009)</xref>
</td>
<td align="left" valign="top"><italic>Experiment 1</italic>: <italic>N</italic> =&#x2009;90</td>
<td align="left" valign="top">Desirable objects<italic>Experiment 1</italic>: A bottle of water</td>
<td align="left" valign="top"><italic>Experiment 1.</italic> Participants were given either a salty snack or a glass of water and then were asked to estimate the distance between them and a bottle of water</td>
<td align="left" valign="top">Perceptions of distance depend in part on the desirability of the perceived object&#x2014;which depends, in turn, on its capacity to satisfy a visceral or intrapsychic need</td>
</tr>
<tr>
<td/>
<td align="left" valign="top"><italic>Experiment 2a</italic>: <italic>N</italic> =&#x2009;123</td>
<td align="left" valign="top"><italic>Experiment 2a</italic>: A 100 dollar bill</td>
<td><italic>Experiment 2a</italic>. Participants were offered the chance of winning a 100 dollar bill in a card game or only a candy and then asked to estimate the distance from the bill</td>
<td/>
</tr>
<tr>
<td/>
<td align="left" valign="top"><italic>Experiment 2b: N</italic> =&#x2009;89</td>
<td align="left" valign="top"><italic>Experiment 2b:</italic> A survey containing self-relevant feedback</td>
<td align="left" valign="top"><italic>Experiment 2b.</italic> A survey on sense of humour was ostensibly graded by the experimenter and participants were given either positive or negative feedback and the survey was then placed away from the person and asked to judge how far away</td>
<td/>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref128">Zadra et al. (2016)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;8 (3 females, mean age 26.38)</td>
<td align="left" valign="top">Walkable distances</td>
<td align="left" valign="top">A host of physiological measures were recorded as participants engaged in exercise on 2 occasions: once while provided with a carbohydrate supplement and once with a placebo. Distance estimates were made before and after exercise on both occasions</td>
<td align="left" valign="top">The carbohydrate manipulation caused decreased distance estimates relative to the placebo condition. Individual differences in physiological measures that are associated with physical fitness predicted distance estimates both before and after the experimental manipulations</td>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref18">Bruner and Goodman (1947)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;30 (mean age&#x2009;=&#x2009;10)</td>
<td align="left" valign="top">Disk of different sizes. The disks could either be neutral metal items or coins of different value</td>
<td align="left" valign="top">The children were asked to estimate the size of the metal disk/coin by adjusting the diameter of a circle of light projected on a screen</td>
<td align="left" valign="top">Coin size was overestimated, while the neutral disk size estimates were closer to reality. The higher the coin value, the bigger the overestimation. Poorer children overestimated more than more wealthy children the size of the coins</td>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref21">Changizi and Hall (2001)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;74</td>
<td align="left" valign="top">Visual stimuli &#x201C;definitely,&#x201D; &#x201C;ambiguously,&#x201D; or &#x201C;definitely not&#x201D; transparent</td>
<td align="left" valign="top">Participants had to judge the stimulus as &#x201C;transparent&#x201D; or &#x201C;not transparent&#x201D; in two conditions: after having eaten a bag of chips (thirsty group) or after having drunk water (non-thirsty group)</td>
<td align="left" valign="top">The thirsty group showed a greater inclination to judge as transparent the ambiguous stimuli</td>
</tr>
<tr>
<td align="left" valign="top" rowspan="2"><xref ref-type="bibr" rid="ref13">Bhalla and Proffitt (1999)</xref>
</td>
<td align="left" valign="top"><italic>Experiment 1. N</italic> =&#x2009;130 (65 females)</td>
<td align="left" valign="top">Slant of a hill</td>
<td align="left" valign="top">Participants were asked to judge how steep the hill was in three ways: verbally, visually, and haptically</td>
<td align="left" valign="top">All the experiments demonstrated an effect of load, fatigue, or fitness on the slant estimates, but only for verbal and visual assessments</td>
</tr>
<tr>
<td/>
<td/>
<td align="left" valign="top"><italic>Experiment 1.</italic> Participants performed the task in two conditions: wearing a backpack and without it.</td>
<td align="left" valign="top"><italic>Experiment 1.</italic> Wearing the backpack influenced the slant estimate making the hill look steeper.</td>
</tr>
<tr>
<td/>
<td align="left" valign="top"><italic>Experiment 2</italic>. <italic>N</italic> =&#x2009;40 (20 females)</td>
<td/>
<td align="left" valign="top"><italic>Experiment 2.</italic> Participants had to give slant estimates of a hill before a long run (45 to 75&#x2009;min) and of another hill after the run</td>
<td align="left" valign="top"><italic>Experiment 2.</italic> The hill estimates were higher after the exhausting run</td>
</tr>
<tr>
<td/>
<td align="left" valign="top"><italic>Experiment 3</italic>. <italic>N</italic> =&#x2009;74 (35 females)</td>
<td/>
<td align="left" valign="top"><italic>Experiment 3.</italic> Different fitness measures were taken from the participants (heart rate in different conditions and body mass index) after or before asking them to judge the slant of the hill</td>
<td align="left" valign="top"><italic>Experiment 3.</italic> A higher level of fitness corresponded to a lesser overestimation of the hill&#x2019;s slant</td>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref113">Sugovic et al. (2016)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;66 (30 female, mean age&#x2009;=&#x2009;24.4)</td>
<td align="left" valign="top">A cone presented on the sidewalk (4 target distances)</td>
<td align="left" valign="top">After making four distance estimates, participants completed a survey. Along with demographic questions, the survey asked them to indicate their height, weight, and an evaluative measure of body size</td>
<td align="left" valign="top">A person&#x2019;s body weight influenced perceived distance: Those who weighed more than others perceived distances to be farther</td>
</tr>
<tr>
<td align="left" valign="top" colspan="5">Studies involving social manipulation</td>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref29">Doerrfeld et al. (2012)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;43 (mean age&#x2009;=&#x2009;22.5)</td>
<td align="left" valign="top">Box weight</td>
<td align="left" valign="top">Participants were asked to judge the weight of a box filled with potatoes in two conditions: alone or with someone else</td>
<td align="left" valign="top">Participants in the joint condition judged the weight as lighter than in the solo condition</td>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref300">Harber et al. (2011)</xref>
</td>
<td align="left" valign="top"><italic>Experiment 1</italic>. <italic>N</italic> =&#x2009;107 (63% female; mean age&#x2009;=&#x2009;20.8)</td>
<td align="left" valign="top"><italic>Experiment 1.</italic> Distance estimates at 3 measurements points</td>
<td align="left" valign="top"><italic>Experiment 1</italic>. Stimuli were either a live tarantula or a cat toy. Participants were primed with positive, negative or neutral self-worth conditions</td>
<td align="left" valign="top">Resources moderate the perception of physical threats (dangerous animals, hazardous heights), and are not limited to the implicit calculus of metabolic costs (i.e., how much physical effort a situation might demand, relative to one&#x2019;s physical resources)</td>
</tr>
<tr>
<td/>
<td align="left" valign="top"><italic>Experiment 2. N&#x2009;=</italic> 91 (64.9% female; mean age&#x2009;=&#x2009;20.18)</td>
<td align="left" valign="top"><italic>Experiment 2.</italic> Height estimates from the fourth floor of a building</td>
<td align="left" valign="top"><italic>Experiment 2.</italic> Participants were left free to hold on the handrail or were denied this possibility by tying their hands behind their back. Self-esteem was manipulated</td>
<td/>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref24">Cole et al. (2013)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;48 (100% females)</td>
<td align="left" valign="top">A male confederate</td>
<td align="left" valign="top">Participants were shown a video of the male confederate in which he would appear threatening, disgusting or neutral. Then they were asked for distance estimates</td>
<td align="left" valign="top">Experimentally induced social signals of threat (but not disgust) led to perceived proximity</td>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref82">Oishi et al. (2013)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;202 (112 female; undergraduate students)</td>
<td align="left" valign="top">Pain endurance (hand in icy water), slant perception, distance perception</td>
<td align="left" valign="top">Feelings of understanding or misunderstanding were induced in the participants by judging them with a list of adjectives (positive or negative)</td>
<td align="left" valign="top">Participants in the understanding condition were able to put their hands in icy water for a longer period of time, perceived the target locations to be closer, and perceived the same hill to be less steep than those in the misunderstanding condition</td>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref103">Schnall et al. (2008)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;34 (19 female; mean age&#x2009;=&#x2009;19.94&#x2009;years)</td>
<td align="left" valign="top">Slant of a hill</td>
<td align="left" valign="top">Participants judged the hill slant verbally, visually, and haptically in different conditions: alone or with a friend</td>
<td align="left" valign="top">Participants with a friend, compared to those alone, saw the hill as less steep. The longer participants knew their friends, the less steep they estimated the hill to be, on both the verbal and visual measures</td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
</sec>
<sec id="sec7">
<label>3.</label>
<title>Perceiving the physical and the social world</title>
<p>After having grounded our discussion in a constructive and embodied perspective of visual perception, we can now focus on its integration in models of social cognition. We will start by drawing a parallel between the research approaches used in visual perception and those used in social cognition. In both cases, we will conclude that adopting a multimodal integrative perspective can better represent the intertwined and complex nature of these processes. We will then present the embodied account of social cognition, which we believe to be the most accurate explanation of how we understand one another. Finally, we will provide empirical evidence for a common mechanism for mapping social and physical distances, a further confirmation of the tight link between social cognition and visuospatial perception mediated by the processing of body indices.</p>
<sec id="sec8">
<label>3.1.</label>
<title>A parallel between approaches in perception and social cognition</title>
<p>The issue of how we represent objects and events in our mind is and has always been a central theme in cognitive sciences and for a long time these representations have been described as symbolic and amodal. In the field of perception, for example, the main assumption was that we construct abstract representations of the external inputs through mechanisms of feature extrapolation and categorization (<xref ref-type="bibr" rid="ref65">Lindblom, 2020</xref>). Such a view takes inspiration from Fodor&#x2019;s modularity, according to which encapsulated perceptual modules in the brain transmit the sensory information to higher processing levels that manipulate them in the form of symbolic representation (<xref ref-type="bibr" rid="ref37">Fodor, 1983</xref>). The same form of representation - amodal and disembodied - has also been used in the study of social cognition. According to traditional accounts, people process social information by means of categories, schemata, feature lists, semantic networks, and so forth (<xref ref-type="bibr" rid="ref62">Landau et al., 2010</xref>). However, despite their clarity and linearity, these theories do not account for the multimodal nature of perceptual and social experiences, in which high-and low-level cognitive processes strongly interact (<xref ref-type="bibr" rid="ref129">Zaki and Ochsner, 2012</xref>). More recently, models in which perception and cognition behave as coupled systems are gaining new ground. In the same way, alternative paradigms of social cognition stemming from theories of embodied cognition (<xref ref-type="bibr" rid="ref45">Goldstone and Barsalou, 1998</xref>; <xref ref-type="bibr" rid="ref79">Niedenthal et al., 2005</xref>; <xref ref-type="bibr" rid="ref43">Gallese, 2009</xref>) are challenging the idea of amodal representations of the social information. We endorse the adoption of multimodal and integrative models of both perception and social cognition, confident that without acknowledging a common basis for perceptual and conceptual processing of physical and social events, our understanding of the brain and the mind would remain incomplete.</p>
</sec>
<sec id="sec9">
<label>3.2.</label>
<title>An embodied account of social cognition</title>
<p>Empathy is the ability to understand others&#x2019; inner state by explicitly inferring it from available contextual information or by internally simulating it (<xref ref-type="bibr" rid="ref129">Zaki and Ochsner, 2012</xref>; <xref ref-type="bibr" rid="ref107">Sessa et al., 2014</xref>; <xref ref-type="bibr" rid="ref72">Meconi et al., 2018</xref>). According to the embodied account of social cognition, we understand other people&#x2019;s mental state by reproducing it in ourselves (e.g., <xref ref-type="bibr" rid="ref79">Niedenthal et al., 2005</xref>). This is achieved by internally mimicking the same sensorimotor patterns observed in others, which recall specific psychological states we experienced in association with that physical expression (<xref ref-type="bibr" rid="ref42">Gallese, 2007</xref>, <xref ref-type="bibr" rid="ref43">2009</xref>). This has already been shown in a study by <xref ref-type="bibr" rid="ref31">Duclos et al. (1989)</xref> where participants were asked to mimic some negative emotion expressions (fear, sadness, anger) by contracting specific muscles. In a first experiment, the expressions were limited to the face, while a second testing involved a full body simulation. Participants were convinced that the study regarded brain lateralization and that the muscles&#x2019; contraction was a conflicting task, the function of which was to overload the cognitive system. Finally, they had to report their feelings throughout the experiments, choosing among different emotions and rating their intensity. Although participants were na&#x00EF;ve to the aims of the experiment, they reported higher intensity for the emotions they were mimicking in that moment, both for facial and full body expressions, giving strength to the idea that the activation of specific sensorimotor schemas elicits those embodied feelings.</p>
<p>The discovery of the mirror neurons is a fundamental step at the basis of embodied social cognition because mirror neurons are considered as one route to the development of our ability to understand others&#x2019; actions. Human mirror neurons seem to be widely spread across the brain with peaks of concentration in the premotor and somatosensory cortices (<xref ref-type="bibr" rid="ref42">Gallese, 2007</xref>; <xref ref-type="bibr" rid="ref34">Fabbri-Destro and Rizzolatti, 2008</xref>; <xref ref-type="bibr" rid="ref10">Bastiaansen et al., 2009</xref>; <xref ref-type="bibr" rid="ref43">Gallese, 2009</xref>; <xref ref-type="bibr" rid="ref56">Keysers et al., 2010</xref>; <xref ref-type="bibr" rid="ref78">Mukamel et al., 2010</xref>). The peculiarity of these neurons lies in the fact that they fire not only when we perform a specific action, but also when we see that same action performed by someone else. Traditionally, research on human mirror neurons adopted fMRI investigations to identify which areas become more active during the observation of another person, and this has allowed the mapping of the neural circuitries that exhibit mirroring properties. Recently, <xref ref-type="bibr" rid="ref87">Paradiso et al. (2021)</xref> reviewed the scientific production across species to reveal which brain areas are involved in empathic reaction. Empathy is the ability to resonate with the others&#x2019; inner state and to explicitly understand it (often referred to as affective and cognitive empathy, respectively). Numerous studies on animal models showed converging results: the anterior cingulate cortex and the amygdala resulted as the main areas involved in empathy-related phenomena. The same authors also reviewed the literature on the role of analgesics in modulating prosocial behavior which shows that reducing pain perception hinders the ability to empathize with the pain of others. In fact, the most recent trends in research on empathy focus on the mechanisms of empathy for pain (<xref ref-type="bibr" rid="ref100">R&#x00FC;tgen et al., 2015</xref>, <xref ref-type="bibr" rid="ref99">2018</xref>; <xref ref-type="bibr" rid="ref61">Lamm et al., 2019</xref>). This new research direction sought to provide mechanistic explanations to simulation models, by selectively disrupting specific subprocesses - nociception in this case - with different techniques (e.g., tDCS, analgesics) to verify their involvement in cognitive processes, like empathy for pain (<xref ref-type="bibr" rid="ref15">Bonini et al., 2022</xref>; <xref ref-type="bibr" rid="ref69">Maggio et al., 2022</xref>). The empathic experience of others&#x2019; pain has been widely examined by R&#x00FC;tgen and Lamm, who conducted several studies on the role of our own nociception in the ability to recognize and understand others&#x2019; pain. In an fMRI experiment of 2015, the researchers manipulated participants&#x2019; nociception (i.e., the encoding of noxious stimuli) by means of placebo analgesia induced in half of the participants. FMRI data was collected, while a painful electrical stimulation was delivered either to the participant or to another person present in the scanner room. Results showed reduced activation of the anterior insular and midcingulate cortex, areas typically involved in empathic responses for pain, in the group of participants in which placebo analgesia was administered compared to the control group in which participants did not receive any treatment. Along with other evidence (<xref ref-type="bibr" rid="ref100">R&#x00FC;tgen et al., 2015</xref>, <xref ref-type="bibr" rid="ref99">2018</xref>), these findings are in line with those studies showing that incidental (<xref ref-type="bibr" rid="ref38">Forkmann et al., 2015</xref>) and voluntary (<xref ref-type="bibr" rid="ref35">Fairhurst et al., 2012</xref>) reinstatement of an autobiographical pain, involves the partial recruitment of the brain areas that encoded nociceptive stimuli at the time of memory formation. Indeed, memories of autobiographical physical pain augment participants&#x2019; cognitive empathy for other individuals depicted in similar physically painful situations (<xref ref-type="bibr" rid="ref74">Meconi et al., 2021</xref>).</p>
<p>Neuromodulation and lesion studies are also suited to pinpoint the networks underlying mechanisms of embodied social cognition. Such an example is the experiment of <xref ref-type="bibr" rid="ref64">Lenzoni et al. (2020)</xref>, which demonstrated a causal relationship between body expression and emotion recognition. Using a matching task of faces and bodies, the authors measured social abilities in patients with myotonic dystrophy, a neuromuscular disease which induces strong sensorimotor limitations. The clinical population performed significantly worse than the group of healthy controls, demonstrating a causal role of visuomotor abilities in emotion recognition. In a review by <xref ref-type="bibr" rid="ref57">Keysers et al. (2018)</xref>, neuromodulation and lesion studies were presented as evidence for the fundamental role of the primary somatosensory cortex and mirror neurons network (parieto-premotor areas) in understanding and predicting the actions of others, which is in turn connected with the ability of recognizing their emotions.</p>
<p>The relevance of bodily states in our social judgments is also rooted in our language. For example, feelings of affection and love are usually described as warm, such as the experience of a hug, while loneliness and social distance are typically associated with cold attributes (e.g., &#x201C;giving someone a cold shower,&#x201D; &#x201C;cold-hearted&#x201D;). <xref ref-type="bibr" rid="ref54">Ijzerman and Semin (2009)</xref> provided evidence for this deep interdependence between language, perception and social behavior. The authors prompted different temperature conditions by asking their participants to rate the social proximity they felt with a known person of their choice while they were holding cold or hot beverages. As it turned out, the warm condition was associated with greater social proximity, compared to the condition of holding a cold beverage.</p>
<p>Taken as a whole, these findings suggest that bodily experiences can play a preconscious and automatic role in shaping explicit awareness and in leading our interaction with the world. We can even state that without embodying our own and others&#x2019; psychological states, we are denied the possibility of understanding them. Such a conclusion leads again to the necessity of adopting an integrative approach for studying both perceptual and cognitive mechanisms. In the next subsection, we provide more evidence for the reliance of cognitive processes on perceptual ones, by showing that we recruit the same neural networks dedicated to visuospatial representations of distances to represent different degrees of social proximity.</p>
</sec>
<sec id="sec10">
<label>3.3.</label>
<title>When the social meets the spatial: Interpersonal distances</title>
<p>A large body of literature highlights that we use overlapping systems for assessing social proximity and physical distances. For instance, <xref ref-type="bibr" rid="ref8">Bar-Anan et al. (2007)</xref> used a Stroop-like task in which words indicating close or distant social affiliation (&#x201C;us&#x201D; or &#x201C;enemy&#x201D;) were positioned in closer or farther perspectives. Participants had to indicate if the item&#x2019;s position on the screen was proximal or distal, independently from the meaning of the word. It resulted that words were classified faster when the psychological and the spatial distances were matching, compared to when the two types of distance were incongruent. For example, when the word &#x201C;us&#x201D; was written in a close-up position in the scenario, the response time was shorter compared to the condition in which the same word (indicating social proximity) was positioned in the background of the scenario. The authors interpreted this finding in terms of a common mechanism for the processing of spatial and psychological distances (see definition in Box 3), which would explain the slower response in the incoherent condition due to the activation of incongruent representations on the same neural path.</p>
<p>Another important line of research supporting this view is the one that investigates the interpersonal distance in social interaction. It is commonly known that we adjust our position in relation to our intimacy with the people around us (<xref ref-type="bibr" rid="ref47">Hall, 1963</xref>; <xref ref-type="bibr" rid="ref48">Hall et al., 1968</xref>; <xref ref-type="bibr" rid="ref64">Lenzoni et al., 2020</xref>). This effect has been named and described in multiple ways. For instance, <xref ref-type="bibr" rid="ref117">Teneggi et al. (2013)</xref> defined peripersonal space as a multisensory-motor interface between body and environment and showed that its shrinkage or extension depended on the presence and interaction with others. In this vein, <xref ref-type="bibr" rid="ref105">Serino (2019)</xref> extensively reviewed the literature on peripersonal space, highlighting the stretchable nature of this multisensorial space and its role in mediating body-environment interactions. Furthermore, the author claimed that this physiological construct has the psychological consequence of defining the boundaries between ourselves and the external world, enabling bodily self-location and consciousness (<xref ref-type="bibr" rid="ref106">Serino et al., 2013</xref>; <xref ref-type="bibr" rid="ref14">Blanke et al., 2015</xref>; <xref ref-type="bibr" rid="ref81">Noel et al., 2018</xref>). It is also suggested that peripersonal space plays an important role in the body&#x2013;body interactions with other people.</p>
<p>To study precisely this body&#x2013;body dynamics, <xref ref-type="bibr" rid="ref60">Kroczek et al. (2020)</xref> used Virtual Reality (VR) to manipulate interpersonal distance in social interactions. Participants had to interact with one of two virtual agents represented in the VR scenario. They were instructed to approach them and start interacting as soon as the agent would look up at them. The authors manipulated the distance of interaction by delaying the moment in which the virtual agent would notice the participant. They found that the closer the participant had to get to the virtual agent in order to be noticed, the &#x201C;more arousing, less pleasant, and less natural&#x201D; the interaction was felt. Perception of close distances was also accompanied with increased levels of skin conductance. These results are consistent with the principles of Proxemics, whereby personal space is organized in concentric areas that determine the level of ease we feel being close to another person, i.e., we can empathize with them, which is based on our level of intimacy with that person (<xref ref-type="bibr" rid="ref102">Schiano Lomoriello et al., 2018</xref>).</p>
<p>Proxemics is not the only discipline that has dealt with concepts of personal distances. Construal Level Theory has also attempted to explain the relationship between social, physical, and temporal distances in terms of psychological dimensions. What is meant by psychological dimension is the level of specificity or abstraction, by which information is represented, that goes from a low-level (incidental and specific) representation of events near us to a high-level (general and prototypical) representation of farther events (<xref ref-type="bibr" rid="ref53">Henderson et al., 2006</xref>; <xref ref-type="bibr" rid="ref119">Trope and Liberman, 2010</xref>). The possibility of a shared mechanism for the perception of these different dimensions of distances has been corroborated by fMRI studies, showing activation of the same neural network during the processing of social and physical distances. For example, in an fMRI study, <xref ref-type="bibr" rid="ref125">Yamakawa et al. (2009)</xref> investigated the role of the parietal cortex in analytic representations of egocentric mapping, which is employed for processing both physical and social relationships. The authors asked participants to perform two tasks. In the first task, participants had to evaluate their physical distance to neutral objects displayed on a screen. In the second task, participants were shown with two faces and had to choose the one with which they felt more compatible. Hemodynamic response was collected during both tasks and revealed a common activation in the parietal cortex. The social distance task was also linked with the activation of extended regions dedicated to social cognition processes, such as the fusiform gyri, the bilateral medial frontal cortices, the inferior frontal cortices, the insular cortices, the left basal ganglia, and the amygdala. Nevertheless, the overlap in the parietal cortex seems to confirm a common neural substrate for the evaluation of spatial and social distances, and indicates that this area is part of the network dedicated to the processing of social stimuli.</p>
<p>It has been argued that the parietal cortex organizes complex social information in a self-referred map of social distances, guiding our spatial behavior toward others (<xref ref-type="bibr" rid="ref1">Abraham et al., 2008</xref>; <xref ref-type="bibr" rid="ref126">Yamazaki et al., 2009</xref>; <xref ref-type="bibr" rid="ref88">Parkinson and Wheatley, 2013</xref>). This supports the idea that visuospatial perception and social cognition are interconnected processes, subserved by a common substrate in the brain. The reciprocal influence of these two kinds of distances is becoming more and more evident in the literature. For example, <xref ref-type="bibr" rid="ref102">Schiano Lomoriello et al. (2018)</xref> have demonstrated that when people are physically distant from us, we are less prone to empathize with them. In other words, the feeling of social distance or proximity is modulated by the physical distance between us and the other person. The effects are visible also the other way around, in that social inferences (e.g., categorization and stereotyping) can tweak our perception of the physical world, as demonstrated by <xref ref-type="bibr" rid="ref124">Xiao and Bavel (2012)</xref> in their three studies on collective identity and identity threat. The authors found that threatening social situations were judged spatially closer than the non-threatening ones, reinforcing the idea of how distance perception serves the function of adjusting our behavior in relation to our social and physical environment. A final remark is on the application of the rules of physical and social distance not only to our egocentric perspective but also in the interpretation of social scenes in which more agents are interacting. The study of <xref ref-type="bibr" rid="ref130">Zhou et al. (2019)</xref>, among others, demonstrated that closer interpersonal distances, more direct interpersonal angles and more open postures, are all visual cues of ongoing interaction in a group of people. This study along with other experiments on how we interpret social scenes are described in greater detail in section 4.2 that is dedicated to the observation of multiple agents.</p>
<p>The studies revised in this third section confirm that our body is the arena where we enact our own and other people&#x2019;s feelings, and the key to our complex social abilities. A summary of the critical studies in support of this concept is presented in <xref rid="tab2" ref-type="table">Table 2</xref>. We can now finally explore in more detail how we use our vision to understand others, by describing those body indexes, such as posture and movement, that inform us on others&#x2019; psychological state. This is the aim of the next section.</p>
<boxed-text position="float" id="box3">
<p><bold>BOX 3 Definition of psychological distance.</bold></p>
<p>This concept was first proposed by Trope and Liberman in their Construal Level Theory and was defined as the level of abstraction used to represent a phenomenon based on its temporal distance. Greater distance corresponds to greater abstraction. Now the theory includes other three categories of psychological distance: spatial, social, and hypothetical. As demonstrated also in this review, these four dimensions are strongly and systemically correlated with each other. Psychological distance is inevitably egocentric, the center is the self in the present, and it serves as a measure of the value attributed to the phenomenon of interest. Closer events/agents are perceived as more important and more likely to be acted on</p>
</boxed-text>
<table-wrap position="float" id="tab2">
<label>Table 2</label>
<caption>
<p>Summary table of the studies cited in section 3.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="top">Study</th>
<th align="left" valign="top">Sample</th>
<th align="left" valign="top">Stimuli</th>
<th align="left" valign="top">Design</th>
<th align="left" valign="top">Results and theoretical implications</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top" colspan="5"><italic>Evidences for social embodiment</italic></td>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref31">Duclos et al. (1989)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;74 (43 females, undergraduate students)</td>
<td align="left" valign="top">Faces expressing sadness, anger, fear and disgust</td>
<td align="left" valign="top">Participants were induced to adopt expressions of fear, anger, disgust, and sadness by contracting or relaxing specific face muscles and then asked to rate their emotional state after having performed a conflicting task, designed to disguise the experiment&#x2019;s goal</td>
<td align="left" valign="top">Although participants were na&#x00EF;ve to the aims of the experiment, they reported the highest rating of emotion in the condition in which they were expressing that emotion, giving strength to the idea that the activation of specific sensorimotor schemas elicits those embodied feelings</td>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref100">R&#x00FC;tgen et al. (2015)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;102 (70 females, mean age&#x2009;=&#x2009;25)</td>
<td align="left" valign="top">Painful stimulation delivered to the participant or to another person</td>
<td align="left" valign="top">The researchers induced placebo analgesia induced in half of the participants and tested all participants&#x2019; pain perception and empathy. fMRI activation and self-reported pain and empathy ratings were collected</td>
<td align="left" valign="top">Compared to the control group, participants with induced placebo analgesia showed reduced first-hand pain perception and reduced pain empathy, suggesting that pain empathy might be grounded in our own pain experiences</td>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref64">Lenzoni et al. (2020)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;66 (42 patients with myotonic dystrophy; 24 healthy controls)</td>
<td align="left" valign="top">FEAST-N, the subtest of the Facial Emotion Matching test, and BEAST-N, the subtest Body Emotion Matching</td>
<td align="left" valign="top">Emotional recognition ability was assessed in patients with myotonic dystrophy and compared with healthy controls</td>
<td align="left" valign="top">The clinical population performed significantly worse than the healthy controls, demonstrating a causal role of visuomotor abilities in emotion recognition</td>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref54">Ijzerman and Semin (2009)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;33 in Exp. 1; 52 in Exp. 2; 39 in Exp. 3</td>
<td align="left" valign="top">Inclusion of Other in Self (IOS) scale</td>
<td align="left" valign="top">Participants were asked to hold cold or hot beverages and rate the social proximity they felt with a known person of their choice using the IOS scale</td>
<td align="left" valign="top">Participants judged the distance between themselves and a known person as shorter when they were asked to hold a warm beverage, demonstrating an association between warmth and feelings of social proximity</td>
</tr>
<tr>
<th align="left" valign="top">Study</th>
<th align="left" valign="top">Sample</th>
<th align="left" valign="top">Stimuli</th>
<th align="left" valign="top">Design</th>
<th align="left" valign="top">Results and theoretical implications</th>
</tr>
<tr>
<td align="left" valign="top" colspan="5"><italic>Evidences for overlaps between social and physical space</italic></td>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref8">Bar-Anan et al. (2007)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;10 (6 females, undergraduate students)</td>
<td align="left" valign="top">Different words defining close or distant social affiliation depicted in different locations in a scenario</td>
<td align="left" valign="top">Participants had to indicate if the position of an item on the screen was proximal or distal, independently of the meaning of the word</td>
<td align="left" valign="top">Words were classified faster when psychological and spatial distances were matching, compared to when the two types of distances were incongruent</td>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref60">Kroczek et al. (2020)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;36 (18 females, mean age&#x2009;=&#x2009;21.75)</td>
<td align="left" valign="top">Virtual agents represented in a VR scenario</td>
<td align="left" valign="top">Participants were instructed to approach the agents in the VR and start interacting as soon as the agent looked up at them. The interpersonal distance was varied by manipulating the distance at which agents reacted to the participant&#x2019;s approach. Arousal, valence, and realism rates were collected after each interaction, on a 1&#x2013;100 points scale</td>
<td align="left" valign="top">Closer interpersonal distances were rated as more arousing, less pleasant, and less natural than longer distances</td>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref125">Yamakawa et al. (2009)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;24 (4 females, age range&#x2009;=&#x2009;19&#x2013;34&#x2009;years)</td>
<td align="left" valign="top">Two inanimate objects whose relative physical positions could be inferred by texture and lighting cues (i.e., physical distance task). Pictures of two faces (i.e., social distance task)</td>
<td align="left" valign="top">In the physical distance task, participants had to judge which object was closer to them. In the social distance task, participants had to choose which person they felt more compatible with. fMRI data was acquired during both tasks</td>
<td align="left" valign="top">Results showed that the parietal cortex was activated in both tasks, suggesting a common neural substrate for the estimation of physical and social distances</td>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref102">Schiano Lomoriello et al. (2018)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;34 (23 females, mean age&#x2009;=&#x2009;23)</td>
<td align="left" valign="top">Pictures of faces with neutral facial expression, receiving either a painful or a neutral stimulation. All faces were presented in the upright and inverted orientation and in two physical sizes, small and big, corresponding to a perceived far and close social distance</td>
<td align="left" valign="top">The perceived physical distance from the stimuli was manipulated through picture sizes. Participants were asked to assess the painfulness of the stimulation applied to each face presented. EEG data was collected during the task</td>
<td align="left" valign="top">ERPs modulations compatible with an empathic reaction were observed only for the group exposed to face stimuli appearing to be at a close social distance from the participants, i.e., big size pictures. This reaction was absent in the group exposed to smaller stimuli corresponding to face stimuli perceived from a far social distance</td>
</tr>
<tr>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref130">Zhou et al. (2019)</xref>
</td>
<td align="left" valign="top"><italic>N</italic> =&#x2009;148 in total across 7 experiments (mean age&#x2009;=&#x2009;20)</td>
<td align="left" valign="top">Virtual avatars placed at different positions and with different face directions</td>
<td align="left" valign="top">Participants had to report if the avatars in the VR environment were interacting</td>
<td align="left" valign="top">Results showed that closer interpersonal distances, more direct interpersonal angles and more open postures, are all visual cues of ongoing interaction in a group of people</td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
</sec>
<sec id="sec11">
<label>4.</label>
<title>Reuniting visual perception and social cognition: The social body In neuroscientific research</title>
<p>Our interaction with others is substantially mediated by the observation of their behavior. As we just described above, we understand others&#x2019; inner states by embodying their posture and expression, which elicit specific affective responses that we cognitively interpret and recognize (see subsection 3.2 for the embodied account of social cognition). In other words, it is by observing and mirroring the bodies of others that we gain insight of their inner states. The aim of this section is to provide the reader with an overview of the most recent techniques and to inspire new lines of research in visual social cognition. We report a summary of the techniques we describe in <xref rid="tab3" ref-type="table">Table 3</xref>. We will distinguish between techniques that are used to examine posture, movement, and gait of individuals, from those that are used to inspect multiple agents interactions. We will end this methodological part by reporting some evidence demonstrating the social function of our vision, followed by a discussion on the importance of reintegrating the whole body in the study of emotion processing and social cognition.</p>
<table-wrap position="float" id="tab3">
<label>Table 3</label>
<caption>
<p>Summary table of the techniques presented in sections &#x201C;Measures of posture, movement, and gait&#x201D;, &#x201C;Measures of observed social interactions&#x201D;, and &#x201C; Measures of the observer&#x201D;.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th/>
<th align="center" valign="top" colspan="3">Measures of posture, movement, and gait&#x201D;</th>
<th align="left" valign="top">Technique</th>
<th align="left" valign="top">Measures</th>
<th align="left" valign="top">Studies</th>
</tr>
</thead>
<tbody>
<tr>
<td align="center" valign="top"><inline-graphic xlink:href="fpsyg-13-1028150-igr0001.tif"/>Image adapted from <ext-link xlink:href="https://commons.wikimedia.org/wiki/File:Nachuo_sogi.svg" ext-link-type="uri">https://commons.wikimedia.org/wiki/File:Nachuo_sogi.svg</ext-link></td>
<td align="left" valign="top">Stride and walk photogram analysis</td>
<td align="left" valign="top">Stride length and walking speed</td>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref30">Doumas et al. (2012)</xref>; <xref ref-type="bibr" rid="ref111">Sloman et al. (1982)</xref></td>
</tr>
<tr>
<td align="center" valign="top"><inline-graphic xlink:href="fpsyg-13-1028150-igr0002.tif"/>
</td>
<td align="left" valign="top">Electronic walkway</td>
<td align="left" valign="top">Stride length and walking speed</td>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref63">Lemke et al. (2000)</xref>
</td>
</tr>
<tr>
<td align="center" valign="top"><inline-graphic xlink:href="fpsyg-13-1028150-igr0003.tif"/></td>
<td align="left" valign="top">Pressure-sensitive shoe insoles</td>
<td align="left" valign="top">Stride length and walking speed</td>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref51">Hausdorff et al. (2004)</xref>
</td>
</tr>
<tr>
<td align="center" valign="top"><inline-graphic xlink:href="fpsyg-13-1028150-igr0004.tif"/></td>
<td align="left" valign="top">3D motion capture system (inertial motion sensors)</td>
<td align="left" valign="top">Diverse gait-based biomarkers</td>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref11">Belvederi Murri et al. (2020)</xref>
</td>
</tr>
<tr>
<td align="center" valign="top"><inline-graphic xlink:href="fpsyg-13-1028150-igr0005.tif"/></td>
<td align="left" valign="top">Wearable motion sensors</td>
<td align="left" valign="top">Gait parameters</td>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref5">Angelini et al. (2020)</xref>
</td>
</tr>
<tr>
<td align="center" valign="top"><inline-graphic xlink:href="fpsyg-13-1028150-igr0006.tif"/>Image adapted from <ext-link xlink:href="https://commons.wikimedia.org/wiki/File:AMTI_OPT464508_force_plate.png" ext-link-type="uri">https://commons.wikimedia.org/wiki/File:AMTI_OPT464508_force_plate.png</ext-link></td>
<td align="left" valign="top">Force platform</td>
<td align="left" valign="top">Balance</td>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref11">Belvederi Murri et al. (2020)</xref>
</td>
</tr>
<tr>
<td align="center" valign="top"><inline-graphic xlink:href="fpsyg-13-1028150-igr0007.tif"/>
<break/>
</td>
<td align="left" valign="top">Point-light walker stimuli</td>
<td align="left" valign="top">Recognition of biological motion, also when it expresses emotional movements. Preference for detecting social agents facing us</td>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref32">Edey et al. (2017)</xref>; <xref ref-type="bibr" rid="ref49">Han et al. (2021)</xref>; <xref ref-type="bibr" rid="ref76">Miller and Saygin (2013)</xref></td>
<td/>
<td>Measures of observed social interactions&#x201D; </td>
<td>Technique</td>
<td>Measures</td>
<td>Studies</td>
</tr>
<tr>
<td align="center" valign="top"><inline-graphic xlink:href="fpsyg-13-1028150-igr0008.tif"/>Image adapted from <ext-link xlink:href="https://commons.wikimedia.org/wiki/File:VRHeadset.png" ext-link-type="uri">https://commons.wikimedia.org/wiki/File:VRHeadset.png</ext-link></td>
<td align="left" valign="top">Virtual reality</td>
<td align="left" valign="top">Observing or engaging in virtual social interaction. Manipulation of body schema</td>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref60">Kroczek et al. (2020)</xref>; <xref ref-type="bibr" rid="ref127">Yee et al. (2009)</xref>; <xref ref-type="bibr" rid="ref130">Zhou et al. (2019)</xref></td>
</tr>
<tr>
<td align="center" valign="top"><inline-graphic xlink:href="fpsyg-13-1028150-igr0009.tif"/>Image adapted from <ext-link xlink:href="https://pxhere.com/en/photo/1327220" ext-link-type="uri">https://pxhere.com/en/photo/1327220</ext-link></td>
<td align="left" valign="top">Inversion effect</td>
<td align="left" valign="top">Configurational representation of the image in its typical display</td>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref85">Papeo and Abassi (2019)</xref>; <xref ref-type="bibr" rid="ref112">Stekelenburg and Gelder (2004)</xref></td>
</tr>
<tr>
<td align="center" valign="top"><inline-graphic xlink:href="fpsyg-13-1028150-igr0010.tif"/>
<break/>
</td>
<td align="left" valign="top">Object-inferiority effect</td>
<td align="left" valign="top">Faster detection of an object when represented outside a configurational figure that typically contains it</td>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref84">Papeo (2020)</xref>
</td>
</tr>
<tr>
<td align="center" valign="top"><inline-graphic xlink:href="fpsyg-13-1028150-igr0011.tif"/>
<break/>
</td>
<td align="left" valign="top">Videos of synchronous movement</td>
<td align="left" valign="top">Observation of social coordination as measure of social relationship</td>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref68">Macpherson et al. (2020)</xref>
</td>
<td/>
<td>Measures of the observer</td>
<td>Technique</td>
<td>Measures</td>
<td>Studies</td>
</tr>
<tr>
<td align="center" valign="top"><inline-graphic xlink:href="fpsyg-13-1028150-igr0012.tif"/></td>
<td align="left" valign="top">Eye-tracker (eye movements and pupillometry)</td>
<td align="left" valign="top">Eye position, eye movements, pupil size. Saccades, fixation duration</td>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref59">Kret et al. (2013)</xref>
</td>
</tr>
<tr>
<td align="center" valign="top"><inline-graphic xlink:href="fpsyg-13-1028150-igr0013.tif"/>Image adapted from <ext-link xlink:href="https://www.flickr.com/photos/sparkfun/8677911665" ext-link-type="uri">https://www.flickr.com/photos/sparkfun/8677911665</ext-link> License <ext-link xlink:href="https://creativecommons.org/licenses/by/2.0/" ext-link-type="uri">CC-BY 2.0</ext-link></td>
<td align="left" valign="top">Electromyography</td>
<td align="left" valign="top">Muscle activity, micro-movements</td>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref59">Kret et al. (2013)</xref>
</td>
</tr>
<tr>
<td align="center" valign="top"><inline-graphic xlink:href="fpsyg-13-1028150-igr0014.tif"/><break/>Ebbinghaus illusionImage adapted from <ext-link xlink:href="https://www.google.com/search?q=ebbinghaus%20illusion&#x0026;tbm=isch&#x0026;tbs=il:cl&#x0026;hl=en&#x0026;sa=X&#x0026;ved=0CAAQ1vwEahcKEwiYmtWH-M78AhUAAAAAHQAAAAAQAw&#x0026;biw=1548&#x0026;bih=937#imgrc=SCJW-hJd0NjdXM" ext-link-type="uri">https://www.google.com/search?q=ebbinghaus%20illusion&#x0026;tbm=isch&#x0026;tbs=il:cl&#x0026;hl=en&#x0026;sa=X&#x0026;ved=0CAAQ1vwEahcKEwiYmtWH-M78AhUAAAAAHQAAAAAQAw&#x0026;biw=1548&#x0026;bih=937#imgrc=SCJW-hJd0NjdXM</ext-link></td>
<td align="left" valign="top">Visual illusions</td>
<td align="left" valign="top">Magnitude of the illusion</td>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref22">Chouinard et al. (2018)</xref>; <xref ref-type="bibr" rid="ref58">King et al. (2017)</xref></td>
</tr>
<tr>
<td align="center" valign="top"><inline-graphic xlink:href="fpsyg-13-1028150-igr0015.tif"/>Image adapted from <ext-link xlink:href="https://commons.wikimedia.org/wiki/File:Binocular_rivalry_Experiment.png" ext-link-type="uri">https://commons.wikimedia.org/wiki/File:Binocular_rivalry_Experiment.png</ext-link>License <ext-link xlink:href="https://creativecommons.org/licenses/by/2.0/" ext-link-type="uri">CC-BY 4.0</ext-link></td>
<td align="left" valign="top">Binocular rivalry</td>
<td align="left" valign="top">Perceptual dominance of one of two stimuli competing to reach visual awareness</td>
<td align="left" valign="top">
<xref ref-type="bibr" rid="ref4">Anderson et al. (2011)</xref>
</td>
</tr>
<tr>
<td align="center" valign="top"><inline-graphic xlink:href="fpsyg-13-1028150-igr0016.tif"/>Image adapted from <ext-link xlink:href="https://picryl.com/media/goggles-8b0197" ext-link-type="uri">https://picryl.com/media/goggles-8b0197</ext-link></td>
<td align="left" valign="top">Skewed goggles</td>
<td align="left" valign="top">Effects of magnification of objects</td>
<td align="left" valign="top"><xref ref-type="bibr" rid="ref67">Linkenauger et al. (2010)</xref>
</td>
</tr>
<tr>
<td align="center" valign="top"><inline-graphic xlink:href="fpsyg-13-1028150-igr0017.tif"/><break/>Image adapted with permission from <xref ref-type="bibr" rid="ref122">van der Hoort et al. (2011)</xref> License <ext-link xlink:href="https://creativecommons.org/licenses/by/2.0/" ext-link-type="uri">CC-BY 4.0</ext-link></td>
<td align="left" valign="top">Body-swap illusion</td>
<td align="left" valign="top">Influence of our body size on the perception of our physical and social environment</td>
<td align="left" valign="top">
<xref ref-type="bibr" rid="ref122">van der Hoort et al. (2011)</xref>
</td>
</tr>
</tbody>
</table>
</table-wrap>
<sec id="sec12">
<label>4.1.</label>
<title>Measures of posture, movement, and gait</title>
<p>The social cues we extract from other people&#x2019;s bodies are linked to their posture, movement and gait. Our emotions find expression not only by means of the facial muscles, but also in the way we position our limbs, shoulders and spine. For example, the curvature of the shoulders reflects behaviors of either closure or openness to the world, either avoidance or approaching attitude. Kinematics is another source of relevant information, and can be decomposed in different indexes: balance, movement and gait. Although the reliability of these body measures in predicting affective states is supported by an increasing number of studies, the tools and assessment methods to measure them are limited or underdeveloped in the empirical research. Here, we present a variety of instruments that can be used to quantify posture and movement.</p>
<sec id="sec13">
<label>4.1.1.</label>
<title>Posture and gait</title>
<p>One of the most immediate and old ways of assessing gait speed and its characteristics is by videorecording people walking and analyzing photograms of the strides. A pioneer study was conducted in the 80s by <xref ref-type="bibr" rid="ref111">Sloman et al. (1982)</xref>, in which the authors assessed the gait in adults with depression using this method. The analysis of mobility in this clinical population showed that depression is associated with specific motor symptoms, such as slower movements and worse balance compared to healthy controls (<xref ref-type="bibr" rid="ref30">Doumas et al., 2012</xref>; <xref ref-type="bibr" rid="ref11">Belvederi Murri et al., 2020</xref>). The use of <bold>electronic walkways</bold> can provide a more accurate measure of the stride length and walking speed. <xref ref-type="bibr" rid="ref63">Lemke et al. (2000)</xref> used a combination of photogrammetry and electronic walkway and confirmed the results found by Sloman proving a reduced stride length in depressed patients. Another study on depression (<xref ref-type="bibr" rid="ref51">Hausdorff et al., 2004</xref>) adopted <bold>pressure-sensitive shoe insoles</bold> to check for variability in swing time, which resulted higher in the clinical population. To obtain indexes on the posture along with the walking characteristics, the use of <bold>3D motion capture systems</bold> can give more detailed information about head, e.g., position and movements, upper limbs swing, back curvature. For example, studies on depressed patients have shown a correlation between the severity of the depression and the thoracic curvature, supporting the idea that a slumped position can be associated with sadness and introversion (<xref ref-type="bibr" rid="ref11">Belvederi Murri et al., 2020</xref>). The 3D motion capture system was applied by <xref ref-type="bibr" rid="ref5">Angelini et al. (2020)</xref> on patients with multiple sclerosis. They identified diverse gait-based biomarkers using inertial motion sensors with the goal of improving the assessment of progressive multiple sclerosis (MS). The authors examined 15 gait measures and reported longer steps and stride duration, reduced regularity and higher instability in the walk of people with MS, when compared to healthy controls. The use of <bold>wearable sensors</bold> for the recording of the kinematics enables the collection of data outside the lab, the identification of a variety of gait parameters and the detection of biomarkers specific to different clinical conditions.</p>
</sec>
<sec id="sec14">
<label>4.1.2.</label>
<title>Balance</title>
<p>Balance can be assessed by means of force platforms (also known as stepping platforms) and in some cases the balance exercise performed during the execution of a working memory task can give information about how cognitive load can reduce balance skills (<xref ref-type="bibr" rid="ref30">Doumas et al., 2012</xref>; <xref ref-type="bibr" rid="ref11">Belvederi Murri et al., 2020</xref>). This dual task approach can be implemented also during other movement assessments, as it is effective in detecting how cognitive load influences motor skills in clinical populations.</p>
</sec>
<sec id="sec15">
<label>4.1.3.</label>
<title>Observing moving bodies: Point-light walker stimuli</title>
<p>Obtaining gait and balance measurements can be exploited also to study our social vision: <xref ref-type="bibr" rid="ref32">Edey et al. (2017)</xref> investigated the walking speed of participants and tested whether participants used their own kinematics as a reference to judge the affective states of point-light walker (PLW) stimuli. These visual stimuli are animations composed solely of points that have been previously attached to the joints of a moving person and extrapolated by the video recording of the scene. Following the idea that our own kinematics influences our perception of emotional movements in others, the authors manipulated the speed and posture of the artificial walker in order to elicit anger, happiness or sadness. As expected, there was a modulatory effect of the participant&#x2019;s movements on the emotion recognition: people judged less intensely emotions similar to their own walking pace. In other words, participants who walked with greater speed rated high-velocity emotions (e.g., anger) as less intense relative to low-velocity emotions (e.g., sadness). This finding is in agreement with the theories of embodied social cognition reviewed above (see subsection 3.2). Point-light stimuli in motion has been used also to investigate the detection of biological motion in relation to measures of social cognition. Using this approach, it has been shown that higher scores in social cognition tests were linked to high accuracy in biological motion detection (<xref ref-type="bibr" rid="ref76">Miller and Saygin, 2013</xref>). Moreover, PLW stimuli have been used to measure our promptness in seeing social agents facing us compared to facing away based on the perceived social relevance. The level of social relevance was manipulated by changing distance, speed and size of the PLW, based on the assumption that people perceived as nearer, faster and bigger have more social relevance than those perceived as farther, slower and smaller. Therefore, the likelihood of initiating an interaction with them increases. PLW stimuli are particularly suited to measure differences in seeing people facing toward us or away thanks to their ambiguous in-depth orientation (<xref ref-type="bibr" rid="ref49">Han et al., 2021</xref>). Findings from these studies show once again that social factors have a clear impact on visual processes, further supporting our hypothesis of a deep link between low-level feature detection and high-level social cognition.</p>
</sec>
</sec>
<sec id="sec16">
<label>4.2.</label>
<title>Measures of observed social interactions</title>
<p>Beside the interface with a single person, our social life is mainly constituted by crowded situations in which multiple agents engage complex interactions with each other. A relatively new branch of study is focusing on the perception of the relations between social entities and proposes that our interpretation of social events draws upon a configurational recognition process. Recent findings suggest a specific sensitivity of the visual system for the spatial relationship between multiple social agents, such as interpersonal distance and angle of a facing dyad (<xref ref-type="bibr" rid="ref84">Papeo, 2020</xref>). One of the requirements for a successful social interaction is, indeed, a face-to-face position between the agents. This implies that seeing two people facing each other makes us assume an ongoing interaction (<xref ref-type="bibr" rid="ref130">Zhou et al., 2019</xref>). Based on the data from their experiments with virtual reality (VR), <xref ref-type="bibr" rid="ref130">Zhou et al. (2019)</xref> created a computational model of the social interaction field, which they describe as the area surrounding each of us within which we can start interacting with other people. Similarly, to a gravitational field, the social interaction field can inform us on the strength of the social interaction between two people based on their physical distance and positions in space.</p>
<sec id="sec17">
<label>4.2.1.</label>
<title>Facing dyads</title>
<p>We believe, akin to other authors (<xref ref-type="bibr" rid="ref84">Papeo, 2020</xref>), that our visual system is tuned for the recognition of social interactions around us, allowing us a fast detection of social groups. The aggregation of multiple elements (individuals) in a unitary piece of visual information (group) can facilitate and fasten the representation of the crowded scene we are observing (<xref ref-type="bibr" rid="ref130">Zhou et al., 2019</xref>). By means of the inversion effect described above, <xref ref-type="bibr" rid="ref85">Papeo and Abassi (2019)</xref> demonstrated that facing dyads are processed as unitary perceptual objects. They found that the inversion effect was greater for the facing dyads compared to non-facing ones. To recall the definition of this phenomenon, a greater inversion effect implies greater visual sensitivity, in this case supporting the hypothesis of a visual attunement for interacting agents. It should be noted that the two bodies inversion effect is not observed for non-human or for human-object dyads. Another signature of this visual grouping is the object-inferiority effect that has been observed in the visual search through a crowd: the dyad as a whole is detected faster than the single objects within it, but only when the agents are facing each other (<xref ref-type="bibr" rid="ref86">Papeo et al., 2019</xref>).</p>
</sec>
<sec id="sec18">
<label>4.2.2.</label>
<title>Synchronicity as a measure of social relation</title>
<p>We extract social information about an ongoing interaction also from the observation of the agents&#x2019; movements. In this case, it is the level of interpersonal coordination that informs us about the cooperative or hostile tones of a social interaction. A higher synchronization of the dyad&#x2019;s movements signals coalition rather than opposition (<xref ref-type="bibr" rid="ref68">Macpherson et al., 2020</xref>).</p>
<p><xref ref-type="bibr" rid="ref75">Miles et al. (2009)</xref> used stick figures and sounds of footstep to simulate two people walking together with different gait patterns and found that when rhythms of walking synchrony were out of phase, these were associated with a lower level of relationship. A similar study showed that social factors, such as the skin-color, can influence the perception of <bold>synchronous movements</bold> (<xref ref-type="bibr" rid="ref68">Macpherson et al., 2020</xref>).</p>
</sec>
</sec>
<sec id="sec19">
<label>4.3.</label>
<title>Measures of the observer</title>
<p>After having presented numerous techniques suitable for measuring body position and movement that can be observed in one or multiple social agents, we want to offer an overview of methodologies that can be used to analyze the observer&#x2019;s behavior.</p>
<p>For a comprehensive approach in visual social cognition, it is important to appropriately combine measures that quantify the observed social cues with measures describing the observer&#x2019;s state, at a cognitive, visual and physiological level.</p>
<sec id="sec20">
<label>4.3.1.</label>
<title>Eye-trackers and other physiological indexes</title>
<p>Since we are concentrating on the visual aspects of social cognition, studying the eye and the way we visually scan the social scene is almost imperative. Although the most immediate way to study gaze behavior is by means of <bold>eye-trackers</bold>, the range of methodologies does not limit to this one. Other important indicators of social cognition - ideally to be combined with the gaze measurements - are those linked to the automatic mimicry involved in the process of emotion recognition. In this case, the focus is on the muscles and posture of the observer and the mirroring reflexes can be recorded through the application of <bold>sensible electrodes on expression muscles</bold>.</p>
<p><xref ref-type="bibr" rid="ref59">Kret et al. (2013)</xref> collected eye movements, pupil size, and facial muscles activity, while participants were performing an emotion discrimination task with full body and face stimuli. They used full body images from the BEAST (Bodily Expressive Action Stimulus Test, <xref ref-type="bibr" rid="ref26">de Gelder and Van den Stock, 2011</xref>) database. The database includes stimuli representing emotions in whole-body figures, and in this experiment they were presented in association with congruent or incongruent facial expressions. Eye movements were recorded with a wearable eye-tracking device, and their analysis revealed that participants looked longer at faces than at bodies, and the same applied for happy versus angry/fearful postures, with longer fixation duration for negative body expressions compared to positive ones. Furthermore, negative emotion expressions correlated with activity in the observers&#x2019; corrugator, and happy expressions with zygomaticus&#x2019; activation. The corrugator showed more responsiveness for bodies compared to faces, whilst a reversed pattern was observed for the zygomaticus. These findings nicely dovetail with theories of embodied social cognition, supporting the preconscious activation of the expressive muscles matching the emotion observed. Other implicit indexes of emotional response are detectable by physiological measures, such as skin conductance and heart rate variability (for reviews on emotion measures see: <xref ref-type="bibr" rid="ref71">Mauss and Robinson, 2009</xref>; <xref ref-type="bibr" rid="ref33">Egger et al., 2019</xref>).</p>
</sec>
<sec id="sec21">
<label>4.3.2.</label>
<title>Measures of visual perception</title>
<p>Visual illusions can provide useful insights on the interplay between high and low order processes in the perception of an image. We will describe the application of this kind of tool in more detail in the section dedicated to the clinical studies (subsection 5). Assessment of visual awareness can be provided with dichoptic stimulation (i.e., simultaneous presentation of different stimuli to the two eyes), which provokes the phenomenon of binocular rivalry, i.e., the alternation in the perception of two different images presented to each eye.</p>
<p>In an experiment by <xref ref-type="bibr" rid="ref4">Anderson et al. (2011)</xref>, a paradigm with binocular rivalry was used to examine the influence of the affective state of a perceiver on the visual awareness of the stimulus presented. The potential of this technique lies in the fact that the two visual inputs presented to the two visual hemispheres compete for perceptual dominance; the selective criteria are driven by top-down processes and this allows us to determine how our internal state influences visual awareness. The authors first manipulated the participants&#x2019; affective states by showing them emotional images, and subsequently asked them to perform the binocular rivalry tasks. These consisted of a neutral stimulus (e.g., a house) presented in competition with a socially relevant stimulus (facial expressions) and participants had to report what they were seeing and for how long. Results confirmed the hypothesis of the authors, for which the affective state of the viewer biases the contents of visual awareness. In fact, the social stimuli were always dominant in the image perception, and this effect was maximized when participants were asked to watch a set of stimuli inducing unpleasant emotions. These findings show how binocular rivalry can be used to explore the process of sensory selection behind our conscious experience of the world and support the role of top-down modulation on our visual perception.</p>
</sec>
<sec id="sec22">
<label>4.3.3.</label>
<title>Body schema manipulations and their effects on personality</title>
<p>We would like to dedicate a short section also to some methodologies used to investigate the influence of our body size and appearance on our perception of the physical and social environments. In an attempt to tackle this issue, previous research has relied upon illusions, which were generated either by magnifying or minimizing the objects in the visual field, or by inducing the sensation of having a shrinked or gigantified body. In a study, <xref ref-type="bibr" rid="ref67">Linkenauger et al. (2010)</xref> observed the rescaling effects induced by placing one&#x2019;s own hand close to objects, whose size was distorted by means of skewed goggles. The presence of a personal body part canceled out the magnifying or minimizing effect of the illusion. In another study, <xref ref-type="bibr" rid="ref122">van der Hoort et al. (2011)</xref> generated a deeper manipulation of the body schema, referred to as body-swap illusion, by touching a part of the participants&#x2019; body and showing them a video of the same tactile stimulation being performed on a mannequin of different dimensions. Although the retinal images remained identical, perceiving a different size of the body changed the estimates of size and distance of objects present in the scene (<xref ref-type="bibr" rid="ref122">van der Hoort et al., 2011</xref>).</p>
<p>Lastly, virtual avatars can alter our self-representation. <xref ref-type="bibr" rid="ref127">Yee et al. (2009)</xref> had people interact in virtual environments with avatars of different dimensions, and found that taller and attractive avatars outperformed shorter avatars in the online game &#x201C;World of Warcraft.&#x201D; The authors attributed the better performance to an increased self-esteem and confidence linked to the height and attractiveness of the characters, showing that an avatar&#x2019;s appearance can influence a user&#x2019;s behavior in an online environment. The effect was transposed also outside the virtual environment: in a second experiment, a VR session in which participants had either a tall or short avatar was followed by a face-to-face interaction during which a negotiation task was performed. It turned out that people that had embodied a tall avatar were more likely to act unfairly to gain more profit and less prone to accept transactions against their interests than participants in short avatars. These studies reveal the potential of VR as a promising technique not only for their observational scope but also as promising intervention tools in clinical settings.</p>
</sec>
</sec>
<sec id="sec23">
<label>4.4.</label>
<title>Our eyes at the service of emotion recognition and social communication</title>
<p>The inextricable connection between vision and social cognition has biological plausibility: the human eye (<xref ref-type="bibr" rid="ref104">Schutt et al., 2015</xref>). Our eyes seem to have evolved to serve the fine and complex phenomenon of human communication and the development of social skills through our gaze-following abilities. These abilities are favored by certain characteristics specific to our species: the white sclera of our eyes and the high contrast in eye and facial skin coloration. Indeed, only in humans the outline of the eyes and the position of the iris are so clearly visible, conveying information on where the others are looking (<xref ref-type="bibr" rid="ref92">Proffitt, 2006</xref>; <xref ref-type="bibr" rid="ref118">Tomasello et al., 2007</xref>). In a study by <xref ref-type="bibr" rid="ref118">Tomasello et al. (2007)</xref>, gaze-following behavior was studied in both primates and infants. This behavior is based on cues coming from head orientation or from eyes direction. In this study, an experimenter sat in front of the ape or the child to be tested and looked up at the ceiling in different modalities: only with the eyes while keeping the head in a frontal position, bending backward the neck and facing the ceiling with the eye closed, or with face and eyes both looking up. Results showed a preference in infants for eye direction cues independently from the head orientation, while great apes relied mostly on head direction cues, suggesting that humans are more attuned to the eyes than our closest primate relatives (the great apes) are (<xref ref-type="bibr" rid="ref118">Tomasello et al., 2007</xref>). The unique features of the human eye probably represent the key to mechanisms of shared attention which are at the basis of the human propensity for cooperation and coordination (<xref ref-type="bibr" rid="ref110">Shepherd, 2010</xref>).</p>
<p>The dominance of the visual system in human communication is evident also in the automatic and fast identification of faces and bodies even in the most complex scenarios. As anticipated in subsection 3.3, we are able to detect conspecifics and evaluate the spatial relations between them in a very rapid and preconscious way. The preconscious nature of emotion processing guided by our vision has been investigated in different studies in which patients with lesions to the primary visual cortex could still perform task of emotion discrimination, without visual awareness of the stimulus (<xref ref-type="bibr" rid="ref77">Morris et al., 2001</xref>; <xref ref-type="bibr" rid="ref90">Pegna et al., 2005</xref>; <xref ref-type="bibr" rid="ref116">Tamietto and de Gelder, 2008</xref>; <xref ref-type="bibr" rid="ref25">de Gelder, 2009</xref>). For instance, <xref ref-type="bibr" rid="ref90">Pegna et al. (2005)</xref> collected data from one patient who became cortically blind as a consequence of two strokes that destroyed his visual cortices bilaterally. Different visual discrimination tasks showed a small capacity to discriminate emotional social stimuli (expressive faces), whilst no sensitivity was observed for different kinds of stimuli (e.g., neutral faces, animals). Similar results were obtained by <xref ref-type="bibr" rid="ref77">Morris et al. (2001)</xref> on a patient with right hemianopia due to a left occipital lobe damage. Taken together, these findings suggest a strong connection between visual inputs and subcortical structures, aimed at providing an automatic discrimination of salient, emotional stimuli.</p>
<p>In summary, the human eye does not serve solely vision, but social communication and emotion processing as well (<xref ref-type="bibr" rid="ref92">Proffitt, 2006</xref>). Once again, the distinction between cognitive and sensory processes becomes even more blurred, reinforcing those models that postulate early influences of our sociality on our perceptual systems.</p>
</sec>
<sec id="sec24">
<label>4.5.</label>
<title>Expressive bodies, not just faces: Reintegrating the whole body in the study of visual social cognition</title>
<p>Just a decade ago, only less than 5% of the experimental production had considered the inclusion of the whole body as stimuli in their design (<xref ref-type="bibr" rid="ref25">de Gelder, 2009</xref>). By now, the situation has seen little changes: <xref ref-type="bibr" rid="ref123">Witkower et al. (2021)</xref> argued also for the need of further investigation on how bodies convey social information. In their study on a culturally-isolated population of Nicaragua, they have shown effective recognition of bodily basic expressions of sadness, anger, and fear, in the members of this society, providing evidence for the universality of these bodily displays. Indeed, a growing body of evidence is testifying that body expressions are recognized automatically and effectively, in the same specialized manner that characterizes the innate predisposition to face perception (for a review see, <xref ref-type="bibr" rid="ref95">Quinn and Macrae, 2011</xref>).</p>
<p>Behavioral and physiological data have proven that emotion recognition relies considerably on the observation of body expressions. For instance, <xref ref-type="bibr" rid="ref59">Kret et al. (2013)</xref> presented participants with <italic>ad-hoc</italic> images of body emotional postures associated with congruent or incongruent facial expressions (fear or happiness). Results revealed that the recognition of the emotion expressed by the face was influenced by the emotion expressed by the body. Response time increased with incongruent stimuli, while it decreased when face and body were expressing the same feeling. <xref ref-type="bibr" rid="ref112">Stekelenburg and Gelder (2004)</xref> examined the electrophysiological correlates of the inversion effect, a well-known phenomenon in facial perception for which people take longer to recognize faces presented upside-down compared to any other object presented in the same fashion. The effect is explained by assuming a configural representation for the identification of faces, which fastens their detection when they appear in the expected upright position but slows it down when inverted. By using EEG, the authors showed that the same ERP component, namely the N170, was evoked both by faces and bodies presented upside down, but not by pictures of inverted objects (e.g., shoes), suggesting a configural coding of bodies&#x2019; images similar to the one underlying face perception. Studies that investigated functional connectivity between brain areas active during recognition of bodily expressions (<xref ref-type="bibr" rid="ref89">Peelen and Downing, 2005</xref>; <xref ref-type="bibr" rid="ref121">van de Riet et al., 2009</xref>) showed the activation of the same areas that typically respond to face stimuli. These areas appeared to be only a part of the broader network involved in body stimuli processing, which also includes the supratemporal sulcus, the middle temporal/middle occipital gyrus, the superior occipital gyrus and the parieto-occipital sulcus.</p>
<p>These dedicated mechanisms behind body perception support the importance of recognizing bodily expressions in our everyday life.</p>
</sec>
</sec>
<sec id="sec25">
<label>5.</label>
<title>Insights from clinical studies</title>
<p>Finally, lesion and clinical studies are valuable in the examination of causal relationships between social and perceptual processes. Neuromuscular diseases, for instance, can provide insights into the relation between emotion recognition in others and impaired sensorimotor skills. Such an example is the study of <xref ref-type="bibr" rid="ref64">Lenzoni et al. (2020)</xref> on myotonic dystrophy described in section 3.2. Along with motor impairments, clinical categories in which social deficits represent the major symptomatology can be studied for investigating the connection between social cognition and perceptual anomalies. Autism spectrum disorder and schizophrenia offer the unique opportunity to examine possible links between deficits in social abilities and altered visual perception - that are typically observed in these disorders (<xref ref-type="bibr" rid="ref19">Butler et al., 2008</xref>; <xref ref-type="bibr" rid="ref58">King et al., 2017</xref>; <xref ref-type="bibr" rid="ref98">Robertson and Baron-Cohen, 2017</xref>; <xref ref-type="bibr" rid="ref23">Chung and Son, 2020</xref>).</p>
<p>In a review by <xref ref-type="bibr" rid="ref58">King et al. (2017)</xref>, perceptual abnormalities in schizophrenia have been revised through the analysis of studies on visual illusions and their effects on this clinical population. Perceptual illusions are widely used in vision studies, in that they allow to disentangle the mechanisms underlying visual processing. For example, the Ebbinghaus illusion (for which a target item looks smaller or bigger by effect of contextual cues) can be modulated by the effects of prior knowledge and culture on visual perception, which means that it is a distortion linked to top-down processing of the visual inputs. From the literature reviewed in <xref ref-type="bibr" rid="ref58">King et al. (2017)</xref>, it emerged that it is this kind of high-level integration that seems to be systematically altered in people with schizophrenia. In fact, they tend to show a reduced susceptibility for high-level illusions, suggesting that abnormalities in visual perception might depend on deficits in the cognitive/perceptual communication at the basis of perceptual awareness. As the same authors suggest, it would be helpful to study these processes not only in isolation but also applying converging techniques to investigate the reciprocal links between higher and lower processes with ecologically valid designs. In this way, it might be possible to explore more in depth the connections between inferential top-down aspects of visual perception and the ability of recognizing social cues from observing other people.</p>
<p>In a similar fashion, different perceptual styles in autism were examined in a study by <xref ref-type="bibr" rid="ref22">Chouinard et al. (2018)</xref>, in which sensory integration was investigated again by means of visual illusions. In this case, the Shepard illusion was tested in autistic and typically developing individuals while their eye movements were recorded. In contrast with the authors&#x2019; expectations, no difference was found in saccades and scene exploration between the two groups, although the clinical population experienced a weaker illusion than the healthy controls. These results can be explained by differences in high-level visual integration, instead of anomalies in earlier stages of perception (e.g., spatial exploration, saccade velocity and frequency). As for schizophrenia, the empirical data suggest that top-down inferences might be reduced in people with autism, bringing to higher objectivity in perceiving the world as it really is, which in turn leads to a diminished sensitivity to visual illusions. Once more, further research is needed in order to establish the relationship between these perceptual anomalies and deficits in higher order social cognition.</p>
</sec>
<sec id="sec26">
<label>6.</label>
<title>Conclusion and future directions of research</title>
<p>Our review strives to encourage the application of multimodal integrative approaches in cognitive, social and affective neuroscience and to inspire further research aimed at discovering the intertwined connection between social cognition and visual perception.</p>
<p>We highlighted the tight relationship between visual perception and social cognition. Specifically, we aimed at unveiling the role of the body as the starting point for the construction of our perception, also when it comes to social perception. Firstly, we compared research modalities that can be adopted in the exploration of these constructs, and remarked on the necessity of moving from isolationist unimodal approaches to integrative multimodal perspectives. Subsequently, we presented abundant evidence for the rooting of social cognition in bodily expressions, as defined by theories of embodied social cognition. Finally, we described a common mechanism at the basis of specific aspects of social cognition and spatial behavior: an overlapping neural network for the perception of both physical and social distances. It appears that we recruit networks dedicated to the processing of physical distances to map our social environment, strengthening the dependence of higher order social abilities on lower representational systems closer to perceptual networks.</p>
<p>This visual-social interface is at play also in processes of emotion recognition, as we rely on visual cues collected through the scrutiny of the other&#x2019;s body. Again, the body acts as a middle ground where our vision and our social abilities can encounter.</p>
<p>We also described some of the body features we observe to assess the other&#x2019;s state, and reviewed instruments and techniques useful in quantifying these indexes for research applications. We described different ways to measure posture, balance, and gait, as meaningful indicators of emotional states. Also the interaction among multiple agents was covered in the methodological section, by providing examples of studies that examined interpersonal distance and synchronization as a hint for understanding the quality of the relationship. Finally, we described different ways to analyze the observer&#x2019;s body, such as the detection of micro-movements underlying a first stage of emotional contagion, or the visual exploration of the stimuli by means of eye-tracking devices. The body is the key for any level of reciprocal understanding, and combining the study of bodily expressions with the analysis of visual behavior can be beneficial for the development of a detailed model of social cognition.</p>
<sec id="sec27">
<label>6.1.</label>
<title>Limitations</title>
<p>A major limitation of this review is that we were unable to extensively cover the whole literature on the topics discussed here. Due to space constraints, sometimes we failed to establish a balance between sources supporting and those opposing a particular view. In this paragraph, we would like to at least introduce the reader to the ongoing debate on what we believe to be one of the core themes of the review: the embodied account of cognition.</p>
<p>Probably the main criticism against embodiment theories is that they disregard any mental constructs of the perceived events. In an interesting paper by <xref ref-type="bibr" rid="ref16">Borg (2018)</xref>, the main argument against these theories relies on the absence of mental state attribution in action understanding, as we predict or explain the behavior of others by adopting what she refers to as a &#x2018;smart behavior reading&#x2019;.</p>
<p>According to this model, action understanding depends directly on non-mentalistic interactive embodied practices (e.g., sensitivity to physical context and bodily motions) rather than on our ability to understand and interact with others. As such, the smart behavior reading account does not take into consideration the individuality of the observed person and all the information we might have about their personality, or life circumstances, which would enable us to predict completely different outcomes of their actions. Hence, the slow, controlled and demanding characteristics required by a mentalistic interpretation of other people&#x2019;s behavior are &#x2018;deflated&#x2019; by the embodied accounts of social cognition in favor of a fast, effortless and automatic behavior reading. Nonetheless, we also believe that mental state attribution cannot be completely disregarded by models of action understanding without generating gaps/errors in interpretation, and only an integrative approach which combines both smart behavior tracking and mental state attribution would enable successful action prediction.</p>
<p>Another critique of the embodied accounts comes from studies by <xref ref-type="bibr" rid="ref70">Mahon and Caramazza (2008)</xref> and <xref ref-type="bibr" rid="ref20">Caramazza et al. (2014)</xref>. Specifically, these authors argue against the explanation of previous electrophysiological research only in light of an embodied view of action understanding processes. The authors claim that the empirical evidence provided by neuroimaging research can be equally used in support of disembodied views of conceptual representations or at least they do not necessarily discard them. Motor and sensory activation during action representation can be seen as part of a cascade process that propagates through qualitatively different levels of processing. Nevertheless, Caramazza et al., also acknowledge the authenticity of sensory-motor activation during action observation or evocation, and propose instead a middle-ground theory that combines together the abstract and symbolic levels of some concepts with the more embodied instantiation of online conceptual processing (<xref ref-type="bibr" rid="ref70">Mahon and Caramazza, 2008</xref>; <xref ref-type="bibr" rid="ref20">Caramazza et al., 2014</xref>).</p>
</sec>
<sec id="sec28">
<label>6.2.</label>
<title>Future directions</title>
<p>This review is not the first nor the only one that points out the importance of the body in shaping cognitive processes (see for example, <xref ref-type="bibr" rid="ref50">Harris et al., 2015</xref>). Nonetheless, it strives to have both theoretical and practical implications. The summary of different methodologies and instruments measuring body indexes and psychological responses can be a useful source of information for researchers interested in conducting studies using the paradigms described here.</p>
<p>For instance, physiological measures may be included in studies of visual perception to explore whether the awareness of the body to external visual stimuli precedes their conscious appraisal or vice versa, deepening our understanding of implicit and explicit processing of emotional stimuli. Moreover, recentering clinical investigations on the body indices could be particularly relevant in those syndromes characterized both by deficits in social cognition and visual perception, such as autism or schizophrenia. Critically, in these syndromes verbal communication can be severely impaired or hardly accessible. Future studies may investigate the use of the body for patterns of interactions with the external world (e.g., postures, gait) as a way to access these clinical conditions. For example, the numerous studies on synchronization as an indirect measure of relational quality could inspirate group or couple exercises aimed at eliciting cooperative behavior.</p>
<p>A clearer understanding of the interplay between visual perception and social cognition might also help the development of novel clinical treatments or cognitive training that takes into account perceptual alterations in order to improve social abilities. For instance, differences in visual processing in individuals with autism (e.g., less semantic-oriented, more detail-oriented) have been identified in the literature and linked to altered abilities in social cognition, such as emotion recognition. A novel training approach might aim at reinforcing global visual processing and this in turn might lead to an improvement in emotion recognition in this population.</p>
<p>Again, systematic investigations of physiological measures linked to the observation of others&#x2019; bodies may represent a turning point in the study of empathy and reciprocal understanding of others&#x2019; inner states. These investigations may represent an innovative way to assess empathy implicitly whilst overcoming issues associated with self-reported measures both in health participants and patients.</p>
<p>Finally, we hope that this review will further strengthen multi-level and multi-approach explanations of cognitive processes that attempt to promote the integration of embodiment theories with more traditional cognitivist approaches. We believe that these trends are already present in the literature, and that they will lead to a comprehensive framework and an integrated perspective on perception and cognition in the years to come.</p>
<p>In summary, we highlighted how the visual perception of our social and physical environments is mediated by bodily processes and expressions. The body is behind any possible interaction with our surroundings and our conspecifics. Observing other people&#x2019;s bodies informs us on their psychological state and allows emotional sharing and understanding. Our body determines the way we look at the world surrounding us.</p>
</sec>
</sec>
<sec id="sec50">
<title>Ethics statement</title>
<p>Written informed consent was obtained from the individual(s) for the publication of any potentially identifiable images or data included in this article.</p>
</sec>
<sec id="sec29">
<title>Author contributions</title>
<p>CD, IS, and FM contributed equally to the literature search, literature discussion, and writing of the manuscript. All authors contributed to the article and approved the submitted version.</p>
</sec>
<sec id="conf1" sec-type="COI-statement">
<title>Conflict of interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec id="sec100" sec-type="disclaimer">
<title>Publisher&#x2019;s note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
</body>
<back>
<ack>
<p>The authors thank the Department of Psychology and Cognitive Science (DiPSCo) of the University of Trento for the financial support provided for the submission of this paper.</p>
</ack>
<ref-list>
<title>References</title>
<ref id="ref1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Abraham</surname> <given-names>A.</given-names></name> <name><surname>Werning</surname> <given-names>M.</given-names></name> <name><surname>Rakoczy</surname> <given-names>H.</given-names></name> <name><surname>von Cramon</surname> <given-names>D. Y.</given-names></name> <name><surname>Schubotz</surname> <given-names>R. I.</given-names></name></person-group> (<year>2008</year>). <article-title>Minds, persons, and space: an fMRI investigation into the relational complexity of higher-order intentionality</article-title>. <source>Conscious. Cogn.</source> <volume>17</volume>, <fpage>438</fpage>&#x2013;<lpage>450</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.concog.2008.03.011</pub-id>, PMID: <pub-id pub-id-type="pmid">18406173</pub-id></citation></ref>
<ref id="ref3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ainley</surname> <given-names>V.</given-names></name> <name><surname>Apps</surname> <given-names>M. A. J.</given-names></name> <name><surname>Fotopoulou</surname> <given-names>A.</given-names></name> <name><surname>Tsakiris</surname> <given-names>M.</given-names></name></person-group> (<year>2016</year>). <article-title>Bodily precision: a predictive coding account of individual differences in interoceptive accuracy</article-title>. <source>Philos. Trans. R. Soc. B Biol. Sci.</source> <volume>371</volume>:<fpage>20160003</fpage>. doi: <pub-id pub-id-type="doi">10.1098/rstb.2016.0003</pub-id>, PMID: <pub-id pub-id-type="pmid">28080962</pub-id></citation></ref>
<ref id="ref4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Anderson</surname> <given-names>E.</given-names></name> <name><surname>Siegel</surname> <given-names>E. H.</given-names></name> <name><surname>Barrett</surname> <given-names>L. F.</given-names></name></person-group> (<year>2011</year>). <article-title>What you feel influences what you see: the role of affective feelings in resolving binocular rivalry</article-title>. <source>J. Exp. Soc. Psychol.</source> <volume>47</volume>, <fpage>856</fpage>&#x2013;<lpage>860</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.jesp.2011.02.009</pub-id>, PMID: <pub-id pub-id-type="pmid">21789027</pub-id></citation></ref>
<ref id="ref5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Angelini</surname> <given-names>L.</given-names></name> <name><surname>Hodgkinson</surname> <given-names>W.</given-names></name> <name><surname>Smith</surname> <given-names>C.</given-names></name> <name><surname>dd</surname> <given-names>J. M.</given-names></name> <name><surname>Sharrack</surname> <given-names>B.</given-names></name> <name><surname>Mazz&#x00E0;</surname> <given-names>C.</given-names></name> <etal/></person-group>. (<year>2020</year>). <article-title>Wearable sensors can reliably quantify gait alterations associated with disability in people with progressive multiple sclerosis in a clinical setting</article-title>. <source>J. Neurol.</source> <volume>267</volume>, <fpage>2897</fpage>&#x2013;<lpage>2909</lpage>. doi: <pub-id pub-id-type="doi">10.1007/s00415-020-09928-8</pub-id>, PMID: <pub-id pub-id-type="pmid">32468119</pub-id></citation></ref>
<ref id="ref6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Balcetis</surname> <given-names>E.</given-names></name> <name><surname>Dunning</surname> <given-names>D.</given-names></name></person-group> (<year>2006</year>). <article-title>See what you want to see: motivational influences on visual perception</article-title>. <source>J. Pers. Soc. Psychol.</source> <volume>91</volume>, <fpage>612</fpage>&#x2013;<lpage>625</lpage>. doi: <pub-id pub-id-type="doi">10.1037/0022-3514.91.4.612</pub-id>, PMID: <pub-id pub-id-type="pmid">17014288</pub-id></citation></ref>
<ref id="ref7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Balcetis</surname> <given-names>E.</given-names></name> <name><surname>Dunning</surname> <given-names>D.</given-names></name></person-group> (<year>2009</year>). <article-title>Wishful seeing: more desired objects are seen as closer</article-title>. <source>Psychol. Sci.</source> <volume>21</volume>, <fpage>147</fpage>&#x2013;<lpage>152</lpage>. doi: <pub-id pub-id-type="doi">10.1177/0956797609356283</pub-id>, PMID: <pub-id pub-id-type="pmid">20424036</pub-id></citation></ref>
<ref id="ref8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bar-Anan</surname> <given-names>Y.</given-names></name> <name><surname>Liberman</surname> <given-names>N.</given-names></name> <name><surname>Trope</surname> <given-names>Y.</given-names></name> <name><surname>Algom</surname> <given-names>D.</given-names></name></person-group> (<year>2007</year>). <article-title>Automatic processing of psychological distance: evidence from a Stroop task</article-title>. <source>J. Exp. Psychol. Gen.</source> <volume>136</volume>, <fpage>610</fpage>&#x2013;<lpage>622</lpage>. doi: <pub-id pub-id-type="doi">10.1037/0096-3445.136.4.610</pub-id>, PMID: <pub-id pub-id-type="pmid">17999574</pub-id></citation></ref>
<ref id="ref9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Barrett</surname> <given-names>L. F.</given-names></name> <name><surname>Satpute</surname> <given-names>A. B.</given-names></name></person-group> (<year>2019</year>). <article-title>Historical pitfalls and new directions in the neuroscience of emotion</article-title>. <source>Neurosci. Lett.</source> <volume>693</volume>, <fpage>9</fpage>&#x2013;<lpage>18</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neulet.2017.07.045</pub-id>, PMID: <pub-id pub-id-type="pmid">28756189</pub-id></citation></ref>
<ref id="ref10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bastiaansen</surname> <given-names>J. A. C. J.</given-names></name> <name><surname>Thioux</surname> <given-names>M.</given-names></name> <name><surname>Keysers</surname> <given-names>C.</given-names></name></person-group> (<year>2009</year>). <article-title>Evidence for mirror systems in emotions</article-title>. <source>Philos. Trans. R. Soc. Lond. Ser. B Biol. Sci.</source> <volume>364</volume>, <fpage>2391</fpage>&#x2013;<lpage>2404</lpage>. doi: <pub-id pub-id-type="doi">10.1098/rstb.2009.0058</pub-id>, PMID: <pub-id pub-id-type="pmid">19620110</pub-id></citation></ref>
<ref id="ref200"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Beckes</surname> <given-names>L.</given-names></name> <name><surname>Coan</surname> <given-names>J. A.</given-names></name></person-group> (<year>2011</year>). <article-title>Social Baseline Theory: The Role of Social Proximity in Emotion and Economy of Action</article-title>. <source>Soc. Personal. Psychol. Compass</source> <volume>5</volume>, <fpage>976</fpage>&#x2013;<lpage>988</lpage>.</citation></ref>
<ref id="ref11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Belvederi Murri</surname> <given-names>M.</given-names></name> <name><surname>Triolo</surname> <given-names>F.</given-names></name> <name><surname>Coni</surname> <given-names>A.</given-names></name> <name><surname>Tacconi</surname> <given-names>C.</given-names></name> <name><surname>Nerozzi</surname> <given-names>E.</given-names></name> <name><surname>Escelsior</surname> <given-names>A.</given-names></name> <etal/></person-group>. (<year>2020</year>). <article-title>Instrumental assessment of balance and gait in depression: a systematic review</article-title>. <source>Psychiatry Res.</source> <volume>284</volume>:<fpage>112687</fpage>. doi: <pub-id pub-id-type="doi">10.1016/j.psychres.2019.112687</pub-id>, PMID: <pub-id pub-id-type="pmid">31740213</pub-id></citation></ref>
<ref id="ref12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Berti</surname> <given-names>A.</given-names></name> <name><surname>Frassinetti</surname> <given-names>F.</given-names></name></person-group> (<year>2000</year>). <article-title>When far becomes near: remapping of space by tool use</article-title>. <source>J. Cogn. Neurosci.</source> <volume>12</volume>, <fpage>415</fpage>&#x2013;<lpage>420</lpage>. doi: <pub-id pub-id-type="doi">10.1162/089892900562237</pub-id>, PMID: <pub-id pub-id-type="pmid">10931768</pub-id></citation></ref>
<ref id="ref13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bhalla</surname> <given-names>M.</given-names></name> <name><surname>Proffitt</surname> <given-names>D. R.</given-names></name></person-group> (<year>1999</year>). <article-title>Visual&#x2013;motor recalibration in geographical slant perception</article-title>. <source>J. Exp. Psychol. Hum. Percept. Perform.</source> <volume>25</volume>, <fpage>1076</fpage>&#x2013;<lpage>1096</lpage>. doi: <pub-id pub-id-type="doi">10.1037/0096-1523.25.4.1076</pub-id>, PMID: <pub-id pub-id-type="pmid">10464946</pub-id></citation></ref>
<ref id="ref14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Blanke</surname> <given-names>O.</given-names></name> <name><surname>Slater</surname> <given-names>M.</given-names></name> <name><surname>Serino</surname> <given-names>A.</given-names></name></person-group> (<year>2015</year>). <article-title>Behavioral, neural, and computational principles of bodily self-consciousness</article-title>. <source>Neuron</source> <volume>88</volume>, <fpage>145</fpage>&#x2013;<lpage>166</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuron.2015.09.029</pub-id>, PMID: <pub-id pub-id-type="pmid">26447578</pub-id></citation></ref>
<ref id="ref15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bonini</surname> <given-names>L.</given-names></name> <name><surname>Rotunno</surname> <given-names>C.</given-names></name> <name><surname>Arcuri</surname> <given-names>E.</given-names></name> <name><surname>Gallese</surname> <given-names>V.</given-names></name></person-group> (<year>2022</year>). <article-title>Mirror neurons 30 years later: implications and applications</article-title>. <source>Trends Cogn. Sci.</source> <volume>26</volume>, <fpage>767</fpage>&#x2013;<lpage>781</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.tics.2022.06.003</pub-id></citation></ref>
<ref id="ref16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Borg</surname> <given-names>E.</given-names></name></person-group> (<year>2018</year>). <article-title>On deflationary accounts of human action understanding</article-title>. <source>Rev. Philos. Psychol.</source> <volume>9</volume>, <fpage>503</fpage>&#x2013;<lpage>522</lpage>. doi: <pub-id pub-id-type="doi">10.1007/s13164-018-0386-3</pub-id>, PMID: <pub-id pub-id-type="pmid">30220943</pub-id></citation></ref>
<ref id="ref17"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Brockmole</surname> <given-names>J. R.</given-names></name> <name><surname>Davoli</surname> <given-names>C. C.</given-names></name> <name><surname>Abrams</surname> <given-names>R. A.</given-names></name> <name><surname>Witt</surname> <given-names>J. K.</given-names></name></person-group> (<year>2013</year>). The World Within Reach: Effects of Hand Posture and Tool Use on Visual Cognition. <source>Curr. Dir. Psychol. Sci.</source> <volume>22</volume>, <fpage>38</fpage>&#x2013;<lpage>44</lpage> doi: <pub-id pub-id-type="doi">10.1177/0963721412465065</pub-id>.</citation></ref>
<ref id="ref18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bruner</surname> <given-names>J. S.</given-names></name> <name><surname>Goodman</surname> <given-names>C. C.</given-names></name></person-group> (<year>1947</year>). <article-title>Value and need as organizing factors in perception</article-title>. <source>J. Abnorm. Soc. Psychol.</source> <volume>42</volume>, <fpage>33</fpage>&#x2013;<lpage>44</lpage>. doi: <pub-id pub-id-type="doi">10.1037/h0058484</pub-id></citation></ref>
<ref id="ref19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Butler</surname> <given-names>P. D.</given-names></name> <name><surname>Silverstein</surname> <given-names>S. M.</given-names></name> <name><surname>Dakin</surname> <given-names>S. C.</given-names></name></person-group> (<year>2008</year>). <article-title>Visual perception and its impairment in schizophrenia</article-title>. <source>Biol. Psychiatry</source> <volume>64</volume>, <fpage>40</fpage>&#x2013;<lpage>47</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.biopsych.2008.03.023</pub-id>, PMID: <pub-id pub-id-type="pmid">18549875</pub-id></citation></ref>
<ref id="ref20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Caramazza</surname> <given-names>A.</given-names></name> <name><surname>Anzellotti</surname> <given-names>S.</given-names></name> <name><surname>Strnad</surname> <given-names>L.</given-names></name> <name><surname>Lingnau</surname> <given-names>A.</given-names></name></person-group> (<year>2014</year>). <article-title>Embodied cognition and Mirror neurons: a critical assessment</article-title>. <source>Annu. Rev. Neurosci.</source> <volume>37</volume>, <fpage>1</fpage>&#x2013;<lpage>15</lpage>. doi: <pub-id pub-id-type="doi">10.1146/annurev-neuro-071013-013950</pub-id>, PMID: <pub-id pub-id-type="pmid">25032490</pub-id></citation></ref>
<ref id="ref21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Changizi</surname> <given-names>M. A.</given-names></name> <name><surname>Hall</surname> <given-names>W. G.</given-names></name></person-group> (<year>2001</year>). <article-title>Thirst modulates a perception</article-title>. <source>Perception</source> <volume>30</volume>, <fpage>1489</fpage>&#x2013;<lpage>1497</lpage>. doi: <pub-id pub-id-type="doi">10.1068/p3266</pub-id></citation></ref>
<ref id="ref22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chouinard</surname> <given-names>P. A.</given-names></name> <name><surname>Royals</surname> <given-names>K. A.</given-names></name> <name><surname>Landry</surname> <given-names>O.</given-names></name> <name><surname>Sperandio</surname> <given-names>I.</given-names></name></person-group> (<year>2018</year>). <article-title>The Shepard illusion is reduced in children with an autism Spectrum disorder because of perceptual rather than attentional mechanisms</article-title>. <source>Front. Psychol.</source> <volume>9</volume>:<fpage>2452</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2018.02452</pub-id>, PMID: <pub-id pub-id-type="pmid">30568622</pub-id></citation></ref>
<ref id="ref23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chung</surname> <given-names>S.</given-names></name> <name><surname>Son</surname> <given-names>J.-W.</given-names></name></person-group> (<year>2020</year>). <article-title>Visual perception in autism Spectrum disorder: a review of neuroimaging studies</article-title>. <source>J. Korean Acad. Child Adolesc. Psychiatry</source> <volume>31</volume>, <fpage>105</fpage>&#x2013;<lpage>120</lpage>. doi: <pub-id pub-id-type="doi">10.5765/jkacap.200018</pub-id>, PMID: <pub-id pub-id-type="pmid">32665755</pub-id></citation></ref>
<ref id="ref24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cole</surname> <given-names>S.</given-names></name> <name><surname>Balcetis</surname> <given-names>E.</given-names></name> <name><surname>Dunning</surname> <given-names>D.</given-names></name></person-group> (<year>2013</year>). <article-title>Affective Signals of Threat Increase Perceived Proximity</article-title>. <source>Psychol. Sci.</source> <volume>24</volume>, <fpage>34</fpage>&#x2013;<lpage>40</lpage>. doi: <pub-id pub-id-type="doi">10.1177/0956797612446953</pub-id>, PMID: <pub-id pub-id-type="pmid">25825706</pub-id></citation></ref>
<ref id="ref25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>de Gelder</surname> <given-names>B.</given-names></name></person-group> (<year>2009</year>). <article-title>Why bodies? Twelve reasons for including bodily expressions in affective neuroscience</article-title>. <source>Philos. Trans. R. Soc. B Biol. Sci.</source> <volume>364</volume>, <fpage>3475</fpage>&#x2013;<lpage>3484</lpage>. doi: <pub-id pub-id-type="doi">10.1098/rstb.2009.0190</pub-id>, PMID: <pub-id pub-id-type="pmid">19884142</pub-id></citation></ref>
<ref id="ref26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>De Gelder</surname> <given-names>B.</given-names></name> <name><surname>Van den Stock</surname> <given-names>J.</given-names></name></person-group> (<year>2011</year>). <article-title>The bodily expressive action stimulus test (BEAST). Construction and validation of a stimulus basis for measuring perception of whole body expression of emotions</article-title>. <source>Front. Psychol.</source> <volume>2</volume>:<fpage>181</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2011.00181</pub-id>, PMID: <pub-id pub-id-type="pmid">21886632</pub-id></citation></ref>
<ref id="ref27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>de Gelder</surname> <given-names>B.</given-names></name> <name><surname>Van den Stock</surname> <given-names>J.</given-names></name> <name><surname>Meeren</surname> <given-names>H. K. M.</given-names></name> <name><surname>Sinke</surname> <given-names>C. B. A.</given-names></name> <name><surname>Kret</surname> <given-names>M. E.</given-names></name> <name><surname>Tamietto</surname> <given-names>M.</given-names></name></person-group> (<year>2010</year>). <article-title>Standing up for the body: Recent progress in uncovering the networks involved in the perception of bodies and bodily expressions</article-title>. <source>Neurosci. Biobehav. Rev.</source> <volume>34</volume>, <fpage>513</fpage>&#x2013;<lpage>527</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neubiorev.2009.10.008</pub-id>, PMID: <pub-id pub-id-type="pmid">19857515</pub-id></citation></ref>
<ref id="ref28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>de Pinedo Garc&#x00ED;a</surname> <given-names>M.</given-names></name></person-group> (<year>2020</year>). <article-title>Ecological psychology and Enactivism: a normative way out from ontological dilemmas</article-title>. <source>Front. Psychol.</source> <volume>11</volume>:<fpage>1637</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2020.01637</pub-id></citation></ref>
<ref id="ref29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Doerrfeld</surname> <given-names>A.</given-names></name> <name><surname>Sebanz</surname> <given-names>N.</given-names></name> <name><surname>Shi</surname> <given-names>M.</given-names></name></person-group> (<year>2012</year>). <article-title>Expecting to lift a box together makes the load look lighter</article-title>. <source>Psychol. Res.</source> <volume>76</volume>, <fpage>467</fpage>&#x2013;<lpage>475</lpage>. doi: <pub-id pub-id-type="doi">10.1007/s00426-011-0398-4</pub-id></citation></ref>
<ref id="ref30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Doumas</surname> <given-names>M.</given-names></name> <name><surname>Smolders</surname> <given-names>C.</given-names></name> <name><surname>Brunfaut</surname> <given-names>E.</given-names></name> <name><surname>Bouckaert</surname> <given-names>F.</given-names></name> <name><surname>Krampe</surname> <given-names>R. T.</given-names></name></person-group> (<year>2012</year>). <article-title>Dual task performance of working memory and postural control in major depressive disorder</article-title>. <source>Neuropsychology</source> <volume>26</volume>, <fpage>110</fpage>&#x2013;<lpage>118</lpage>. doi: <pub-id pub-id-type="doi">10.1037/a0026181</pub-id>, PMID: <pub-id pub-id-type="pmid">22059649</pub-id></citation></ref>
<ref id="ref31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Duclos</surname> <given-names>S. E.</given-names></name> <name><surname>Laird</surname> <given-names>J. D.</given-names></name> <name><surname>Schneider</surname> <given-names>E.</given-names></name> <name><surname>Sexter</surname> <given-names>M.</given-names></name> <name><surname>Stern</surname> <given-names>L.</given-names></name> <name><surname>Van Lighten</surname> <given-names>O.</given-names></name></person-group> (<year>1989</year>). <article-title>Emotion-specific effects of facial expressions and postures on emotional experience</article-title>. <source>J. Pers. Soc. Psychol.</source> <volume>57</volume>, <fpage>100</fpage>&#x2013;<lpage>108</lpage>. doi: <pub-id pub-id-type="doi">10.1037/0022-3514.57.1.100</pub-id></citation></ref>
<ref id="ref32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Edey</surname> <given-names>R.</given-names></name> <name><surname>Yon</surname> <given-names>D.</given-names></name> <name><surname>Cook</surname> <given-names>J.</given-names></name> <name><surname>Dumontheil</surname> <given-names>I.</given-names></name> <name><surname>Press</surname> <given-names>C.</given-names></name></person-group> (<year>2017</year>). <article-title>Our own action kinematics predict the perceived affective states of others</article-title>. <source>J. Exp. Psychol. Hum. Percept. Perform.</source> <volume>43</volume>, <fpage>1263</fpage>&#x2013;<lpage>1268</lpage>. doi: <pub-id pub-id-type="doi">10.1037/xhp0000423</pub-id>, PMID: <pub-id pub-id-type="pmid">28639823</pub-id></citation></ref>
<ref id="ref33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Egger</surname> <given-names>M.</given-names></name> <name><surname>Ley</surname> <given-names>M.</given-names></name> <name><surname>Hanke</surname> <given-names>S.</given-names></name></person-group> (<year>2019</year>). <article-title>Emotion recognition from physiological signal analysis: a review</article-title>. <source>Electron. Notes Theor. Comput. Sci.</source> <volume>343</volume>, <fpage>35</fpage>&#x2013;<lpage>55</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.entcs.2019.04.009</pub-id></citation></ref>
<ref id="ref34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fabbri-Destro</surname> <given-names>M.</given-names></name> <name><surname>Rizzolatti</surname> <given-names>G.</given-names></name></person-group> (<year>2008</year>). <article-title>Mirror neurons and Mirror Systems in Monkeys and Humans</article-title>. <source>Physiology</source> <volume>23</volume>, <fpage>171</fpage>&#x2013;<lpage>179</lpage>. doi: <pub-id pub-id-type="doi">10.1152/physiol.00004.2008</pub-id></citation></ref>
<ref id="ref35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fairhurst</surname> <given-names>M.</given-names></name> <name><surname>Fairhurst</surname> <given-names>K.</given-names></name> <name><surname>Berna</surname> <given-names>C.</given-names></name> <name><surname>Tracey</surname> <given-names>I.</given-names></name></person-group> (<year>2012</year>). <article-title>An fMRI study exploring the overlap and differences between neural representations of physical and recalled pain</article-title>. <source>PLoS One</source> <volume>7</volume>:<fpage>e48711</fpage>. doi: <pub-id pub-id-type="doi">10.1371/journal.pone.0048711</pub-id>, PMID: <pub-id pub-id-type="pmid">23119093</pub-id></citation></ref>
<ref id="ref36"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Firestone</surname> <given-names>C.</given-names></name> <name><surname>Scholl</surname> <given-names>B. J.</given-names></name></person-group> (<year>2016</year>). <article-title>Cognition does not affect perception: evaluating the evidence for &#x201C;top-down&#x201D; effects</article-title>. <source>Behav. Brain Sci.</source> <volume>39</volume>:<fpage>e229</fpage>. doi: <pub-id pub-id-type="doi">10.1017/S0140525X15000965</pub-id></citation></ref>
<ref id="ref37"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Fodor</surname> <given-names>J. A.</given-names></name></person-group> (<year>1983</year>). <source>The modularity of mind.</source> <publisher-loc>Cambridge, Mass, London</publisher-loc>: <publisher-name>MIT Press</publisher-name>.</citation></ref>
<ref id="ref38"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Forkmann</surname> <given-names>K.</given-names></name> <name><surname>Wiech</surname> <given-names>K.</given-names></name> <name><surname>Sommer</surname> <given-names>T.</given-names></name> <name><surname>Bingel</surname> <given-names>U.</given-names></name></person-group> (<year>2015</year>). <article-title>Reinstatement of pain-related brain activation during the recognition of neutral images previously paired with nociceptive stimuli</article-title>. <source>Pain</source> <volume>156</volume>, <fpage>1501</fpage>&#x2013;<lpage>1510</lpage>. doi: <pub-id pub-id-type="doi">10.1097/j.pain.0000000000000194</pub-id>, PMID: <pub-id pub-id-type="pmid">25906345</pub-id></citation></ref>
<ref id="ref39"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Friston</surname> <given-names>K.</given-names></name></person-group> (<year>2018</year>). <article-title>Does predictive coding have a future?</article-title> <source>Nat. Neurosci.</source> <volume>21</volume>, <fpage>1019</fpage>&#x2013;<lpage>1021</lpage>. doi: <pub-id pub-id-type="doi">10.1038/s41593-018-0200-7</pub-id></citation></ref>
<ref id="ref40"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Friston</surname> <given-names>K.</given-names></name> <name><surname>Kiebel</surname> <given-names>S.</given-names></name></person-group> (<year>2009</year>). <article-title>Predictive coding under the free-energy principle</article-title>. <source>Philos. Trans. R. Soc. B Biol. Sci.</source> <volume>364</volume>, <fpage>1211</fpage>&#x2013;<lpage>1221</lpage>. doi: <pub-id pub-id-type="doi">10.1098/rstb.2008.0300</pub-id>, PMID: <pub-id pub-id-type="pmid">19528002</pub-id></citation></ref>
<ref id="ref41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fuchs</surname> <given-names>T.</given-names></name></person-group> (<year>2020</year>). <article-title>The circularity of the embodied mind</article-title>. <source>Front. Psychol.</source> <volume>11</volume>:<fpage>707</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2020.01707</pub-id>, PMID: <pub-id pub-id-type="pmid">32903365</pub-id></citation></ref>
<ref id="ref42"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gallese</surname> <given-names>V.</given-names></name></person-group> (<year>2007</year>). <article-title>Before and below &#x2018;theory of mind&#x2019;: embodied simulation and the neural correlates of social cognition</article-title>. <source>Philos. Trans. R. Soc. B Biol. Sci.</source> <volume>362</volume>, <fpage>659</fpage>&#x2013;<lpage>669</lpage>. doi: <pub-id pub-id-type="doi">10.1098/rstb.2006.2002</pub-id>, PMID: <pub-id pub-id-type="pmid">17301027</pub-id></citation></ref>
<ref id="ref43"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gallese</surname> <given-names>V.</given-names></name></person-group> (<year>2009</year>). <article-title>Mirror neurons, embodied simulation, and the neural basis of social identification</article-title>. <source>Psychoanal. Dialogues</source> <volume>19</volume>, <fpage>519</fpage>&#x2013;<lpage>536</lpage>. doi: <pub-id pub-id-type="doi">10.1080/10481880903231910</pub-id></citation></ref>
<ref id="ref44"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Gibson</surname> <given-names>J. J.</given-names></name></person-group> (<year>2014</year>). <source>The Ecological Approach to Visual Perception: Classic Edition 1st Edn.</source> <publisher-loc>New York</publisher-loc>: <publisher-name>Psychology Press.</publisher-name></citation></ref>
<ref id="ref45"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Goldstone</surname> <given-names>R. L.</given-names></name> <name><surname>Barsalou</surname> <given-names>L. W.</given-names></name></person-group> (<year>1998</year>). <article-title>Reuniting perception and conception</article-title>. <source>Cognition</source> <volume>65</volume>, <fpage>231</fpage>&#x2013;<lpage>262</lpage>. doi: <pub-id pub-id-type="doi">10.1016/S0010-0277(97)00047-4</pub-id>, PMID: <pub-id pub-id-type="pmid">9557384</pub-id></citation></ref>
<ref id="ref46"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gross</surname> <given-names>E. B.</given-names></name> <name><surname>Proffitt</surname> <given-names>D.</given-names></name></person-group> (<year>2013</year>). <article-title>The economy of social resources and its influence on spatial perceptions</article-title>. <source>Front. Hum. Neurosci.</source> <volume>7</volume>:<fpage>772</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fnhum.2013.00772</pub-id>, PMID: <pub-id pub-id-type="pmid">24312039</pub-id></citation></ref>
<ref id="ref47"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hall</surname> <given-names>E. T.</given-names></name></person-group> (<year>1963</year>). <article-title>A system for the notation of Proxemic behavior</article-title>. <source>Am. Anthropol.</source> <volume>65</volume>, <fpage>1003</fpage>&#x2013;<lpage>1026</lpage>. Available at: <ext-link xlink:href="http://www.jstor.org/stable/668580" ext-link-type="uri">http://www.jstor.org/stable/668580</ext-link>. doi: <pub-id pub-id-type="doi">10.1525/aa.1963.65.5.02a00020</pub-id></citation></ref>
<ref id="ref48"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hall</surname> <given-names>E. T.</given-names></name> <name><surname>Birdwhistell</surname> <given-names>R. L.</given-names></name> <name><surname>Bock</surname> <given-names>B.</given-names></name> <name><surname>Bohannan</surname> <given-names>P.</given-names></name> <name><surname>Diebold</surname> <given-names>A. R.</given-names></name> <name><surname>Durbin</surname> <given-names>M.</given-names></name> <etal/></person-group>. (<year>1968</year>). <article-title>Proxemics [and comments and replies]</article-title>. <source>Curr. Anthropol.</source> <volume>9</volume>, <fpage>83</fpage>&#x2013;<lpage>108</lpage>. Available at: <ext-link xlink:href="http://www.jstor.org/stable/2740724" ext-link-type="uri">http://www.jstor.org/stable/2740724</ext-link>. doi: <pub-id pub-id-type="doi">10.1086/200975</pub-id></citation></ref>
<ref id="ref55"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hamilton</surname> <given-names>A. F. de C.</given-names></name> <name><surname>Kessler</surname> <given-names>K.</given-names></name> <name><surname>Creem-Regehr</surname> <given-names>S. H.</given-names></name></person-group> (<year>2014</year>). <article-title>Perspective taking: building a neurocognitive framework for integrating the &#x201C;social&#x201D; and the &#x201C;spatial.&#x201D;</article-title> <source>Front. Hum. Neurosci.</source> <volume>8</volume>:<fpage>35</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fnhum.2014.00403</pub-id></citation></ref>
<ref id="ref49"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Han</surname> <given-names>Q.</given-names></name> <name><surname>Wang</surname> <given-names>Y.</given-names></name> <name><surname>Jiang</surname> <given-names>Y.</given-names></name> <name><surname>Bao</surname> <given-names>M.</given-names></name></person-group> (<year>2021</year>). <article-title>The relevance to social interaction modulates bistable biological-motion perception</article-title>. <source>Cognition</source> <volume>209</volume>:<fpage>104584</fpage>. doi: <pub-id pub-id-type="doi">10.1016/j.cognition.2021.104584</pub-id>, PMID: <pub-id pub-id-type="pmid">33450439</pub-id></citation></ref>
<ref id="ref300"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Harber</surname> <given-names>K. D.</given-names></name> <name><surname>Yeung</surname> <given-names>D.</given-names></name> <name><surname>Iacovelli</surname> <given-names>A.</given-names></name></person-group> (<year>2011</year>). <article-title>Psychosocial resources, threat, and the perception of distance and height: support for the resources and perception model</article-title>. <source>Emotion</source> <volume>11</volume>:<fpage>1080.</fpage>, PMID: <pub-id pub-id-type="pmid">33756399</pub-id></citation></ref>
<ref id="ref50"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Harris</surname> <given-names>L. R.</given-names></name> <name><surname>Carnevale</surname> <given-names>M. J.</given-names></name> <name><surname>D&#x2019;Amour</surname> <given-names>S.</given-names></name> <name><surname>Fraser</surname> <given-names>L. E.</given-names></name> <name><surname>Harrar</surname> <given-names>V.</given-names></name> <name><surname>Hoover</surname> <given-names>A. E. N.</given-names></name> <etal/></person-group>. (<year>2015</year>). <article-title>How our body influences our perception of the world</article-title>. <source>Front. Psychol.</source> <volume>6</volume>:<fpage>819</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2015.00819</pub-id>, PMID: <pub-id pub-id-type="pmid">26124739</pub-id></citation></ref>
<ref id="ref51"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hausdorff</surname> <given-names>J. M.</given-names></name> <name><surname>Peng</surname> <given-names>C.-K.</given-names></name> <name><surname>Goldberger</surname> <given-names>A. L.</given-names></name> <name><surname>Stoll</surname> <given-names>A. L.</given-names></name></person-group> (<year>2004</year>). <article-title>Gait unsteadiness and fall risk in two affective disorders: a preliminary study</article-title>. <source>BMC Psychiatry</source> <volume>4</volume>:<fpage>39</fpage>. doi: <pub-id pub-id-type="doi">10.1186/1471-244X-4-39</pub-id>, PMID: <pub-id pub-id-type="pmid">15563372</pub-id></citation></ref>
<ref id="ref52"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Heft</surname> <given-names>H.</given-names></name></person-group> (<year>2020</year>). <article-title>Ecological psychology and Enaction theory: divergent groundings</article-title>. <source>Front. Psychol.</source> <volume>11</volume>:<fpage>991</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2020.00991</pub-id>, PMID: <pub-id pub-id-type="pmid">32547449</pub-id></citation></ref>
<ref id="ref53"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Henderson</surname> <given-names>M. D.</given-names></name> <name><surname>Fujita</surname> <given-names>K.</given-names></name> <name><surname>Trope</surname> <given-names>Y.</given-names></name> <name><surname>Liberman</surname> <given-names>N.</given-names></name></person-group> (<year>2006</year>). <article-title>Transcending the &#x201C;here&#x201D;: the effect of spatial distance on social judgment</article-title>. <source>J. Pers. Soc. Psychol.</source> <volume>91</volume>, <fpage>845</fpage>&#x2013;<lpage>856</lpage>. doi: <pub-id pub-id-type="doi">10.1037/0022-3514.91.5.845</pub-id>, PMID: <pub-id pub-id-type="pmid">17059305</pub-id></citation></ref>
<ref id="ref54"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ijzerman</surname> <given-names>H.</given-names></name> <name><surname>Semin</surname> <given-names>G. R.</given-names></name></person-group> (<year>2009</year>). <article-title>The thermometer of social relations: mapping social proximity on temperature</article-title>. <source>Psychol. Sci.</source> <volume>20</volume>, <fpage>1214</fpage>&#x2013;<lpage>1220</lpage>. doi: <pub-id pub-id-type="doi">10.1111/j.1467-9280.2009.02434.x</pub-id></citation></ref>
<ref id="ref56"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Keysers</surname> <given-names>C.</given-names></name> <name><surname>Kaas</surname> <given-names>J. H.</given-names></name> <name><surname>Gazzola</surname> <given-names>V.</given-names></name></person-group> (<year>2010</year>). <article-title>Somatosensation in social perception</article-title>. <source>Nat. Rev. Neurosci.</source> <volume>11</volume>, <fpage>417</fpage>&#x2013;<lpage>428</lpage>. doi: <pub-id pub-id-type="doi">10.1038/nrn2833</pub-id></citation></ref>
<ref id="ref57"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Keysers</surname> <given-names>C.</given-names></name> <name><surname>Paracampo</surname> <given-names>R.</given-names></name> <name><surname>Gazzola</surname> <given-names>V.</given-names></name></person-group> (<year>2018</year>). <article-title>What neuromodulation and lesion studies tell us about the function of the mirror neuron system and embodied cognition</article-title>. <source>Curr. Opin. Psychol.</source> <volume>24</volume>, <fpage>35</fpage>&#x2013;<lpage>40</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.copsyc.2018.04.001</pub-id></citation></ref>
<ref id="ref58"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>King</surname> <given-names>D. J.</given-names></name> <name><surname>Hodgekins</surname> <given-names>J.</given-names></name> <name><surname>Chouinard</surname> <given-names>P. A.</given-names></name> <name><surname>Chouinard</surname> <given-names>V.-A.</given-names></name> <name><surname>Sperandio</surname> <given-names>I.</given-names></name></person-group> (<year>2017</year>). <article-title>A review of abnormalities in the perception of visual illusions in schizophrenia</article-title>. <source>Psychon. Bull. Rev.</source> <volume>24</volume>, <fpage>734</fpage>&#x2013;<lpage>751</lpage>. doi: <pub-id pub-id-type="doi">10.3758/s13423-016-1168-5</pub-id>, PMID: <pub-id pub-id-type="pmid">27730532</pub-id></citation></ref>
<ref id="ref59"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kret</surname> <given-names>M. E.</given-names></name> <name><surname>Stekelenburg</surname> <given-names>J. J.</given-names></name> <name><surname>Roelofs</surname> <given-names>K.</given-names></name> <name><surname>de Gelder</surname> <given-names>B.</given-names></name></person-group> (<year>2013</year>). <article-title>Perception of face and body expressions using electromyography pupillometry and gaze measures</article-title>. <source>Front. Psychol.</source> <volume>4</volume>:<fpage>28</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2013.00028</pub-id>, PMID: <pub-id pub-id-type="pmid">23403886</pub-id></citation></ref>
<ref id="ref60"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kroczek</surname> <given-names>L. O. H.</given-names></name> <name><surname>Pfaller</surname> <given-names>M.</given-names></name> <name><surname>Lange</surname> <given-names>B.</given-names></name> <name><surname>M&#x00FC;ller</surname> <given-names>M.</given-names></name> <name><surname>M&#x00FC;hlberger</surname> <given-names>A.</given-names></name></person-group> (<year>2020</year>). <article-title>Interpersonal distance during real-time social interaction: insights from subjective experience, behavior, and physiology</article-title>. <source>Front. Psych.</source> <volume>11</volume>:<fpage>561</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyt.2020.00561</pub-id>, PMID: <pub-id pub-id-type="pmid">32595544</pub-id></citation></ref>
<ref id="ref61"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lamm</surname> <given-names>C.</given-names></name> <name><surname>R&#x00FC;tgen</surname> <given-names>M.</given-names></name> <name><surname>Wagner</surname> <given-names>I. C.</given-names></name></person-group> (<year>2019</year>). <article-title>Imaging empathy and prosocial emotions</article-title>. <source>Neurosci. Lett.</source> <volume>693</volume>, <fpage>49</fpage>&#x2013;<lpage>53</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neulet.2017.06.054</pub-id></citation></ref>
<ref id="ref62"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Landau</surname> <given-names>M. J.</given-names></name> <name><surname>Meier</surname> <given-names>B. P.</given-names></name> <name><surname>Keefer</surname> <given-names>L. A.</given-names></name></person-group> (<year>2010</year>). <article-title>A metaphor-enriched social cognition</article-title>. <source>Psychol. Bull.</source> <volume>136</volume>, <fpage>1045</fpage>&#x2013;<lpage>1067</lpage>. doi: <pub-id pub-id-type="doi">10.1037/a0020970</pub-id>, PMID: <pub-id pub-id-type="pmid">20822208</pub-id></citation></ref>
<ref id="ref63"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lemke</surname> <given-names>M. R.</given-names></name> <name><surname>Wendorff</surname> <given-names>T.</given-names></name> <name><surname>Mieth</surname> <given-names>B.</given-names></name> <name><surname>Buhl</surname> <given-names>K.</given-names></name> <name><surname>Linnemann</surname> <given-names>M.</given-names></name></person-group> (<year>2000</year>). <article-title>Spatiotemporal gait patterns during over ground locomotion in major depression compared with healthy controls</article-title>. <source>J. Psychiatr. Res.</source> <volume>34</volume>, <fpage>277</fpage>&#x2013;<lpage>283</lpage>. doi: <pub-id pub-id-type="doi">10.1016/S0022-3956(00)00017-0</pub-id>, PMID: <pub-id pub-id-type="pmid">11104839</pub-id></citation></ref>
<ref id="ref64"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lenzoni</surname> <given-names>S.</given-names></name> <name><surname>Bozzoni</surname> <given-names>V.</given-names></name> <name><surname>Burgio</surname> <given-names>F.</given-names></name> <name><surname>de Gelder</surname> <given-names>B.</given-names></name> <name><surname>Wennberg</surname> <given-names>A.</given-names></name> <name><surname>Botta</surname> <given-names>A.</given-names></name> <etal/></person-group>. (<year>2020</year>). <article-title>Recognition of emotions conveyed by facial expression and body postures in myotonic dystrophy (DM)</article-title>. <source>Cortex</source> <volume>127</volume>, <fpage>58</fpage>&#x2013;<lpage>66</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.cortex.2020.02.005</pub-id></citation></ref>
<ref id="ref65"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lindblom</surname> <given-names>J.</given-names></name></person-group> (<year>2020</year>). <article-title>A radical reassessment of the body in social cognition</article-title>. <source>Front. Psychol.</source> <volume>11</volume>:<fpage>987</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2020.00987</pub-id>, PMID: <pub-id pub-id-type="pmid">32581915</pub-id></citation></ref>
<ref id="ref66"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Linkenauger</surname> <given-names>S.</given-names></name> <name><surname>Proffitt</surname> <given-names>D.</given-names></name></person-group> (<year>2008</year>). <article-title>The effect of intention and bodily capabilities on the perception of size</article-title>. <source>J. Vis.</source> <volume>8</volume>:<fpage>620</fpage>. doi: <pub-id pub-id-type="doi">10.1167/8.6.620</pub-id></citation></ref>
<ref id="ref67"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Linkenauger</surname> <given-names>S. A.</given-names></name> <name><surname>Ramenzoni</surname> <given-names>V.</given-names></name> <name><surname>Proffitt</surname> <given-names>D. R.</given-names></name></person-group> (<year>2010</year>). <article-title>Illusory Shrinkage and Growth: Body-Based Rescaling Affects the Perception of Size</article-title>. <source>Psychol. Sci.</source> <volume>21</volume>, <fpage>1318</fpage>&#x2013;<lpage>1325</lpage> doi: <pub-id pub-id-type="doi">10.1177/0956797610380700</pub-id></citation></ref>
<ref id="ref68"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Macpherson</surname> <given-names>M. C.</given-names></name> <name><surname>Fay</surname> <given-names>N.</given-names></name> <name><surname>Miles</surname> <given-names>L. K.</given-names></name></person-group> (<year>2020</year>). <article-title>Seeing synchrony: a replication of the effects of task-irrelevant social information on perceptions of interpersonal coordination</article-title>. <source>Acta Psychol.</source> <volume>209</volume>:<fpage>103140</fpage>. doi: <pub-id pub-id-type="doi">10.1016/j.actpsy.2020.103140</pub-id></citation></ref>
<ref id="ref69"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Maggio</surname> <given-names>M. G.</given-names></name> <name><surname>Piazzitta</surname> <given-names>D.</given-names></name> <name><surname>Andaloro</surname> <given-names>A.</given-names></name> <name><surname>Latella</surname> <given-names>D.</given-names></name> <name><surname>Sciarrone</surname> <given-names>F.</given-names></name> <name><surname>Casella</surname> <given-names>C.</given-names></name> <etal/></person-group>. (<year>2022</year>). <article-title>Embodied cognition in neurodegenerative disorders: what do we know so far? A narrative review focusing on the mirror neuron system and clinical applications</article-title>. <source>J. Clin. Neurosci.</source> <volume>98</volume>, <fpage>66</fpage>&#x2013;<lpage>72</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.jocn.2022.01.028</pub-id></citation></ref>
<ref id="ref70"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mahon</surname> <given-names>B. Z.</given-names></name> <name><surname>Caramazza</surname> <given-names>A.</given-names></name></person-group> (<year>2008</year>). <article-title>A critical look at the embodied cognition hypothesis and a new proposal for grounding conceptual content</article-title>. <source>J. Physiol. Paris</source> <volume>102</volume>, <fpage>59</fpage>&#x2013;<lpage>70</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.jphysparis.2008.03.004</pub-id>, PMID: <pub-id pub-id-type="pmid">18448316</pub-id></citation></ref>
<ref id="ref71"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mauss</surname> <given-names>I. B.</given-names></name> <name><surname>Robinson</surname> <given-names>M. D.</given-names></name></person-group> (<year>2009</year>). <article-title>Measures of emotion: a review</article-title>. <source>Cogn. Emot.</source> <volume>23</volume>, <fpage>209</fpage>&#x2013;<lpage>237</lpage>. doi: <pub-id pub-id-type="doi">10.1080/02699930802204677</pub-id>, PMID: <pub-id pub-id-type="pmid">19809584</pub-id></citation></ref>
<ref id="ref72"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Meconi</surname> <given-names>F.</given-names></name> <name><surname>Doro</surname> <given-names>M.</given-names></name> <name><surname>Schiano Lomoriello</surname> <given-names>A.</given-names></name> <name><surname>Mastrella</surname> <given-names>G.</given-names></name> <name><surname>Sessa</surname> <given-names>P.</given-names></name></person-group> (<year>2018</year>). <article-title>Neural measures of the role of affective prosody in empathy for pain</article-title>. <source>Sci. Rep.</source> <volume>8</volume>:<fpage>291</fpage>. doi: <pub-id pub-id-type="doi">10.1038/s41598-017-18552-y</pub-id>, PMID: <pub-id pub-id-type="pmid">29321532</pub-id></citation></ref>
<ref id="ref74"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Meconi</surname> <given-names>F.</given-names></name> <name><surname>Linde-Domingo</surname> <given-names>J. S.</given-names></name> <name><surname>Ferreira</surname> <given-names>C.</given-names></name> <name><surname>Michelmann</surname> <given-names>S.</given-names></name> <name><surname>Staresina</surname> <given-names>B.</given-names></name> <name><surname>Apperly</surname> <given-names>I. A.</given-names></name> <etal/></person-group>. (<year>2021</year>). <article-title>EEG and fMRI evidence for autobiographical memory reactivation in empathy</article-title>. <source>Hum. Brain Mapp.</source> <volume>42</volume>, <fpage>4448</fpage>&#x2013;<lpage>4464</lpage>. doi: <pub-id pub-id-type="doi">10.1002/hbm.25557</pub-id>, PMID: <pub-id pub-id-type="pmid">34121270</pub-id></citation></ref>
<ref id="ref75"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Miles</surname> <given-names>L. K.</given-names></name> <name><surname>Nind</surname> <given-names>L. K.</given-names></name> <name><surname>Macrae</surname> <given-names>C. N.</given-names></name></person-group> (<year>2009</year>). <article-title>The rhythm of rapport: interpersonal synchrony and social perception</article-title>. <source>J. Exp. Soc. Psychol.</source> <volume>45</volume>, <fpage>585</fpage>&#x2013;<lpage>589</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.jesp.2009.02.002</pub-id></citation></ref>
<ref id="ref76"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Miller</surname> <given-names>L. E.</given-names></name> <name><surname>Saygin</surname> <given-names>A. P.</given-names></name></person-group> (<year>2013</year>). <article-title>Individual differences in the perception of biological motion: links to social cognition and motor imagery</article-title>. <source>Cognition</source> <volume>128</volume>, <fpage>140</fpage>&#x2013;<lpage>148</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.cognition.2013.03.013</pub-id>, PMID: <pub-id pub-id-type="pmid">23680791</pub-id></citation></ref>
<ref id="ref77"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Morris</surname> <given-names>J. S.</given-names></name> <name><surname>DeGelder</surname> <given-names>B.</given-names></name> <name><surname>Weiskrantz</surname> <given-names>L.</given-names></name> <name><surname>Dolan</surname> <given-names>R. J.</given-names></name></person-group> (<year>2001</year>). <article-title>Differential extrageniculostriate and amygdala responses to presentation of emotional faces in a cortically blind field</article-title>. <source>Brain</source> <volume>124</volume>, <fpage>1241</fpage>&#x2013;<lpage>1252</lpage>. doi: <pub-id pub-id-type="doi">10.1093/brain/124.6.1241</pub-id></citation></ref>
<ref id="ref78"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mukamel</surname> <given-names>R.</given-names></name> <name><surname>Ekstrom</surname> <given-names>A. D.</given-names></name> <name><surname>Kaplan</surname> <given-names>J.</given-names></name> <name><surname>Iacoboni</surname> <given-names>M.</given-names></name> <name><surname>Fried</surname> <given-names>I.</given-names></name></person-group> (<year>2010</year>). <article-title>Single-neuron responses in humans during execution and observation of actions</article-title>. <source>Curr. Biol.</source> <volume>20</volume>, <fpage>750</fpage>&#x2013;<lpage>756</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.cub.2010.02.045</pub-id>, PMID: <pub-id pub-id-type="pmid">20381353</pub-id></citation></ref>
<ref id="ref400"><citation citation-type="book"><person-group person-group-type="editor"><name><surname>Nakayama</surname> <given-names>K.</given-names></name></person-group> (<year>2011</year>). &#x201C;Introduction: Vision Going Social.&#x201D; in <source>The science of social vision. Vol. 7.</source> eds. R. B. Adams, R. B. Adams Jr, N. Ambady, K. Nakayama and S. Shimojo (<publisher-name>Oxford university press</publisher-name>).</citation></ref>
<ref id="ref79"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Niedenthal</surname> <given-names>P. M.</given-names></name> <name><surname>Barsalou</surname> <given-names>L. W.</given-names></name> <name><surname>Winkielman</surname> <given-names>P.</given-names></name> <name><surname>Krauth-Gruber</surname> <given-names>S.</given-names></name> <name><surname>Ric</surname> <given-names>F.</given-names></name></person-group> (<year>2005</year>). <article-title>Embodiment in attitudes, social perception, and emotion</article-title>. <source>Personal. Soc. Psychol. Rev.</source> <volume>9</volume>, <fpage>184</fpage>&#x2013;<lpage>211</lpage>. doi: <pub-id pub-id-type="doi">10.1207/s15327957pspr0903_1</pub-id>, PMID: <pub-id pub-id-type="pmid">16083360</pub-id></citation></ref>
<ref id="ref80"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Niedenthal</surname> <given-names>P.</given-names></name> <name><surname>Wood</surname> <given-names>A.</given-names></name></person-group> (<year>2019</year>). <article-title>Does emotion influence visual perception? Depends on how you look at it</article-title>. <source>Cogn. Emot.</source> <volume>33</volume>, <fpage>77</fpage>&#x2013;<lpage>84</lpage>. doi: <pub-id pub-id-type="doi">10.1080/02699931.2018.1561424</pub-id>, PMID: <pub-id pub-id-type="pmid">30636535</pub-id></citation></ref>
<ref id="ref81"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Noel</surname> <given-names>J.-P.</given-names></name> <name><surname>Blanke</surname> <given-names>O.</given-names></name> <name><surname>Serino</surname> <given-names>A.</given-names></name></person-group> (<year>2018</year>). <article-title>From multisensory integration in peripersonal space to bodily self-consciousness: from statistical regularities to statistical inference</article-title>. <source>Ann. N. Y. Acad. Sci.</source> <volume>1426</volume>, <fpage>146</fpage>&#x2013;<lpage>165</lpage>. doi: <pub-id pub-id-type="doi">10.1111/nyas.13867</pub-id>, PMID: <pub-id pub-id-type="pmid">29876922</pub-id></citation></ref>
<ref id="ref82"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Oishi</surname> <given-names>S.</given-names></name> <name><surname>Schiller</surname> <given-names>J.</given-names></name> <name><surname>Gross</surname> <given-names>E. B.</given-names></name></person-group> (<year>2013</year>). <article-title>Felt understanding and misunderstanding affect the perception of pain, slant, and distance</article-title>. <source>Soc. Psychol. Personal. Sci.</source> <volume>4</volume>, <fpage>259</fpage>&#x2013;<lpage>266</lpage>. doi: <pub-id pub-id-type="doi">10.1177/1948550612453469</pub-id></citation></ref>
<ref id="ref83"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Otten</surname> <given-names>M.</given-names></name> <name><surname>Seth</surname> <given-names>A. K.</given-names></name> <name><surname>Pinto</surname> <given-names>Y.</given-names></name></person-group> (<year>2017</year>). <article-title>A social Bayesian brain: how social knowledge can shape visual perception</article-title>. <source>Brain Cogn.</source> <volume>112</volume>, <fpage>69</fpage>&#x2013;<lpage>77</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.bandc.2016.05.002</pub-id>, PMID: <pub-id pub-id-type="pmid">27221986</pub-id></citation></ref>
<ref id="ref84"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Papeo</surname> <given-names>L.</given-names></name></person-group> (<year>2020</year>). <article-title>Twos in human visual perception</article-title>. <source>Cortex</source> <volume>132</volume>, <fpage>473</fpage>&#x2013;<lpage>478</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.cortex.2020.06.005</pub-id>, PMID: <pub-id pub-id-type="pmid">32698947</pub-id></citation></ref>
<ref id="ref85"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Papeo</surname> <given-names>L.</given-names></name> <name><surname>Abassi</surname> <given-names>E.</given-names></name></person-group> (<year>2019</year>). <article-title>Seeing social events: the visual specialization for dyadic human&#x2013;human interactions</article-title>. <source>J. Exp. Psychol. Hum. Percept. Perform.</source> <volume>45</volume>, <fpage>877</fpage>&#x2013;<lpage>888</lpage>. doi: <pub-id pub-id-type="doi">10.1037/xhp0000646</pub-id>, PMID: <pub-id pub-id-type="pmid">30998069</pub-id></citation></ref>
<ref id="ref86"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Papeo</surname> <given-names>L.</given-names></name> <name><surname>Goupil</surname> <given-names>N.</given-names></name> <name><surname>Soto-Faraco</surname> <given-names>S.</given-names></name></person-group> (<year>2019</year>). <article-title>Visual search for people among people</article-title>. <source>Psychol. Sci.</source> <volume>30</volume>, <fpage>1483</fpage>&#x2013;<lpage>1496</lpage>. doi: <pub-id pub-id-type="doi">10.1177/0956797619867295</pub-id></citation></ref>
<ref id="ref87"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Paradiso</surname> <given-names>E.</given-names></name> <name><surname>Gazzola</surname> <given-names>V.</given-names></name> <name><surname>Keysers</surname> <given-names>C.</given-names></name></person-group> (<year>2021</year>). <article-title>Neural mechanisms necessary for empathy-related phenomena across species</article-title>. <source>Curr. Opin. Neurobiol.</source> <volume>68</volume>, <fpage>107</fpage>&#x2013;<lpage>115</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.conb.2021.02.005</pub-id>, PMID: <pub-id pub-id-type="pmid">33756399</pub-id></citation></ref>
<ref id="ref88"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Parkinson</surname> <given-names>C.</given-names></name> <name><surname>Wheatley</surname> <given-names>T.</given-names></name></person-group> (<year>2013</year>). <article-title>Old cortex, new contexts: re-purposing spatial perception for social cognition</article-title>. <source>Front. Hum. Neurosci.</source> <volume>7</volume>:<fpage>645</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fnhum.2013.00645</pub-id></citation></ref>
<ref id="ref89"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Peelen</surname> <given-names>M. V.</given-names></name> <name><surname>Downing</surname> <given-names>P. E.</given-names></name></person-group> (<year>2005</year>). <article-title>Selectivity for the human body in the fusiform gyrus</article-title>. <source>J. Neurophysiol.</source> <volume>93</volume>, <fpage>603</fpage>&#x2013;<lpage>608</lpage>. doi: <pub-id pub-id-type="doi">10.1152/jn.00513.2004</pub-id></citation></ref>
<ref id="ref90"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pegna</surname> <given-names>A. J.</given-names></name> <name><surname>Khateb</surname> <given-names>A.</given-names></name> <name><surname>Lazeyras</surname> <given-names>F.</given-names></name> <name><surname>Seghier</surname> <given-names>M. L.</given-names></name></person-group> (<year>2005</year>). <article-title>Discriminating emotional faces without primary visual cortices involves the right amygdala</article-title>. <source>Nat. Neurosci.</source> <volume>8</volume>, <fpage>24</fpage>&#x2013;<lpage>25</lpage>. doi: <pub-id pub-id-type="doi">10.1038/nn1364</pub-id>, PMID: <pub-id pub-id-type="pmid">15592466</pub-id></citation></ref>
<ref id="ref91"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Popova</surname> <given-names>Y. B.</given-names></name> <name><surname>R&#x0105;czaszek-Leonardi</surname> <given-names>J.</given-names></name></person-group> (<year>2020</year>). <article-title>Enactivism and ecological psychology: the role of bodily experience in agency</article-title>. <source>Front. Psychol.</source> <volume>11</volume>:<fpage>841</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2020.539841</pub-id></citation></ref>
<ref id="ref92"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Proffitt</surname> <given-names>D. R.</given-names></name></person-group> (<year>2006</year>). <article-title>Embodied perception and the economy of action</article-title>. <source>Perspect. Psychol. Sci.</source> <volume>1</volume>, <fpage>110</fpage>&#x2013;<lpage>122</lpage>. doi: <pub-id pub-id-type="doi">10.1111/j.1745-6916.2006.00008.x</pub-id>, PMID: <pub-id pub-id-type="pmid">26151466</pub-id></citation></ref>
<ref id="ref93"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Proffitt</surname> <given-names>D.</given-names></name> <name><surname>Baer</surname> <given-names>D.</given-names></name></person-group> (<year>2020</year>). <source>Perception: How Our Bodies Shape Our Minds</source> <publisher-name>Minds. St. Martin&#x2019;s Press</publisher-name>.</citation></ref>
<ref id="ref94"><citation citation-type="book"><person-group person-group-type="editor"><name><surname>Proffitt</surname> <given-names>D.</given-names></name> <name><surname>Linkenauger</surname> <given-names>S.</given-names></name></person-group> (<year>2013</year>). &#x201C;<article-title>Perception viewed as a phenotypic expression</article-title>&#x201D; in <source>Action science: Foundations of an emerging discipline</source>. eds. W. Prinz, M. Beisert, and A. Herwig (MIT Press), <fpage>171</fpage>&#x2013;<lpage>197</lpage>. </citation></ref>
<ref id="ref95"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Quinn</surname> <given-names>K. A.</given-names></name> <name><surname>Macrae</surname> <given-names>C. N.</given-names></name></person-group> (<year>2011</year>). <article-title>The face and person perception: insights from social cognition</article-title>. <source>Br. J. Psychol.</source> <volume>102</volume>, <fpage>849</fpage>&#x2013;<lpage>867</lpage>. doi: <pub-id pub-id-type="doi">10.1111/j.2044-8295.2011.02030.x</pub-id></citation></ref>
<ref id="ref96"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rao</surname> <given-names>R. P. N.</given-names></name> <name><surname>Ballard</surname> <given-names>D. H.</given-names></name></person-group> (<year>1999</year>). <article-title>Predictive coding in the visual cortex: a functional interpretation of some extra-classical receptive-field effects</article-title>. <source>Nat. Neurosci.</source> <volume>2</volume>, <fpage>79</fpage>&#x2013;<lpage>87</lpage>. doi: <pub-id pub-id-type="doi">10.1038/4580</pub-id>, PMID: <pub-id pub-id-type="pmid">10195184</pub-id></citation></ref>
<ref id="ref97"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Read</surname> <given-names>C.</given-names></name> <name><surname>Szokolszky</surname> <given-names>A.</given-names></name></person-group> (<year>2020</year>). <article-title>Ecological psychology and Enactivism: perceptually-guided action vs. sensation-based Enaction 1</article-title>. <source>Front. Psychol.</source> <volume>11</volume>:<fpage>270</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2020.01270</pub-id>, PMID: <pub-id pub-id-type="pmid">32765330</pub-id></citation></ref>
<ref id="ref98"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Robertson</surname> <given-names>C. E.</given-names></name> <name><surname>Baron-Cohen</surname> <given-names>S.</given-names></name></person-group> (<year>2017</year>). <article-title>Sensory perception in autism</article-title>. <source>Nat. Rev. Neurosci.</source> <volume>18</volume>, <fpage>671</fpage>&#x2013;<lpage>684</lpage>. doi: <pub-id pub-id-type="doi">10.1038/nrn.2017.112</pub-id></citation></ref>
<ref id="ref99"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>R&#x00FC;tgen</surname> <given-names>M.</given-names></name> <name><surname>Seidel</surname> <given-names>E.-M.</given-names></name> <name><surname>Pletti</surname> <given-names>C.</given-names></name> <name><surname>Rie&#x010D;ansk&#x00FD;</surname> <given-names>I.</given-names></name> <name><surname>Gartus</surname> <given-names>A.</given-names></name> <name><surname>Eisenegger</surname> <given-names>C.</given-names></name> <etal/></person-group>. (<year>2018</year>). <article-title>Psychopharmacological modulation of event-related potentials suggests that first-hand pain and empathy for pain rely on similar opioidergic processes</article-title>. <source>Neuropsychologia</source> <volume>116</volume>, <fpage>5</fpage>&#x2013;<lpage>14</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2017.04.023</pub-id>, PMID: <pub-id pub-id-type="pmid">28438708</pub-id></citation></ref>
<ref id="ref100"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>R&#x00FC;tgen</surname> <given-names>M.</given-names></name> <name><surname>Seidel</surname> <given-names>E.-M.</given-names></name> <name><surname>Silani</surname> <given-names>G.</given-names></name> <name><surname>Rie&#x010D;ansk&#x00FD;</surname> <given-names>I.</given-names></name> <name><surname>Hummer</surname> <given-names>A.</given-names></name> <name><surname>Windischberger</surname> <given-names>C.</given-names></name> <etal/></person-group>. (<year>2015</year>). <article-title>Placebo analgesia and its opioidergic regulation suggest that empathy for pain is grounded in self pain</article-title>. <source>Proc. Natl. Acad. Sci.</source> <volume>112</volume>, <fpage>E5638</fpage>&#x2013;<lpage>E5646</lpage>. doi: <pub-id pub-id-type="doi">10.1073/pnas.1511269112</pub-id></citation></ref>
<ref id="ref101"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sato</surname> <given-names>A.</given-names></name> <name><surname>Matsuo</surname> <given-names>A.</given-names></name> <name><surname>Kitazaki</surname> <given-names>M.</given-names></name></person-group> (<year>2019</year>). <article-title>Social contingency modulates the perceived distance between self and other</article-title>. <source>Cognition</source> <volume>192</volume>:<fpage>104006</fpage>. doi: <pub-id pub-id-type="doi">10.1016/j.cognition.2019.06.018</pub-id>, PMID: <pub-id pub-id-type="pmid">31229741</pub-id></citation></ref>
<ref id="ref102"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schiano Lomoriello</surname> <given-names>A.</given-names></name> <name><surname>Meconi</surname> <given-names>F.</given-names></name> <name><surname>Rinaldi</surname> <given-names>I.</given-names></name> <name><surname>Sessa</surname> <given-names>P.</given-names></name></person-group> (<year>2018</year>). <article-title>Out of sight out of mind: perceived physical distance between the observer and someone in pain shapes Observer&#x2019;s neural empathic reactions</article-title>. <source>Front. Psychol.</source> <volume>9</volume>:<fpage>1824</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2018.01824</pub-id>, PMID: <pub-id pub-id-type="pmid">30364280</pub-id></citation></ref>
<ref id="ref103"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schnall</surname> <given-names>S.</given-names></name> <name><surname>Harber</surname> <given-names>K. D.</given-names></name> <name><surname>Stefanucci</surname> <given-names>J. K.</given-names></name> <name><surname>Proffitt</surname> <given-names>D. R.</given-names></name></person-group> (<year>2008</year>). <article-title>Social support and the perception of geographical slant</article-title>. <source>J. Exp. Soc. Psychol.</source> <volume>44</volume>, <fpage>1246</fpage>&#x2013;<lpage>1255</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.jesp.2008.04.011</pub-id>, PMID: <pub-id pub-id-type="pmid">22389520</pub-id></citation></ref>
<ref id="ref104"><citation citation-type="book"><person-group person-group-type="editor"><name><surname>Schutt</surname> <given-names>R. K.</given-names></name> <name><surname>Seidman</surname> <given-names>L. J.</given-names></name> <name><surname>Keshavan</surname> <given-names>M. S.</given-names></name></person-group> (Eds.). (<year>2015</year>). <source><italic>Social neuroscience: Brain</italic>. <italic>mind</italic>. <italic>and society.</italic></source> <publisher-name>Harvard University Press</publisher-name>.</citation></ref>
<ref id="ref105"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Serino</surname> <given-names>A.</given-names></name></person-group> (<year>2019</year>). <article-title>Peripersonal space (PPS) as a multisensory interface between the individual and the environment, defining the space of the self</article-title>. <source>Neurosci. Biobehav. Rev.</source> <volume>99</volume>, <fpage>138</fpage>&#x2013;<lpage>159</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neubiorev.2019.01.016</pub-id>, PMID: <pub-id pub-id-type="pmid">30685486</pub-id></citation></ref>
<ref id="ref106"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Serino</surname> <given-names>A.</given-names></name> <name><surname>Alsmith</surname> <given-names>A.</given-names></name> <name><surname>Costantini</surname> <given-names>M.</given-names></name> <name><surname>Mandrigin</surname> <given-names>A.</given-names></name> <name><surname>Tajadura-Jimenez</surname> <given-names>A.</given-names></name> <name><surname>Lopez</surname> <given-names>C.</given-names></name></person-group> (<year>2013</year>). <article-title>Bodily ownership and self-location: components of bodily self-consciousness</article-title>. <source>Conscious. Cogn.</source> <volume>22</volume>, <fpage>1239</fpage>&#x2013;<lpage>1252</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.concog.2013.08.013</pub-id>, PMID: <pub-id pub-id-type="pmid">24025475</pub-id></citation></ref>
<ref id="ref107"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sessa</surname> <given-names>P.</given-names></name> <name><surname>Meconi</surname> <given-names>F.</given-names></name> <name><surname>Han</surname> <given-names>S.</given-names></name></person-group> (<year>2014</year>). <article-title>Double dissociation of neural responses supporting perceptual and cognitive components of social cognition: evidence from processing of others&#x2019; pain</article-title>. <source>Sci. Rep.</source> <volume>4</volume>:<fpage>424</fpage>. doi: <pub-id pub-id-type="doi">10.1038/srep07424</pub-id>, PMID: <pub-id pub-id-type="pmid">25502570</pub-id></citation></ref>
<ref id="ref108"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Seth</surname> <given-names>A. K.</given-names></name></person-group> (<year>2013</year>). <article-title>Interoceptive inference, emotion, and the embodied self</article-title>. <source>Trends Cogn. Sci.</source> <volume>17</volume>, <fpage>565</fpage>&#x2013;<lpage>573</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.tics.2013.09.007</pub-id>, PMID: <pub-id pub-id-type="pmid">24126130</pub-id></citation></ref>
<ref id="ref109"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Seth</surname> <given-names>A.</given-names></name> <name><surname>Suzuki</surname> <given-names>K.</given-names></name> <name><surname>Critchley</surname> <given-names>H.</given-names></name></person-group> (<year>2012</year>). <article-title>An interoceptive predictive coding model of conscious presence</article-title>. <source>Front. Psychol.</source> <volume>2</volume>:<fpage>395</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2011.00395</pub-id>, PMID: <pub-id pub-id-type="pmid">22291673</pub-id></citation></ref>
<ref id="ref110"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Shepherd</surname> <given-names>S.</given-names></name></person-group> (<year>2010</year>). <article-title>Following gaze: gaze-following behavior as a window into social cognition</article-title>. <source>Front. Integr. Neurosci.</source> <volume>4</volume>:<fpage>5</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fnint.2010.00005</pub-id>, PMID: <pub-id pub-id-type="pmid">20428494</pub-id></citation></ref>
<ref id="ref111"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sloman</surname> <given-names>L.</given-names></name> <name><surname>Berridge</surname> <given-names>M.</given-names></name> <name><surname>Homatidis</surname> <given-names>S.</given-names></name> <name><surname>Hunter</surname> <given-names>D.</given-names></name> <name><surname>Duck</surname> <given-names>T.</given-names></name></person-group> (<year>1982</year>). <article-title>Gait patterns of depressed patients and normal subjects</article-title>. <source>Am. J. Psychiatry</source> <volume>139</volume>, <fpage>94</fpage>&#x2013;<lpage>97</lpage>. doi: <pub-id pub-id-type="doi">10.1176/ajp.139.1.94</pub-id>, PMID: <pub-id pub-id-type="pmid">7055284</pub-id></citation></ref>
<ref id="ref112"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Stekelenburg</surname> <given-names>J.</given-names></name> <name><surname>Gelder</surname> <given-names>B.</given-names></name></person-group> (<year>2004</year>). <article-title>The neural correlates of perceiving human bodies: an ERP study on the body-inversion effect</article-title>. <source>Neuroreport</source> <volume>15</volume>, <fpage>777</fpage>&#x2013;<lpage>780</lpage>. doi: <pub-id pub-id-type="doi">10.1097/00001756-200404090-00007</pub-id>, PMID: <pub-id pub-id-type="pmid">15073513</pub-id></citation></ref>
<ref id="ref113"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sugovic</surname> <given-names>M.</given-names></name> <name><surname>Turk</surname> <given-names>P.</given-names></name> <name><surname>Witt</surname> <given-names>J. K.</given-names></name></person-group> (<year>2016</year>). <article-title>Perceived distance and obesity: It&#x2019;s what you weigh, not what you think</article-title>. <source>Acta Psychol.</source> <volume>165</volume>, <fpage>1</fpage>&#x2013;<lpage>8</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.actpsy.2016.01.012</pub-id>, PMID: <pub-id pub-id-type="pmid">26854404</pub-id></citation></ref>
<ref id="ref114"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sun</surname> <given-names>C.</given-names></name> <name><surname>Chen</surname> <given-names>J.</given-names></name> <name><surname>Chen</surname> <given-names>Y.</given-names></name> <name><surname>Tang</surname> <given-names>R.</given-names></name></person-group> (<year>2021</year>). <article-title>The influence of induced emotions on distance and size perception and on the grip scaling during grasping</article-title>. <source>Front. Psychol.</source> <volume>12</volume>:<fpage>651885</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2021.651885</pub-id>, PMID: <pub-id pub-id-type="pmid">34650465</pub-id></citation></ref>
<ref id="ref115"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Takahashi</surname> <given-names>K.</given-names></name> <name><surname>Meilinger</surname> <given-names>T.</given-names></name> <name><surname>Watanabe</surname> <given-names>K.</given-names></name> <name><surname>B&#x00FC;lthoff</surname> <given-names>H. H.</given-names></name></person-group> (<year>2013</year>). <article-title>Psychological influences on distance estimation in a virtual reality environment</article-title>. <source>Front. Hum. Neurosci.</source> <volume>7</volume>:<fpage>580</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fnhum.2013.00580</pub-id>, PMID: <pub-id pub-id-type="pmid">24065905</pub-id></citation></ref>
<ref id="ref116"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tamietto</surname> <given-names>M.</given-names></name> <name><surname>de Gelder</surname> <given-names>B.</given-names></name></person-group> (<year>2008</year>). <article-title>Affective blindsight in the intact brain: neural interhemispheric summation for unseen fearful expressions</article-title>. <source>Neuropsychologia</source> <volume>46</volume>, <fpage>820</fpage>&#x2013;<lpage>828</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2007.11.002</pub-id>, PMID: <pub-id pub-id-type="pmid">18160081</pub-id></citation></ref>
<ref id="ref117"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Teneggi</surname> <given-names>C.</given-names></name> <name><surname>Canzoneri</surname> <given-names>E.</given-names></name> <name><surname>di Pellegrino</surname> <given-names>G.</given-names></name> <name><surname>Serino</surname> <given-names>A.</given-names></name></person-group> (<year>2013</year>). <article-title>Social modulation of peripersonal space boundaries</article-title>. <source>Curr. Biol.</source> <volume>23</volume>, <fpage>406</fpage>&#x2013;<lpage>411</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.cub.2013.01.043</pub-id></citation></ref>
<ref id="ref118"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tomasello</surname> <given-names>M.</given-names></name> <name><surname>Hare</surname> <given-names>B.</given-names></name> <name><surname>Lehmann</surname> <given-names>H.</given-names></name> <name><surname>Call</surname> <given-names>J.</given-names></name></person-group> (<year>2007</year>). <article-title>Reliance on head versus eyes in the gaze following of great apes and human infants: the cooperative eye hypothesis</article-title>. <source>J. Hum. Evol.</source> <volume>52</volume>, <fpage>314</fpage>&#x2013;<lpage>320</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.jhevol.2006.10.001</pub-id>, PMID: <pub-id pub-id-type="pmid">17140637</pub-id></citation></ref>
<ref id="ref119"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Trope</surname> <given-names>Y.</given-names></name> <name><surname>Liberman</surname> <given-names>N.</given-names></name></person-group> (<year>2010</year>). <article-title>Construal-level theory of psychological distance</article-title>. <source>Psychol. Rev.</source> <volume>117</volume>, <fpage>440</fpage>&#x2013;<lpage>463</lpage>. doi: <pub-id pub-id-type="doi">10.1037/a0018963</pub-id>, PMID: <pub-id pub-id-type="pmid">20438233</pub-id></citation></ref>
<ref id="ref120"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Valenti</surname> <given-names>J. J.</given-names></name> <name><surname>Firestone</surname> <given-names>C.</given-names></name></person-group> (<year>2019</year>). <article-title>Finding the &#x201C;odd one out&#x201D;: memory color effects and the logic of appearance</article-title>. <source>Cognition</source> <volume>191</volume>:<fpage>103934</fpage>. doi: <pub-id pub-id-type="doi">10.1016/j.cognition.2019.04.003</pub-id>, PMID: <pub-id pub-id-type="pmid">31382106</pub-id></citation></ref>
<ref id="ref121"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>van de Riet</surname> <given-names>W. A. C.</given-names></name> <name><surname>Grezes</surname> <given-names>J.</given-names></name> <name><surname>de Gelder</surname> <given-names>B.</given-names></name></person-group> (<year>2009</year>). <article-title>Specific and common brain regions involved in the perception of faces and bodies and the representation of their emotional expressions</article-title>. <source>Soc. Neurosci.</source> <volume>4</volume>, <fpage>101</fpage>&#x2013;<lpage>120</lpage>. doi: <pub-id pub-id-type="doi">10.1080/17470910701865367</pub-id>, PMID: <pub-id pub-id-type="pmid">19255912</pub-id></citation></ref>
<ref id="ref122"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>van der Hoort</surname> <given-names>B.</given-names></name> <name><surname>Guterstam</surname> <given-names>A.</given-names></name> <name><surname>Ehrsson</surname> <given-names>H. H.</given-names></name></person-group> (<year>2011</year>). <article-title>Being Barbie: the size of one&#x2019;s own body determines the perceived size of the world</article-title>. <source>PloS One</source> <volume>6</volume>:<fpage>e20195</fpage>. doi: <pub-id pub-id-type="doi">10.1371/journal.pone.0020195</pub-id>, PMID: <pub-id pub-id-type="pmid">21633503</pub-id></citation></ref>
<ref id="ref500"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Varela</surname> <given-names>F. J.</given-names></name> <name><surname>Thompson</surname> <given-names>E.</given-names></name> <name><surname>Rosch</surname> <given-names>E.</given-names></name></person-group> (<year>1991</year>). <source>The embodied mind: Cognitive science and human experience</source>. Cambridge, MA, US: The MIT Press., PMID: <pub-id pub-id-type="pmid">33756399</pub-id></citation></ref>
<ref id="ref123"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Witkower</surname> <given-names>Z.</given-names></name> <name><surname>Hill</surname> <given-names>A. K.</given-names></name> <name><surname>Koster</surname> <given-names>J.</given-names></name> <name><surname>Tracy</surname> <given-names>J. L.</given-names></name></person-group> (<year>2021</year>). <article-title>Beyond face value: evidence for the universality of bodily expressions of emotion</article-title>. <source>Affect. Sci.</source> <volume>2</volume>, <fpage>221</fpage>&#x2013;<lpage>229</lpage>. doi: <pub-id pub-id-type="doi">10.1007/s42761-021-00052-y</pub-id></citation></ref>
<ref id="ref124"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Xiao</surname> <given-names>Y. J.</given-names></name> <name><surname>Bavel</surname> <given-names>J. J. V.</given-names></name></person-group> (<year>2012</year>). <article-title>See your friends close and your enemies closer</article-title>. <source>Personal. Soc. Psychol. Bull.</source> <volume>38</volume>, <fpage>959</fpage>&#x2013;<lpage>972</lpage>. doi: <pub-id pub-id-type="doi">10.1177/0146167212442228</pub-id></citation></ref>
<ref id="ref125"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yamakawa</surname> <given-names>Y.</given-names></name> <name><surname>Kanai</surname> <given-names>R.</given-names></name> <name><surname>Matsumura</surname> <given-names>M.</given-names></name> <name><surname>Naito</surname> <given-names>E.</given-names></name></person-group> (<year>2009</year>). <article-title>Social distance evaluation in human parietal cortex</article-title>. <source>PLoS One</source> <volume>4</volume>:<fpage>e4360</fpage>. doi: <pub-id pub-id-type="doi">10.1371/journal.pone.0004360</pub-id>, PMID: <pub-id pub-id-type="pmid">19204791</pub-id></citation></ref>
<ref id="ref126"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yamazaki</surname> <given-names>Y.</given-names></name> <name><surname>Hashimoto</surname> <given-names>T.</given-names></name> <name><surname>Iriki</surname> <given-names>A.</given-names></name></person-group> (<year>2009</year>). <article-title>The posterior parietal cortex and non-spatial cognition</article-title>. <source>Biol. Rep.</source> <volume>1</volume>:<fpage>74</fpage>. doi: <pub-id pub-id-type="doi">10.3410/B1-74</pub-id>, PMID: <pub-id pub-id-type="pmid">20948614</pub-id></citation></ref>
<ref id="ref127"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Yee</surname> <given-names>N.</given-names></name> <name><surname>Bailenson</surname> <given-names>J. N.</given-names></name> <name><surname>Ducheneaut</surname> <given-names>N.</given-names></name></person-group> (<year>2009</year>). The proteus effect: implications of transformed digital self-representation on online and offline behavior. <source>Commun. Res.</source> <volume>36</volume>, <fpage>285</fpage>&#x2013;<lpage>312</lpage>. doi: <pub-id pub-id-type="doi">10.1177/0093650208330254</pub-id></citation></ref>
<ref id="ref128"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zadra</surname> <given-names>J. R.</given-names></name> <name><surname>Weltman</surname> <given-names>A. L.</given-names></name> <name><surname>Proffitt</surname> <given-names>D. R.</given-names></name></person-group> (<year>2016</year>). <article-title>Walkable distances are bioenergetically scaled</article-title>. <source>J. Exp. Psychol. Hum. Percept. Perform.</source> <volume>42</volume>, <fpage>39</fpage>&#x2013;<lpage>51</lpage>. doi: <pub-id pub-id-type="doi">10.1037/xhp0000107</pub-id>, PMID: <pub-id pub-id-type="pmid">26301887</pub-id></citation></ref>
<ref id="ref129"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zaki</surname> <given-names>J.</given-names></name> <name><surname>Ochsner</surname> <given-names>K. N.</given-names></name></person-group> (<year>2012</year>). <article-title>The neuroscience of empathy: progress, pitfalls and promise</article-title>. <source>Nat. Neurosci.</source> <volume>15</volume>, <fpage>675</fpage>&#x2013;<lpage>680</lpage>. doi: <pub-id pub-id-type="doi">10.1038/nn.3085</pub-id>, PMID: <pub-id pub-id-type="pmid">22504346</pub-id></citation></ref>
<ref id="ref130"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhou</surname> <given-names>C.</given-names></name> <name><surname>Han</surname> <given-names>M.</given-names></name> <name><surname>Liang</surname> <given-names>Q.</given-names></name> <name><surname>Hu</surname> <given-names>Y.-F.</given-names></name> <name><surname>Kuai</surname> <given-names>S.-G.</given-names></name></person-group> (<year>2019</year>). <article-title>A social interaction field model accurately identifies static and dynamic social groupings</article-title>. <source>Nat. Hum. Behav.</source> <volume>3</volume>, <fpage>847</fpage>&#x2013;<lpage>855</lpage>. doi: <pub-id pub-id-type="doi">10.1038/s41562-019-0618-2</pub-id>, PMID: <pub-id pub-id-type="pmid">31182793</pub-id></citation></ref></ref-list>
</back>
</article>