Abstract
Body expressions provide important perceptual cues to recognize emotions in others. By adulthood, people are very good at using body expressions for emotion recognition. Thus an important research question is: How does emotion processing of body expressions develop, particularly during the critical first 2-years and into early childhood? To answer this question, we conducted a meta-analysis of developmental studies that use body stimuli to quantity infants' and young children's ability to discriminate and process emotions from body expressions at different ages. The evidence from our review converges on the finding that infants and children can process emotion expressions across a wide variety of body stimuli and experimental paradigms, and that emotion-processing abilities do not vary with age. We discuss limitations and gaps in the literature in relation to a prominent view that infants learn to extract perceptual cues from different sources about people's emotions under different environmental and social contexts, and suggest naturalistic approaches to further advance our understanding of the development of emotion processing of body expressions.
Introduction
The ability to discriminate and recognize other people's emotion is important for social interactions. Adults process a rich combination of perceptual cues from people's facial, vocal and body expressions to recognize emotions quickly and accurately so they can take appropriate actions (de Gelder, 2009; Belin et al., 2011; Keltner et al., 2016). These cues also include changes in body odor and temperature (Robinson et al., 2012; Rosen et al., 2015; Salazar-López et al., 2015; de Groot and Smeets, 2017). For emotion body expressions, adults seem to focus on perceptual cues in the upper body, including the arms and hands (Pollux et al., 2019; Ross and Flack, 2020). Bodies can provide more diagnostic information about emotions than other perceptual cues under certain circumstances, such as when a person is far away (de Gelder, 2009; Bhatt et al., 2016; Enea and Iancu, 2016). Thus, an important research question is how emotion processing develops, particularly during the critical first 2-years and into early childhood. For example, we recently showed that the focus on the upper body shown by adults may emerge as early as 7-months (Geangu and Vuong, 2020). Developmental research, however, has focused predominantly on facial expressions (Geangu et al., 2016a; Bayet and Nelson, 2019).
Our aim in this mini-review is to synthesize evidence from developmental studies of emotion processing of body expressions from infancy until early childhood to address the research question. We have two goals toward this aim. First, we highlight the importance of environmental and social contexts for learning perceptual cues to emotion expressions. As infants grow, different visual information related to faces and bodies become more prevalent in the visual field during their daily activities (Smith et al., 2018), and they experience more and more varied emotion expressions under different social contexts. Second, we present a meta-analysis of developmental studies that use body stimuli to quantity infants' and children's ability to discriminate and process emotion expressions at different ages. The evidence suggests that there is a shift from faces being prevalent in the visual field toward other parts of the body (e.g., hands; Fausey et al., 2016; Ausderau et al., 2017; Jayaraman et al., 2017; Smith et al., 2018), and so the meta-analysis may help us relate laboratory-based studies to infants' and children's natural learning environment. We conclude with suggestions for future research directions.
Emotion body expressions in context
A prominent view of the development of emotion processing is that infants learn to extract perceptual cues from different sources about people's emotions and their communicative value (Campos et al., 1994; Leppänen and Nelson, 2009; Widen, 2013; Smith et al., 2018; Walle and Lopez, 2020). With respect to body expressions, infants frequently have people (e.g., parents, siblings) in their visual field view throughout the first year of life (Ausderau et al., 2017; Jayaraman et al., 2017). Importantly, the prevalence of different body parts that are present in the visual field changes during development. For example, faces are more prevalent than other body parts during the first 4 months after birth (Jayaraman et al., 2017). This prevalence shifts to other body parts after this age. Fausey et al. (2016) used head-mounted camera recordings in infants' home environment to demonstrate an increase in the proportion of hands in infants' visual field with a corresponding decrease in the proportion of faces, with a larger proportion of hands emerging between 6 and 9-months-old. The changes in prevalence of different body parts are observed across the first 2-years of life, and are likely due to cognitive and motor development that allow infants to more actively explore and interact with their environment and people (Flavell, 1982; Fischer and Silvern, 1985; Ausderau et al., 2017).
Thus as infants mature and explore their environment, they are likely to extract and process different body parts that become more prevalent in their visual field to recognize different emotion expressions, and possibly relate body parts to perceptual cues in other modalities such as vocal expressions or odor changes. The prevalence of bodies in the visual field may also be relevant for other social tasks. For example, infants as young as 6-months-old fixate on the hands of people who reach and grasp objects, and look less at other body parts that are in view (Falck-Ytter et al., 2006; Kochukhova and Gredebäck, 2010; Geangu et al., 2015). These changes in the availability of different body cues to emotions and social interactions also increase the opportunities infants have to learn the relation between body expressions and the social and non-social contexts in which they occur, further contributing to the development of emotion processing of body expressions (Campos et al., 1994; Leppänen and Nelson, 2009; Widen, 2013; Walle and Lopez, 2020). These experiences during maturation may lead to appropriate neuro-physiological responses associated with emotion processing (e.g., Krol et al., 2015; Rajhans et al., 2015; Ross et al., 2019).
Visual information for emotion processing of body expressions
By adulthood, research suggests that combinations of body postures and movements define signature cues for recognizing emotions from body expressions (Atkinson et al., 2004; Atkinson, 2013; Poyo Solanas et al., 2020). For example, signature cues for happy expressions may include an upright posture with raised arms. The cues for anger expressions may include a forward-leaning posture and shaking fists, contrasted to a backward-leaning posture and hands in front of the body for fear expressions. Sad expressions have the most subtle cues that tend to include a dropped position of the head, with arms brought near the body. The existent evidence indicates that adults rely on visual information contained in the upper body (e.g., torso, arms and hands) to recognize emotions expressed in static body images (Pollux et al., 2019; Ross and Flack, 2020).
The naturalistic studies discussed in the previous section provide evidence that bodies are prevalent in infants' visual field from very early on (Fausey et al., 2016; Jayaraman et al., 2017; Smith et al., 2018). The results from these studies are complemented by behavioral and neural evidence that, from birth, infants are sensitive to body postures and movements (e.g., Hirai and Hiraki, 2005; Geangu, 2008; Simion et al., 2008; Geangu et al., 2015; Bhatt et al., 2016; Gillmeister et al., 2019). This initial sensitivity may help them orient and attend to bodies. Infants seem to also attend to visual information in the upper body like adults, in line with the increased prevalence of body parts in infants' visual field (Geangu and Vuong, 2020). Thus infants and young children's reliance on signature cues based on body parts for emotion processing of body expressions may reflect changes to the prevalence of different body parts in the visual field under different contexts during infancy and early childhood (Ausderau et al., 2017; Smith et al., 2018). There is currently no direct evidence for this possibility. Furthermore, developmental studies on the emotion processing of body expressions use different emotions, body stimuli and outcome measurements across different age groups, leading to gaps in the literature.
Review of emotion processing of body expressions
To address this issue and our overarching aim, we synthesize published studies on emotion processing of body expressions by infants and children. This synthesis can provide a holistic view to identify gaps and motivate future research. We conducted a literature search on PUBMED, Scopus, Medline, Embase and PsycInfo in October 2022 for articles which investigated emotion processing of body expressions in typically developing infants and children up to ~7.5-years-old. Although studies may include older age groups or developmental groups, we focused on typical development and body stimuli (or stimuli that included the body) within our age range. The electronic searches were complemented with hand citation searches. There were 1,787 unique articles, with 3 additional articles from hand searches. QV undertook the searching and screening processes. See the Supplementary material for details.
Study characteristics
Table 1 summarizes the 38 articles included in the review. The studies are ordered by the youngest age group (mean age in months), and range from 3.4-months-old to 87.1-months-old (7.3-years-old). Most studies balanced the number of male and female participants. Several studies included comparisons to older age groups (e.g., adults) or developmental conditions (e.g., hearing impairments or mental disabilities). We include developmental milestones from Ausderau et al. (2017) to illustrate some known developmental changes occurring at different ages.
Table 1
| Age category | Milestones | References | Mean months | N | Sex | Other age groups tested | Body stimulus | Motion | Face | Voice | Emotion | Outcomemeasurements | Conditions/notes | Meta |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Birth to 6 months | Visually tracks person moving across room; Regards toys (3 months) | Zieber et al. (2014b) | 3.4, 3.5, 6.6 | 16, 16, 32 | 9F, 7F, 14F | Full light videos (Atkinson) | Yes* | No | Yes* | Angry, happy, neutral | Preference | Upright and inverted bodies (voices from Saunter) | Y | |
| Heck et al. (2018) | 3.4, 5.0 | 60, 32 | 23F, 18F | Full light videos (Atkinson) | Yes | No | Yes | Angry, happy | Preference | Body/voice congruency | N | |||
| Calms in response to parent or soothing voice | Missana et al. (2015) | 4.3, 8.4 | 20, 20 | 10F, 9F | Point light videos (Atkinson) | Yes | No | No | Fear, happy | ERP (Pb, Nc and Pc components) | Upright and inverted bodies | N | ||
| Lifts head to look around; Reaches/grasps hanging toys (4-5 months) | Missana and Grossmann (2015) | 4.3, 8.4 | 20, 20 | 10F, 9F | Point light videos (Atkinson) | Yes | No | No | Fear, happy | ERP (frontal assymetry) | Upright and inverted bodies | Y | ||
| Transfers objects from hand to hand; Begins to display separation anxiety and preference for specific caregiver | Hock et al. (2017) | 6.4 | 30 | 19F | Full light images (Atkinson) | No | Yes* | No | Angry, happy, sad | Preference | Y | |||
| Zieber et al. (2014a) | 6.5 | 30* | 18F | Full light videos (Atkinson) | Yes | No | Yes* | Angry, happy, neutral | Preference | Upright and inverted bodies (voices from Saunter) | Y | |||
| 6 months to 1 year | Sits well without support | Geangu and Vuong (2020) | 7.6 | 48 | 30F | Body images (BEAST) | No | No | No | Angry, fear, happy, neutral | Eye tracking (proportion looking times, proportion fixations, fixation durations) | Y | ||
| Geangu and Vuong (2023) | 7.6 | 48 | 30F | Body images (BEAST) | No | No | No | Angry, fear, happy, neutral | Eye tracking (pupil dilation) | Y | ||||
| Crawls on belly; Reach is smooth and efficient in all directions | Rajhans et al. (2016) | 8.2 | 32 | 16F | Full light images (Atkinson) | No | No | No | Fear, happy | ERP (P1, N290, P400 and Nc components) | Priming by body on faces; Body/face congruency | Y | ||
| Krol et al. (2015) | 8.3 | 28 | 15F | Full light images (Atkinson) | No | No | No | Fear, happy | ERP (Nc component) | Compared groups with low and high exclusive breastfeeding (EBF) durations | Y | |||
| Missana et al. (2014) | 8.4 | 15 | 10F | Full light images (Atkinson) | No | No | No | Fear, happy | ERP (N290 and Nc components) | Upright and inverted bodies | Y | |||
| Visually follows pointing, engages in joint attention (9 months) | Rajhans et al. (2015) | 8.4 | 27 | 13F | Full light images (Atkinson) | No | No | No | Fear, happy | ERP (Nc component) | Also assessed temperament and maternal empathy | N | ||
| Creeps on hands and knees; Begins standing unsupported; Gives object to adult to communicate need for help | Addabbo et al. (2020) | 11.6 | 17 | 6F | Action videos (upper body) | Yes | No | No | Angry, happy | EMG (corrugator supercilii; medial frontalis; zygomaticus major) | Y | |||
| Walks indepdently | Ogren et al. (2019) | 14.7, 15.0 | 26, 26 | 15F, 14F | Point light videos | Yes | No | No | Angry, happy, sad, neutral | Preference | Y | |||
| 2–4 years | Begins running; well-coordinated, balanced gait; Social, parallel play begins (24 months) | Quam and Swingley (2012) | 24.0, 36.0, 48.0, 60.0 | 12, 59, 27, 20 | Not provided | Live experimenter with puppet | Yes | Yes | Yes | Happy, sad (puppet) | Various | N | ||
| Witkower et al. (2021) | 24.0, 54.0, 84.0 | 164, 196, 168 | Not provided | 9–12 years | body images (BEAST) | No | No | No | Angry, fear, sad | Accuracy | Y | |||
| Understands caregivers will return, increasing flexibility in relationship with caregivers; Associative play in groups | Mondloch et al. (2013) | 37.0, 46.5, 71.9 | 12, 24, 12 | Not provided | Adults | Body images | No | Yes | No | Fear, sad | Accuracy | Body/face congruency (faces from NIMSTIM) | N | |
| Geangu et al. (2016b) | 40.4 | 22 | 12F | Body images (BEAST) | No | No | No | Angry, fear, happy, neutral | EMG (corrugator supercilii; medial frontalis; zygomaticus major) | Y | ||||
| Nelson and Russell (2011) | 42.7, 53.6, 64.8 | 48, 48, 48 | 24F, 24F, 24F | Body videos | Yes | Yes* | Yes* | Angry, fear, happy, sad | Accuracy | Faces blurred or not | N | |||
| Ke et al. (2022) | 45.8, 78.2 | 17, 17 | 8F, 10F | Point light videos (Max Planck) | Yes | No | No | Angry, happy | ERP (N300 and N400 components) | Priming by body on words; word/body congruency | Y | |||
| 4–6 years | Cooperative play with peers to reach common goals | Lagerlöf and Djerf (2009) | 48.0, 60.0 | 20, 21 | 10F, 11F | 8 years, adults | Dance videos | Yes | Yes | No | Angry, fear, happy, sad | Accuracy | Happy labeled as joy | Y |
| Boone and Cunningham (1998) | 49.8, 60.6 | 25, 25 | 13F, 12F | 8 years, adults | Dance videos | Yes | No | No | Angry, fear, happy, sad | Accuracy | Y | |||
| Parker et al. (2013) | 54.0 | 55 | 24F | Body images | No | No | No | Angry, disgust, fear, happy, sad, surprise, neutral | Accuracy | Angry labeled as mad, fear labeled as scared | N | |||
| Nelson and Russell (2012) | 55.0, 80.0 | 36, 36 | 18F, 18F | 8–11 years | Body videos | Yes | Yes* | Yes* | Pride | Accuracy | Faces blurred or not | N | ||
| Nelson et al. (2013) | 55.3, 63.8 | 68, 72 | 34F, 36F | Body videos | Yes | Yes | Yes | Angry, disgust, fear, happy, sad, surprise | Accuracy | Y | ||||
| Nelson and Mondloch (2018) | 60.0 | 32 | 17F | 9 years, adults | Body videos, body images from videos | Yes* | Yes* | No | Angry, fear, happy, sad | Accuracy, eye tracking (relative fixation number, relative fixation duration) | Faces blurred or not | N | ||
| Sanders (2006) | 60.0, 84.0 | Not provided | Not provided | 11, 15 years | Schematic body drawings | No | No | No | Not stated | Accuracy | Compared hearing and non-hearing | N | ||
| Tuminello and Davidson (2011) | 63.3 | 111 | Not provided | Body images | No | Yes* | No | Anger, fear, happy, sad, surprise, neutral | Compared African American and European American children | Y | ||||
| Hao and Su (2014) | 65.8 | 25 | 13F | Body videos (faces occluded) | Yes | No | No | Anger, fear, happy, sad | Accuracy | N | ||||
| Brosgole et al. (1986) | 66.5 | 20 | 9F | Animal line drawings | No | No | No | Angry, happy, sad, neutral | Errors | Compared mild, moderate and severe mental disabilities | N | |||
| Yang et al. (2022) | 67.8 | 41 | 21F | Adults | Body images BEAST | No | No | No | Anger, fear, happy, sad | Accuracy | Tested Asian participants | Y | ||
| Gioia and Brosgole (1988) | 71.0 | 10 | 5F | Animal line drawings | No | No | No | Angry, happy, sad | Errors | Compared mild, moderate and severe mental disorders | N | |||
| 6–8 years | Balas et al. (2018) | 72.0 | 20 | 13F | 8–11 years, adults | Body images (BESST) | No | No | No | Angry, sad | Acucracy, dprime, response criterion | Add spatial noise in vertical, horizontal or both directions | Y | |
| Tsou et al. (2021) | 72.8 | 71 | 41F | Social interaction videos | Yes | Yes | No | Not stated | Eye tracking (fixation ratios in defined areas of interests [AOIs]) | Compared hearing and non-hearing | N | |||
| Vieillard and Guidetti (2009) | 74.0 | 28 | 14F | 8 years, adults | Body videos (GEMEP) | Yes | Yes | No | Angry, happy, irritation, pleasure, neutral | Errors | Y | |||
| Nguyen and Nelson (2021) | 76.9 | 30 | 14F | 8–10 years, adults | Body images | Yes | Yes* | No | Win/lose | Accuracy | N | |||
| Nicolini et al. (2019) | 79.2 | 15 | 6F | Body videos | Yes | Yes | Yes | Fear, happy, sad, neutral | Thermal imaging | Compared with and without facial palsy | Y | |||
| Ross et al. (2021) | 87.1 | 32 | Not provided | 8–11 years, adults | Body images (BEAST) | No | No | Yes* | Happy, fear | Accuracy | Body/voice congruency, happy/fear voices | Y |
Summary characteristics of the 38 studies included in the mini-review.
Included in some groups or conditions; Milestones from Ausderau et al. (2017).
The studies are listed by first author and year and ordered by the mean age (in months) of the youngest age group tested by the author(s). The mean age is listed for each group tested, along with the sample size and number of females (F). Across these 3 columns, the age groups are presented in the same corresponding order and separated by commas. Other age groups tested in the same study are listed but not considered in this review. Other study characteristics include: body stimulus used; whether the stimuli included body motion, faces or voices; the emotion expressions tested; the outcome measurements; and additional summary information about any conditions tested (e.g., upright vs. inverted bodies) and notes. The Meta column indicates whether the study was included in the meta-analysis or not. Milestones from Ausderau et al. (2017). *Feature was included on some conditions. Atkinson = from Atkinson et al. (2004). BEAST = from de Gelder and van den Stock (2011). BESST = from Thoma et al. (2013). GEMEP = from Bänziger et al. (2012). Max Planck = from Volkova et al. (2014). NIMSTIM = from Tottenham et al. (2009). Saunter = from Sauter et al. (2010). The Excel file version of this table is available at: https://osf.io/tyg6n/.
A few studies considered psychological (Rajhans et al., 2015), social (Krol et al., 2015) and cultural factors (Tuminello and Davidson, 2011; Yang et al., 2022) in emotion processing of body expressions. Anger, fear, happy and sad expressions were tested the most, and ~29% (11/38) included an emotionally neutral condition as recommended by Hepach and Westermann (2016). Other expressions included, for example, disgust, surprise, pride and irritation. The body stimuli ranged from abstract representations (e.g., point-light displays or schematic line drawings) to videos and real-time interactions with experimenters (Quam and Swingley, 2012). Thus the stimuli could include static (e.g., body posture), dynamic (e.g., body movements) information (or both), and they could be combined with other perceptual cues such as faces and voices.
The studies used different outcome measurements, including accuracy, facial muscle activities from electromyography (EMG), eye-tracking measurements (e.g., fixations or pupil dilations), and event-related potentials (ERPs) in electroencephalography (EEG) related to different neural markers of emotion processing. One study measured facial thermal-imaging responses to body expressions (Nicolini et al., 2019). The studies also tested emotion processing of body expressions under different experimental conditions, such as body inversion. Several studies also compared emotion processing between different developmental conditions.
Meta-analysis
The studies in this review highlight the rich variety of body stimuli, outcome measurements and experimental manipulations used to test whether and how infants and children recognize emotion body expressions. Although this richness allows for a broad generalization, there is no quantification of infants' and young children's overall ability to discriminate between different emotion pairs (given differences in these studies). Thus, the goals of the meta-analysis is to combine effect sizes across studies to determine: (1) whether there is an overall ability to discriminate between different expression pairs; (2) whether this ability differs between different pairs; and (3) whether this ability varies with age.
For 22 of the 38 articles, we could derive mean and standard deviation for each body expression from graphs and/or tables to be included in the meta-analysis. We focused on anger, fear, happy, sad and neutral expressions as most studies used one or more of these expressions, resulting in 10 possible pairs (~14% [3/22] included a neutral condition). We calculated Hedges' g as the effect size and took the absolute value to quantify participants' ability to discriminate expression pairs. We log-transformed any effect sizes calculated from sample proportion data (Nelson et al., 2013; Witkower et al., 2021). For each study included in the meta-analysis, the effect size was calculated separately for each outcome measurement, within-subject experimental condition and age group. The effect sizes were averaged across outcome measurements and within-subject conditions resulting in 2 (sad vs. neutral) to 21 (anger vs. happy) effect sizes for each pair. A random-effects model with restricted maximum likelihood estimation (REML) was used to test whether the overall effect size for each emotion pair was greater than zero. Lastly, we conducted a meta-regression between effect size and mean age (in months) for each pair. The meta-analysis was conducted using the meta (v6.1-0; Schwarzer et al., 2015) package for R-Studio (v1.4.1106). See Supplementary material for details. The data and scripts are available at the Open Science Framework (https://osf.io/tyg6n/).
Figure 1 presents a forest plot for the 10 expression pairs, with studies ordered chronologically by the mean age in months. For the 6 pairs including two emotions (Row 1 in Figure 1), combining the effect sizes across all studies showed consistent evidence for small to medium effects (g = 0.36 to 0.68; ps < 0.001). The meta-regression showed inconsistent evidence that effect size varied with age for these pairs (ps > 0.05).
Figure 1

Forest plot of effect size estimate (Hedges' g) for each emotion pair. The effect sizes from the 22 studies are ordered chronologically based on mean age in months. Some studies tested groups in different conditions (as indicated in brackets). Horizontal lines depict 95% confidence interval (95%-CI), size of squares represents the weight of individual data sets, and diamonds represent mean effect sizes based on a random-effects model (vertical dashed line). The effect size mean and 95%-CI for each emotion pair (column), respectively, are: Row 1 0.36 [0.23; 0.48]; 0.41 [0.28; 0.54]; 0.68 [0.48; 0.88]; 0.50 [0.26; 0.74]; 0.64 [0.40; 0.88]; 0.50 [0.27; 0.72] (ps < 0.001); Row 2 0.69 [0.22; 1.16] (p < 0.001); 0.20 [−0.03; 0.44] (p = 0.09); 0.28 [0.04; 0.51] (p = 0.02); 0.34 [−0.03; 0.71] (p = 0.07). The scale was truncated to Hedges' g = −0.5 to 3.0 for visualization purposes. Arrows on the confidence interval indicate that the horizontal line extended beyond the limits of the truncated scale.
A similar but weaker pattern was found when each emotion was compared to the neutral condition (Row 2 in Figure 1; 4 pairs). The mean effect size also ranged from small to medium effects. It was significantly >0 for anger and happy expressions (g = 0.69 and 0.28, respectively; ps < 0.02) but not for fear and sad expressions (g = 0.20 and 0.34, respectively; ps > 0.07). There was a significant correlation between effect size and age for anger expressions (p < 0.001) but not for the other expressions (ps > 0.61 for fear and happy expressions; no solution for sad expressions). However, there was a small number of effect sizes that included a neutral condition (e.g., N = 2 for sad, N = 4 for the other emotions) and so we do not make any strong conclusions from these results.
Discussion and future directions
Our review identified a wide range of laboratory-based developmental studies of emotion processing of body expressions. We also note that researchers use different terms for similar or related emotions, such as joy vs. happy (e.g., Lagerlöf and Djerf, 2009), as well as more ambiguous cases such as win and lose (Nguyen and Nelson, 2021) which can be associated with happy/excitement and disappointment. Many individual effect sizes in these studies had confidence intervals that included 0. However across these studies, the evidence suggests that infants and children can discriminate between emotion expressions across a variety of body stimuli and experimental paradigms, and that infants and children can integrate perceptual cues across bodies, faces and voices. A similar pattern was seen for discriminating emotion from neutral body expressions, but this finding is limited by the small number of effect sizes.
The ability to recognize emotions is often inferred from infants' and children's ability to discriminate emotion pairs. Several studies in our review measured neuro-physiological outcomes while participants viewed different emotion body expressions, such as ERP components (e.g., Krol et al., 2015; Rajhans et al., 2016), EMG responses (Geangu et al., 2016b; Addabbo et al., 2020), pupil dilations (Geangu and Vuong, 2023) and facial temperature (Nicolini et al., 2019). Importantly, these measurements are related to emotion processing in adults (Robinson et al., 2012; Kret et al., 2013; Yeh et al., 2016). They suggest that infants and children can process the emotional content of body expressions using static (e.g., body posture) and dynamic (e.g., body movements) cues, rather than discriminating emotion pairs (Ross and Atkinson, 2020). A second finding is that emotion-processing abilities do not vary with age (as indicated by the meta-regression for the 6 emotion pairs), which is surprising given the developmental milestones and changes in visual information that are prevalent in infants' and children's visual field as they mature (Ausderau et al., 2017; Smith et al., 2018).
These 2 main findings should be considered in light of emotion processing in adults. Although body postures and gestures contribute to emotion processing in adulthood, body cues do not necessarily convey all emotions equally (Atkinson et al., 2004; Atkinson, 2013; Poyo Solanas et al., 2020) and may need to interact with other perceptual cues for effective emotion processing in the natural environment. For example, body expressions may be important for disambiguating fear and surprise, which can be easily confused with facial expressions (Smith and Schyns, 2009; Actis-Grosso et al., 2015). Thus our review and meta-analysis underscores the importance of investigating the development of emotion processing from multiple perceptual cues.
The 2 main findings should also be considered in light of potential limitations highlighted by our review. First, the sample size for young infants tend to be less than for older infants and children resulting in more variability for the younger group. Second, young infants were not tested with as many emotion pairs compared to the older age groups leaving a gap in understanding the early development of emotion processing of body expressions. This younger age group also tended to be tested with fewer emotion expressions within a study (e.g., typically 2 expressions) than older age groups. There was also a smaller proportion of studies that included a neutral condition (~29%; Hepach and Westermann, 2016). Third, there is a relatively small number of body-stimulus databases used across all studies (see Table 1). Nearly all studies with infants younger than 9-months used the stimuli from Atkinson et al. (2004). For other age groups, several studies used static and dynamic body-stimulus databases that have only been validated by adults. A few studies recorded their own body expression videos with different expressivity (e.g., expressive dance movements; Boone and Cunningham, 1998). Finally, few studies presented naturalistic stimuli that combined body, facial and vocal cues. Those that did manipulated the congruency of the emotion expression between different cues, leading to stimuli that were not necessarily naturalistic.
Given these limitations, we suggest several future research directions. The first is to test young infants with a larger variety of emotion body expressions, including neutral expressions (Hepach and Westermann, 2016). It would also be important to test infants longitudinally to map out the developmental trajectory for emotion processing of body expressions. Future work can also combine different outcome measurements (e.g., pupil dilation, EMG and EEG), use naturalistic dynamic multi-sensory perceptual cues (e.g., Geangu et al., 2011; Poulin-Dubois et al., 2018; Quadrelli et al., 2019), test different cultures (e.g., Geangu et al., 2011, 2016a; Geangu, 2015; see Parkinson et al., 2017; Poulin-Dubois et al., 2018; Quadrelli et al., 2019, for adults), and investigate factors contributing to observed individual differences (e.g., Crespo-Llado et al., 2018). One key limitation is that the body stimuli used in laboratory studies are visually impoverished and may not capture many of the perceptual cues that infants and children may experience in their daily activities (e.g., Smith et al., 2018). Given the importance of the maturing infants' environmental and social contexts, future studies can be conducted in the real world and focus on, for example, the frequency of different facial and body emotion expressions in the infants' visual field, parenting behaviors, and the context in which emotion expressions occur (e.g., Fausey et al., 2016; Jayaraman et al., 2017; Smith et al., 2018). These directions will be highly challenging but will be important to address the gaps in understanding the development of emotion processing of body expressions—and emotion processing more generally—highlighted by our mini-review.
Statements
Author contributions
QV contributed to the conception and interpretation of the work, conducted the literature search and meta-analysis, and worked on the draft of the manuscript. EG contributed to the conception and interpretation of the work, contributed to the literature search, and worked on the draft of the manuscript. All authors contributed to manuscript revision, read, and approved the submitted version.
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
Supplementary material
The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fcogn.2023.1155031/full#supplementary-material
References
1
Actis-GrossoR.BossiF.RicciardelliP. (2015). Emotion recognition through static faces and moving bodies: a comparison between typically developed adults and individuals with high level of autistic traits. Front. Psychol.6, 1570. 10.3389/fpsyg.2015.01570
2
AddabboM.VacaruS. V.MeyerM.HunniusS. (2020). Something in the way you move: infants are sensitive to emotions conveyed in action kinematics. Dev. Sci.23, e12873. 10.1111/desc.12873
3
AtkinsonA. P. (2013). “Bodily expressions of emotion: visual cues and neural mechanisms,” in The Cambridge Handbook of Human Affective Neuroscience, eds J. Armony, and P. Vuilleumier (Cambridge: Cambridge University Press). 10.1017/CBO9780511843716.012
4
AtkinsonA. P.DittrichW. H.GemmellA. J.YoungA. W. (2004). Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception33, 717–746. 10.1068/p5096
5
AusderauK. K.DammannC.McManusK.SchneiderM.EmborgM. E.Schultz-DarkenN. (2017). Cross-species comparison of behavioral neurodevelopmental milestones in the common marmoset monkey and human child. Dev. Psychobiol.59, 807–821. 10.1002/dev.21545
6
**BalasB.AuenA.SavilleA.SchmidtJ. (2018). Body emotion recognition disproportionately depends on vertical orientations during childhood. Int. J. Behav. Dev.42, 278–283. 10.1177/0165025417690267
7
BänzigerT.MortillaroM.SchererK. R. (2012). Introducing the Geneva multimodal expression corpus for experimental research on emotion perception. Emotion12, 1161–1179. 10.1037/a0025827
8
BayetL.NelsonC. A. (2019). “The perception of facial emotion in typical and atypical development,” in Handbook of Emotional Development (Cham: Springer), 105–138. 10.1007/978-3-030-17332-6_6
9
BelinP.BestelmeyerP. E. G.LatinusM.WatsonR. (2011). Understanding voice perception. Br. J. Psychol.102, 711–725. 10.1111/j.2044-8295.2011.02041.x
10
BhattR. S.HockA.WhiteH.JubranR.GalatiA. (2016). The development of body structure knowledge in infancy. Child Dev. Perspect.10, 45–52. 10.1111/cdep.12162
11
**BooneR. T.CunninghamJ. G. (1998). Children's decoding of emotion in expressive body movement: the development of cue attunement. Dev. Psychol.34, 1007–1016. 10.1037/0012-1649.34.5.1007
12
*BrosgoleL.GioiaJ. V.ZingmondR. (1986). Facial- and postural-affect recognition in the mentally handicapped and normal young children. Int. J. Neurosci.30, 127–144. 10.3109/00207458608985662
13
CamposJ. J.MummeD. L.KermoianR.CamposR. G. (1994). A functionalist perspective on the nature of emotion. Monogr. Soc. Res. Child Dev.59, 284–303.
14
Crespo-LladoM. M.VanderwertR. E.GeanguE. (2018). Individual differences in infants' neural responses to their peers' cry and laughter. Biol. Psychol.135, 117–127. 10.1016/j.biopsycho.2018.03.008
15
de GelderB. (2009). Why bodies? Twelve reasons for including bodily expressions in affective neuroscience. Philos. Trans. R. Soc. B Biol. Sci.364, 3475–3484. 10.1098/rstb.2009.0190
16
de GelderB.van den StockJ. (2011). The bodily expressive action stimulus test (BEAST). Construction and validation of a stimulus basis for measuring perception of whole body expression of emotions. Front. Psychol.2, 181. 10.3389/fpsyg.2011.00181
17
de GrootJ. H.SmeetsM. A. (2017). Human fear chemosignaling: evidence from a meta-analysis. Chem. Senses42, 663–673. 10.1093/chemse/bjx049
18
EneaV.IancuS. (2016). Processing emotional body expressions: state-of-the-art. Soc. Neurosci.11, 495–506. 10.1080/17470919.2015.1114020
19
Falck-YtterT.GredebäckG.Von HofstenC. (2006). Infants predict other people's action goals. Nat. Neurosci.9, 878–879. 10.1038/nn1729
20
FauseyC. M.JayaramanS.SmithL. B. (2016). From faces to hands: changing visual input in the first 2 years. Cognition152, 101–107. 10.1016/j.cognition.2016.03.005
21
FischerK. W.SilvernL. (1985). Stages and individual differences in cognitive development. Annu. Rev. Psychol.36, 613–648.
22
FlavellJ. H. (1982). On cognitive development. Child Dev.14, 1–10.
23
GeanguE. (2008). Notes on self awareness development in early infancy. Cognit. Brain Behav. 12, 103.
24
GeanguE. (2015). “Development of empathy during childhood across cultures,” in International Encyclopedia of the Social and Behavioral Sciences (Amsterdam: Elsevier), 549–553. 10.1016/B978-0-08-097086-8.23167-X
25
GeanguE.HaufP.BhardwajR.BentzW. (2011). Infant pupil diameter changes in response to others' positive and negative emotions. PLoS ONE6, e27132. 10.1371/journal.pone.0027132
26
GeanguE.IchikawaH.LaoJ.KanazawaS.YamaguchiM. K.CaldaraR.et al. (2016a). Culture shapes 7-month-olds' perceptual strategies in discriminating facial expressions of emotion. Curr. Biol.26, R663–R664. 10.1016/j.cub.2016.05.072
27
**GeanguE.QuadrelliE.ConteS.CrociE.TuratiC. (2016b). Three-year-olds' rapid facial electromyographic responses to emotional facial expressions and body postures. J. Exp. Child Psychol.144, 1–14. 10.1016/j.jecp.2015.11.001
28
GeanguE.QuadrelliE.LewisJ. W.Macchi CassiaV.TuratiC. (2015). By the sound of it: an ERP investigation of human action sound processing in 7-month-old infants. Dev. Cognit. Neurosci.12, 134–144. 10.1016/j.dcn.2015.01.005
29
**GeanguE.VuongQ. C. (2020). Look up to the body: an eye-tracking investigation of 7-months-old infants' visual exploration of emotional body expressions. Infant. Behav. Dev.60, 101473. 10.1016/j.infbeh.2020.101473
30
**GeanguE.VuongQ. C. (2023). Seven-months-old infants show increased arousal to static emotion body expressions: evidence from pupil dilation. Infancy. 14, 12535. 10.1111/infa.12535
31
GillmeisterH.StetsM.GrigorovaM.RigatoS. (2019). How do bodies become special? Electrophysiological evidence for the emergence of body-related cortical processing in the first 14 months of life. Dev. Psychol.55, 2025. 10.1037/dev0000762
32
*GioiaJ. V.BrosgoleL. (1988). Visual and auditory affect recognition in singly diagnosed mentally retarded patients, mentally retarded patients with autism and normal young children. Int. J. Neurosci.43, 149–163. 10.3109/00207458808986164
33
*HaoJ.SuY. (2014). Deaf children's use of clear visual cues in mindreading. Res. Dev. Disabil.35, 2849–2857. 10.1016/j.ridd.2014.07.034
34
*HeckA.ChroustA.WhiteH.JubranR.BhattR. S. (2018). Development of body emotion perception in infancy: from discrimination to recognition. Infant. Behav. Dev.50, 42–51. 10.1016/j.infbeh.2017.10.007
35
HepachR.WestermannG. (2016). Pupillometry in infancy research. J. Cognit. Dev.17, 359–377. 10.1080/15248372.2015.1135801
36
HiraiM.HirakiK. (2005). An event-related potentials study of biological motion perception in human infants. Cognit. Brain Res.22, 301–304. 10.1016/j.cogbrainres.2004.08.008
37
**HockA.OberstL.JubranR.WhiteH.HeckA.BhattR. S. (2017). Integrated emotion processing in infancy: matching of faces and bodies. Infancy22, 608–625. 10.1111/infa.12177
38
JayaramanS.FauseyC. M.SmithL. B. (2017). Why are faces denser in the visual experiences of younger than older infants?Dev. Psychol.53, 38. 10.1037/dev0000230
39
**KeH.VuongQ. C.GeanguE. (2022). Three- and six-year-old children are sensitive to natural body expressions of emotion: an event-related potential emotional priming study. J. Exp. Child Psychol.224, 105497. 10.1016/j.jecp.2022.105497
40
KeltnerD.TracyJ.SauterD. A.CordaroD. C.McNeilG. (2016). Expression of emotion. Handbook Emot.14, 467–482.
41
KochukhovaO.GredebäckG. (2010). Preverbal infants anticipate that food will be brought to the mouth: an eye tracking study of manual feeding and flying spoons. Child Dev.81, 1729–1738. 10.1111/j.1467-8624.2010.01506.x
42
KretM. E.RoelofsK.StekelenburgJ.de GelderB. (2013). Emotional signals from faces, bodies and scenes influence observers' face expressions, fixations and pupil-size. Front. Hum. Neurosci.7, 810. 10.3389/fnhum.2013.00810
43
**KrolK. M.RajhansP.MissanaM.GrossmannT. (2015). Duration of exclusive breastfeeding is associated with differences in infants' brain responses to emotional body expressions. Front. Behav. Neurosci.8, 459. 10.3389/fnbeh.2014.00459
44
**LagerlöfI.DjerfM. (2009). Children's understanding of emotion in dance. Eur. J. Dev. Psychol. 6, 409–431. 10.1080/17405620701438475
45
LeppänenJ. M.NelsonC. A. (2009). Tuning the developing brain to social signals of emotion. Nat. Rev. Neurosci. 10, 37–47. 10.1038/nrn2554
46
*MissanaM.AtkinsonA. P.GrossmannT. (2015). Tuning the developing brain to emotional body expressions. Dev. Sci.18, 243–253. 10.1111/desc.12209
47
**MissanaM.GrossmannT. (2015). Infants' emerging sensitivity to emotional body expressions: insights from asymmetrical frontal brain activity. Dev. Psychol.51, 151–160. 10.1037/a0038469
48
**MissanaM.RajhansP.AtkinsonA. P.GrossmannT. (2014). Discrimination of fearful and happy body postures in 8-month-old infants: an event-related potential study. Front. Hum. Neurosci.8, 531. 10.3389/fnhum.2014.00531
49
*MondlochC. J.HornerM.MianJ. (2013). Wide eyes and drooping arms: adult-like congruency effects emerge early in the development of sensitivity to emotional faces and body postures. J. Exp. Child Psychol.114, 203–216. 10.1016/j.jecp.2012.06.003
50
**NelsonN. L.HudspethK.RussellJ. A. (2013). A story superiority effect for disgust, fear, embarrassment, and pride. Br. J. Dev. Psychol.31, 334–348. 10.1111/bjdp.12011
51
*NelsonN. L.MondlochC. J. (2018). Children's visual attention to emotional expressions varies with stimulus movement. J. Exp. Child Psychol.172, 13–24. 10.1016/j.jecp.2018.03.001
52
*NelsonN. L.RussellJ. A. (2011). Preschoolers' use of dynamic facial, bodily, and vocal cues to emotion. J. Exp. Child Psychol.110, 52–61. 10.1016/j.jecp.2011.03.014
53
*NelsonN. L.RussellJ. A. (2012). Children's understanding of nonverbal expressions of pride. J. Exp. Child Psychol.111, 379–385. 10.1016/j.jecp.2011.09.004
54
*NguyenT. T.NelsonN. L. (2021). Winners and losers: recognition of spontaneous emotional expressions increases across childhood. J. Exp. Child Psychol.209, 105184. 10.1016/j.jecp.2021.105184
55
**NicoliniY.ManiniB.De StefaniE.CoudeG.CardoneD.et al. (2019). Autonomic responses to emotional stimuli in children affected by facial palsy: the case of moebius syndrome. Neural Plast.14, 7253768. 10.1155/2019/7253768
56
**OgrenM.KaplanB.PengY.JohnsonK. L.JohnsonS. P. (2019). Motion or emotion: infants discriminate emotional biological motion based on low-level visual information. Infant. Behav. Dev.57, 101324. 10.1016/j.infbeh.2019.04.006
57
*ParkerA. E.MathisE. T.KupersmidtJ. B. (2013). How is this child feeling? Preschool-aged children's ability to recognize emotion in faces and body poses. Early Educ. Dev.24, 188–211. 10.1080/10409289.2012.657536
58
ParkinsonC.WalkerT. T.MemmiS.WheatleyT. (2017). Emotions are understood from biological motion across remote cultures. Emotion17, 459–477. 10.1037/emo0000194
59
PolluxP. M.CraddockM.GuoK. (2019). Gaze patterns in viewing static and dynamic body expressions. Acta Psychol.198, 102862. 10.1016/j.actpsy.2019.05.014
60
Poulin-DuboisD.HastingsP. D.ChiarellaS. S.GeanguE.HaufP.RuelA.et al. (2018). The eyes know it: Toddlers' visual scanning of sad faces is predicted by their theory of mind skills. PLoS ONE13, e0208524. 10.1371/journal.pone.0208524
61
Poyo SolanasM.VaessenM.de GelderB. (2020). The role of computational and subjective features in emotional body expressions. Sci. Rep.10, 1–13. 10.1038/s41598-020-63125-1
62
QuadrelliE.GeanguE.TuratiC. (2019). Human action sounds elicit sensorimotor activation early in life. Cortex117, 323–335. 10.1016/j.cortex.2019.05.009
63
*QuamC.SwingleyD. (2012). Development in children's interpretation of pitch cues to emotions. Child Dev.83, 236–250. 10.1111/j.1467-8624.2011.01700.x
64
**RajhansP.JessenS.MissanaM.GrossmannT. (2016). Putting the face in context: body expressions impact facial emotion processing in human infants. Dev. Cogn. Neurosci.19, 115–121. 10.1016/j.dcn.2016.01.004
65
*RajhansP.MissanaM.KrolK. M.GrossmannT. (2015). The association of temperament and maternal empathy with individual differences in infants' neural responses to emotional body expressions. Dev. Psychopathol.27, 1205–1216. 10.1017/S0954579415000772
66
RobinsonD. T.Clay-WarnerJ.MooreC. D.EverettT.WattsA.TuckerT. N.et al. (2012). Toward an unobtrusive measure of emotion during interaction: Thermal imaging techniques. Biosociol. Neurosociol.12, 225–266. 10.1108/S0882-6145(2012)0000029011
67
RosenJ. B.AsokA.ChakrabortyT. (2015). The smell of fear: innate threat of 2, 5-dihydro-2, 4, 5-trimethylthiazoline, a single molecule component of a predator odor. Front. Neurosci.9, 292. 10.3389/fnins.2015.00292
68
RossP.AtkinsB.AllisonL.SimpsonH.DuffellC.et al. (2021). Children cannot ignore what they hear: incongruent emotional information leads to an auditory dominance in children. J. Exp. Child Psychol.204, 105068. 10.1016/j.jecp.2020.105068
69
**RossP.AtkinsonA. P. (2020). Expanding simulation models of emotional understanding: The case for different modalities, body-state simulation prominence, and developmental trajectories. Front. Psychol.11, 309. 10.3389/fpsyg.2020.00309
70
RossP.de GelderB.CrabbeF.GrosbrasM. H. (2019). Emotion modulation of the body-selective areas in the developing brain. Dev. Cognit. Neurosci.38, 100660. 10.1016/j.dcn.2019.100660
71
RossP.FlackT. R. (2020). Removing hand information specifically impairs emotion recognition for fearful and angry body stimuli. Perception49, 98–112. 10.1177/0301006619893229
72
Salazar-LópezE.DomínguezE.RamosV. J.De la FuenteJ.MeinsA.IborraO.et al. (2015). The mental and subjective skin: emotion, empathy, feelings and thermography. Consc. Cognit.34, 149–162. 10.1016/j.concog.2015.04.003
73
*SandersG. (2006). The perception and decoding of expressive emotional information by hearing and hearing-impaired children. Early Child Dev. Care21, 11–26. 10.1080/0300443850210102
74
SauterD. A.EisnerF.EkmanP.ScottS. K. (2010). Cross-cultural recognition of basic emotions through nonverbal emotional vocalizations. Proc. Natl. Acad. Sci.107, 2408–2412. 10.1073/pnas.0908239106
75
SchwarzerG.CarpenterJ. R.RückerG. (2015). Meta-Analysis with R. London: Springer. 10.1007/978-3-319-21416-0
76
SimionF.RegolinL.BulfH. (2008). A predisposition for biological motion in the newborn baby. Proc. Natl. Acad. Sci. 105, 809–813. 10.1073/pnas.0707021105
77
SmithF. W.SchynsP. G. (2009). Smile through your fear and sadness: transmitting and identifying facial expression signals over a range of viewing distances. Psychol. Sci.20, 1202–1208. 10.1111/j.1467-9280.2009.02427.x
78
SmithL. B.JayaramanS.ClerkinE.ChenY. (2018). The developing infant creates a curriculum for statistical learning. Trends Cognit. Sci. 22, 325–336. 10.1016/j.tics.2018.02.004
79
ThomaP.BauserD. S.SuchanB. (2013). BESST (Bochum Emotional Stimulus Set): a pilot validation study of a stimulus set containing emotional bodies and faces from frontal and averted views. Psych. Res.209, 98–109. 10.1016/j.psychres.2012.11.012
80
TottenhamN.TanakaJ. W.LeonA. C.McCarryT.NurseM.HareT. A.et al. (2009). The NimStim set of facial expressions: judgments from untrained research participants. Psych. Res.168, 242–249. 10.1016/j.psychres.2008.05.006
81
*TsouY. T.LiB.KretM. E.FrijnsJ. H.RieffeC. (2021). Hearing status affects children's emotion understanding in dynamic social situations: an eye-tracking study. Ear Hear.42, 1024–1033. 10.1097/AUD.0000000000000994
82
**TuminelloE. R.DavidsonD. (2011). What the face and body reveal: in-group emotion effects and stereotyping of emotion in african american and european american children. J. Exp. Child Psychol.110, 258–274. 10.1016/j.jecp.2011.02.016
83
**VieillardS.GuidettiM. (2009). Children's perception and understanding of (dis)similarities among dynamic bodily/facial expressions of happiness, pleasure, anger, and irritation. J. Exp. Child Psychol.102, 78–95. 10.1016/j.jecp.2008.04.005
84
VolkovaE.De La RosaS.BülthoffH. H.MohlerB. (2014). The MPI emotional body expressions database for narrative scenarios. PLoS ONE9, e113647. 10.1371/journal.pone.0113647
85
WalleE. A.LopezL. D. (2020). “Emotion recognition and understanding in infancy and early childhood,” in Encyclopedia of Infant and Early Childhood Development, 2nd edition, ed J. B. Benson (Amsterdam: Elsevier), 537–545. 10.1016/B978-0-12-809324-5.23567-0
86
WidenS. C. (2013). Children's interpretation of facial expressions: the long path from valence-based to specific discrete categories. Emot. Rev.5, 72–77. 10.1177/1754073912451492
87
**WitkowerZ.TracyJ. L.PunA.BaronA. S. (2021). Can children recognize bodily expressions of emotion?J. Nonverb. Behav.45, 505–518. 10.1007/s10919-021-00368-0
88
**YangY.HouW.LiJ. (2022). Validation of the bodily expressive action stimulus test among chinese adults and children. Psych. J.11, 392–400. 10.1002/pchj.542
89
YehP. W.GeanguE.ReidV. (2016). Coherent emotional perception from body expressions and the voice. Neuropsychologia91, 99–108. 10.1016/j.neuropsychologia.2016.07.038
90
**ZieberN.KangasA.HockA.BhattR. S. (2014a). Infants' perception of emotion from body movements. Child Dev.85, 675–684. 10.1111/cdev.12134
91
**ZieberN.KangasA.HockA.BhattR. S. (2014b). The development of intermodal emotion perception from bodies and voices. J. Exp. Child Psychol.126, 68–79. 10.1016/j.jecp.2014.03.005
Summary
Keywords
emotion, body expression, development, discrimination, recognition, meta-analysis
Citation
Vuong QC and Geangu E (2023) The development of emotion processing of body expressions from infancy to early childhood: A meta-analysis. Front. Cognit. 2:1155031. doi: 10.3389/fcogn.2023.1155031
Received
31 January 2023
Accepted
20 March 2023
Published
11 April 2023
Volume
2 - 2023
Edited by
Alice Mado Proverbio, University of Milano-Bicocca, Italy
Reviewed by
Elisa Roberti, Neurological Institute Foundation Casimiro Mondino (IRCCS), Italy; Marta Calbi, University of Milan, Italy
Updates
Copyright
© 2023 Vuong and Geangu.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Quoc C. Vuong quoc.vuong@newcastle.ac.uk
This article was submitted to Perception, a section of the journal Frontiers in Cognition
Disclaimer
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.