<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="review-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Integr. Neurosci.</journal-id>
<journal-title>Frontiers in Integrative Neuroscience</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Integr. Neurosci.</abbrev-journal-title>
<issn pub-type="epub">1662-5145</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fnint.2012.00081</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Neuroscience</subject>
<subj-group>
<subject>Review Article</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>The social-sensory interface: category interactions in person perception</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Freeman</surname> <given-names>Jonathan B.</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="author-notes" rid="fn001"><sup>&#x0002A;</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Johnson</surname> <given-names>Kerri L.</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<xref ref-type="aff" rid="aff3"><sup>3</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Adams</surname> <given-names>Reginald B.</given-names> <suffix>Jr.</suffix></name>
<xref ref-type="aff" rid="aff4"><sup>4</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Ambady</surname> <given-names>Nalini</given-names></name>
<xref ref-type="aff" rid="aff5"><sup>5</sup></xref>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Department of Psychological and Brain Sciences, Dartmouth College</institution> <country>Hanover, NH, USA</country></aff>
<aff id="aff2"><sup>2</sup><institution>Department of Communication Studies, University of California, Los Angeles</institution> <country>Los Angeles, CA, USA</country></aff>
<aff id="aff3"><sup>3</sup><institution>Department of Psychology, University of California, Los Angeles</institution> <country>Los Angeles, CA, USA</country></aff>
<aff id="aff4"><sup>4</sup><institution>Department of Psychology, The Pennsylvania State University</institution> <country>University Park, PA, USA</country></aff>
<aff id="aff5"><sup>5</sup><institution>Department of Psychology, Stanford University</institution> <country>Stanford, CA, USA</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Jacob Jolij, University of Groningen, Netherlands</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Antonio Pereira, Federal University of Rio Grande do Norte, Brazil; Lars A. Ross, Albert Einstein College of Medicine, USA</p></fn>
<fn fn-type="corresp" id="fn001"><p>&#x0002A;Correspondence: Jonathan B. Freeman, Department of Psychological and Brain Sciences, Dartmouth College, 6207 Moore Hall, Hanover, NH 03755, USA. e-mail: <email>jon.freeman&#x00040;dartmouth.edu</email></p></fn>
</author-notes>
<pub-date pub-type="epreprint">
<day>18</day>
<month>06</month>
<year>2012</year>
</pub-date>
<pub-date pub-type="epub">
<day>17</day>
<month>10</month>
<year>2012</year>
</pub-date>
<pub-date pub-type="collection">
<year>2012</year>
</pub-date>
<volume>6</volume>
<elocation-id>81</elocation-id>
<history>
<date date-type="received">
<day>16</day>
<month>04</month>
<year>2012</year>
</date>
<date date-type="accepted">
<day>05</day>
<month>09</month>
<year>2012</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2012 Freeman, Johnson, Adams and Ambady.</copyright-statement>
<copyright-year>2012</copyright-year>
<license license-type="open-access" xlink:href="http://www.frontiersin.org/licenseagreement"><p>This is an open-access article distributed under the terms of the <uri xlink:href="http://creativecommons.org/licenses/by/3.0/">Creative Commons Attribution License</uri>, which permits use, distribution and reproduction in other forums, provided the original authors and source are credited and subject to any copyright notices concerning any third-party graphics etc.</p>
</license>
</permissions>
<abstract><p>Research is increasingly challenging the claim that distinct sources of social information&#x02014;such as sex, race, and emotion&#x02014;are processed in discrete fashion. Instead, there appear to be functionally relevant interactions that occur. In the present article, we describe research examining how cues conveyed by the human face, voice, and body interact to form the unified representations that guide our perceptions of and responses to other people. We explain how these information sources are often thrown into interaction through bottom-up forces (e.g., phenotypic cues) as well as top-down forces (e.g., stereotypes and prior knowledge). Such interactions point to a person perception process that is driven by an intimate interface between bottom-up perceptual and top-down social processes. Incorporating data from neuroimaging, event-related potentials (ERP), computational modeling, computer mouse-tracking, and other behavioral measures, we discuss the structure of this interface, and we consider its implications and adaptive purposes. We argue that an increased understanding of person perception will likely require a synthesis of insights and techniques, from social psychology to the cognitive, neural, and vision sciences.</p></abstract>
<kwd-group>
<kwd>person perception</kwd>
<kwd>visual perception</kwd>
<kwd>face perception</kwd>
<kwd>social categorization</kwd>
</kwd-group>
<counts>
<fig-count count="3"/>
<table-count count="0"/>
<equation-count count="0"/>
<ref-count count="129"/>
<page-count count="13"/>
<word-count count="11816"/>
</counts>
</article-meta>
</front>
<body>
<p>With only a fleeting glimpse, a constellation of near-instant judgments are often made about another person. Although frequently warned not to &#x0201C;judge a book by its cover,&#x0201D; our tendency to make meaning out of the sensory information availed by others is typically beyond our conscious control. From minimal cues afforded by the face, voice, and body, we unwittingly infer the intentions, thoughts, personalities, emotions, and category memberships (e.g., sex, race, and age) of those around us. While some of these judgments may be expectancy-driven and biased by our stereotypes (Brewer, <xref ref-type="bibr" rid="B24">1988</xref>; Fiske and Neuberg, <xref ref-type="bibr" rid="B37">1990</xref>; Macrae and Bodenhausen, <xref ref-type="bibr" rid="B81">2000</xref>), others may be surprisingly accurate and expose humans&#x00027; exquisite ability to perceive other people from only the briefest of observations (Ambady et al., <xref ref-type="bibr" rid="B11">2000</xref>; Willis and Todorov, <xref ref-type="bibr" rid="B122">2006</xref>).</p>
<p>This astounding ability to perceive other people, however, is plagued by a basic contradiction. As readily and rapidly as we may dispel our judgments of others, each judgment requires an astonishing complexity of mental processing. Despite their complexity, however, they occur with remarkable ease. From a single face, for example, any numbers of perceptions (e.g., sex and emotion) are immediately available, but each requires the integration of an enormous amount of information. Unlike objects, other people are highly complex stimuli, embedded in a rich set of contexts and grounded in multiple sensory modalities. All the features and configural properties of a person&#x00027;s face must be bound together, along with that person&#x00027;s hair and array of bodily cues. Auditory cues of a person&#x00027;s voice are available as well, and these must be bound together with the person&#x00027;s visual cues to form a coherent social percept. Such a complexity of bottom-up sensory information is matched, however, by a similar complexity in top-down information sources that are uniquely present in person perception. For example, people bring a great deal of prior knowledge, stereotypic expectations, and affective and motivational states to the process of perceiving others. The influences of these top-down factors may often seep down into the perceptual process itself. How does such a vast array of information&#x02014;both bottom-up and top-down&#x02014;rapidly conspire to drive perception in the very short time it takes to arrive at an instant judgment of another person?</p>
<p>In this article, we first provide background on person perception from the perspective of social psychology followed by background from the perspective of the cognitive, vision, and neural sciences. We then describe how these literatures have traditionally converged on the argument for non-interactive processing of different category dimensions. We then discuss more recent evidence for category interactions, either through top-down (e.g., driven by stereotypic expectations) or bottom-up (e.g., driven by perceptual cues) mechanisms. Finally, we explain how a recent framework and model of person perception can capture such effects and potentially yield a clearer picture of person perception. When then finish with some concluding remarks.</p>
<sec>
<title>Social psychology on person perception</title>
<p>If person perception is characterized, on the one hand, by being highly complex, and on the other by being highly efficient, social psychological research has historically placed a great deal of focus on the latter. Seminal work in social psychology by Allport (<xref ref-type="bibr" rid="B9">1954</xref>), Sherif (<xref ref-type="bibr" rid="B104">1967</xref>), and Tajfel (<xref ref-type="bibr" rid="B115">1969</xref>), for example, argued that individuals perceive others via spontaneous, perhaps inevitable, category-based impressions that are highly efficient and designed to economize on mental resources. Since then, a vast array of studies has demonstrated that such category-based impressions bring about a host of cognitive, affective, and behavioral outcomes. Mere activation of a social category, it has been shown, readily changes how individuals think about others, feel about them, and behave toward them, often in ways that may operate non-consciously (e.g., Brewer, <xref ref-type="bibr" rid="B24">1988</xref>; Devine, <xref ref-type="bibr" rid="B32">1989</xref>; Fiske and Neuberg, <xref ref-type="bibr" rid="B37">1990</xref>; Gilbert and Hixon, <xref ref-type="bibr" rid="B49">1991</xref>; Bargh, <xref ref-type="bibr" rid="B16">1994</xref>, <xref ref-type="bibr" rid="B17">1999</xref>; Fazio et al., <xref ref-type="bibr" rid="B36">1995</xref>; Dovidio et al., <xref ref-type="bibr" rid="B34">1997</xref>; Sinclair and Kunda, <xref ref-type="bibr" rid="B105">1999</xref>). A strong emphasis in social psychology, therefore, has been to document the downstream implications of person categorization and its myriad outcomes for social interaction.</p>
<p>With a focus on subsequent interpersonal phenomena, a clear research strategy emerged in the literature. Often, a single category of interest would be isolated (e.g., race) and its influences on subsequent behavior measured, while all other categories were controlled (e.g., sex, age, and emotion). This afforded tremendous insights into the downstream dynamics of single categorizations, but it lacked breadth in understanding the complexity of real-world categorization. In reality, social targets may be categorized along any number of possible dimensions. Of the many potential categories, then, which get the privilege of perceivers&#x00027; processing, and which are thrown aside? A prevalent answer to this question has been that one category (e.g., race) comes to dominate perception, while all others (e.g., sex and age) are inhibited from working memory (Macrae et al., <xref ref-type="bibr" rid="B82">1995</xref>; Sinclair and Kunda, <xref ref-type="bibr" rid="B105">1999</xref>). Presumably, this category selection process makes the perceiver&#x00027;s job easier (e.g., Bodenhausen and Macrae, <xref ref-type="bibr" rid="B21">1998</xref>), keeping with the longstanding notion that social categorization is for maximizing cognitive efficiency (Allport, <xref ref-type="bibr" rid="B9">1954</xref>). Although this perspective has been valuable, its upshot has been a tendency to view each category membership as isolated and independent (with one dominating and all others cast aside), and to neglect targets&#x00027; multiple simultaneous memberships and how they may, in some cases, interact.</p>
<p>Such multiple social categorization is one of the most fascinating and distinctive aspects of person perception. For instance, whereas the perception of an object generally affords only one focal type of construal (e.g., &#x0201C;that&#x00027;s a table&#x0201D;), multiple construals are simultaneously available when perceiving other people and each is highly relevant. A single face stimulus, for example, permits a rich array of judgments, including basic categories (e.g., sex, race, age, and emotion), perceptually ambiguous categories (e.g., sexual orientation), personality traits (e.g., warmth and competence), intentions (e.g., deception), among many others. Although the results of studies examining personality judgments have long implied that they may occur in parallel (e.g., Ambady et al., <xref ref-type="bibr" rid="B11">2000</xref>; Todorov and Uleman, <xref ref-type="bibr" rid="B117">2003</xref>; Willis and Todorov, <xref ref-type="bibr" rid="B122">2006</xref>), the underlying basis of the parallelism and the mutual influences each judgment may exert on one another have rarely been investigated. With respect to basic social categories, parallel memberships has been examined in the context of high-level impressions, social reasoning, and memory (e.g., Strangor et al., <xref ref-type="bibr" rid="B112">1992</xref>; Kunda and Thagard, <xref ref-type="bibr" rid="B72">1996</xref>; Vescio et al., <xref ref-type="bibr" rid="B121">2004</xref>; Smith, <xref ref-type="bibr" rid="B107">2006</xref>; Crisp and Hewstone, <xref ref-type="bibr" rid="B31">2007</xref>), but their simultaneous interplay has been scarcely considered in lower-level sensory-based perceptions. Thus, although construing others is uniquely characterized by an enormous number of simultaneously available perceptions, the social literature has tended to overlook their compound nature and how they might interact.</p>
</sec>
<sec>
<title>Cognitive, vision, and neural sciences on person perception</title>
<p>In the cognitive face-processing literature, non-interactive processing has even been argued formally. Such work has been largely guided by the cognitive architecture laid out in the influential Bruce and Young (<xref ref-type="bibr" rid="B27">1986</xref>) model of face perception, which proposed a dual processing route. Initially, a structural encoding mechanism constructs a representation of a face&#x00027;s features and configuration. Processing results from structural encoding are then sent down two non-interactive, functionally independent routes. One route works on processing a target&#x00027;s static cues, such as identity, while a separate route works on processing more complex and dynamic cues, including emotion expressions, speech, and other &#x0201C;visually derived semantic information,&#x0201D; such as social categories. Haxby and colleagues (<xref ref-type="bibr" rid="B57">2000</xref>) extended the Bruce and Young model to the neural level. They proposed that, first the early perception of facial features is mediated by the inferior occipital gyrus (IOG) (analogous to Bruce and Young&#x00027;s structural encoding mechanism). The labor is then divided onto the lateral fusiform gyrus, which processes static cues such as identity, and the superior temporal sulcus (STS), which processes dynamic cues such as emotion expressions (Haxby et al., <xref ref-type="bibr" rid="B57">2000</xref>). The model was supported by a number of fMRI studies demonstrating that fusiform regions tend to be more sensitive to identity, whereas the STS tends to be more sensitive to emotion (LaBar et al., <xref ref-type="bibr" rid="B75">2003</xref>; Winston et al., <xref ref-type="bibr" rid="B123">2004</xref>). Additional evidence came from lesion studies, which showed that distinct lesions correspond with selective impairments in processing identity versus emotion expressions (Tranel and Damasio, <xref ref-type="bibr" rid="B118">1988</xref>; Young et al., <xref ref-type="bibr" rid="B124">1993</xref>). A popular view, therefore, has been that the processing of multiple dimensions, such as identity and emotion, run independently and in parallel. As such, although multiple dimensions may be processed simultaneously, their processing is not generally thought to cross paths.</p>
<p>In contrast, a growing body of research emerging from the vision sciences has found a great deal of evidence for interdependence in processing various facial dimensions. Using selective attention paradigms such as the Garner interference paradigm (Garner, <xref ref-type="bibr" rid="B47">1976</xref>), a number of studies have tested perceivers&#x00027; ability to selectively attend to one dimension (e.g., facial identity) while ignoring task-irrelevant dimensions (e.g., facial emotion). Over the years, researchers have reported interference effects for many facial dimensions, including sex and emotion (Atkinson et al., <xref ref-type="bibr" rid="B14">2005</xref>), sex and age (Quinn and Macrae, <xref ref-type="bibr" rid="B96">2005</xref>), identity and sex (Ganel and Goshen-Gottstein, <xref ref-type="bibr" rid="B46">2004</xref>), identity and emotion (Schweinberger et al., <xref ref-type="bibr" rid="B103">1999</xref>), eye gaze and emotion (Graham and LaBar, <xref ref-type="bibr" rid="B50">2007</xref>), and sex and race (Johnson et al., <xref ref-type="bibr" rid="B67">in press</xref>). Such findings suggest interdependence among various facial dimensions, casting doubt on the traditional view that the processing of various facial dimensions is strictly separated. Calder and Young (<xref ref-type="bibr" rid="B28">2005</xref>) proposed a principal component analysis (PCA) framework for face perception, accounting for such interference effects by way of a single multidimensional face-coding system. According to their framework, neural dissociations typically taken as evidence for distinct processing pathways (e.g., identity processed via fusiform regions and emotion processed via the STS) reflect statistical regularities inherent in the visual input itself, rather than separate neural structures dedicated for particular facial dimensions. Such work presents serious challenges for the traditional view that the processing of one facial dimension is insulated from the processing of all other dimensions.</p>
<p>Additional evidence for inherent inseparability between multiple facial dimensions comes from neuronal recordings in non-human primates. In monkey temporal cortex, for example, there are groups of neurons that are sensitive to the conjunction of both identity and emotion, as well as the conjunction of eye gaze and emotion (Hasselmo et al., <xref ref-type="bibr" rid="B56">1989</xref>; Perrett et al., <xref ref-type="bibr" rid="B93">1992</xref>). There also appears to be a temporal evolution in how these neurons represent aspects of facial information. In one study, the transient response of face-sensitive temporal cortex neurons was found to initially reflect a rough, global discrimination of a visual stimulus as a face (rather than some other shape). Subsequent firing of this same neuronal population, however, appeared to sharpen over time by coming to represent finer facial information, such as emotion expression and, slightly later, facial identity (Sugase et al., <xref ref-type="bibr" rid="B114">1999</xref>). In humans, studies recording event-related potentials (ERP) also suggest a dynamic evolution of face representation, from more global (structural) encoding of the face to finer-grained information, such as sex category (Freeman et al., <xref ref-type="bibr" rid="B42">2010</xref>). Such findings suggest that there are overlapping neuronal populations involved in encoding multiple aspects of facial information. Taken together with the interference effects above, it appears that the processing of a single facial dimension may, at least in some cases, be neurally coextensive with the processing of other dimensions, and may readily interact and influence those other dimensions.</p>
</sec>
<sec>
<title>Combinatorial person perception</title>
<p>Evidence that the perceptual processing of a social target&#x00027;s various identities may be coextensive during perception implies that those identities may be thrown into interaction. Thus, the dynamics of social perceptions raise the intriguing possibility that the perception of multiple social categories and transient states are not only coactive during perception, but that they also are mutually dependent upon one another. As such, social perceptions are combinatorial. The perception of one social category may systematically facilitate or inhibit the perception of another social category. Such impacts appear to occur via two distinct routes&#x02014;one through the top-down influence of factors that originate in the perceiver (e.g., existing knowledge structures and motivations) and one through the bottom-up influence of factors that originate in the target of perception (e.g., overlapping visual cues). Next we review evidence supporting these two routes by which complexities in both the perceiver and the percept are likely to impact perceptions and their efficiency. Then, we review work that examines the underlying cognitive and neural processing through which these two forms of influence dynamically collaborate to yield a coherent and adaptively sensitive social percept.</p>
<sec>
<title>Top-down perceiver impacts</title>
<p>Some factors that impinge on the combinatorial nature of social perception originate in the perceiver. Although perception in general was long presumed to be impenetrable to and isolated from higher-order cognitive processes, recent evidence suggests otherwise. Instead, low-level sensory processes may be modulated by social cognitive factors (e.g., Bar, <xref ref-type="bibr" rid="B15">2004</xref>; Amodio and Frith, <xref ref-type="bibr" rid="B13">2006</xref>; Kveraga et al., <xref ref-type="bibr" rid="B74">2007</xref>), and this is apparent at the behavioral and neural levels. For instance, interconnectivity has been identified between the amygdala and the STS, tethering brain regions responsible for processing emotion content and the visual analysis of human actions, respectively (Amaral et al., <xref ref-type="bibr" rid="B10">2003</xref>). In terms of functionally adaptive face processing, the amygdala, orbitofrontal cortex (OFC), and STS form a three-node pathway that has been referred to as the &#x0201C;social brain&#x0201D; (Brothers, <xref ref-type="bibr" rid="B25">1990</xref>) important for processing social and emotional meaning from the face. Pathways from the STS to the amygdala support adaptive behavioral response to biological movement including facial expression and looking behavior (Aggleton et al., <xref ref-type="bibr" rid="B8">1980</xref>; Brothers and Ring, <xref ref-type="bibr" rid="B26">1993</xref>) and pathways to the OFC support adaptive behavioral responding, conceptual knowledge retrieval, and decision making during the processing of emotion information (Bechara et al., <xref ref-type="bibr" rid="B18">2000</xref>). Beyond these connections, the amygdala is also densely interconnected with regions involved in affective, cognitive, perceptual, and behavioral responses to faces where exteroceptive and interoceptive information can be integrated, and it is known to exert top-down modulation on extrastriate responses (Adolphs, <xref ref-type="bibr" rid="B7">2003</xref>). Specifically, the amygdala is known to be reciprocally connected to regions involved in face perception, such as the IOG, which is involved in low-level structural encoding of faces (Haxby et al., <xref ref-type="bibr" rid="B57">2000</xref>), and the fusiform gyrus, which is thought to be specialized for identity processing (Kanwisher, <xref ref-type="bibr" rid="B69">2000</xref>; Kanwisher and Yovel, <xref ref-type="bibr" rid="B70">2006</xref>). As such, it appears to act as an integrative center for the processing and relaying of socially relevant facial information.</p>
<p>The bidirectional and dynamic nature of the neural processing subserving social perception opens up the opportunity for social perceptions to be modulated by factors that are inherent to the perceiver, including existing knowledge structures (i.e., stereotypes) and current motivation states. Indeed, mounting evidence demonstrates that such factors impact social perception systematically, leading to functional biases or attunements in perceptions of the world and the people within it.</p>
<p>A perceiver&#x00027;s knowledge structures may impact perception through expectations. The social categories to which people belong each activate a network of knowledge structures that are associated with the particular category (Bargh, <xref ref-type="bibr" rid="B17">1999</xref>; Devine, <xref ref-type="bibr" rid="B32">1989</xref>). For instance, perceiving the category male is likely to elicit stereotypes of assertiveness and strength (Hess et al., <xref ref-type="bibr" rid="B62">2010</xref>); likewise perceiving an emotion such as anger may facilitate activation of the sex category male (Hess et al., <xref ref-type="bibr" rid="B58">2009</xref>). Once these knowledge structures are activated, they are thought to have a pronounced impact on basic perceptual processes (Freeman and Ambady, <xref ref-type="bibr" rid="B40">2011a</xref>).</p>
<p>Indeed, recent evidence suggests that stereotyped expectations that are elicited from cues to a social category can bias low-level aspects of perception. Race-cuing features, for instance, alter judgments of a target&#x00027;s skin tone (Levin and Banaji, <xref ref-type="bibr" rid="B77">2006</xref>). When facial cues implied a Black identity, participants were prone to overestimate the pigmentation of a target&#x00027;s face; when facial cues implied a White identity, in contrast, participants underestimated the pigmentation of a target&#x00027;s face. Thus, social category knowledge structures biased the luminance properties of a face. In other research, race-cuing hairstyles led perceivers to disambiguate the race of an otherwise race-ambiguous face in a category-consistent manner (MacLin and Malpass, <xref ref-type="bibr" rid="B79">2001</xref>, <xref ref-type="bibr" rid="B80">2003</xref>). Not only did these race categories influence memory for the faces, but as was found in the study above, race-cuing hairstyles also influenced low-level aspects of perception. Black faces were perceived to have a darker skin tone, wider faces and mouths, and less protruding eyes, relative to Hispanic faces.</p>
<p>Similarly, stereotyped expectations elicited from cues to a social category can bias perceptions. For instance, Johnson et al. (<xref ref-type="bibr" rid="B67">in press</xref>) demonstrated that sex categorizations and their efficiency were influenced by race-category membership. Male categorizations were more efficient for Black faces, but less efficient for Asian faces; female categorizations, in contrast, were more efficient for Asian faces, but less efficient for Black faces. These results were obtained, in part, because the stereotypes associated with race and sex are substantively overlapping. For example, both Black individuals and men are stereotypically associated as aggressive; both Asian individuals and women are stereotypically associated as docile. In another series of studies, contextual cues surrounding a face were found to alter race perception via stereotypes. If a face was surrounded by business attire, it was more likely to be perceived as White (as businesspeople and White people are both stereotypically associated as high status); when surrounded by janitor attire, the face was more likely to be perceived as Black (as janitors and Black people are both stereotypically associated as low status). These effects of stereotypes became more pronounced as the face&#x00027;s race increased in ambiguity. Further, even when a participant&#x00027;s ultimate response was not biased by the context and by stereotypes, their hand movement en route to the response often swerved nevertheless to the opposite response stereotypically associated with the attire (Freeman et al., <xref ref-type="bibr" rid="B44">2011</xref>). Thus, even in cases where stereotypes do not exert an influence on a perceptual outcome, they may still substantially alter the perceptual process.</p>
<p>Additionally, a perceiver&#x00027;s motivation state may alter perceptual processing. Visual cues to identity are potent sources of information that, under many circumstances, compel surprisingly accurate social judgments (Ambady and Rosenthal, <xref ref-type="bibr" rid="B12">1992</xref>). At times, however, observers&#x00027; judgments are prone to functional perceptual biases (Haselton and Nettle, <xref ref-type="bibr" rid="B55">2006</xref>). From this perspective, perceptual judgments are always rendered with some degree of uncertainty, and the relative costs associated with various errors are likely to be asymmetric. Motivational factors, therefore, will tend to bias the perceptions of the physical world in a manner that minimizes potential costs to the perceiver. Race-category labels that are paired with otherwise race-ambiguous faces change how a face is processed (Corneille et al., <xref ref-type="bibr" rid="B30">2004</xref>; Michel et al., <xref ref-type="bibr" rid="B87">2007</xref>) and determine whether a face will be remembered (see also Pauker and Ambady, <xref ref-type="bibr" rid="B90">2009</xref>; Pauker et al., <xref ref-type="bibr" rid="B91">2009</xref>), in ways that appear to be, at least in part, motivationally driven (see also Sacco and Hugenberg, <xref ref-type="bibr" rid="B99">2009</xref>). Perceivers are also likely to categorize targets to be Black&#x02014;a social category that is stereotyped to be dangerous&#x02014;when personal safety is a concern (Miller et al., <xref ref-type="bibr" rid="B88">2010</xref>), and perceivers who are high in racial prejudice are also more likely to categorize ambiguous race faces as Black (Hugenberg and Bodenhausen, <xref ref-type="bibr" rid="B64">2004</xref>). A motivation to identify coalitional alliances has been identified as a functional underpinning for race categorization (Kurzban et al., <xref ref-type="bibr" rid="B73">2001</xref>).</p>
<p>Sex categorizations show a similar pattern of functionally biased perceptions. Because men overall tend to be physically larger and stronger than women, they pose a greater potential threat to perceivers. In any condition of uncertainty, therefore, a functional bias is likely to favor a male percept. In fact, a male categorization has long been argued to comprise the &#x0201C;default&#x0201D; social category judgment (Zarate and Smith, <xref ref-type="bibr" rid="B125">1990</xref>; Stroessner, <xref ref-type="bibr" rid="B113">1996</xref>). Under conditions that may signal potential threat, this tendency appears to be exacerbated. When categorizing the sex of bodies, for example, perceivers show a pronounced male categorization bias for every body shape that is, in reality, not exclusive to women (Johnson et al., <xref ref-type="bibr" rid="B67">in press</xref>), and this tendency is most pronounced when perceivers are in a fearful state. Similarly, point-light defined arm motions that depict a person throwing an object are overwhelmingly categorized as male when the person engages a threatening emotion state (i.e., anger), relative to any other emotion state (Johnson et al., <xref ref-type="bibr" rid="B68">2011</xref>). Moreover, the findings from Johnson et al. (<xref ref-type="bibr" rid="B67">in press</xref>) are consistent with the notion that Black targets&#x02014;who are stereotyped as dangerous&#x02014;are likely to more readily compel male categorizations.</p>
<p>Although perceivers are generally adept in achieving accurate social perception, accuracy goals may sometimes be overshadowed by other motivational concerns (e.g., situational desires or physical safety concerns). In such circumstances, current motivations may outweigh accuracy objectives, leading social perceptions to be functionally biased in a directional fashion. In sum, both perceptual attunements and functional biases may emerge from the top-down modulation of social perception, either through motivation, existing knowledge structures, or both.<xref ref-type="fn" rid="fn0001"><sup>1</sup></xref></p>
</sec>
<sec>
<title>Bottom-up target impacts</title>
<p>Other factors that impinge on the combinatorial nature of social perception originate in the target of perception. This is because some perceptual attunements and biases are driven by the incoming sensory information itself. As such, cues to social identities may be confounded at the level of the stimulus. Such effects are now well documented for important intersections of social categories including sex and emotion, sex and race, and race and emotion. Importantly, because these categories share cues, their perception becomes inextricably tethered, in turn producing attunements and biases that are moderated by the unique combination of cues and categories.</p>
<p>One particularly intriguing juxtaposition of these dual routes of influence is in the perception of sex and emotion categories. This particular effect has received some attention over the years, initially with respect to shared stereotypes between emotions and sex categories. For instance, for many years, researchers found that facial expressions of emotion were perceived to vary between men and women (Grossman and Wood, <xref ref-type="bibr" rid="B52">1993</xref>; Plant et al., <xref ref-type="bibr" rid="B94">2000</xref>). Ambiguities in emotion expression tended to be resolved in a manner that was consistent with gender stereotypes (Hess et al., <xref ref-type="bibr" rid="B61">2000</xref>), and many interpreted such findings as evidence for a top-down modulation of emotion perception in a manner described above. Thus, the prevailing belief was that common associations between sex and emotion categories lead to biases in perceptual judgments.</p>
<p>More recent research clarified that such results may also emerge via an alternate route. One argument put forth by Marsh et al. (<xref ref-type="bibr" rid="B84">2005</xref>) proposed that some facial expressions in humans, in this case anger and fear, evolved to mimic more stable appearance cues related to facial maturity. Likewise, gender appearance is similarly associated with facial features that perceptually overlap with facial maturity (Zebrowitz, <xref ref-type="bibr" rid="B126">1997</xref>). Like facial maturity and masculinized facial features, anger is characterized by a low, bulging brow and small eyes. Conversely, like babyfacedness and feminized features, fear is distinguished by raised and arched brow ridge and widened eyes. Perhaps not too surprisingly then, several studies have hinted at a confounded nature between emotional expression and gender (Hess et al., <xref ref-type="bibr" rid="B59">2004</xref>). One more recent study examined the confound between gender and emotional expression of anger and happiness (Becker et al., <xref ref-type="bibr" rid="B19">2007</xref>). In an even more recent study (Hess et al., <xref ref-type="bibr" rid="B58">2009</xref>), both happy and fearful expressions were found to bias perception of otherwise androgynous faces toward female categorization, whereas anger expressions biased perception toward male categorization.</p>
<p>Such physical resemblance has been revealed in an even more compelling manner through computer-based models that are trained with facial metric data to detect appearance-based and expression cues in faces (e.g., Said et al., <xref ref-type="bibr" rid="B100">2009</xref>; Zebrowitz et al., <xref ref-type="bibr" rid="B128">2010</xref>). Critically, such studies avoid confounds with socially learned stereotypes. In one study, Zebrowitz et al. (<xref ref-type="bibr" rid="B127">2007</xref>) trained a connectionist model to detect babyfacedness versus maturity in the face, and then applied this model to detecting such cues in surprise, anger, happy, and neutral expressions. They found that the model was detected babyfacedness in surprise expressions and maturity in anger expressions due to similarities in height of brow. Additionally, the authors found that objective babyfacedness (as determined by the connectionist model) mediated impressions of surprise and anger in those faces reported by human judges. In this way, they were able to provide direct evidence for babyfacedness overgeneralization effects on a wide array of perceived personality traits.</p>
<p>Overlapping perceptual cues affect a number of other category dimensions as well. Some sex and race categories, for example, appear to share overlapping features. In one study using a statistical face model (derived from laser scans of many faces), cues associated with the Black category and cues associated with the male category were found to share a degree of overlap. This, in turn, facilitated the sex categorization of Black men relative to White or Asian men (Johnson et al., <xref ref-type="bibr" rid="B67">in press</xref>). A similar overlap exists between eye gaze and emotional expressions. Gaze has the interesting property of being able to offer functional information to a perceiver that, when paired with certain expressions, can lead to interesting interactive effects. According to the shared signal hypothesis (Adams et al., <xref ref-type="bibr" rid="B4">2003</xref>, Adams and Kleck, <xref ref-type="bibr" rid="B5">2005</xref>), cues relevant to threat that share a congruent underlying signal value should facilitate the processing efficiency of an emotion. Because direct and averted eye gaze convey a heightened probability of a target to either approach or avoid a target individual respectively (see Adams and Nelson, <xref ref-type="bibr" rid="B6">2012</xref>, for review), and anger and fear share underlying behavioral intentions (see Harmon-Jones, <xref ref-type="bibr" rid="B54">2003</xref>, for review), this hypothesis suggests that processing should be facilitated when emotion and eye gaze are combined in a congruent manner (i.e., both signaling approach, such as direct-gaze anger and averted-gaze fear) relative to an incongruent manner (i.e., direct-gaze fear and averted-gaze anger).</p>
<p>In support of both functional affordances described above, using speeded reaction time tasks and self-reported perception of emotional intensity, Adams et al. (<xref ref-type="bibr" rid="B4">2003</xref>) and Adams and Kleck (<xref ref-type="bibr" rid="B5">2005</xref>) found that direct gaze facilitated processing efficiency, accuracy, and increased the perceived intensity of facially communicated approach-oriented emotions (e.g., anger and joy), whereas averted gaze facilitated processing efficiency, accuracy, and perceived intensity of facially communicated avoidance-oriented emotions (e.g., fear and sadness). Similar effects were replicated by Sander et al. (<xref ref-type="bibr" rid="B101">2007</xref>) using dynamic threat displays, and by Hess et al. (<xref ref-type="bibr" rid="B60">2007</xref>) who found that direct relative to averted anger expressions and averted relative to direct fear expressions elicited more negative responsivity in observers. The converse effect holds as well; facial emotion influences how eye gaze is perceived. Direct eye gaze is recognized faster when paired with angry faces and averted eye gaze is recognized faster when paired with fearful faces (Adams and Franklin, <xref ref-type="bibr" rid="B2">2009</xref>). In addition, perceivers tend to judge eye gaze more often as looking at them when presented on happy and angry faces than neutral or fearful (Lobmaier et al., <xref ref-type="bibr" rid="B78">2008</xref>; Slepian et al., <xref ref-type="bibr" rid="B106">2011</xref>; see also, Martin and Rovira, <xref ref-type="bibr" rid="B85">1982</xref>). Further, (Mathews et al., <xref ref-type="bibr" rid="B85a">2003</xref>) found a faster cueing effect (where the attention of an observer is automatically shifted in that a target face is looking) for fear faces than neutral faces for those with high anxiety but not low anxiety, arguably because anxiety increases the observer&#x00027;s attunement to the threat afforded by an expressive display. When eye gaze was shifted dynamically <italic>after</italic> emotion was presented, however, fearful faces were found to induce higher levels of cueing compared to other emotions for all participants regardless of anxiety level (Tipples, <xref ref-type="bibr" rid="B116">2006</xref>; Putman et al., <xref ref-type="bibr" rid="B95">2007</xref>). More recently, Fox et al. (<xref ref-type="bibr" rid="B39">2007</xref>) found that fear expressions coupled with averted gaze yielded greater reflexive orienting than did neutral or anger expressions, whereas anger expressions coupled with direct gaze yielded greater attention capture than did neutral or fear expressions. These effects were also moderated by trait anxiety.</p>
<p>On the neural level, gaze has been found to influence amygdala responses to threatening emotion expressions. In an initial study, Adams et al. (<xref ref-type="bibr" rid="B4a">2003</xref>) found more amygdala response to threat-related ambiguity (i.e., for averted-gaze fear and direct-gaze anger). This study, however, was based on relatively sustained presentations of threat stimuli (2000 ms), whereas some more recent studies have found similar evidence for greater amygdala responses to congruent threat-gaze pairs (direct anger and averted fear) when employing more rapid presentations (Sato et al., <xref ref-type="bibr" rid="B102">2004</xref>; Hadjikhani et al., <xref ref-type="bibr" rid="B53">2008</xref>). Although these latter findings do corroborate Adams et al.&#x00027;s early behavioral findings for gaze&#x02013;emotion interactivity, they opened up new questions regarding the role of the amygdala in processing these compound threat cues. In subsequent work, Adams et al. (<xref ref-type="bibr" rid="B3">2012</xref>) found direct evidence supporting the conclusion that early, reflexive responses to threat-gaze pairs are more attuned to congruent pairings, whereas later, reflective responses are more attuned to threat-related ambiguity. These differential responses support both an early process that detects threat and sets in motion adaptive responding, but a slightly slower process that is geared to confirming and perpetuating a survival response, or disconfirming and inhibiting an inappropriate response. It is in this interplay of reflexive and reflective processes that threat perception can benefit from different attunements to a threatening stimulus with different but complementary processing demands, to achieve the most timely and adaptive response to other people. In short, many characteristics may interact in person perception because they are directly overlapping, often in functionally adaptive ways.</p>
</sec>
</sec>
<sec>
<title>The social-sensory interface</title>
<p>The two routes by which social perceptions may be attuned or biased are now well documented, and such research provides an important foundation for understanding the basic mechanisms of social perception. More interesting to our minds is their ability to help us understand how these dual routes work in concert to enable judgments of people that vary along multiple dimensions and across multiple sensory modalities.</p>
<p>Recently, Freeman and Ambady (<xref ref-type="bibr" rid="B40">2011a</xref>) proposed a dynamic interactive framework to account for findings such as those reviewed above, and to map out how multiple category dimensions are perceived&#x02014;and in many cases may interact&#x02014;in a neurally plausible person perception system. In this system, multiple category dimensions (e.g., sex, race, and emotion) dynamically accumulate evidence in parallel, sometimes in conflicting ways. Importantly, as we will describe shortly, while the system is attempting to stabilize onto particular perceptions over time, it will often throw different category dimensions into interaction with one another. This may occur through either bottom-up or top-down forces, mapping onto the two routes described above. Before describing why and how these interactions would occur, we first outline the structure and function of the system.</p>
<p>Freeman and Ambady (<xref ref-type="bibr" rid="B40">2011a</xref>) captured their theoretical system with a computational neural network model. The perceptual process that emerges in this system is a highly integrative one. It incorporates whatever bottom-up evidence is available (from others&#x00027; facial, vocal, or bodily cues), while also taking into account any relevant top-down sources that could be brought to bear on perception. Thus, the system arrives at stable person construals not only through integrating bottom-up facial, vocal, bodily cues, but also by coordinating with and being constrained by higher-order social cognition (e.g., prior knowledge, stereotypes, motivation, and prejudice). As such, this system permits social top-down factors to fluidly interact with bottom-up sensory information to shape how we see and hear other people. Accordingly, our basic construals of others are always compromises between the sensory information &#x0201C;actually&#x0201D; there and the variety of baggage we bring to the perceptual process. Although traditionally it was long assumed that perception is solely bottom-up and insulated from any top-down influence of higher-order processes (e.g., Marr, <xref ref-type="bibr" rid="B83">1982</xref>; Fodor, <xref ref-type="bibr" rid="B38">1983</xref>), it has become clear that perception arises instead from both bottom-up and top-down influences (e.g., Engel et al., <xref ref-type="bibr" rid="B35">2001</xref>; Gilbert and Sigman, <xref ref-type="bibr" rid="B48">2007</xref>). Thus, we should expect top-down factors to be able to flexibly weigh in on the basic perceptual processing of other people.</p>
<p>In this framework, person perception is treated as an ongoing, dynamic process where bottom-up cues and top-down factors interact over time to stabilize onto particular perceptions (e.g., male or female; Black, White, or Asian). This is because person perception, as implemented in a human brain, would involve continuous changes in a pattern of neuronal activity (Usher and McClelland, <xref ref-type="bibr" rid="B120">2001</xref>; Smith and Ratcliff, <xref ref-type="bibr" rid="B108">2004</xref>; Spivey and Dale, <xref ref-type="bibr" rid="B111">2006</xref>). Consider, for example, the perception of another&#x00027;s face. Early in processing, representations of the face would tend to be partially consistent with multiple categories (e.g., both male and female) because the initial rough &#x0201C;gist&#x0201D; of the face partially supports both categories. As more information accumulates, the pattern of neuronal activity would gradually sharpen into an increasingly confident representation (e.g., male), while other competing, partially-active representations (e.g., female) would be pushed out (Usher and McClelland, <xref ref-type="bibr" rid="B120">2001</xref>; Spivey and Dale, <xref ref-type="bibr" rid="B111">2006</xref>; Freeman et al., <xref ref-type="bibr" rid="B43">2008</xref>). During the hundreds of milliseconds it takes for the neuronal activity to achieve a stable pattern (&#x0007E;100% male or &#x0007E;100% female), both bottom-up processing of the face as well as top-down factors (e.g., stereotypes) could gradually exert their influences, jointly determining the pattern to which the system gravitates (Grossberg, <xref ref-type="bibr" rid="B51">1980</xref>; Spivey, <xref ref-type="bibr" rid="B110">2007</xref>; Freeman and Ambady, <xref ref-type="bibr" rid="B40">2011a</xref>). Thus, this approach proposes that person perception involves ongoing competition between partially-active categories (e.g., male and female). Further, the competition is gradually weighed in on by both bottom-up sensory cues as well as top-down social factors, until a stable categorization is achieved. Accordingly, bottom-up cues and top-down factors mutually constrain one another to shape person perception.</p>
<p>How might this dynamic social&#x02013;sensory interface be instantiated at the neural level specifically? Let us consider sex categorization. One possibility is that visual processing of another&#x00027;s face and body in the occipotemporal cortex (e.g., the lateral fusiform gyrus, fusiform face area, extrastriate body area, and fusiform body area) continuously sends off ongoing results of processing into multimodal integrative regions, such as the STS (Campanella and Belin, <xref ref-type="bibr" rid="B29">2007</xref>; Peelen and Downing, <xref ref-type="bibr" rid="B92">2007</xref>; Freeman et al., <xref ref-type="bibr" rid="B42">2010</xref>). There, ongoing visual-processing results of the face begin integrating with ongoing auditory-processing results of the voice, which are emanating from the temporal voice area (Lattner et al., <xref ref-type="bibr" rid="B76">2005</xref>; Campanella and Belin, <xref ref-type="bibr" rid="B29">2007</xref>). While the available bottom-up information (facial, vocal, and bodily cues) begins integrating in multimodal regions such as the STS, the intermediary results of this integration are sent off to higher-order regions, such as the prefrontal cortex (Kim and Shadlen, <xref ref-type="bibr" rid="B71">1999</xref>), in addition to regions involved in decision-making and response selection, such as the basal ganglia (Bogacz and Gurney, <xref ref-type="bibr" rid="B22">2007</xref>). In doing so, bottom-up processing provides tentative support for perceptual alternatives (e.g., some cues provide 75% support for the male category and other cues provide 25% support for the female category; Freeman et al., <xref ref-type="bibr" rid="B43">2008</xref>; Freeman and Ambady, <xref ref-type="bibr" rid="B41">2011b</xref>). The basal ganglia and higher-order regions such as the prefrontal cortex force these partially-active representations (e.g., 75% male and 25% female) to compete, and the ongoing results of this competition are fed back to lower-level cortices where visual and auditory specification is more precise and results can be verified (Treisman, <xref ref-type="bibr" rid="B119">1996</xref>; Bouvier and Treisman, <xref ref-type="bibr" rid="B23">2010</xref>). Before these processing results are fed back, however, they may be slightly adjusted by higher-order regions&#x00027; top-down biases, e.g., activated stereotypes and motivational states. Lower-level regions then update higher-order regions by sending back revised information (e.g., 85% male and 15% female). Across cycles of this ongoing interaction between the processing of bottom-up sensory cues (instantiated in lower-level regions) and top-down social factors (instantiated in higher-order regions), the entire system comes to settle into a steady state (e.g., &#x0007E;100% male), presumably reflecting an ultimate, stable perception of another person. This general kind of processing has been captured in a computational model, described below.</p>
<sec>
<title>Dynamic interactive model</title>
<p>A general diagram of the dynamic interactive model appears in Figure <xref ref-type="fig" rid="F1">1</xref>. It is a recurrent connectionist network with stochastic interactive activation (McClelland, <xref ref-type="bibr" rid="B86">1991</xref>). The figure depicts a number of pools; in specific instantiations of the model, each pool will contain a variety of nodes (e.g., <sc>Male</sc>, <sc>Black</sc>, <sc>Aggressive</sc>, and <sc>Female Cues</sc>). Specific details on the model&#x00027;s structure may be found in Freeman and Ambady (<xref ref-type="bibr" rid="B40">2011a</xref>). The model provides an approximation of the kind of processing that might take place in a human brain (Rumelhart et al., <xref ref-type="bibr" rid="B98">1986</xref>; Smolensky, <xref ref-type="bibr" rid="B109">1989</xref>; Rogers and McClelland, <xref ref-type="bibr" rid="B97">2004</xref>; Spivey, <xref ref-type="bibr" rid="B110">2007</xref>), such as that described above, specifically in the context of perceiving other people.</p>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption><p><bold>A general diagram of the dynamic interactive model.</bold> Adapted from Freeman and Ambady (<xref ref-type="bibr" rid="B40">2011a</xref>).</p></caption>
<graphic xlink:href="fnint-06-00081-g0001.tif"/>
</fig>
<p>Initially, the network is stimulated simultaneously by both bottom-up and top-down inputs (see Figure <xref ref-type="fig" rid="F1">1</xref>). This may include inputs such as visual input of another&#x00027;s face, auditory input of another&#x00027;s voice, or higher-level input from systems responsible for top-down attention, motivations, or prejudice, for example. Each model instantiation contains a variety of nodes that are organized into, at most, four interactive levels of processing (one level representing each of the following: cues, categories, stereotypes, and high-level cognitive states). Every node has a transient level of activation at each moment in time. This activation corresponds with the strength of a tentative hypothesis that the node is represented in the input. Once the network is initially stimulated, activation flows among all nodes simultaneously as a function of their connection weights. Activation is also altered by a small amount of random noise, making the system&#x00027;s states inherently probabilistic. Because many connections between nodes are bi-directional, this flow results in a continual back-and-forth of activation between many nodes in the system. As such, nodes in the system continually re-adjust each other&#x00027;s activation and mutually constrain one another to find an overall pattern of activation that best fits the inputs. Gradually, the flows of activation lead the network to converge on a stable, steady state, where the activation of each node reaches an asymptote. This final steady state, it is argued, corresponds to an ultimate perception of another person. Through this ongoing mutual constraint-satisfaction process, multiple sources of information&#x02014;both bottom-up cues and top-down factors&#x02014;are interacting over time toward integrating into a stable perception.<xref ref-type="fn" rid="fn0002"><sup>2</sup></xref></p>
<p>As such, this model captures the intimate interaction between bottom-up and top-down processing theorized here. Thus, together, the approach and model treat perceptions of other people as continuously evolving over fractions of a second and emerging from the interaction between multiple bottom-up sensory cues and top-down social factors. Accordingly, person perception readily makes compromises between the variety of sensory cues inherent to another person and the baggage an individual perceiver brings to the perceptual process. Now, let us consider how this system naturally brings about category interactions such as those described earlier, either through top-down perceiver impacts or bottom-up target impacts.</p>
</sec>
<sec>
<title>Accounting for interactions via top-down impacts</title>
<p>A specific instantiation of the general model appears in Figure <xref ref-type="fig" rid="F2">2</xref>. Solid-line connections are excitatory (positive weight) and dashed-line connections are inhibitory (negative weight). Further details and particular connection weights are provided in Freeman and Ambady (<xref ref-type="bibr" rid="B40">2011a</xref>). This instantiation of the model is intended to capture the experience of how a perceiver would go about categorizing either sex or race for a particular task context. When the network is presented with a face, its visual input stimulates nodes in the cue level. Cue nodes excite category nodes consistent with them and inhibit category nodes inconsistent with them. They also receive feedback from category nodes. At the same time that cue nodes receive input from visual processing, higher-level input stimulates higher-order nodes, in this case representing task demands. This higher-level input would originate from a top-down attentional system driven by memory of the task instructions. These higher-order nodes excite category nodes consistent with them, inhibit category nodes inconsistent with them, and are also activated by category nodes as well. Thus, activation of the <sc>Race Task Demand</sc> node would facilitate activation of race categories (<sc>Black</sc>, <sc>White</sc>, <sc>Asian</sc>) and inhibit activation for sex categories (<sc>Male</sc>, <sc>Female</sc>), and vice-versa for the <sc>Sex Task Demand</sc> node.</p>
<fig id="F2" position="float">
<label>Figure 2</label>
<caption><p><bold>An instantiation of the dynamic interactive model that gives rise to category interactions driven by top-down stereotypes.</bold> Adapted from Freeman and Ambady (<xref ref-type="bibr" rid="B40">2011a</xref>).</p></caption>
<graphic xlink:href="fnint-06-00081-g0002.tif"/>
</fig>
<p>One manner by which many categories may interact is through overlapping stereotype contents. For instance, particular social categories in one dimension (e.g., race) may facilitate and inhibit the activation of categories in another dimension (e.g., sex) due to shared activations in the stereotype level. Stereotypes associated with the sex category, male, include aggressive, dominant, athletic, and competitive, and these are also associated with the race category, Black. Similarly, stereotypes of shy, family-oriented, and soft-spoken apply not only to the sex category, female, but also to the race category, Asian (Bem, <xref ref-type="bibr" rid="B20">1974</xref>; Devine and Elliot, <xref ref-type="bibr" rid="B33">1995</xref>; Ho and Jackson, <xref ref-type="bibr" rid="B63">2001</xref>). Thus, there is some overlap in the stereotypes belonging to the Black and male categories and in the stereotypes belonging to the Asian and female categories. Johnson et al. (<xref ref-type="bibr" rid="B67">in press</xref>) found that sex categorization was quickened when a computer-generated male face was made to be Black, relative to White or Asian. Conversely, for a female face, sex categorization was quickened when made to be Asian, relative to White or Black. Moreover, when faces were sex-ambiguous they were overwhelmingly categorized as male when Black, but overwhelmingly categorized as female when Asian. Later work found that such influences have downstream implications for interpreting ambiguous identities (e.g., sexual orientation; Johnson and Ghavami, <xref ref-type="bibr" rid="B66">2011</xref>). How could a dynamic interactive model account for such interactions between sex and race, presumably driven by top-down stereotypes?</p>
<p>In the model, category activation along one dimension, e.g., sex, may be constrained by feedback from stereotype activations triggered by the other dimension, e.g., race (see Figure <xref ref-type="fig" rid="F2">2</xref>). Sex categorization, for example, is constrained by race-triggered stereotype activations. Because the stereotypes of Black and male categories happen to partially overlap, Black men would be categorized more efficiently relative to White and Asian men. As shown in Figure <xref ref-type="fig" rid="F2">2</xref>, <sc>Aggressive</sc> happens to be positively linked and <sc>Docile</sc> happens to be negatively linked with both <sc>Black</sc> and <sc>Male</sc> categories. This overlap would lead the race-triggered excitation of <sc>Aggressive</sc> and race-triggered inhibition of <sc>Docile</sc> to feed back excitation to the <sc>Male</sc> category and inhibition to the <sc>Female</sc> category. This would facilitate a male categorization or, in cases of sex-ambiguous targets, bias categorizations toward male (rather than female). A similar effect would occur with the <sc>Asian</sc> and <sc>Female</sc> categories, where race-triggered excitation of <sc>Docile</sc> and race-triggered inhibition of <sc>Aggressive</sc> would come to facilitate a female categorization or bias categorizations toward female. Thus, a dynamic interactive model predicts that incidental overlap in stereotype contents could powerfully shape the perception of another category dimension.</p>
<p>When actual simulations were run with the network appearing in Figure <xref ref-type="fig" rid="F2">2</xref>, it was found that race category was readily used to disambiguate sex categorization. When a sex-ambiguous face was Black, the network was biased toward male categorization, with a 26% likelihood to categorize it as female. When White, random noise seemed to be driving sex categorization one way or the other, with a 52% likelihood (random chance: 50%) of female categorization. When Asian, however, the network was biased toward female categorization, with a 75% likelihood of female categorization (Freeman and Ambady, <xref ref-type="bibr" rid="B40">2011a</xref>). Thus, a dynamic interactive model predicts that perceivers would be biased to perceive sex-ambiguous Black faces as men and, conversely, to perceive sex-ambiguous Asian faces as women. This is because the presumably task-irrelevant race category placed excitatory and inhibitory pressures on stereotype nodes which were incidentally shared with sex categories. Indeed, Johnson et al. (<xref ref-type="bibr" rid="B67">in press</xref>) obtained this precise pattern of results with human perceivers.</p>
</sec>
<sec>
<title>Accounting for interactions via bottom-up impacts</title>
<p>As discussed earlier, different categories may be thrown into interaction because the perceptual cues supporting those categories partly overlap and are therefore directly confounded. For instance, sex categorization is facilitated for faces of happy women and angry men, relative to happy men and angry women. Further studies solidified the evidence that this interaction between sex and emotion is due to direct, physical overlap in cues rather than merely top-down stereotypes (see also Becker et al., <xref ref-type="bibr" rid="B19">2007</xref>; Hess et al., <xref ref-type="bibr" rid="B58">2009</xref>; Oosterhof and Todorov, <xref ref-type="bibr" rid="B89">2009</xref>). Thus, these studies suggest the features that make a face angrier are also partly those that make a face more masculine. Similarly, the features that make a face happier are also partly those that make a face more feminine. For instance, anger displays involve the center of the brow drawn down-ward, a compression of the mouth, and flared nostrils. However, men also have larger brows which may cause them to appear drawn down-ward. They also have a more defined jaw and thinner lips, which may make the mouth to appear more compressed, and they have larger noses, which may lead to the appearance of flared nostrils. A similar overlap exists for happy displays and the female face (Becker et al., <xref ref-type="bibr" rid="B19">2007</xref>). For instance, women have rounder faces than men, and the appearance of roundness increases when displaying happiness (i.e., a smile draws out the width of the face). Previous studies suggest that it is this direct, physical overlap in the cues signaling maleness and anger and in the cues signaling femaleness and happiness that leads to more efficient perceptions of angry men and happy women (relative to happy men and angry women).</p>
<p>A second instantiation of the general model appears in Figure <xref ref-type="fig" rid="F3">3</xref> (particular connection weights found in Freeman and Ambady, <xref ref-type="bibr" rid="B40">2011a</xref>). Differing from the previous instantiation, here nodes in the cue level represent a single perceptual cue (e.g., defined jaw and smile). Note that one cue node, <sc>Facial Hair</sc>, has an excitatory connection with <sc>Male</sc> and inhibitory connection with <sc>Female</sc>, whereas another cue node, <sc>Round Eyes</sc>, has an excitatory connection with <sc>Female</sc> and inhibitory connection with <sc>Male</sc>. Similarly, one cue node, <sc>Tensed Eyelids</sc>, has an excitatory connection with <sc>Anger</sc> and inhibitory connection with <sc>Happy</sc>, and vice-versa for the cue node, <sc>Smile</sc>. These four cue nodes represent the perceptual cues that independently relate to sex categories and independently relate to emotion categories. However, also note that one cue, <sc>Furrowed Brow</sc>, has an excitatory connection both with <sc>Anger</sc> and with <sc>Male</sc> (since a furrowed brow conveys both categories, described earlier). Similarly, another cue, <sc>Round Face</sc>, has an excitatory connection both with <sc>Happy</sc> and with <sc>Female</sc> (since a rounder face conveys both categories, described earlier). Thus, these two cue nodes represent the bottom-up overlap in the perceptual cues conveying sex and emotion. Specific cues used in this simulation were chosen arbitrarily; they are merely intended to simulate the set of non-overlapping and overlapping perceptual cues that convey sex and emotion.</p>
<fig id="F3" position="float">
<label>Figure 3</label>
<caption><p><bold>An instantiation of the dynamic interactive model that gives rise to category interactions driven by bottom-up perceptual cues.</bold> Adapted from Freeman and Ambady (<xref ref-type="bibr" rid="B40">2011a</xref>).</p></caption>
<graphic xlink:href="fnint-06-00081-g0003.tif"/>
</fig>
<p>When actual simulations were run with the network, the overlapping perceptual cues created bottom-up pressure that give rise to interactions between sex and emotion. When a male face was angry, the <sc>Male</sc> category&#x00027;s activation grew more quickly and stabilized on a stronger state, relative to when a male face was happy. Conversely, however, when a female face was angry, the <sc>Female</sc> category&#x00027;s activation grew more slowly and stabilized on a weaker state, relative to when a female face was happy. This led sex categorization of angry men and happy women to be completed more quickly (Freeman and Ambady, <xref ref-type="bibr" rid="B40">2011a</xref>). This pattern of results converges with the experimental data of previous studies (Becker et al., <xref ref-type="bibr" rid="B19">2007</xref>). Thus, the categorization of one dimension (e.g., sex) may be shaped by direct bottom-up overlap with the perceptual features supporting another dimension (e.g., emotion). This highlights how the model naturally accounts for such category interactions driven by bottom-up perceptual overlaps.</p>
</sec>
</sec>
<sec sec-type="conclusion" id="s1">
<title>Conclusion</title>
<p>One of the most fascinating aspects of person perception, which distinguishes it from most kinds of object perception, is that a single social percept can simultaneously convey an enormous amount of information. From another&#x00027;s face, multiple possible construals are available in parallel, including sex, race, age, emotion, sexual orientation, social status, intentions, and personality characteristics, among others. Here we have reviewed two manners by which many of these construals may interact with one another. One manner is through top-down perceiver impacts, where existing knowledge structures, the stereotypes a perceiver brings to the table, motivations, and other social factors throw different dimensions into interaction. Another manner is through bottom-up target impacts, where the perceptual cues supporting different dimensions are inextricably linked, leading those dimensions to interact. Further, these interactions in person perception may often occur in functionally adaptive ways. We then discussed a recent computational model of person perception that we argued is able to account for many of these sorts of interactions, both those driven by top-down and bottom-up forces. In short, person perception is combinatorial, and treating our targets of perception as having multiple intersecting identities is critical for an accurate understanding of how we perceive other people. Research investigating the underlying mechanisms of person perception is growing rapidly. To take up this new level of analysis in understanding person perception successfully, collaboration between scientists in traditionally divided domains is needed, such as the social-visual interface (Adams et al., <xref ref-type="bibr" rid="B1">2011</xref>). Here, we have argued that there is a coextension among sensory and social processes typically investigated independently. To map out how low-level visual information (traditionally home to the vision sciences) may meaningfully interact with and be shaped by high-level social factors (traditionally home to social psychology), and how this is instantiated through all the cognitive and neural processing lying in between them, interdisciplinary collaboration will be important. The emerging study of social vision offers an exciting and multilevel approach that may help bring about a more unified understanding of person perception. At the same time, it provides a unique bridge between far-reaching areas of the field, from researchers in social psychology to the cognitive, neural, and vision sciences.</p>
<sec>
<title>Conflict of interest statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p></sec>
</sec>
</body>
<back>
<ack>
<p>This work was supported by NIH NRSA grant (F31-MH092000) to Jonathan B. Freeman, NSF research grant (BCS-1052896) to Kerri L. Johnson, NIH research grant (R01-AG035028-01) to Reginald B. Adams Jr., and NSF research grant (BCS-0435547) to Nalini Ambady.</p>
</ack>
<ref-list>
<title>References</title>
<ref id="B1">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Adams</surname> <given-names>R. B.</given-names></name> <name><surname>Ambady</surname> <given-names>N.</given-names></name> <name><surname>Nakayama</surname> <given-names>K.</given-names></name> <name><surname>Shimojo</surname> <given-names>S.</given-names></name></person-group> (<year>2011</year>). <source>The Science of Social Vision</source>. <publisher-loc>New York, NY</publisher-loc>: <publisher-name>Oxford University Press</publisher-name>.</citation>
</ref>
<ref id="B2">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Adams</surname> <given-names>R. B.</given-names></name> <name><surname>Franklin</surname> <given-names>R. G.</given-names> <suffix>Jr.</suffix></name></person-group> (<year>2009</year>). <article-title>Influence of emotional expression on the processing of gaze direction</article-title>. <source>Motiv. Emotion</source> <volume>33</volume>, <fpage>106</fpage>&#x02013;<lpage>112</lpage>.</citation>
</ref>
<ref id="B3">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Adams</surname> <given-names>R. B.</given-names> <suffix>Jr.</suffix></name> <name><surname>Franklin</surname> <given-names>R. G.</given-names> <suffix>Jr.</suffix></name> <name><surname>Kveraga</surname> <given-names>K.</given-names></name> <name><surname>Ambady</surname> <given-names>N.</given-names></name> <name><surname>Kleck</surname> <given-names>R. E.</given-names></name> <name><surname>Whalen</surname> <given-names>P. J.</given-names></name> <etal/></person-group>. (<year>2012</year>). <article-title>Amygdala responses to averted vs direct gaze fear vary as a function of presentation speed</article-title>. <source>Soc. Cogn. Affect. Neurosci</source>. <volume>7</volume>, <fpage>568</fpage>&#x02013;<lpage>577</lpage>. <pub-id pub-id-type="doi">10.1093/scan/nsr038</pub-id><pub-id pub-id-type="pmid">21666261</pub-id></citation>
</ref>
<ref id="B4">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Adams</surname> <given-names>R. B.</given-names></name> <name><surname>Gordon</surname> <given-names>H. L.</given-names></name> <name><surname>Baird</surname> <given-names>A. A.</given-names></name> <name><surname>Ambady</surname> <given-names>N.</given-names></name> <name><surname>Kleck</surname> <given-names>R. E.</given-names></name></person-group> (<year>2003</year>). <article-title>Effects of Gaze on Amygdala sensitivity to anger and fear faces</article-title>. <source>Science</source> <volume>300</volume>, <fpage>1536</fpage>. <pub-id pub-id-type="doi">10.1126/science.1082244</pub-id><pub-id pub-id-type="pmid">12791983</pub-id></citation>
</ref>
<ref id="B5">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Adams</surname> <given-names>R. B.</given-names> <suffix>Jr.</suffix></name> <name><surname>Kleck</surname> <given-names>R. E.</given-names></name></person-group> (<year>2005</year>). <article-title>Effects of direct and averted gaze on the perception of facially communicated emotion</article-title>. <source>Emotion</source> <volume>5</volume>, <fpage>3</fpage>&#x02013;<lpage>11</lpage>. <pub-id pub-id-type="doi">10.1037/1528-3542.5.1.3</pub-id><pub-id pub-id-type="pmid">15755215</pub-id></citation>
</ref>
<ref id="B6">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Adams</surname> <given-names>R. B.</given-names></name> <name><surname>Nelson</surname> <given-names>A. J.</given-names></name></person-group> (<year>2012</year>). <article-title>Intersecting identities and expressions: the compound nature of social perception</article-title>, in <source>The Handbook of Social Neuroscience</source>, eds <person-group person-group-type="editor"><name><surname>Decety</surname> <given-names>J.</given-names></name> <name><surname>Cacioppo</surname> <given-names>J. T.</given-names></name></person-group> (<publisher-loc>New York, NY</publisher-loc>: <publisher-name>Oxford University Press</publisher-name>).</citation>
</ref>
<ref id="B4a">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Adams</surname> <given-names>R. B.</given-names></name> <name><surname>Gordon</surname> <given-names>H. L.</given-names></name> <name><surname>Baird</surname> <given-names>A. A.</given-names></name> <name><surname>Ambady</surname> <given-names>N.</given-names></name> <name><surname>Kleck</surname> <given-names>R. E.</given-names></name></person-group> (<year>2003</year>). <article-title>Effects of Gaze on Amygdala Sensitivity to Anger and Fear Faces</article-title>. <source>Science</source>, <volume>300</volume>, <fpage>1536.</fpage> <pub-id pub-id-type="doi">10.1126/science.1082244</pub-id><pub-id pub-id-type="pmid">12791983</pub-id></citation>
</ref>
<ref id="B7">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Adolphs</surname> <given-names>R.</given-names></name></person-group> (<year>2003</year>). <article-title>Cognitive neuroscience of human social behavior</article-title>. <source>Nat. Rev. Neurosci</source>. <volume>4</volume>, <fpage>165</fpage>&#x02013;<lpage>178</lpage>. <pub-id pub-id-type="doi">10.1038/nrn1056</pub-id><pub-id pub-id-type="pmid">12612630</pub-id></citation>
</ref>
<ref id="B8">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Aggleton</surname> <given-names>J. P.</given-names></name> <name><surname>Burton</surname> <given-names>M. J.</given-names></name> <name><surname>Passingham</surname> <given-names>R. E.</given-names></name></person-group> (<year>1980</year>). <article-title>Cortical and subcortical afferents to the amygdala in the rhesus monkey (<italic>Macaca mulatta</italic>)</article-title>. <source>Brain Res</source>. <volume>190</volume>, <fpage>347</fpage>&#x02013;<lpage>368</lpage>. <pub-id pub-id-type="doi">10.1016/0006-8993(80)90279-6</pub-id><pub-id pub-id-type="pmid">6768425</pub-id></citation>
</ref>
<ref id="B9">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Allport</surname> <given-names>G. W.</given-names></name></person-group> (<year>1954</year>). <source>The Nature of Prejudice</source>. <publisher-loc>Oxford</publisher-loc>: <publisher-name>Addison-Wesley</publisher-name>. <pub-id pub-id-type="pmid">11970776</pub-id></citation>
</ref>
<ref id="B10">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Amaral</surname> <given-names>D. G.</given-names></name> <name><surname>Behniea</surname> <given-names>H.</given-names></name> <name><surname>Kelly</surname> <given-names>J. L.</given-names></name></person-group> (<year>2003</year>). <article-title>Topographic organization of projects from the amygdala to the visual cortex in the macaque monkey</article-title>. <source>Neuroscience</source> <volume>118</volume>, <fpage>1099</fpage>&#x02013;<lpage>1120</lpage>. <pub-id pub-id-type="doi">10.1016/S0306-4522(02)01001-1</pub-id><pub-id pub-id-type="pmid">12732254</pub-id></citation>
</ref>
<ref id="B11">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Ambady</surname> <given-names>N.</given-names></name> <name><surname>Bernieri</surname> <given-names>F. J.</given-names></name> <name><surname>Richeson</surname> <given-names>J. A.</given-names></name></person-group> (<year>2000</year>). <article-title>Toward a histology of social behavior: judgmental accuracy from thin slices of the behavioral stream</article-title>, in <source>Advances in Experimental Social Psychology</source>, <volume>Vol. 32</volume>, ed <person-group person-group-type="editor"><name><surname>Zanna</surname> <given-names>M. P.</given-names></name></person-group> (<publisher-loc>San Diego, CA</publisher-loc>: <publisher-name>Academic Press</publisher-name>), <fpage>201</fpage>&#x02013;<lpage>271</lpage>.</citation>
</ref>
<ref id="B12">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ambady</surname> <given-names>N.</given-names></name> <name><surname>Rosenthal</surname> <given-names>R.</given-names></name></person-group> (<year>1992</year>). <article-title>Thin slices of expressive behavior as predictors of interpersonal consequences: a meta-analysis</article-title>. <source>Psychol. Bull</source>. <volume>111</volume>, <fpage>256</fpage>&#x02013;<lpage>274</lpage>.</citation>
</ref>
<ref id="B13">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Amodio</surname> <given-names>D. M.</given-names></name> <name><surname>Frith</surname> <given-names>C. D.</given-names></name></person-group> (<year>2006</year>). <article-title>Meeting of minds: the medial frontal cortex and social cognition</article-title>. <source>Nat. Rev. Neurosci</source>. <volume>7</volume>, <fpage>268</fpage>&#x02013;<lpage>277</lpage>. <pub-id pub-id-type="doi">10.1038/nrn1884</pub-id><pub-id pub-id-type="pmid">16552413</pub-id></citation>
</ref>
<ref id="B14">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Atkinson</surname> <given-names>A. P.</given-names></name> <name><surname>Tipples</surname> <given-names>J.</given-names></name> <name><surname>Burt</surname> <given-names>D. M.</given-names></name> <name><surname>Young</surname> <given-names>A. W.</given-names></name></person-group> (<year>2005</year>). <article-title>Asymmetric interference between sex and emotion in face perception</article-title>. <source>Percept. Psychophys</source>. <volume>67</volume>, <fpage>1199</fpage>&#x02013;<lpage>1213</lpage>. <pub-id pub-id-type="pmid">16502842</pub-id></citation>
</ref>
<ref id="B15">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bar</surname> <given-names>M.</given-names></name></person-group> (<year>2004</year>). <article-title>Visual objects in context</article-title>. <source>Nat. Rev. Neurosci</source>. <volume>5</volume>, <fpage>617</fpage>&#x02013;<lpage>629</lpage>. <pub-id pub-id-type="doi">10.1016/S0079-6123(06)55003-4</pub-id><pub-id pub-id-type="pmid">17027378</pub-id></citation>
</ref>
<ref id="B16">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Bargh</surname> <given-names>J. A.</given-names></name></person-group> (<year>1994</year>). <article-title>The four horsemen of automaticity: awareness, intention, efficiency, and control in social cognition</article-title>, in <source>Handbook of Social Cognition</source>, <italic>Vol. 1: Basic Processes; Vol. 2: Applications</italic> <edition>2nd Edn</edition>, eds <person-group person-group-type="editor"><name><surname>Wyer</surname> <given-names>R. S.</given-names> <suffix>Jr</suffix></name> <name><surname>Srull</surname> <given-names>T. K.</given-names></name></person-group> (<publisher-loc>Hillsdale, NJ, England</publisher-loc>: <publisher-name>Lawrence Erlbaum Associates, Inc</publisher-name>), <fpage>1</fpage>&#x02013;<lpage>40</lpage>.</citation>
</ref>
<ref id="B17">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Bargh</surname> <given-names>J. A.</given-names></name></person-group> (<year>1999</year>). <article-title>The cognitive monster: the case against the controllability of automatic stereotype effects</article-title>, in <source>Dual-Process Theories in Social Psychology</source>, eds <person-group person-group-type="editor"><name><surname>Chaiken</surname> <given-names>S.</given-names></name> <name><surname>Trope</surname> <given-names>Y.</given-names></name></person-group> (<publisher-loc>New York, NY</publisher-loc>: <publisher-name>Guilford Press</publisher-name>), <fpage>361</fpage>&#x02013;<lpage>382</lpage>.</citation>
</ref>
<ref id="B18">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bechara</surname> <given-names>A.</given-names></name> <name><surname>Damasio</surname> <given-names>H.</given-names></name> <name><surname>Damasio</surname> <given-names>A. R.</given-names></name></person-group> (<year>2000</year>). <article-title>Emotion, decision making and the orbitofrontal</article-title>. <source>Cortex</source> <volume>10</volume>, <fpage>295</fpage>&#x02013;<lpage>307</lpage>. <pub-id pub-id-type="doi">10.1093/cercor/10.3.295</pub-id><pub-id pub-id-type="pmid">10731224</pub-id></citation>
</ref>
<ref id="B19">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Becker</surname> <given-names>D. V.</given-names></name> <name><surname>Kenrick</surname> <given-names>D. T.</given-names></name> <name><surname>Neuberg</surname> <given-names>S. L.</given-names></name> <name><surname>Blackwell</surname> <given-names>K. C.</given-names></name> <name><surname>Smith</surname> <given-names>D. M.</given-names></name></person-group> (<year>2007</year>). <article-title>The confounded nature of angry men and happy women</article-title>. <source>J. Pers. Soc. Psychol</source>. <volume>92</volume>, <fpage>179</fpage>&#x02013;<lpage>190</lpage>. <pub-id pub-id-type="doi">10.1037/0022-3514.92.2.179</pub-id><pub-id pub-id-type="pmid">17279844</pub-id></citation>
</ref>
<ref id="B20">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bem</surname> <given-names>S.</given-names></name></person-group> (<year>1974</year>). <article-title>The measurement of psychological androgyny</article-title>. <source>J. Consult. Clin. Psychol</source>. <volume>42</volume>, <fpage>155</fpage>&#x02013;<lpage>162</lpage>. <pub-id pub-id-type="doi">10.1016/0166-4328(88)90166-0</pub-id><pub-id pub-id-type="pmid">4823550</pub-id></citation>
</ref>
<ref id="B21">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Bodenhausen</surname> <given-names>G. V.</given-names></name> <name><surname>Macrae</surname> <given-names>C. N.</given-names></name></person-group> (<year>1998</year>). <article-title>Stereotype activation and inhibition</article-title>, in <source>Stereotype Activation and Inhibition</source> (<publisher-loc>Mahwah, NJ</publisher-loc>: <publisher-name>Lawrence Erlbaum Associates Publishers</publisher-name>), <fpage>1</fpage>&#x02013;<lpage>52</lpage>.</citation>
</ref>
<ref id="B22">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bogacz</surname> <given-names>R.</given-names></name> <name><surname>Gurney</surname> <given-names>K.</given-names></name></person-group> (<year>2007</year>). <article-title>The basal ganglia and cortex implement optimal decision making between alternative actions</article-title>. <source>Neural Comput</source>. <volume>19</volume>, <fpage>442</fpage>&#x02013;<lpage>477</lpage>. <pub-id pub-id-type="doi">10.1162/neco.2007.19.2.442</pub-id><pub-id pub-id-type="pmid">17206871</pub-id></citation>
</ref>
<ref id="B23">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bouvier</surname> <given-names>S.</given-names></name> <name><surname>Treisman</surname> <given-names>A.</given-names></name></person-group> (<year>2010</year>). <article-title>Visual feature binding requires reentry</article-title>. <source>Psychol. Sci</source>. <volume>21</volume>, <fpage>200</fpage>&#x02013;<lpage>204</lpage>. <pub-id pub-id-type="doi">10.1177/0956797609357858</pub-id><pub-id pub-id-type="pmid">20424045</pub-id></citation>
</ref>
<ref id="B24">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Brewer</surname> <given-names>M. B.</given-names></name></person-group> (<year>1988</year>). <article-title>A dual process model of impression formation</article-title>, in <source>A Dual-Process Model of Impression Formation: Advances in Social Cognition</source>, <volume>Vol. 1</volume>, eds <person-group person-group-type="editor"><name><surname>Srull</surname> <given-names>T. K.</given-names></name> <name><surname>Wyer</surname> <given-names>R. S.</given-names></name></person-group> (<publisher-loc>Hillsdale, NJ</publisher-loc>: <publisher-name>Erlbaum</publisher-name>), <fpage>1</fpage>&#x02013;<lpage>36</lpage></citation>
</ref>
<ref id="B25">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brothers</surname> <given-names>L.</given-names></name></person-group> (<year>1990</year>). <article-title>The neural basis of primate social communication</article-title>. <source>Motiv. Emotion</source> <volume>14</volume>, <fpage>81</fpage>&#x02013;<lpage>91</lpage>.</citation>
</ref>
<ref id="B26">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brothers</surname> <given-names>L.</given-names></name> <name><surname>Ring</surname> <given-names>B.</given-names></name></person-group> (<year>1993</year>). <article-title>Mesial temporal neurons in the macaque monkey with responses selective for aspects of social stimuli</article-title>. <source>Behav. Brain Res</source>. <volume>57</volume>, <fpage>53</fpage>&#x02013;<lpage>61</lpage>. <pub-id pub-id-type="pmid">8292255</pub-id></citation>
</ref>
<ref id="B27">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bruce</surname> <given-names>V.</given-names></name> <name><surname>Young</surname> <given-names>A. W.</given-names></name></person-group> (<year>1986</year>). <article-title>A theoretical perspective for understanding face recognition</article-title>. <source>Br. J. Psychol</source>. <volume>77</volume>, <fpage>305</fpage>&#x02013;<lpage>327</lpage>. <pub-id pub-id-type="pmid">3756376</pub-id></citation>
</ref>
<ref id="B28">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Calder</surname> <given-names>A. J.</given-names></name> <name><surname>Young</surname> <given-names>A. W.</given-names></name></person-group> (<year>2005</year>). <article-title>Understanding the recognition of facial identity and facial expression</article-title>. <source>Nat. Rev. Neurosci</source>. <volume>6</volume>, <fpage>641</fpage>&#x02013;<lpage>651</lpage>. <pub-id pub-id-type="doi">10.1038/nrn1724</pub-id><pub-id pub-id-type="pmid">16062171</pub-id></citation>
</ref>
<ref id="B29">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Campanella</surname> <given-names>S.</given-names></name> <name><surname>Belin</surname> <given-names>P.</given-names></name></person-group> (<year>2007</year>). <article-title>Integrating face and voice in person perception</article-title>. <source>Trends Cogn. Sci</source>. <volume>11</volume>, <fpage>535</fpage>&#x02013;<lpage>543</lpage>. <pub-id pub-id-type="doi">10.1016/j.tics.2007.10.001</pub-id><pub-id pub-id-type="pmid">17997124</pub-id></citation>
</ref>
<ref id="B30">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Corneille</surname> <given-names>O.</given-names></name> <name><surname>Huart</surname> <given-names>J.</given-names></name> <name><surname>Emile</surname> <given-names>J.</given-names></name> <name><surname>Br&#x000E9;dart</surname> <given-names>E.</given-names></name></person-group> (<year>2004</year>). <article-title>When memory shifts toward more typical category exemplars: accentuation effects in the recollection of ethnically ambiguous faces</article-title>. <source>J. Pers. Soc. Psychol</source>. <volume>86</volume>, <fpage>236</fpage>&#x02013;<lpage>250</lpage>. <pub-id pub-id-type="doi">10.1037/0022-3514.86.2.236</pub-id><pub-id pub-id-type="pmid">14769081</pub-id></citation>
</ref>
<ref id="B31">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Crisp</surname> <given-names>R. J.</given-names></name> <name><surname>Hewstone</surname> <given-names>M.</given-names></name></person-group> (<year>2007</year>). <article-title>Multiple social categorization</article-title>. <source>Adv. Exp. Soc. Psychol</source>. <volume>39</volume>, <fpage>164</fpage>&#x02013;<lpage>254</lpage>.</citation>
</ref>
<ref id="B32">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Devine</surname> <given-names>P.</given-names></name></person-group> (<year>1989</year>). <article-title>Stereotypes and prejudice: their automatic and controlled components</article-title>. <source>J. Pers. Soc. Psychol</source>. <volume>56</volume>, <fpage>5</fpage>&#x02013;<lpage>18</lpage>.</citation>
</ref>
<ref id="B33">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Devine</surname> <given-names>P. G.</given-names></name> <name><surname>Elliot</surname> <given-names>A. J.</given-names></name></person-group> (<year>1995</year>). <article-title>Are racial stereotypes really fading? The Princeton trilogy revisited</article-title>. <source>J. Pers. Soc. Psychol</source>. <volume>11</volume>, <fpage>1139</fpage>&#x02013;<lpage>1150</lpage>.</citation>
</ref>
<ref id="B34">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dovidio</surname> <given-names>J. F.</given-names></name> <name><surname>Kawakami</surname> <given-names>K.</given-names></name> <name><surname>Johnson</surname> <given-names>C.</given-names></name> <name><surname>Johnson</surname> <given-names>B.</given-names></name> <name><surname>Howard</surname> <given-names>A.</given-names></name></person-group> (<year>1997</year>). <article-title>The nature of prejudice: automatic and controlled processes</article-title>. <source>J. Exp. Soc. Psychol</source>. <volume>33</volume>, <fpage>510</fpage>&#x02013;<lpage>540</lpage>.</citation>
</ref>
<ref id="B35">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Engel</surname> <given-names>A. K.</given-names></name> <name><surname>Fries</surname> <given-names>P.</given-names></name> <name><surname>Singer</surname> <given-names>W.</given-names></name></person-group> (<year>2001</year>). <article-title>Dynamic predictions: oscillations and synchrony in top-down processing</article-title>. <source>Nat. Rev. Neurosci</source>. <volume>2</volume>, <fpage>704</fpage>&#x02013;<lpage>716</lpage>. <pub-id pub-id-type="doi">10.1038/35094565</pub-id><pub-id pub-id-type="pmid">11584308</pub-id></citation>
</ref>
<ref id="B36">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fazio</surname> <given-names>R. H.</given-names></name> <name><surname>Jackson</surname> <given-names>J. R.</given-names></name> <name><surname>Dunton</surname> <given-names>B. C.</given-names></name> <name><surname>Williams</surname> <given-names>C. J.</given-names></name></person-group> (<year>1995</year>). <article-title>Variability in automatic activation as an unobtrusive measure of racial attitudes: a bona fide pipeline?</article-title> <source>J. Pers. Soc. Psychol</source>. <volume>69</volume>, <fpage>1013</fpage>&#x02013;<lpage>1027</lpage>. <pub-id pub-id-type="pmid">8531054</pub-id></citation>
</ref>
<ref id="B37">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fiske</surname> <given-names>S. T.</given-names></name> <name><surname>Neuberg</surname> <given-names>S. L.</given-names></name></person-group> (<year>1990</year>). <article-title>A continuum model of impression formation from category-based to individuating processes: influences of information and motivation on attention and interpretation</article-title>. <source>Adv. Exp. Soc. Psychol</source>. <volume>23</volume>, <fpage>1</fpage>&#x02013;<lpage>74</lpage>.</citation>
</ref>
<ref id="B38">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Fodor</surname> <given-names>J. A.</given-names></name></person-group> (<year>1983</year>). <source>The Modularity of Mind</source>. <publisher-loc>Cambridge, MA</publisher-loc>: <publisher-name>MIT Press</publisher-name>.</citation>
</ref>
<ref id="B39">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fox</surname> <given-names>E.</given-names></name> <name><surname>Mathews</surname> <given-names>A.</given-names></name> <name><surname>Calder</surname> <given-names>A. J.</given-names></name> <name><surname>Yiend</surname> <given-names>J.</given-names></name></person-group> (<year>2007</year>). <article-title>Anxiety and sensitivity to gaze direction in emotionally expressive faces</article-title>. <source>Emotion</source> <volume>7</volume>, <fpage>478</fpage>&#x02013;<lpage>486</lpage>. <pub-id pub-id-type="doi">10.1037/1528-3542.7.3.478</pub-id><pub-id pub-id-type="pmid">17683204</pub-id></citation>
</ref>
<ref id="B40">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Freeman</surname> <given-names>J. B.</given-names></name> <name><surname>Ambady</surname> <given-names>N.</given-names></name></person-group> (<year>2011a</year>). <article-title>A dynamic interactive theory of person construal</article-title>. <source>Psychol. Rev</source>. <volume>118</volume>, <fpage>247</fpage>&#x02013;<lpage>279</lpage>. <pub-id pub-id-type="doi">10.1037/a0022327</pub-id><pub-id pub-id-type="pmid">21355661</pub-id></citation>
</ref>
<ref id="B41">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Freeman</surname> <given-names>J. B.</given-names></name> <name><surname>Ambady</surname> <given-names>N.</given-names></name></person-group> (<year>2011b</year>). <article-title>When two become one: temporally dynamic integration of the face and voice</article-title>. <source>J. Exp. Soc. Psychol</source>. <volume>47</volume>, <fpage>259</fpage>&#x02013;<lpage>263</lpage>.</citation>
</ref>
<ref id="B42">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Freeman</surname> <given-names>J. B.</given-names></name> <name><surname>Ambady</surname> <given-names>N.</given-names></name> <name><surname>Holcomb</surname> <given-names>P. J.</given-names></name></person-group> (<year>2010</year>). <article-title>The face-sensitive N170 encodes social category information</article-title>. <source>Neuroreport</source> <volume>21</volume>, <fpage>24</fpage>&#x02013;<lpage>28</lpage>. <pub-id pub-id-type="doi">10.1097/WNR.0b013e3283320d54</pub-id><pub-id pub-id-type="pmid">19864961</pub-id></citation>
</ref>
<ref id="B43">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Freeman</surname> <given-names>J. B.</given-names></name> <name><surname>Ambady</surname> <given-names>N.</given-names></name> <name><surname>Rule</surname> <given-names>N. O.</given-names></name> <name><surname>Johnson</surname> <given-names>K. L.</given-names></name></person-group> (<year>2008</year>). <article-title>Will a category cue attract you? motor output reveals dynamic competition across person construal</article-title>. <source>J. Exp. Psychol. Gen</source>. <volume>137</volume>, <fpage>673</fpage>&#x02013;<lpage>690</lpage>. <pub-id pub-id-type="doi">10.1037/a0013875</pub-id><pub-id pub-id-type="pmid">18999360</pub-id></citation>
</ref>
<ref id="B44">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Freeman</surname> <given-names>J. B.</given-names></name> <name><surname>Penner</surname> <given-names>A. M.</given-names></name> <name><surname>Saperstein</surname> <given-names>A.</given-names></name> <name><surname>Scheutz</surname> <given-names>M.</given-names></name> <name><surname>Ambady</surname> <given-names>N.</given-names></name></person-group> (<year>2011</year>). <article-title>Looking the part: social status cues shape race perception</article-title>. <source>PLoS ONE</source> <volume>6</volume>:<fpage>e25107</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0025107</pub-id><pub-id pub-id-type="pmid">21977227</pub-id></citation>
</ref>
<ref id="B45">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Freeman</surname> <given-names>J. B.</given-names></name> <name><surname>Rule</surname> <given-names>N. O.</given-names></name> <name><surname>Adams</surname> <given-names>R. B.</given-names></name> <name><surname>Ambady</surname> <given-names>N.</given-names></name></person-group> (<year>2010</year>). <article-title>The neural basis of categorical face perception: graded representations of face gender in fusiform and orbitofrontal cortices</article-title>. <source>Cereb. Cortex</source> <volume>20</volume>, <fpage>1314</fpage>&#x02013;<lpage>1322</lpage>. <pub-id pub-id-type="doi">10.1093/cercor/bhp195</pub-id><pub-id pub-id-type="pmid">19767310</pub-id></citation>
</ref>
<ref id="B46">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ganel</surname> <given-names>T.</given-names></name> <name><surname>Goshen-Gottstein</surname> <given-names>Y.</given-names></name></person-group> (<year>2004</year>). <article-title>Effects of familiarity on the perceptual integrality of the identity and exprsesion of faces: the parallel-route hypothesis revisited</article-title>. <source>J. Exp. Psychol. Hum. Percept. Perform</source>. <volume>30</volume>, <fpage>583</fpage>&#x02013;<lpage>597</lpage>. <pub-id pub-id-type="doi">10.1037/0096-1523.30.3.583</pub-id><pub-id pub-id-type="pmid">15161388</pub-id></citation>
</ref>
<ref id="B47">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Garner</surname> <given-names>W. R.</given-names></name></person-group> (<year>1976</year>). <article-title>Interaction of stimulus dimensions in concept and choice processes</article-title>. <source>Cognit. Psychol</source>. <volume>8</volume>, <fpage>98</fpage>&#x02013;<lpage>123</lpage>.</citation>
</ref>
<ref id="B48">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gilbert</surname> <given-names>C. D.</given-names></name> <name><surname>Sigman</surname> <given-names>M.</given-names></name></person-group> (<year>2007</year>). <article-title>Brain states: top-down influences in sensory processing</article-title>. <source>Neuron</source> <volume>54</volume>, <fpage>677</fpage>&#x02013;<lpage>696</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuron.2007.05.019</pub-id><pub-id pub-id-type="pmid">17553419</pub-id></citation>
</ref>
<ref id="B49">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gilbert</surname> <given-names>D. T.</given-names></name> <name><surname>Hixon</surname> <given-names>J. G.</given-names></name></person-group> (<year>1991</year>). <article-title>The trouble of thinking: activation and application of stereotypic beliefs</article-title>. <source>J. Pers. Soc. Psychol</source>. <volume>60</volume>, <fpage>509</fpage>&#x02013;<lpage>517</lpage>.</citation>
</ref>
<ref id="B50">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Graham</surname> <given-names>R.</given-names></name> <name><surname>LaBar</surname> <given-names>K. S.</given-names></name></person-group> (<year>2007</year>). <article-title>Garner interference reveals dependencies between emotional expression and gaze in face perception</article-title>. <source>Emotion</source> <volume>7</volume>, <fpage>296</fpage>&#x02013;<lpage>313</lpage>. <pub-id pub-id-type="doi">10.1037/1528-3542.7.2.296</pub-id><pub-id pub-id-type="pmid">17516809</pub-id></citation>
</ref>
<ref id="B51">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Grossberg</surname> <given-names>S.</given-names></name></person-group> (<year>1980</year>). <article-title>How does a brain build a cognitive code?</article-title> <source>Psychol. Rev</source>. <volume>87</volume>, <fpage>1</fpage>&#x02013;<lpage>51</lpage>. <pub-id pub-id-type="pmid">7375607</pub-id></citation>
</ref>
<ref id="B52">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Grossman</surname> <given-names>M.</given-names></name> <name><surname>Wood</surname> <given-names>W.</given-names></name></person-group> (<year>1993</year>). <article-title>Sex differences in intensity of emotional experience: a social role interpretation</article-title>. <source>J. Pers. Soc. Psychol</source>. <volume>65</volume>, <fpage>1010</fpage>&#x02013;<lpage>1022</lpage>. <pub-id pub-id-type="pmid">8246109</pub-id></citation>
</ref>
<ref id="B53">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hadjikhani</surname> <given-names>N.</given-names></name> <name><surname>Hoge</surname> <given-names>R.</given-names></name> <name><surname>Snyder</surname> <given-names>J.</given-names></name> <name><surname>de Gelder</surname> <given-names>B.</given-names></name></person-group> (<year>2008</year>). <article-title>Pointing with the eyes: the role of gaze in communicating danger</article-title>. <source>Brain Cogn</source>. <volume>68</volume>, <fpage>1</fpage>&#x02013;<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1016/j.bandc.2008.01.008</pub-id><pub-id pub-id-type="pmid">18586370</pub-id></citation>
</ref>
<ref id="B54">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Harmon-Jones</surname> <given-names>E.</given-names></name></person-group> (<year>2003</year>). <article-title>Clarifying the emotive functions of asymmetrical frontal cortical activity</article-title>. <source>Psychophysiology</source> <volume>40</volume>, <fpage>838</fpage>&#x02013;<lpage>848</lpage>. <pub-id pub-id-type="doi">10.1111/1469-8986.00121</pub-id><pub-id pub-id-type="pmid">14986837</pub-id></citation>
</ref>
<ref id="B55">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Haselton</surname> <given-names>M. G.</given-names></name> <name><surname>Nettle</surname> <given-names>D.</given-names></name></person-group> (<year>2006</year>). <article-title>The paranoid optimist: an integrative evolutionary model of cognitive biases</article-title>. <source>Pers. Soc. Psychol. Rev</source>. <volume>10</volume>, <fpage>47</fpage>&#x02013;<lpage>66</lpage>. <pub-id pub-id-type="doi">10.1207/s15327957pspr1001_3</pub-id><pub-id pub-id-type="pmid">16430328</pub-id></citation>
</ref>
<ref id="B56">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hasselmo</surname> <given-names>M. E.</given-names></name> <name><surname>Rolls</surname> <given-names>E. T.</given-names></name> <name><surname>Baylis</surname> <given-names>G. C.</given-names></name></person-group> (<year>1989</year>). <article-title>The role of expression and identity in the face-selective responses of neurons in the temporal cortex of the monkey</article-title>. <source>Behav. Brain Res</source>. <volume>32</volume>, <fpage>203</fpage>&#x02013;<lpage>218</lpage>. <pub-id pub-id-type="pmid">2713076</pub-id></citation>
</ref>
<ref id="B57">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Haxby</surname> <given-names>J. V.</given-names></name> <name><surname>Hoffman</surname> <given-names>E. A.</given-names></name> <name><surname>Gobbini</surname> <given-names>M. I.</given-names></name></person-group> (<year>2000</year>). <article-title>The distributed human neural system for face perception</article-title>. <source>Trends Cogn. Sci</source>. <volume>4</volume>, <fpage>223</fpage>&#x02013;<lpage>233</lpage>. <pub-id pub-id-type="doi">10.1016/S1364-6613(00)01482-0</pub-id><pub-id pub-id-type="pmid">10827445</pub-id></citation>
</ref>
<ref id="B58">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hess</surname> <given-names>U.</given-names></name> <name><surname>Adams</surname> <given-names>R. B.</given-names> <suffix>Jr.</suffix></name> <name><surname>Grammer</surname> <given-names>K.</given-names></name> <name><surname>Kleck</surname> <given-names>R. E.</given-names></name></person-group> (<year>2009</year>). <article-title>Face gender and emotion expression: are angry women more like men?</article-title> <source>J. Vis</source>. <volume>9</volume>, <fpage>1</fpage>&#x02013;<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1167/9.12.19</pub-id><pub-id pub-id-type="pmid">20053110</pub-id></citation>
</ref>
<ref id="B59">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hess</surname> <given-names>U.</given-names></name> <name><surname>Adams</surname> <given-names>R. B.</given-names> <suffix>Jr.</suffix></name> <name><surname>Kleck</surname> <given-names>R. E.</given-names></name></person-group> (<year>2004</year>). <article-title>Facial appearance, gender, and emotion expression</article-title>. <source>Emotion</source> <volume>4</volume>, <fpage>378</fpage>&#x02013;<lpage>388</lpage>. <pub-id pub-id-type="doi">10.1037/1528-3542.4.4.378</pub-id><pub-id pub-id-type="pmid">15571436</pub-id></citation>
</ref>
<ref id="B60">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hess</surname> <given-names>U.</given-names></name> <name><surname>Adams</surname> <given-names>R. B.</given-names></name> <name><surname>Kleck</surname> <given-names>R. E.</given-names></name></person-group> (<year>2007</year>). <article-title>Looking at you or looking elsewhere: the influence of head orientation on the signal value of emotional facial expressions</article-title>. <source>Motiv. Emotion</source> <volume>31</volume>, <fpage>137</fpage>&#x02013;<lpage>144</lpage>.</citation>
</ref>
<ref id="B61">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hess</surname> <given-names>U.</given-names></name> <name><surname>Sen&#x000E9;cal</surname> <given-names>S.</given-names></name> <name><surname>Kirouac</surname> <given-names>G.</given-names></name> <name><surname>Herrera</surname> <given-names>P.</given-names></name> <name><surname>Philippot</surname> <given-names>P.</given-names></name> <name><surname>Kleck</surname> <given-names>R. E.</given-names></name></person-group> (<year>2000</year>). <article-title>Emotional expressivity in men and women: stereotypes and self-perceptions</article-title>. <source>Cogn. Emotion</source> <volume>14</volume>, <fpage>5</fpage>.</citation>
</ref>
<ref id="B62">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hess</surname> <given-names>U.</given-names></name> <name><surname>Thibault</surname> <given-names>P.</given-names></name> <name><surname>Adams</surname> <given-names>R. B.</given-names></name> <name><surname>Kleck</surname> <given-names>R. E.</given-names></name></person-group> (<year>2010</year>). <article-title>The influence of gender, social roles and facial appearance on perceived emotionality</article-title>. <source>Eur. J. Soc. Psychol</source>. <volume>40 7</volume>, <fpage>1310</fpage>&#x02013;<lpage>1317</lpage>.</citation>
</ref>
<ref id="B63">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ho</surname> <given-names>C.</given-names></name> <name><surname>Jackson</surname> <given-names>J. W.</given-names></name></person-group> (<year>2001</year>). <article-title>Attitudes toward Asian Americans: theory and measurement</article-title>. <source>J. Appl. Soc. Psychol</source>. <volume>31</volume>, <fpage>1553</fpage>&#x02013;<lpage>1581</lpage>.</citation>
</ref>
<ref id="B64">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hugenberg</surname> <given-names>K.</given-names></name> <name><surname>Bodenhausen</surname> <given-names>G. V.</given-names></name></person-group> (<year>2004</year>). <article-title>Ambiguity in social categorization: the role of prejudice and facial affect in race categorization</article-title>. <source>Psychol. Sci</source>. <volume>15</volume>, <fpage>342</fpage>&#x02013;<lpage>345</lpage>. <pub-id pub-id-type="doi">10.1111/j.0956-7976.2004.00680.x</pub-id><pub-id pub-id-type="pmid">15102145</pub-id></citation>
</ref>
<ref id="B66">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Johnson</surname> <given-names>K. L.</given-names></name> <name><surname>Ghavami</surname> <given-names>N.</given-names></name></person-group> (<year>2011</year>). <article-title>At the crossroads of conspicuous and concealable: what race categories communicate about sexual orientation</article-title>. <source>PLoS ONE</source> <volume>6</volume>:<fpage>e18025</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0018025</pub-id><pub-id pub-id-type="pmid">21483863</pub-id></citation>
</ref>
<ref id="B67">
<citation citation-type="other"><person-group person-group-type="author"><name><surname>Johnson</surname> <given-names>K. L.</given-names></name> <name><surname>Iida</surname> <given-names>M.</given-names></name> <name><surname>Tassinary</surname> <given-names>L. G.</given-names></name></person-group> (<year>in press</year>). <article-title>Person (mis)perception: Functionally biased sex categorization of bodies</article-title>. in <source>Proceedings of the Royal Society, Biological Sciences</source>.</citation>
</ref>
<ref id="B68">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Johnson</surname> <given-names>K. L.</given-names></name> <name><surname>McKay</surname> <given-names>L.</given-names></name> <name><surname>Pollick</surname> <given-names>F. E.</given-names></name></person-group> (<year>2011</year>). <article-title>Why &#x0201C;He throws like a girl&#x0201D; (but only when he&#x00027;s sad): emotion affects sex-decoding of biological motion displays</article-title>. <source>Cognition</source> <volume>119</volume>, <fpage>265</fpage>&#x02013;<lpage>280</lpage>. <pub-id pub-id-type="doi">10.1016/j.cognition.2011.01.016</pub-id><pub-id pub-id-type="pmid">21349506</pub-id></citation>
</ref>
<ref id="B69">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kanwisher</surname> <given-names>N.</given-names></name></person-group> (<year>2000</year>). <article-title>Domain specificity in face perception</article-title>. <source>Nat. Neurosci</source>. <volume>3</volume>, <fpage>759</fpage>&#x02013;<lpage>763</lpage>. <pub-id pub-id-type="doi">10.1038/77664</pub-id><pub-id pub-id-type="pmid">10903567</pub-id></citation>
</ref>
<ref id="B70">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kanwisher</surname> <given-names>N.</given-names></name> <name><surname>Yovel</surname> <given-names>G.</given-names></name></person-group> (<year>2006</year>). <article-title>The fusiform face area: a cortical region specialized for the perception of faces</article-title>. <source>Philos. Trans. R. Soc. Lond. B Biol. Sci</source>. <volume>361</volume>, <fpage>2109</fpage>&#x02013;<lpage>2128</lpage>. <pub-id pub-id-type="doi">10.1098/rstb.2006.1934</pub-id><pub-id pub-id-type="pmid">17118927</pub-id></citation>
</ref>
<ref id="B71">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kim</surname> <given-names>J.-N.</given-names></name> <name><surname>Shadlen</surname> <given-names>M. N.</given-names></name></person-group> (<year>1999</year>). <article-title>Neural correlates of a decision in the dorsolateral prefrontal cortex of the macaque</article-title>. <source>Nat. Neurosci</source>. <volume>2</volume>, <fpage>176</fpage>&#x02013;<lpage>185</lpage>. <pub-id pub-id-type="doi">10.1038/5739</pub-id><pub-id pub-id-type="pmid">10195203</pub-id></citation>
</ref>
<ref id="B72">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kunda</surname> <given-names>Z.</given-names></name> <name><surname>Thagard</surname> <given-names>P.</given-names></name></person-group> (<year>1996</year>). <article-title>Forming impressions from stereotypes, traits, and behaviors: a parallel-constraint-satisfaction theory</article-title>. <source>Psychol. Rev</source>. <volume>103</volume>, <fpage>284</fpage>&#x02013;<lpage>308</lpage>.</citation>
</ref>
<ref id="B73">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kurzban</surname> <given-names>R.</given-names></name> <name><surname>Tooby</surname> <given-names>J.</given-names></name> <name><surname>Cosmides</surname> <given-names>L.</given-names></name></person-group> (<year>2001</year>). <article-title>Can race be erased? Coalitional computation and social categorization</article-title>. <source>Proc. Natl. Acad. Sci. U.S.A</source>. <volume>98</volume>, <fpage>15387</fpage>&#x02013;<lpage>15392</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.251541498</pub-id><pub-id pub-id-type="pmid">11742078</pub-id></citation>
</ref>
<ref id="B74">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kveraga</surname> <given-names>K.</given-names></name> <name><surname>Boshyan</surname> <given-names>J.</given-names></name> <name><surname>Bar</surname> <given-names>M.</given-names></name></person-group> (<year>2007</year>). <article-title>Magnocellular projections as the trigger of top-down facilitation in recognition</article-title>. <source>J. Neurosci</source>. <volume>27</volume>, <fpage>13232</fpage>&#x02013;<lpage>13240</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.3481-07.2007</pub-id><pub-id pub-id-type="pmid">18045917</pub-id></citation>
</ref>
<ref id="B75">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>LaBar</surname> <given-names>K. S.</given-names></name> <name><surname>Crupain</surname> <given-names>M. J.</given-names></name> <name><surname>Voyvodic</surname> <given-names>J. T.</given-names></name> <name><surname>McCarthy</surname> <given-names>G.</given-names></name></person-group> (<year>2003</year>). <article-title>Dynamic perception of facial affect and identity in the human brain</article-title>. <source>Cereb. Cortex</source> <volume>13</volume>, <fpage>1023</fpage>&#x02013;<lpage>1033</lpage>. <pub-id pub-id-type="doi">10.1093/cercor/13.10.1023</pub-id><pub-id pub-id-type="pmid">12967919</pub-id></citation>
</ref>
<ref id="B76">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lattner</surname> <given-names>S.</given-names></name> <name><surname>Meyer</surname> <given-names>M. E.</given-names></name> <name><surname>Friederici</surname> <given-names>A. D.</given-names></name></person-group> (<year>2005</year>). <article-title>Voice perception: sex, pitch, and the right hemisphere</article-title>. <source>Hum. Brain Mapp</source>. <volume>24</volume>, <fpage>11</fpage>&#x02013;<lpage>20</lpage>. <pub-id pub-id-type="doi">10.1002/hbm.20065</pub-id><pub-id pub-id-type="pmid">15593269</pub-id></citation>
</ref>
<ref id="B77">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Levin</surname> <given-names>D. T.</given-names></name> <name><surname>Banaji</surname> <given-names>M. R.</given-names></name></person-group> (<year>2006</year>). <article-title>Distortions in the perceived lightness of faces: the role of race categories</article-title>. <source>J. Exp. Psychol. Gen</source>. <volume>135</volume>, <fpage>501</fpage>&#x02013;<lpage>512</lpage>. <pub-id pub-id-type="doi">10.1037/0096-3445.135.4.501</pub-id><pub-id pub-id-type="pmid">17087569</pub-id></citation>
</ref>
<ref id="B78">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lobmaier</surname> <given-names>J. S.</given-names></name> <name><surname>Tiddeman</surname> <given-names>B. P.</given-names></name> <name><surname>Perrett</surname> <given-names>D. I.</given-names></name></person-group> (<year>2008</year>). <article-title>Emotional expression modulates perceived gaze direction</article-title>. <source>Emotion</source> <volume>8</volume>, <fpage>573</fpage>&#x02013;<lpage>577</lpage>. <pub-id pub-id-type="doi">10.1037/1528-3542.8.4.573</pub-id><pub-id pub-id-type="pmid">18729587</pub-id></citation>
</ref>
<ref id="B79">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>MacLin</surname> <given-names>O. H.</given-names></name> <name><surname>Malpass</surname> <given-names>R. S.</given-names></name></person-group> (<year>2001</year>). <article-title>Racial categorization of faces: the ambiguous-race face effect</article-title>. <source>Psychol. Public Policy Law</source> <volume>7</volume>, <fpage>98</fpage>&#x02013;<lpage>118</lpage>.</citation>
</ref>
<ref id="B80">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>MacLin</surname> <given-names>O. H.</given-names></name> <name><surname>Malpass</surname> <given-names>R. S.</given-names></name></person-group> (<year>2003</year>). <article-title>The ambiguous-race face illusion</article-title>. <source>Perception</source> <volume>32</volume>, <fpage>249</fpage>&#x02013;<lpage>252</lpage>. <pub-id pub-id-type="pmid">12696668</pub-id></citation>
</ref>
<ref id="B81">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Macrae</surname> <given-names>C. N.</given-names></name> <name><surname>Bodenhausen</surname> <given-names>G. V.</given-names></name></person-group> (<year>2000</year>). <article-title>Social cognition: thinking categorically about others</article-title>. <source>Annu. Rev. Psychol</source>. <volume>51</volume>, <fpage>93</fpage>&#x02013;<lpage>120</lpage>. <pub-id pub-id-type="doi">10.1146/annurev.psych.51.1.93</pub-id><pub-id pub-id-type="pmid">10751966</pub-id></citation>
</ref>
<ref id="B82">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Macrae</surname> <given-names>C. N.</given-names></name> <name><surname>Bodenhausen</surname> <given-names>G. V.</given-names></name> <name><surname>Milne</surname> <given-names>A. B.</given-names></name></person-group> (<year>1995</year>). <article-title>The dissection of selection in person perception: inhibitory processes in social stereotyping</article-title>. <source>J. Pers. Soc. Psychol</source>. <volume>69</volume>, <fpage>397</fpage>&#x02013;<lpage>407</lpage>. <pub-id pub-id-type="pmid">7562387</pub-id></citation>
</ref>
<ref id="B83">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Marr</surname> <given-names>D.</given-names></name></person-group> (<year>1982</year>). <source>Vision</source>. <publisher-loc>San Francisco, CA</publisher-loc>: <publisher-name>W. H. Freeman</publisher-name>.</citation>
</ref>
<ref id="B84">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Marsh</surname> <given-names>A. A.</given-names></name> <name><surname>Adams</surname> <given-names>R. B.</given-names></name> <name><surname>Kleck</surname> <given-names>R. E.</given-names></name></person-group> (<year>2005</year>). <article-title>Why do fear and anger look the way they do? form and social function in facial expressions</article-title>. <source>Pers. Soc. Psychol. Bull</source>. <volume>31</volume>, <fpage>73</fpage>&#x02013;<lpage>86</lpage>. <pub-id pub-id-type="doi">10.1177/0146167204271306</pub-id><pub-id pub-id-type="pmid">15574663</pub-id></citation>
</ref>
<ref id="B85">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Martin</surname> <given-names>W.</given-names></name> <name><surname>Rovira</surname> <given-names>M.</given-names></name></person-group> (<year>1982</year>). <article-title>Response biases in eye-gaze perception</article-title>. <source>J. Psychol</source>. <volume>110(2d Half)</volume>, <fpage>203</fpage>&#x02013;<lpage>209</lpage>. <pub-id pub-id-type="doi">10.1080/00223980.1982.9915341</pub-id><pub-id pub-id-type="pmid">7069638</pub-id></citation>
</ref>
<ref id="B85a">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mathews</surname> <given-names>A.</given-names></name> <name><surname>Fox</surname> <given-names>E.</given-names></name> <name><surname>Yiend</surname> <given-names>J.</given-names></name> <name><surname>Calder</surname> <given-names>A.</given-names></name></person-group> (<year>2003</year>). <article-title>The face of fear: Effects of eye gaze and emotion on visual attention</article-title>. <source>Vis. Cogn</source>. <volume>10</volume>, <fpage>823</fpage>&#x02013;<lpage>835</lpage>. <pub-id pub-id-type="doi">10.1080/13506280344000095</pub-id><pub-id pub-id-type="pmid">17453064</pub-id></citation>
</ref>
<ref id="B86">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>McClelland</surname> <given-names>J. L.</given-names></name></person-group> (<year>1991</year>). <article-title>Stochastic interactive processes and the effect of context on perception</article-title>. <source>Cognit. Psychol</source>. <volume>23</volume>, <fpage>1</fpage>&#x02013;<lpage>44</lpage>. <pub-id pub-id-type="doi">10.1016/0010-0285(91)90002-6</pub-id><pub-id pub-id-type="pmid">2001614</pub-id></citation>
</ref>
<ref id="B87">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Michel</surname> <given-names>C.</given-names></name> <name><surname>Corneille</surname> <given-names>O.</given-names></name> <name><surname>Rossion</surname> <given-names>B.</given-names></name></person-group> (<year>2007</year>). <article-title>Race categorization modulates holistic face encoding</article-title>. <source>Cogn. Sci</source>. <volume>31</volume>, <fpage>911</fpage>&#x02013;<lpage>924</lpage>. <pub-id pub-id-type="doi">10.1080/03640210701530805</pub-id><pub-id pub-id-type="pmid">21635322</pub-id></citation>
</ref>
<ref id="B88">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Miller</surname> <given-names>S. L.</given-names></name> <name><surname>Maner</surname> <given-names>J. K.</given-names></name> <name><surname>Becker</surname> <given-names>D. V.</given-names></name></person-group> (<year>2010</year>). <article-title>Self-protective biases in group categorization: threat cues shape the psychological boundary between &#x0201C;us&#x0201D; and &#x0201C;them&#x0201D;</article-title>. <source>J. Pers. Soc. Psychol</source>. <volume>99</volume>, <fpage>62</fpage>&#x02013;<lpage>77</lpage>. <pub-id pub-id-type="doi">10.1037/a0018086</pub-id><pub-id pub-id-type="pmid">20565186</pub-id></citation>
</ref>
<ref id="B89">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Oosterhof</surname> <given-names>N. N.</given-names></name> <name><surname>Todorov</surname> <given-names>A.</given-names></name></person-group> (<year>2009</year>). <article-title>Shared perceptual basis of emotional expressions and trustworthiness impressions from faces</article-title>. <source>Emotion</source> <volume>9</volume>, <fpage>128</fpage>&#x02013;<lpage>133</lpage>. <pub-id pub-id-type="doi">10.1037/a0014520</pub-id><pub-id pub-id-type="pmid">19186926</pub-id></citation>
</ref>
<ref id="B90">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pauker</surname> <given-names>K.</given-names></name> <name><surname>Ambady</surname> <given-names>N.</given-names></name></person-group> (<year>2009</year>). <article-title>Multiracial faces: how categorization affects memory at the boundaries of race</article-title>. <source>J. Soc. Issues</source> <volume>65</volume>, <fpage>69</fpage>&#x02013;<lpage>86</lpage>.</citation>
</ref>
<ref id="B91">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pauker</surname> <given-names>K.</given-names></name> <name><surname>Weisbuch</surname> <given-names>M.</given-names></name> <name><surname>Ambady</surname> <given-names>N.</given-names></name> <name><surname>Sommers</surname> <given-names>S. R.</given-names></name> <name><surname>Adams</surname> <given-names>R. B.</given-names> <suffix>Jr.</suffix></name> <name><surname>Ivcevic</surname> <given-names>Z.</given-names></name></person-group> (<year>2009</year>). <article-title>Not so black and white: memory for ambiguous group members</article-title>. <source>J. Pers. Soc. Psychol</source>. <volume>96</volume>, <fpage>795</fpage>&#x02013;<lpage>810</lpage>. <pub-id pub-id-type="doi">10.1037/a0013265</pub-id><pub-id pub-id-type="pmid">19309203</pub-id></citation>
</ref>
<ref id="B92">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Peelen</surname> <given-names>M. V.</given-names></name> <name><surname>Downing</surname> <given-names>P. E.</given-names></name></person-group> (<year>2007</year>). <article-title>The neural basis of visual body perception</article-title>. <source>Nat. Rev. Neurosci</source>. <volume>8</volume>, <fpage>636</fpage>&#x02013;<lpage>648</lpage>. <pub-id pub-id-type="doi">10.1038/nrn2195</pub-id><pub-id pub-id-type="pmid">17643089</pub-id></citation>
</ref>
<ref id="B93">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Perrett</surname> <given-names>D. I.</given-names></name> <name><surname>Hietanen</surname> <given-names>J. K.</given-names></name> <name><surname>Oram</surname> <given-names>M. W.</given-names></name> <name><surname>Benson</surname> <given-names>P. J.</given-names></name> <name><surname>Rolls</surname> <given-names>E. T.</given-names></name></person-group> (<year>1992</year>). <article-title>Organization and function of cells responsive to faces in the temporal cortex</article-title>. <source>Philos. Trans. R. Soc. Lond. B Biol. Sci</source>. <volume>335</volume>, <fpage>23</fpage>&#x02013;<lpage>30</lpage>. <pub-id pub-id-type="doi">10.1098/rstb.1992.0003</pub-id><pub-id pub-id-type="pmid">1348133</pub-id></citation>
</ref>
<ref id="B94">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Plant</surname> <given-names>E. A.</given-names></name> <name><surname>Hyde</surname> <given-names>J. S.</given-names></name> <name><surname>Keltner</surname> <given-names>D.</given-names></name> <name><surname>Devine</surname> <given-names>P. G.</given-names></name></person-group> (<year>2000</year>). <article-title>The gender stereotyping of emotion</article-title>. <source>Psychol. Women Q</source>. <volume>24</volume>, <fpage>81</fpage>&#x02013;<lpage>92</lpage>.</citation>
</ref>
<ref id="B95">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Putman</surname> <given-names>P.</given-names></name> <name><surname>Hermans</surname> <given-names>E. J.</given-names></name> <name><surname>van Honk</surname> <given-names>J.</given-names></name></person-group> (<year>2007</year>). <article-title>Exogenous cortisol shifts a motivated bias from fear to anger in spatial working memory for facial expressions</article-title>. <source>Psychoneuroendocrinology</source> <volume>32</volume>, <fpage>14</fpage>&#x02013;<lpage>21</lpage>. <pub-id pub-id-type="doi">10.1016/j.psyneuen.2006.09.010</pub-id><pub-id pub-id-type="pmid">17088024</pub-id></citation>
</ref>
<ref id="B96">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Quinn</surname> <given-names>K. A.</given-names></name> <name><surname>Macrae</surname> <given-names>C. N.</given-names></name></person-group> (<year>2005</year>). <article-title>Categorizing others: the dynamics of person construal</article-title>. <source>J. Pers. Soc. Psychol</source>. <volume>88</volume>, <fpage>467</fpage>&#x02013;<lpage>479</lpage>. <pub-id pub-id-type="doi">10.1037/0022-3514.88.3.467</pub-id><pub-id pub-id-type="pmid">15740440</pub-id></citation>
</ref>
<ref id="B97">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Rogers</surname> <given-names>T. T.</given-names></name> <name><surname>McClelland</surname> <given-names>J. L.</given-names></name></person-group> (<year>2004</year>). <source>Semantic Cognition: A Parallel Distributed Processing Approach</source>. <publisher-loc>Boston, MA</publisher-loc>: <publisher-name>Bradford Books</publisher-name>. <pub-id pub-id-type="doi">10.1038/nrn1076</pub-id><pub-id pub-id-type="pmid">12671647</pub-id></citation>
</ref>
<ref id="B98">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Rumelhart</surname> <given-names>D. E.</given-names></name> <name><surname>Hinton</surname> <given-names>G. E.</given-names></name> <name><surname>McClelland</surname> <given-names>J. L.</given-names></name></person-group> (<year>1986</year>). <source>A General Framework for Parallel Distributed Processing</source>. <publisher-loc>Cambridge, MA</publisher-loc>: <publisher-name>MIT Press</publisher-name>.</citation>
</ref>
<ref id="B99">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sacco</surname> <given-names>D. F.</given-names></name> <name><surname>Hugenberg</surname> <given-names>K.</given-names></name></person-group> (<year>2009</year>). <article-title>The look of fear and anger: facial maturity modulates recognition of fearful and angry expressions</article-title>. <source>Emotion</source> <volume>9</volume>, <fpage>39</fpage>&#x02013;<lpage>49</lpage>. <pub-id pub-id-type="doi">10.1037/a0014081</pub-id><pub-id pub-id-type="pmid">19186915</pub-id></citation>
</ref>
<ref id="B100">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Said</surname> <given-names>C. P.</given-names></name> <name><surname>Sebe</surname> <given-names>N.</given-names></name> <name><surname>Todorov</surname> <given-names>A.</given-names></name></person-group> (<year>2009</year>). <article-title>Structural resemblance to emotional expressions predicts evaluation of emotionally neutral faces</article-title>. <source>Emotion</source> <volume>9</volume>, <fpage>260</fpage>&#x02013;<lpage>264</lpage>. <pub-id pub-id-type="doi">10.1037/a0014681</pub-id><pub-id pub-id-type="pmid">19348537</pub-id></citation>
</ref>
<ref id="B101">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sander</surname> <given-names>D.</given-names></name> <name><surname>Grandjean</surname> <given-names>D.</given-names></name> <name><surname>Kaiser</surname> <given-names>S.</given-names></name> <name><surname>Wehrle</surname> <given-names>T.</given-names></name> <name><surname>Scherer</surname> <given-names>K. R.</given-names></name></person-group> (<year>2007</year>). <article-title>Interaction effects of perceived gaze direction and dynamic facial expression: evidence for appraisal theories of emotion</article-title>. <source>Eur. J. Cogn. Psychol</source>. <volume>19</volume>, <fpage>470</fpage>&#x02013;<lpage>480</lpage>.</citation>
</ref>
<ref id="B102">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sato</surname> <given-names>W.</given-names></name> <name><surname>Yoshikawa</surname> <given-names>S.</given-names></name> <name><surname>Kochiyama</surname> <given-names>T.</given-names></name> <name><surname>Matsumura</surname> <given-names>M.</given-names></name></person-group> (<year>2004</year>). <article-title>The amygdala processes the emotional significance of facial expressions: an fMRI investigation using the interaction between expression and face direction</article-title>. <source>Neuroimage</source> <volume>22</volume>, <fpage>1006</fpage>&#x02013;<lpage>1013</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2004.02.030</pub-id><pub-id pub-id-type="pmid">15193632</pub-id></citation>
</ref>
<ref id="B103">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schweinberger</surname> <given-names>S. R.</given-names></name> <name><surname>Burton</surname> <given-names>A. M.</given-names></name> <name><surname>Kelly</surname> <given-names>S. W.</given-names></name></person-group> (<year>1999</year>). <article-title>Asymmetric dependencies in perceiving identity and emotion: experiments with morphed faces</article-title>. <source>Percept. Psychophys</source>. <volume>61</volume>, <fpage>1102</fpage>&#x02013;<lpage>1115</lpage>. <pub-id pub-id-type="pmid">10497431</pub-id></citation>
</ref>
<ref id="B104">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Sherif</surname> <given-names>M.</given-names></name></person-group> (<year>1967</year>). <source>Group Conflict and Co-operation: Their Social Psychology</source>. <publisher-loc>London</publisher-loc>: <publisher-name>Routledge and K. Paul</publisher-name>.</citation>
</ref>
<ref id="B105">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sinclair</surname> <given-names>L.</given-names></name> <name><surname>Kunda</surname> <given-names>Z.</given-names></name></person-group> (<year>1999</year>). <article-title>Reactions to a black professional: motivated inhibition and activation of conflicting stereotypes</article-title>. <source>J. Pers. Soc. Psychol</source>. <volume>77</volume>, <fpage>885</fpage>&#x02013;<lpage>904</lpage>. <pub-id pub-id-type="pmid">10573871</pub-id></citation>
</ref>
<ref id="B106">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Slepian</surname> <given-names>M. L.</given-names></name> <name><surname>Weisbuch</surname> <given-names>M.</given-names></name> <name><surname>Adams</surname> <given-names>R. B.</given-names></name> <name><surname>Ambady</surname> <given-names>N.</given-names></name></person-group> (<year>2011</year>). <article-title>Gender moderates the relationship between emotion and perceived gaze</article-title>. <source>Emotion</source> <volume>11</volume>, <fpage>1439</fpage>&#x02013;<lpage>1444</lpage>. <pub-id pub-id-type="doi">10.1037/a0026163</pub-id><pub-id pub-id-type="pmid">22142212</pub-id></citation>
</ref>
<ref id="B107">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Smith</surname> <given-names>E. R.</given-names></name></person-group> (<year>2006</year>). <article-title>Multiply categorizable social objects: representational models and some potential determinants of category use</article-title>, in <source>Multiple Social Categorization: Processes, Models and Applications</source> ed <person-group person-group-type="editor"><name><surname>Crisp</surname> <given-names>R.</given-names></name></person-group> (<publisher-loc>New York, NY</publisher-loc>: <publisher-name>Psychology Press</publisher-name>), <fpage>50</fpage>&#x02013;<lpage>62</lpage>.</citation>
</ref>
<ref id="B108">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Smith</surname> <given-names>P. L.</given-names></name> <name><surname>Ratcliff</surname> <given-names>R.</given-names></name></person-group> (<year>2004</year>). <article-title>Psychology and neurobiology of simple decisions</article-title>. <source>Trends Neurosci</source>. <volume>27</volume>, <fpage>161</fpage>&#x02013;<lpage>168</lpage>. <pub-id pub-id-type="doi">10.1016/j.tins.2004.01.006</pub-id><pub-id pub-id-type="pmid">15036882</pub-id></citation>
</ref>
<ref id="B109">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Smolensky</surname> <given-names>P.</given-names></name></person-group> (<year>1989</year>). <article-title>Connectionist modeling: neural computation/mental connections</article-title>, in <source>Neural Connections, Mental Computations</source>, eds <person-group person-group-type="editor"><name><surname>Nadel</surname> <given-names>L.</given-names></name> <name><surname>Cooper</surname> <given-names>A.</given-names></name> <name><surname>Culicover</surname> <given-names>P.</given-names></name> <name><surname>Harnish</surname> <given-names>R. M.</given-names></name></person-group> (<publisher-loc>Cambridge, MA</publisher-loc>: <publisher-name>MIT Press</publisher-name>), <fpage>233</fpage>&#x02013;<lpage>250</lpage>.</citation>
</ref>
<ref id="B110">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Spivey</surname> <given-names>M. J.</given-names></name></person-group> (<year>2007</year>). <source>The Continuity of Mind</source>. <publisher-loc>New York, NY</publisher-loc>: <publisher-name>Oxford University Press</publisher-name>.</citation>
</ref>
<ref id="B111">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Spivey</surname> <given-names>M. J.</given-names></name> <name><surname>Dale</surname> <given-names>R.</given-names></name></person-group> (<year>2006</year>). <article-title>Continuous dynamics in real-time cognition</article-title>. <source>Curr. Dir. Psychol. Sci</source>. <volume>15</volume>, <fpage>207</fpage>&#x02013;<lpage>211</lpage>.</citation>
</ref>
<ref id="B112">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Strangor</surname> <given-names>C.</given-names></name> <name><surname>Lynch</surname> <given-names>L.</given-names></name> <name><surname>Duan</surname> <given-names>C.</given-names></name> <name><surname>Glas</surname> <given-names>B.</given-names></name></person-group> (<year>1992</year>). <article-title>Categorization of individuals on the basis of multiple social features</article-title>. <source>J. Pers. Soc. Psychol</source>. <volume>62</volume>, <fpage>207</fpage>&#x02013;<lpage>218</lpage>.</citation>
</ref>
<ref id="B113">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Stroessner</surname> <given-names>S. J.</given-names></name></person-group> (<year>1996</year>). <article-title>Social categorization by race or sex: effects of perceived non-normalcy on response times</article-title>. <source>Soc. Cogn</source>. <volume>14</volume>, <fpage>247</fpage>&#x02013;<lpage>276</lpage>.</citation>
</ref>
<ref id="B114">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sugase</surname> <given-names>Y.</given-names></name> <name><surname>Yamane</surname> <given-names>S.</given-names></name> <name><surname>Ueno</surname> <given-names>S.</given-names></name> <name><surname>Kawano</surname> <given-names>K.</given-names></name></person-group> (<year>1999</year>). <article-title>Global and fine information coded by single neurons in the temporal visual cortex</article-title>. <source>Nature</source> <volume>400</volume>, <fpage>869</fpage>&#x02013;<lpage>873</lpage>. <pub-id pub-id-type="doi">10.1038/23703</pub-id><pub-id pub-id-type="pmid">10476965</pub-id></citation>
</ref>
<ref id="B115">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tajfel</surname> <given-names>H.</given-names></name></person-group> (<year>1969</year>). <article-title>Cognitive aspects of prejudice</article-title>. <source>J. Soc. Issues</source> <volume>25</volume>, <fpage>79</fpage>&#x02013;<lpage>97</lpage>. <pub-id pub-id-type="pmid">5373848</pub-id></citation>
</ref>
<ref id="B116">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tipples</surname> <given-names>J.</given-names></name></person-group> (<year>2006</year>). <article-title>Fear and fearfulness potentiate automatic orienting to eye gaze</article-title>. <source>Cogn. Emotion</source> <volume>20</volume>, <fpage>309</fpage>&#x02013;<lpage>320</lpage>.</citation>
</ref>
<ref id="B117">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Todorov</surname> <given-names>A.</given-names></name> <name><surname>Uleman</surname> <given-names>J. S.</given-names></name></person-group> (<year>2003</year>). <article-title>The efficiency of binding spontaneous trait inferences to actors&#x00027; faces</article-title>. <source>J. Exp. Soc. Psychol</source>. <volume>39</volume>, <fpage>549</fpage>&#x02013;<lpage>562</lpage>.</citation>
</ref>
<ref id="B118">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tranel</surname> <given-names>D.</given-names></name> <name><surname>Damasio</surname> <given-names>A. R.</given-names></name></person-group> (<year>1988</year>). <article-title>Non-conscious face recognition in patients with face agnosia</article-title>. <source>Behav. Brain Res</source>. <volume>30</volume>, <fpage>235</fpage>&#x02013;<lpage>249</lpage>. <pub-id pub-id-type="pmid">3178995</pub-id></citation>
</ref>
<ref id="B119">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Treisman</surname> <given-names>A.</given-names></name></person-group> (<year>1996</year>). <article-title>The binding problem</article-title>. <source>Curr. Opin. Neurobiol</source>. <volume>6</volume>, <fpage>171</fpage>&#x02013;<lpage>178</lpage>. <pub-id pub-id-type="doi">10.1016/S0959-4388(96)80070-5</pub-id><pub-id pub-id-type="pmid">8725958</pub-id></citation>
</ref>
<ref id="B120">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Usher</surname> <given-names>M.</given-names></name> <name><surname>McClelland</surname> <given-names>J. L.</given-names></name></person-group> (<year>2001</year>). <article-title>The time course of perceptual choice: the leaky, competing accumulator model</article-title>. <source>Psychol. Rev</source>. <volume>108</volume>, <fpage>550</fpage>&#x02013;<lpage>592</lpage>. <pub-id pub-id-type="pmid">11488378</pub-id></citation>
</ref>
<ref id="B121">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vescio</surname> <given-names>T. K.</given-names></name> <name><surname>Judd</surname> <given-names>C. M.</given-names></name> <name><surname>Kwan</surname> <given-names>V. S.</given-names></name></person-group> (<year>2004</year>). <article-title>The crossed-categorization hypothesis: evidence of reductions in the strength of categorization, but not intergroup bias</article-title>. <source>J. Exp. Soc. Psychol</source>. <volume>40</volume>, <fpage>478</fpage>&#x02013;<lpage>496</lpage>.</citation>
</ref>
<ref id="B122">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Willis</surname> <given-names>J.</given-names></name> <name><surname>Todorov</surname> <given-names>A.</given-names></name></person-group> (<year>2006</year>). <article-title>First impressions: making up your mind after a 100-ms exposure to a face</article-title>. <source>Psychol. Sci</source>. <volume>17</volume>, <fpage>592</fpage>&#x02013;<lpage>598</lpage>. <pub-id pub-id-type="doi">10.1111/j.1467-9280.2006.01750.x</pub-id><pub-id pub-id-type="pmid">16866745</pub-id></citation>
</ref>
<ref id="B123">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Winston</surname> <given-names>J. S.</given-names></name> <name><surname>Henson</surname> <given-names>R. N. A.</given-names></name> <name><surname>Fine-Goulden</surname> <given-names>M. R.</given-names></name> <name><surname>Dolan</surname> <given-names>R. J.</given-names></name></person-group> (<year>2004</year>). <article-title>fMRI-adaptation reveals dissociable neural representations of identity and expression in face perception</article-title>. <source>J. Neurophysiol</source>. <volume>92</volume>, <fpage>1830</fpage>&#x02013;<lpage>1839</lpage>. <pub-id pub-id-type="doi">10.1152/jn.00155.2004</pub-id><pub-id pub-id-type="pmid">15115795</pub-id></citation>
</ref>
<ref id="B124">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Young</surname> <given-names>A. W.</given-names></name> <name><surname>Newcombe</surname> <given-names>F.</given-names></name> <name><surname>de Haan</surname> <given-names>E. H. F.</given-names></name> <name><surname>Small</surname> <given-names>M.</given-names></name> <name><surname>Hay</surname> <given-names>D. C.</given-names></name></person-group> (<year>1993</year>). <article-title>Face perception after brain injury: selective impairments affecting identity and expression</article-title>. <source>Brain</source> <volume>116</volume>, <fpage>941</fpage>&#x02013;<lpage>959</lpage>. <pub-id pub-id-type="doi">10.1093/brain/116.4.941</pub-id><pub-id pub-id-type="pmid">8353717</pub-id></citation>
</ref>
<ref id="B125">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zarate</surname> <given-names>M. A.</given-names></name> <name><surname>Smith</surname> <given-names>E. R.</given-names></name></person-group> (<year>1990</year>). <article-title>Person categorization and stereotyping</article-title>. <source>Soc. Cogn</source>. <volume>8</volume>, <fpage>161</fpage>&#x02013;<lpage>185</lpage>.</citation>
</ref>
<ref id="B126">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Zebrowitz</surname> <given-names>L. A.</given-names></name></person-group> (<year>1997</year>). <source>Reading Faces: Window to the Soul? New Directions in Social Psychology</source>. <publisher-loc>Boulder, CO</publisher-loc>: <publisher-name>Westview Press</publisher-name>.</citation>
</ref>
<ref id="B127">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zebrowitz</surname> <given-names>L. A.</given-names></name> <name><surname>Kikuchi</surname> <given-names>M.</given-names></name> <name><surname>Fellous</surname> <given-names>J. M.</given-names></name></person-group> (<year>2007</year>). <article-title>Are effects of emotion expression on trait impressions mediated by babyfaceness? evidence from connectionist modeling</article-title>. <source>Pers. Soc. Psychol. Bull</source>. <volume>33</volume>, <fpage>648</fpage>&#x02013;<lpage>662</lpage>. <pub-id pub-id-type="doi">10.1177/0146167206297399</pub-id><pub-id pub-id-type="pmid">17440203</pub-id></citation>
</ref>
<ref id="B128">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zebrowitz</surname> <given-names>L. A.</given-names></name> <name><surname>Kikuchi</surname> <given-names>M.</given-names></name> <name><surname>Fellous</surname> <given-names>J.-M.</given-names></name></person-group> (<year>2010</year>). <article-title>Facial resemblance to emotions: group differences, impression effects, and race stereotypes</article-title>. <source>J. Pers. Soc. Psychol</source>. <volume>98</volume>, <fpage>175</fpage>&#x02013;<lpage>189</lpage>. <pub-id pub-id-type="doi">10.1037/a0017990</pub-id><pub-id pub-id-type="pmid">20085393</pub-id></citation>
</ref>
</ref-list>
<fn-group>
<fn id="fn0001"><p><sup>1</sup>It is important to note that such findings could also potentially be accounted for by a response bias. In such a case, the top-down effect may be the result of a higher-order decision-level response criterion that is altered by stereotypic expectations or a perceiver&#x00027; motivational state, rather than a genuine change in perception occurring.</p></fn>
<fn id="fn0002"><p><sup>2</sup>Given that the connectionist model employs localist nodes that are relatively abstract and that model parameters are hand-coded, the model should be seen as an existence proof for the kind of processing proposed. Future research will need to expand and refine the model to more rigorously investigate its plausibility.</p></fn>
</fn-group>
</back>
</article>