<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="review-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Psychol.</journal-id>
<journal-title>Frontiers in Psychology</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Psychol.</abbrev-journal-title>
<issn pub-type="epub">1664-1078</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fpsyg.2013.00318</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Psychology</subject>
<subj-group>
<subject>Review Article</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Face Adaptation Effects: Reviewing the Impact of Adapting Information, Time, and Transfer</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Strobach</surname> <given-names>Tilo</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<xref ref-type="corresp" rid="cor1">&#x0002A;</xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Carbon</surname> <given-names>Claus-Christian</given-names></name>
<xref ref-type="aff" rid="aff3"><sup>3</sup></xref>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Department of Psychology, Humboldt-University</institution>, <addr-line>Berlin</addr-line>, <country>Germany</country></aff>
<aff id="aff2"><sup>2</sup><institution>Department of Psychology, Ludwig-Maximilians-University</institution>, <addr-line>Munich</addr-line>, <country>Germany</country></aff>
<aff id="aff3"><sup>3</sup><institution>Department of General Psychology and Methodology, University of Bamberg</institution>, <addr-line>Bamberg</addr-line>, <country>Germany</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Peter J. Hills, Anglia Ruskin University, UK</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Alice O&#x02019;Toole, University of Texas at Dallas, USA; Graham Hole, University of Sussex, UK</p></fn>
<corresp content-type="corresp" id="cor1">&#x0002A;Correspondence: Tilo Strobach, Department of Psychology, Humboldt-University Berlin, Rudower Chaussee 18, 12489 Berlin, Germany e-mail: <email>tilo.strobach&#x00040;hu-berlin.de</email></corresp>
<fn fn-type="other" id="fn001"><p>This article was submitted to Frontiers in Perception Science, a specialty of Frontiers in Psychology.</p></fn>
</author-notes>
<pub-date pub-type="epub">
<day>03</day>
<month>06</month>
<year>2013</year>
</pub-date>
<pub-date pub-type="collection">
<year>2013</year>
</pub-date>
<volume>4</volume>
<elocation-id>318</elocation-id>
<history>
<date date-type="received">
<day>01</day>
<month>03</month>
<year>2013</year>
</date>
<date date-type="accepted">
<day>16</day>
<month>05</month>
<year>2013</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2013 Strobach and Carbon.</copyright-statement>
<copyright-year>2013</copyright-year>
<license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by/3.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in other forums, provided the original authors and source are credited and subject to any copyright notices concerning any third-party graphics etc.</p></license>
</permissions>
<abstract><p>The ability to adapt is essential to live and survive in an ever-changing environment such as the human ecosystem. Here we review the literature on adaptation effects of face stimuli to give an overview of existing findings in this area, highlight gaps in its research literature, initiate new directions in face adaptation research, and help to design future adaptation studies. Furthermore, this review should lead to better understanding of the processing characteristics as well as the mental representations of face-relevant information. The review systematizes studies at a behavioral level in respect of a framework which includes three dimensions representing the major characteristics of studies in this field of research. These dimensions comprise (1) the specificity of adapting face information, e.g., identity, gender, or age aspects of the material to be adapted to (2) aspects of timing (e.g., the sustainability of adaptation effects) and (3) transfer relations between face images presented during adaptation and adaptation tests (e.g., images of the same or different identities). The review concludes with options for how to combine findings across different dimensions to demonstrate the relevance of our framework for future studies.</p></abstract>
<kwd-group>
<kwd>face adaptation</kwd>
<kwd>figural adaptation effects</kwd>
<kwd>memory representation</kwd>
<kwd>learning</kwd>
<kwd>plasticity</kwd>
<kwd>perception</kwd>
<kwd>transfer</kwd>
<kwd>delay</kwd>
</kwd-group>
<counts>
<fig-count count="2"/>
<table-count count="4"/>
<equation-count count="0"/>
<ref-count count="95"/>
<page-count count="12"/>
<word-count count="9744"/>
</counts>
</article-meta>
</front>
<body>
<p>The term <italic>adaptation</italic> refers to the ability to adjust to novel information and experiences. This ability to adapt is essential for living and surviving in an ever-changing environment such as the human ecosystem (Carbon and Ditye, <xref ref-type="bibr" rid="B14">2012</xref>). Visual adaptation in particular is an effect of the processes by which the visual system encodes and represents information, and includes a process by which the visual system (passively or actively) alters its function in response to the lack of fit between mental representations and perceived objects (e.g., Clifford et al., <xref ref-type="bibr" rid="B21">2000</xref>; Clifford and Rhodes, <xref ref-type="bibr" rid="B20">2005</xref>). Such responses result in adaptation effects. In experimental situations, such adaptation effects are typically assessed in an adaptation test phase after an adaptation phase. Intensive investigations of these adaptation effects provides an excellent opportunity for an exploration and deeper understanding of the processing architecture as well as the representation of particular stimuli (Li et al., <xref ref-type="bibr" rid="B53">2008a</xref>; Webster, <xref ref-type="bibr" rid="B87">2011</xref>).</p>
<p>The present paper includes a systematic review about the literature on adaptation effects of face stimuli<xref ref-type="fn" rid="fn1"><sup>1</sup></xref>: which areas does this review on face adaptation cover and which areas does it negotiate? This review focuses on empirical studies that investigate face adaptation effects on a behavioral level; this is typically realized by an overt categorization of face stimuli in a test phase. In fact, we focus on adaptation under optimal conditions in a fully developed and (more or less) optimally functioning cognitive system; i.e., we review findings of studies typically investigating adaptation effects in younger adults possessing face recognition skills that are particularly impressive, for instance the fact that normal persons can discriminate thousands of faces (Jeffery and Rhodes, <xref ref-type="bibr" rid="B34">2011</xref>) when they reach so-called &#x0201C;face expertise&#x0201D; (Schwaninger et al., <xref ref-type="bibr" rid="B74">2003</xref>). This focus on complex objects of the face category is realized in an exclusive and extensive way; that is, we do not relate findings in the area of face adaptation effects to other visual coding mechanisms such as color coding as realized in previous work (Webster, <xref ref-type="bibr" rid="B87">2011</xref>; Webster and MacLeod, <xref ref-type="bibr" rid="B89">2011</xref>). By reviewing face issues exclusively, we assume to provide the main aim of this review most efficiently: we aim to clearly highlight and systematize existing findings as well as gaps in the research literature in the area of face adaptation effects. This systematization should stimulate new directions in face adaptation research and help to design future adaptation studies. In contrast to what we provide, we do not include results about the adaptation of neural processes to face stimuli: for instance, studies on modulations of the N170 as a result of prior adaptation (e.g., Kov&#x000E1;cs et al., <xref ref-type="bibr" rid="B44">2006</xref>; Kloth et al., <xref ref-type="bibr" rid="B42">2010</xref>) as questions regarding this area of research refer to further dimensions and use different theoretical frameworks, mostly based on specific brain processes and structures. In addition, we omit research from developmental and evolutionary perspectives on face adaptation effects as they were already the major aim of recent alternative review papers (Leopold and Rhodes, <xref ref-type="bibr" rid="B51">2010</xref>; Jeffery and Rhodes, <xref ref-type="bibr" rid="B34">2011</xref>).</p>
<p>We start this review by presenting a framework that enables a systematic organization of findings in the field of face adaptation effects. This framework includes dimensions representing the major characteristics of studies (i.e., experimental manipulations) or operational parameters in this field. As discussed in detail in a later section, the dimensions enable a categorization of the (1) various adapting face information (2) timing characteristics of adaptation effects (e.g., the delay between adaptation and adaptation test phases), and (3) transfer relations between face images presented during these phases (e.g., images of the same or different identities). In the main section of the present review, we discuss the research literature specifically toward each of the framework&#x02019;s dimensions. Finally, we present options for how to combine findings across different dimensions to demonstrate the relevance of our framework for future studies.</p>
<p>Investigations on adaptation effects in faces are very relevant for the progress of cognitive research, since these effects offer a window into the processes and dynamics of highly complex object processing. First of all, faces are arguably the most important social stimuli since they are the primary means by which we perceive identity information, emotional information, etc. (see below). This expertise and its investigation in adaptation effects provide an essential tool or window for dissecting different levels of neural code and the visual pathway in face processing. This latter fact is also the motivation for a close look at timing and additionally the transfer characteristics of adaptation effects in faces. As research on adaptation effects in faces is moreover a broad and elaborated field today, represented by a great number of adaptation studies employing different procedures and aiming at different research questions, this research field also offers an excellent opportunity for taking a more general perspective on the found effects in the form of a review.</p>
<sec id="S1">
<title>Framework to Conceptualize Research on Face Adaptation Effects</title>
<p>As illustrated in Figure <xref ref-type="fig" rid="F1">1</xref> and in Table <xref ref-type="table" rid="T1">1</xref>, we integrate findings in the field of face adaptation research into a conceptual framework that includes three dimensions. The first dimension of this framework represents different types of potential facial information which are susceptible to adaptation; we call this dimension <italic>adapting information</italic>. Instances of adapting information are identity information (e.g., Leopold et al., <xref ref-type="bibr" rid="B50">2001</xref>; Jiang et al., <xref ref-type="bibr" rid="B38">2006</xref>), configural information (e.g., Carbon et al., <xref ref-type="bibr" rid="B18">2007b</xref>; Little et al., <xref ref-type="bibr" rid="B56">2008</xref>), gaze information (e.g., Jenkins et al., <xref ref-type="bibr" rid="B37">2006</xref>), emotional information (e.g., Webster et al., <xref ref-type="bibr" rid="B88">2004</xref>; Ng et al., <xref ref-type="bibr" rid="B60">2008</xref>; Adams et al., <xref ref-type="bibr" rid="B1">2010</xref>), age information (e.g., Schweinberger et al., <xref ref-type="bibr" rid="B76">2010</xref>), gender information (e.g., Kov&#x000E1;cs et al., <xref ref-type="bibr" rid="B45">2007</xref>; Bestelmeyer et al., <xref ref-type="bibr" rid="B6">2008</xref>), ethnicity information (e.g., Jaquet et al., <xref ref-type="bibr" rid="B32">2008</xref>; Ng et al., <xref ref-type="bibr" rid="B60">2008</xref>), attractiveness information (e.g., Anzures et al., <xref ref-type="bibr" rid="B3">2009</xref>), or viewpoint information (e.g., Chen et al., <xref ref-type="bibr" rid="B19">2010</xref>). The present list of adapting information is completed by face information investigated in the context of face distortion aftereffects (FDAEs; e.g., Webster and MacLin, <xref ref-type="bibr" rid="B90">1999</xref>). This method of manipulating faces may affect types of facial information that are listed above (e.g., configural information, age, identity, gender; please see also later sections). However, FDAEs are realized by unique manipulation algorithms (i.e., distortions by expanding or contracting the frontal-view original face image relative to a midpoint on the nose). Further, these algorithms relax the controlling of which specific information types are affected. For example, manipulating faces in the context of FDAEs affect facial features (e.g., eyes, mouth, etc.) while manipulations of configural information (i.e., spatial distances between these features) rather leave these features unaffected. Generally, these different types of instances of adapting information realize different levels of ecological validity. While differences in age, viewpoint, gaze, or emotion are plausible and realistic in an ecological context, manipulations of identity or configural information have less validity since such changes do not typically occur in the ecosystem.</p>
<fig position="float" id="F1">
<label>Figure 1</label>
<caption>
<p><bold>Framework to review face adaptation effects including dimensions for different types of adapting information, transfer effects, and timing between adaptation and adaptation test phases</bold>.</p></caption>
<graphic xlink:href="fpsyg-04-00318-g001.tif"/>
</fig>
<table-wrap position="float" id="T1">
<label>Table 1</label>
<caption>
<p><bold>Overview of types of adapting face information and related references</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">Adapting face information</th>
<th align="left">Reference</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Age information</td>
<td align="left">O&#x02019;Neil and Webster (<xref ref-type="bibr" rid="B62">2011</xref>), Schweinberger et al. (<xref ref-type="bibr" rid="B76">2010</xref>)</td>
</tr>
<tr>
<td align="left">Attractiveness information</td>
<td align="left">Anzures et al. (<xref ref-type="bibr" rid="B3">2009</xref>), Carbon et al. (<xref ref-type="bibr" rid="B17">2007a</xref>), MacLin and Webster (<xref ref-type="bibr" rid="B57">2001</xref>), Rhodes et al. (<xref ref-type="bibr" rid="B71">2003</xref>), Rhodes et al. (<xref ref-type="bibr" rid="B67">2009b</xref>), Webster and MacLin (<xref ref-type="bibr" rid="B90">1999</xref>)</td>
</tr>
<tr>
<td align="left">Configural information</td>
<td align="left">Carbon and Ditye (<xref ref-type="bibr" rid="B13">2011</xref>), Carbon and Leder (<xref ref-type="bibr" rid="B15">2005</xref>), Carbon et al. (<xref ref-type="bibr" rid="B18">2007b</xref>), Little et al. (<xref ref-type="bibr" rid="B55">2005</xref>), Little et al. (<xref ref-type="bibr" rid="B56">2008</xref>), McKone et al. (<xref ref-type="bibr" rid="B58">2005</xref>), Strobach et al. (<xref ref-type="bibr" rid="B77">2011</xref>)</td>
</tr>
<tr>
<td align="left">Emotion information</td>
<td align="left">Adams et al. (<xref ref-type="bibr" rid="B1">2010</xref>), Benton and Burgess (<xref ref-type="bibr" rid="B5">2008</xref>), Fox and Barton (<xref ref-type="bibr" rid="B24">2007</xref>), Ng et al. (<xref ref-type="bibr" rid="B60">2008</xref>), Webster et al. (<xref ref-type="bibr" rid="B88">2004</xref>)</td>
</tr>
<tr>
<td align="left">Ethnicity information</td>
<td align="left">Jaquet and Rhodes (<xref ref-type="bibr" rid="B30">2008</xref>), Ng et al. (<xref ref-type="bibr" rid="B60">2008</xref>), Rhodes et al. (<xref ref-type="bibr" rid="B72">2010</xref>), Webster et al. (<xref ref-type="bibr" rid="B88">2004</xref>)</td>
</tr>
<tr>
<td align="left">Figural (distortion) information</td>
<td align="left">Burkhardt et al. (<xref ref-type="bibr" rid="B9">2010</xref>), Hills et al. (<xref ref-type="bibr" rid="B27">2010</xref>), Jaquet and Rhodes (<xref ref-type="bibr" rid="B30">2008</xref>), Jaquet et al. (<xref ref-type="bibr" rid="B31">2007</xref>, <xref ref-type="bibr" rid="B32">2008</xref>), Jeffery et al. (<xref ref-type="bibr" rid="B35">2006</xref>, <xref ref-type="bibr" rid="B36">2007</xref>), Morikawa (<xref ref-type="bibr" rid="B59">2005</xref>), Robbins et al. (<xref ref-type="bibr" rid="B73">2007</xref>), Webster and MacLin (<xref ref-type="bibr" rid="B90">1999</xref>), Yamashita et al. (<xref ref-type="bibr" rid="B92">2005</xref>), Zhao and Chubb (<xref ref-type="bibr" rid="B95">2001</xref>)</td>
</tr>
<tr>
<td align="left">Gaze information</td>
<td align="left">Jenkins et al. (<xref ref-type="bibr" rid="B37">2006</xref>), Schweinberger et al. (<xref ref-type="bibr" rid="B75">2007</xref>)</td>
</tr>
<tr>
<td align="left">Gender information</td>
<td align="left">Bestelmeyer et al. (<xref ref-type="bibr" rid="B6">2008</xref>), Buckingham et al. (<xref ref-type="bibr" rid="B8">2006</xref>), Kov&#x000E1;cs et al. (<xref ref-type="bibr" rid="B45">2007</xref>), Ng et al. (<xref ref-type="bibr" rid="B60">2008</xref>), Ng et al. (<xref ref-type="bibr" rid="B61">2006</xref>), Webster et al. (<xref ref-type="bibr" rid="B88">2004</xref>), Yang et al. (<xref ref-type="bibr" rid="B93">2011</xref>)</td>
</tr>
<tr>
<td align="left">Identity information</td>
<td align="left">Anderson and Wilson (<xref ref-type="bibr" rid="B2">2005</xref>), Hurlbert (<xref ref-type="bibr" rid="B29">2001</xref>), Jiang et al. (<xref ref-type="bibr" rid="B38">2006</xref>), Leopold et al. (<xref ref-type="bibr" rid="B50">2001</xref>), Leopold et al. (<xref ref-type="bibr" rid="B52">2005</xref>), Palermo et al. (<xref ref-type="bibr" rid="B64">2011</xref>), Rhodes et al. (<xref ref-type="bibr" rid="B66">2009a</xref>), Rhodes et al. (<xref ref-type="bibr" rid="B68">2011</xref>), Rhodes and Jeffery (<xref ref-type="bibr" rid="B69">2006</xref>), Rhodes et al. (<xref ref-type="bibr" rid="B72">2010</xref>)</td>
</tr>
<tr>
<td align="left">Viewpoint information</td>
<td align="left">Chen et al. (<xref ref-type="bibr" rid="B19">2010</xref>), Fang et al. (<xref ref-type="bibr" rid="B23">2007</xref>)</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>The second dimension of the present framework, <italic>time</italic>, orders adaptation effects according to different types of temporal information. The first information type, <italic>delay</italic>, is related to the robustness and sustainability of adaptation effects; basically, the time interval between an adaptation and an adaptation test phase. Delays range from milliseconds (e.g., Leopold et al., <xref ref-type="bibr" rid="B50">2001</xref>; Rhodes et al., <xref ref-type="bibr" rid="B71">2003</xref>) to minutes (e.g., Carbon and Leder, <xref ref-type="bibr" rid="B16">2006</xref>; Kloth and Schweinberger, <xref ref-type="bibr" rid="B41">2008</xref>), but also include days and even weeks under typical laboratory (e.g., Carbon et al., <xref ref-type="bibr" rid="B18">2007b</xref>; Carbon and Ditye, <xref ref-type="bibr" rid="B13">2011</xref>) as well as ecologically more valid test contexts (Carbon and Ditye, <xref ref-type="bibr" rid="B14">2012</xref>). The delay characteristics of adaptation effects are essential for providing useful information about the decay of adaptation effects and thus the &#x0201C;recalibration&#x0201D; and &#x0201C;readaptation&#x0201D; abilities of the visual system (Carbon and Ditye, <xref ref-type="bibr" rid="B13">2011</xref>). Furthermore, they allow inferences about the robustness and consistency of perceptual information in general. In parallel to the &#x0201C;time&#x0201D; information <italic>delay</italic>, we focus on <italic>adaptation duration</italic>, the time span during which the adapting stimulus is presented (e.g., Strobach et al., <xref ref-type="bibr" rid="B77">2011</xref>). Adaptation duration information provides insights into how this time span can modulate the adaptation effect size or the adaptability of faces. Moreover, this type of time information was compared with simple adaptation effects (e.g., with tilt information; Leopold et al., <xref ref-type="bibr" rid="B52">2005</xref>; Rhodes et al., <xref ref-type="bibr" rid="B70">2007</xref>) to explore the dynamics of adaptation effects at different levels of cortical visual hierarchy. Finally, we focus on &#x0201C;time&#x0201D; information of the <italic>test duration</italic> type, establishing the time span during which the test stimulus is presented (e.g., Rhodes et al., <xref ref-type="bibr" rid="B70">2007</xref>). Similar to the adaptation duration, test duration can give insights into to the dynamics of adaptation in face stimuli in contrast to simple adaptation effects.</p>
<p>The third dimension in the present framework is associated with the transfer of adaptation effects. This <italic>transfer</italic> dimension reflects the range and limits of adaptation transfer effects providing important inferences about the nature of processing being linked with specific adapted stimuli or being of more general quality. In this way, the investigation of adaptation is a tool (rather than a topic) for localizing the plasticity and pointing out common coding principles of various levels of visual processing (from retinotopic to high and possibly face-specific levels of visual processing; Webster, <xref ref-type="bibr" rid="B87">2011</xref>; Webster and MacLeod, <xref ref-type="bibr" rid="B89">2011</xref>). There exists two systems of structuring transfer effects: transferring between different (manipulated) image versions of the identical identity during adaptation and test phases (e.g., variations in size or orientation) enables exclusive low-level perceptual effects of adaptation (e.g., on a retinal level; Zhao and Chubb, <xref ref-type="bibr" rid="B95">2001</xref>) to be excluded. Additionally, as proposed by Carbon et al. (<xref ref-type="bibr" rid="B18">2007b</xref>), adaptation transfer can be systematically tested with face images used in the adaptation and test phases showing the same images of the same identity vs. different images of the same identity vs. different images of different identities<xref ref-type="fn" rid="fn2"><sup>2</sup></xref>.</p>
<p>The approach can be extended by investigating transfer effects, <italic>inter alia</italic>, across family members, gender, and/or ethnicity.</p>
<p>As will be seen later, not all studies in the field of face adaptation research allow a localization of its research in all dimensions of the applied framework. For example, many studies apply sets of face images of different identities during adaption and the identical set of images during a following test phase (e.g., Buckingham et al., <xref ref-type="bibr" rid="B8">2006</xref>; Chen et al., <xref ref-type="bibr" rid="B19">2010</xref>). Such a procedure prevents conclusions about the transferability of adaptation effects, since potential effects in an adaptation test phase may originate from the presentation of the identical image and/or the presentation of other identities&#x02019; images during the adaptation phase.</p>
</sec>
<sec id="S2">
<title>Investigating the Adaptation Effects of Different Types of Face Information &#x02013; The Adapting Information Dimension</title>
<p>Basically, the result patterns in studies on adaptation effects showed an <italic>adaptation bias</italic> and were thus consistent in the following way: values of adaptation test ratings tend toward the (typically extreme) values of adapting information presented during the adaptation phase; in other words, average or neutral faces are perceptually biased away from the adapting face. After introducing findings in the context of FDAEs, we show findings of facial information loosely ordered with increasing abstractedness.</p>
<sec id="S2-1">
<title>Face distortion aftereffects</title>
<p>Webster and MacLin (<xref ref-type="bibr" rid="B90">1999</xref>) investigated FDAEs within face images of a single identity by presenting adaptation images that were distorted by expanding or contracting the frontal-view original face image relative to a midpoint on the nose. After viewing distorted faces during adaptation (e.g., contracted face images), original faces appear distorted in the direction opposite to the distraction (e.g., expanded face images). In contrast to this effect after adaptations to distorted faces, no such adaptation effect followed the presentation of original face images (i.e., distorted faces still appeared distorted). In this way, Webster and MacLin provided among the first evidence for adaptation effects in complex, natural objects, suggesting that adaptation may play an important normalizing role in face perception and adaptation effects may strongly influence form perception (see also Zhao and Chubb, <xref ref-type="bibr" rid="B95">2001</xref>; Morikawa, <xref ref-type="bibr" rid="B59">2005</xref>; Yamashita et al., <xref ref-type="bibr" rid="B92">2005</xref>; Jeffery et al., <xref ref-type="bibr" rid="B35">2006</xref>, <xref ref-type="bibr" rid="B36">2007</xref>; Jaquet et al., <xref ref-type="bibr" rid="B31">2007</xref>, <xref ref-type="bibr" rid="B32">2008</xref>; Robbins et al., <xref ref-type="bibr" rid="B73">2007</xref>; Jaquet and Rhodes, <xref ref-type="bibr" rid="B30">2008</xref>; Burkhardt et al., <xref ref-type="bibr" rid="B9">2010</xref>; Hills et al., <xref ref-type="bibr" rid="B27">2010</xref>); such a &#x0201C;complex&#x0201D; adaptation phenomenon was recently transferred to animals, trees, or every-day objects (e.g., light bulb; Daelli et al., <xref ref-type="bibr" rid="B22">2010</xref>).</p>
<p>However, the FDAE enables no specification of which types of facial information are precisely involved in face adaptation. For example, the usage of FDAEs simultaneously affects feature information such as mouth, eyes, or eyebrows (e.g., Tanaka and Sengco, <xref ref-type="bibr" rid="B81">1997</xref>; Cabeza and Kato, <xref ref-type="bibr" rid="B10">2000</xref>), configural information such as nose-mouth distance (Young et al., <xref ref-type="bibr" rid="B94">1987</xref>; Rhodes, <xref ref-type="bibr" rid="B65">1988</xref>; Leder and Carbon, <xref ref-type="bibr" rid="B49">2006</xref>), as well as holistic information referring to face processing &#x0201C;as a whole&#x0201D; (e.g., Tanaka and Farah, <xref ref-type="bibr" rid="B80">1993</xref>; Leder and Carbon, <xref ref-type="bibr" rid="B48">2005</xref>). As a consequence, it is essential to systematically vary distinct face information between adaptation and test phases in order to generate precise conclusions about the mechanisms of face adaptation effects, a fact that is not necessarily granted in the context of FDAEs.</p>
</sec>
<sec id="S2-2">
<title>Configural information</title>
<p>Carbon and colleagues (Carbon and Leder, <xref ref-type="bibr" rid="B15">2005</xref>; Carbon et al., <xref ref-type="bibr" rid="B18">2007b</xref>; Carbon and Ditye, <xref ref-type="bibr" rid="B13">2011</xref>; Strobach et al., <xref ref-type="bibr" rid="B77">2011</xref>) aimed instead at investigating adaptation effects of distorted configural information on subsequent adaptation tests. Participants were either presented face images during adaptation of familiar identities with decreased eyes-mouth distance or face images with increased eyes-mouth distance relative to the original. In a subsequent test phase participants were asked to select the veridical version (1) out of a series of versions of gradually altered eyes-mouth distances, or (2) from one original and one slightly altered version (e.g., slightly decreased or increased eyes-mouth distance). The results showed a bias in participants&#x02019; selections in the direction of the respective manipulations, e.g., after viewing face images with extremely decreased eyes-mouth distance there was an increased likelihood of selecting a version with slightly decreased distances (for similar results with exclusive shifting the eyes in the vertical axis, see Walton and Hills, <xref ref-type="bibr" rid="B84">2012</xref>). Thus, these studies demonstrated adaptation effects of configural information. Similar results are demonstrated by Little et al. (<xref ref-type="bibr" rid="B55">2005</xref>, <xref ref-type="bibr" rid="B56">2008</xref>) following the inspection of manipulated eyes-spacing: inspecting faces with extreme narrow or wide eye distances resulted in increased normality rating for subsequently presented face images with moderately manipulated distances. These latter findings demonstrate the generalization of adaptation effects after exposure to manipulated configural information.</p>
</sec>
<sec id="S2-3">
<title>Gaze and viewpoint information</title>
<p>Adaptation to a consistent leftward or rightward gaze produces ratings that demonstrate an elimination of observers&#x02019; perception of the gaze in the adapted direction (Jenkins et al., <xref ref-type="bibr" rid="B37">2006</xref>; Schweinberger et al., <xref ref-type="bibr" rid="B75">2007</xref>). That is, a gaze to the adapted side was subsequently seen as pointing straight. Leftward and rightward viewpoint adaptation resulted in similar adaptation effects (Fang et al., <xref ref-type="bibr" rid="B23">2007</xref>; Chen et al., <xref ref-type="bibr" rid="B19">2010</xref>); thus, a face turned to the adapted side was subsequently seen as pointing straight. Again, these effects can be interpreted as a recalibration mechanism: probably the best heuristic to use if one constantly lacks a straight viewpoint is to retune the processing of gaze direction or viewpoint.</p>
</sec>
<sec id="S2-4">
<title>Emotional and attractiveness information</title>
<p>Another example for testing adaptation effects is represented by investigations on the effects of different facial expressions. For instance, after perceiving a happy face, a previously ambiguous happy-angry face appeared distinctly angry, and thus the boundary between happy and angry faces was shifted toward the happy expression (Webster et al., <xref ref-type="bibr" rid="B88">2004</xref>; Fox and Barton, <xref ref-type="bibr" rid="B24">2007</xref>; Juricevic and Webster, <xref ref-type="bibr" rid="B39">2012</xref>). Such shifted emotion categorization was even evident with adapting stimuli having been suppressed from awareness (Benton and Burgess, <xref ref-type="bibr" rid="B5">2008</xref>; Adams et al., <xref ref-type="bibr" rid="B1">2010</xref>) illustrating the fast and automatic processing of such expressions. Attractiveness adaptation effects demonstrated that viewing consistently distorted faces shifts attractiveness preferences toward the adapting stimuli; for instance, contracted face images (Webster and MacLin, <xref ref-type="bibr" rid="B90">1999</xref>) appeared more attractive after adapting to contracted faces than after adapting to expanded faces (MacLin and Webster, <xref ref-type="bibr" rid="B57">2001</xref>; Rhodes et al., <xref ref-type="bibr" rid="B71">2003</xref>; Carbon et al., <xref ref-type="bibr" rid="B17">2007a</xref>; Anzures et al., <xref ref-type="bibr" rid="B3">2009</xref>). Similarly, faces with left-right asymmetries appeared more attractive when asymmetrical faces were presented during adaptation (Rhodes et al., <xref ref-type="bibr" rid="B67">2009b</xref>).</p>
</sec>
<sec id="S2-5">
<title>Gender information</title>
<p>Adaptation to either masculine or feminine faces increases preferences for novel faces that are (gender-wise) similar to those that were recently seen (Buckingham et al., <xref ref-type="bibr" rid="B8">2006</xref>), as well as increasing the femininity and masculinity ratings of test faces, respectively (see also Webster et al., <xref ref-type="bibr" rid="B88">2004</xref>; Ng et al., <xref ref-type="bibr" rid="B61">2006</xref>, <xref ref-type="bibr" rid="B60">2008</xref>; Kov&#x000E1;cs et al., <xref ref-type="bibr" rid="B45">2007</xref>). An alternative measurement of gender-adaptation effects demonstrated that adapting to a male/female face selectively enhances discrimination for male/female faces (Yang et al., <xref ref-type="bibr" rid="B93">2011</xref>).</p>
</sec>
<sec id="S2-6">
<title>Age and ethnicity information</title>
<p>When participants viewed young or old adult faces (i.e., adults of different ages), their &#x0201C;young/old boundary&#x0201D; was biased toward the age of the adapting face (O&#x02019;Neil and Webster, <xref ref-type="bibr" rid="B62">2011</xref>). Consistently, test faces appeared older or younger when the adapting faces were young or old, respectively (Schweinberger et al., <xref ref-type="bibr" rid="B76">2010</xref>). Therefore, there is evidence for an adaptation bias for facial age as well (see also Lai et al., <xref ref-type="bibr" rid="B46">2012</xref>). An analog bias also exists for face ethnicity, exemplarily shown for Caucasian vs. Asian faces: adaptation to an average Asian or Caucasian face reduced identification thresholds for faces from the adapted relative to the unadapted ethnicity (Webster et al., <xref ref-type="bibr" rid="B88">2004</xref>; Rhodes et al., <xref ref-type="bibr" rid="B72">2010</xref>).</p>
</sec>
<sec id="S2-7">
<title>Identity information</title>
<p>Leopold et al. (<xref ref-type="bibr" rid="B50">2001</xref>) provided evidence for increased sensitivity for particular face identities after adaptation, as investigated in the context of face identity aftereffects (FIAEs). Based on the ideas of a &#x0201C;face space&#x0201D; (e.g., a multidimensional representation of faces as their distance to a prototypical &#x0201C;center&#x0201D; face, Valentine, <xref ref-type="bibr" rid="B83">2001</xref>), the authors were able to explain the effect in the theoretical framework of a computationally derived mental representation (Figure <xref ref-type="fig" rid="F2">2</xref>). After perceiving an &#x0201C;anti-face&#x0201D; (located opposite an original face of an identity, on a trajectory crossing this original face and a face-space average), adaptation specifically shifted perception along a trajectory passing through the adapting anti-face and average face away from the original face, selectively facilitating recognition of a test face lying on this trajectory. Such adaptation effects on the identity level were replicated in a number of studies and variations (Hurlbert, <xref ref-type="bibr" rid="B29">2001</xref>; Anderson and Wilson, <xref ref-type="bibr" rid="B2">2005</xref>; Leopold et al., <xref ref-type="bibr" rid="B52">2005</xref>; Rhodes and Jeffery, <xref ref-type="bibr" rid="B69">2006</xref>; Rhodes et al., <xref ref-type="bibr" rid="B66">2009a</xref>, <xref ref-type="bibr" rid="B72">2010</xref>, <xref ref-type="bibr" rid="B68">2011</xref>; Palermo et al., <xref ref-type="bibr" rid="B64">2011</xref>) and are often explicitly referred to as changes of the face space.</p>
<fig position="float" id="F2">
<label>Figure 2</label>
<caption>
<p><bold>Computationally derived face space in which the stimuli were generated to investigate face identity adaptation effects (FIAE)</bold>. The original faces (green ellipses) are connected to the average face (blue ellipse) by an &#x0201C;identity trajectory.&#x0201D; Numbers refer to the &#x0201C;identity strength&#x0201D; possessed by the given face (taken from Leopold et al., <xref ref-type="bibr" rid="B50">2001</xref>).</p></caption>
<graphic xlink:href="fpsyg-04-00318-g002.tif"/>
</fig>
<p>In sum, this section demonstrates adaptation effects of numerous facial attributes. To our knowledge there are no attributes that don&#x02019;t show any such effects in the research literature, which indicates that adult face coding systems are more flexible than was previously thought (Bruce, <xref ref-type="bibr" rid="B7">1994</xref>). At this point it is essential to stress again, that the class of faces is the only object class that allows investigating and reviewing this large number of information types.</p>
</sec>
</sec>
<sec id="S3">
<title>Investigating Temporal Characteristics of Adaptation Effects &#x02013; The Time Dimension</title>
<sec id="S3-8">
<title>Delay</title>
<p>Face adaptation effects are typically tested with a delay interval of only a few seconds or even less (Webster and MacLin, <xref ref-type="bibr" rid="B90">1999</xref>; Leopold et al., <xref ref-type="bibr" rid="B50">2001</xref>; Rhodes et al., <xref ref-type="bibr" rid="B71">2003</xref>; Benton and Burgess, <xref ref-type="bibr" rid="B5">2008</xref>; Bestelmeyer et al., <xref ref-type="bibr" rid="B6">2008</xref>; Carbon and Ditye, <xref ref-type="bibr" rid="B13">2011</xref>). Given these constraints, the studies mainly show that (A) adapting face information such as face identity occurs on a perceptual level, and (B) there is no recalibration of the visual system after this delay. In sum, they allow conclusions about neither adaptation effects on the representational level nor on the robustness of visual system recalibration. One of the first systematic investigations of adaptation effects and their delay characteristics focused on gaze information (Kloth and Schweinberger, <xref ref-type="bibr" rid="B41">2008</xref>); Table <xref ref-type="table" rid="T2">2</xref> gives an overview of studies testing adaptation effects with short and long delays. This study demonstrated a decrease in the gaze adaptation effect over time but this effect was still measureable up to 385&#x02009;s after the end of the adaptation phase. This is evidence for the idea that adaptation effects are not completely perceptually-based. Also it suggests that there is no complete &#x0201C;return to normal&#x0201D; (&#x0201C;recalibration,&#x0201D; see Carbon and Ditye, <xref ref-type="bibr" rid="B13">2011</xref>) of the visual system, associated to gaze processing, within this time range of more than 6&#x02009;min. Consequently, adaptation effects are &#x0201C;stickier&#x0201D; than many of the traditional low-level adaptation effects (e.g., K&#x000F6;hler and Wallach, <xref ref-type="bibr" rid="B43">1944</xref>) and can be initially interpreted as evidence of representation-based effects.</p>
<table-wrap position="float" id="T2">
<label>Table 2</label>
<caption>
<p><bold>Overview of examples of face adaptation studies and their delays between adaptation and test phases</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">Study</th>
<th align="left">Delay</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Barrett and O&#x02019;Toole (<xref ref-type="bibr" rid="B4">2009</xref>)</td>
<td align="left">100&#x02009;ms</td>
</tr>
<tr>
<td align="left">Benton and Burgess (<xref ref-type="bibr" rid="B5">2008</xref>)</td>
<td align="left">500&#x02009;ms</td>
</tr>
<tr>
<td align="left">Bestelmeyer et al. (<xref ref-type="bibr" rid="B6">2008</xref>)</td>
<td align="left">Not available</td>
</tr>
<tr>
<td align="left">Carbon and Ditye (<xref ref-type="bibr" rid="B13">2011</xref>)</td>
<td align="left">24&#x02009;h, 1&#x02009;week</td>
</tr>
<tr>
<td align="left">Carbon and Ditye (<xref ref-type="bibr" rid="B14">2012</xref>)</td>
<td align="left">1&#x02009;week</td>
</tr>
<tr>
<td align="left">Carbon and Leder (<xref ref-type="bibr" rid="B15">2005</xref>)</td>
<td align="left">4,000&#x02009;ms; 5&#x02009;min</td>
</tr>
<tr>
<td align="left">Carbon and Leder (<xref ref-type="bibr" rid="B16">2006</xref>)</td>
<td align="left">80&#x02009;min</td>
</tr>
<tr>
<td align="left">Carbon et al. (<xref ref-type="bibr" rid="B18">2007b</xref>)</td>
<td align="left">5&#x02009;min; 24&#x02009;h</td>
</tr>
<tr>
<td align="left">Fang et al. (<xref ref-type="bibr" rid="B23">2007</xref>)</td>
<td align="left">1,000&#x02009;ms</td>
</tr>
<tr>
<td align="left">Hills et al. (<xref ref-type="bibr" rid="B27">2010</xref>)</td>
<td align="left">5,000&#x02009;ms</td>
</tr>
<tr>
<td align="left">Hole (<xref ref-type="bibr" rid="B28">2011</xref>)</td>
<td align="left">&#x02264;2&#x02009;min</td>
</tr>
<tr>
<td align="left">Kloth and Schweinberger (<xref ref-type="bibr" rid="B41">2008</xref>)</td>
<td align="left">0&#x02013;6&#x02009;min</td>
</tr>
<tr>
<td align="left">Kov&#x000E1;cs et al. (<xref ref-type="bibr" rid="B45">2007</xref>)</td>
<td align="left">500&#x02009;ms</td>
</tr>
<tr>
<td align="left">Leopold et al. (<xref ref-type="bibr" rid="B52">2005</xref>)</td>
<td align="left">Not available</td>
</tr>
<tr>
<td align="left">Leopold et al. (<xref ref-type="bibr" rid="B50">2001</xref>)</td>
<td align="left">150; 300; 600; 1,200; 2,400&#x02009;ms</td>
</tr>
<tr>
<td align="left">McKone et al. (<xref ref-type="bibr" rid="B58">2005</xref>)</td>
<td align="left">15&#x02009;min</td>
</tr>
<tr>
<td align="left">Rhodes et al. (<xref ref-type="bibr" rid="B70">2007</xref>)</td>
<td align="left">1,000&#x02009;ms</td>
</tr>
<tr>
<td align="left">Rhodes et al. (<xref ref-type="bibr" rid="B71">2003</xref>)</td>
<td align="left">500&#x02009;ms</td>
</tr>
<tr>
<td align="left">Strobach et al. (<xref ref-type="bibr" rid="B77">2011</xref>)</td>
<td align="left">5&#x02009;min; 24&#x02009;h</td>
</tr>
<tr>
<td align="left">Webster et al. (<xref ref-type="bibr" rid="B88">2004</xref>)</td>
<td align="left">250&#x02009;ms</td>
</tr>
<tr>
<td align="left">Webster and MacLin (<xref ref-type="bibr" rid="B90">1999</xref>)</td>
<td align="left">Not available</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Carbon and colleagues systematically extended research on effects of adaptation to manipulated configural information of famous faces. These studies demonstrated adaptation effects after 5&#x02009;min (Carbon and Leder, <xref ref-type="bibr" rid="B15">2005</xref>; Carbon et al., <xref ref-type="bibr" rid="B18">2007b</xref>; Strobach et al., <xref ref-type="bibr" rid="B77">2011</xref>), 80&#x02009;min (Carbon and Leder, <xref ref-type="bibr" rid="B16">2006</xref>), 24&#x02009;h (Carbon et al., <xref ref-type="bibr" rid="B18">2007b</xref>; Carbon and Ditye, <xref ref-type="bibr" rid="B13">2011</xref>; Strobach et al., <xref ref-type="bibr" rid="B77">2011</xref>), and even 1&#x02009;week (Carbon and Ditye, <xref ref-type="bibr" rid="B13">2011</xref>). Therefore, adaptation seems to be very robust and refers to effects on the functional level of representations. According to Carbon et al.&#x02019;s research, it takes at least 1&#x02009;week for the visual system to return to its original state before adaptation (i.e., to recalibrate to its pre-adaptation state), at least in terms of adaptation effects for configural facial information.</p>
<p>To sum up, regarding the delay dimension, a series of recent experiments revealed relatively long-lasting adaptation effects. This evidence is related to face attributes of gaze information (Kloth and Schweinberger, <xref ref-type="bibr" rid="B41">2008</xref>), and configural information (e.g., Carbon et al., <xref ref-type="bibr" rid="B18">2007b</xref>). It illustrates aspects of face processing that are related to an increased participation of representations and do not only rely on simple iconic traces or simple visual aftereffects (Carbon et al., <xref ref-type="bibr" rid="B18">2007b</xref>). To learn about mental representations, these findings of long-lasting adaptation effects demonstrate that the investigated types of facial information (i.e., gaze information, configural information) are not exclusively processed and coded at a perceptual level. One may speculate that such processing and coding involves long-term memory functions. However, there are no investigations of &#x0201C;delay&#x0201D; effects on facial information aspects of age, attractiveness, emotion, ethnicity, gender, FIAEs, or viewpoint information. Such investigations are essential to assess the functional level (representation level and/or perceptual level), the robustness/sustainability and the time needed for recalibrations of adaptation effects of these types of information. We will come back to this immense gap in the adaptation literature in a later section.</p>
</sec>
<sec id="S3-9">
<title>Adaptation duration</title>
<p>The increase in the time interval for presenting visual adaptation material typically results in an increase of simple perceptual adaptation effects demonstrating the adaptability of tilt, motion, or shape information (Rhodes et al., <xref ref-type="bibr" rid="B70">2007</xref>). In fact, this relation is characterized by a logarithmic function between adaptation duration and effect size. A comparable relation was found in faces. Rhodes, Leopold, Jeffery and colleagues (Leopold et al., <xref ref-type="bibr" rid="B52">2005</xref>; Rhodes et al., <xref ref-type="bibr" rid="B70">2007</xref>) tested the FDAEs as well as FIAEs after varying presentation times of adapting face stimuli; in fact, test stimuli appeared immediately after adaptation material was presented between 1,000 and 16,000&#x02009;ms. Independent of size relations between adaptation and test faces, FDAEs and FIAEs increased with adaptation time. The relationship between adaptation duration and effect is thus comparable in simple perceptual information as well as complex face objects, illustrating common coding principles at different levels of cortical visual hierarchy.</p>
<p>There were however no, or very short, time delays between adaptation and test phases in these prior studies (Leopold et al., <xref ref-type="bibr" rid="B52">2005</xref>; Rhodes et al., <xref ref-type="bibr" rid="B70">2007</xref>) and thus it is likely that adaptation effects were investigated at the perceptual level only. In contrast, Carbon et al. (<xref ref-type="bibr" rid="B18">2007b</xref>) introduced delays of 5&#x02009;min to 24&#x02009;h between these phases, allowing investigation of the effects of adaptation delay on the adaptation of memory face representations (in this case, the adapting information was configural information). On the basis of this argument, Strobach et al. (<xref ref-type="bibr" rid="B77">2011</xref>) performed multiple regression analyses on the individual adaptation duration and their adaptation effects after both 5&#x02009;min and 24&#x02009;h. Positive relations between both measures (i.e., longer presentation times of adaptation faces resulted in increased adaptation effects) demonstrated an impact of the adaptation time on the effect size, extending findings on the perceptual level to findings demonstrating mechanisms instead on a memory level.</p>
<p>There is a lack of studies that explicitly investigate the effects of adaptation time for facial attributes other than FDAEs, FIAEs, and configural information &#x02013; e.g., for age, gender, and attractiveness. Such investigations would provide elaborated knowledge about the coding principles of face-specific and simple visual information.</p>
</sec>
<sec id="S3-10">
<title>Test duration</title>
<p>Similarly to changes in the magnitude of adaptation effects following variability in adaptation duration, the time span of presenting a test stimulus modulates the magnitude of these effects (Leopold et al., <xref ref-type="bibr" rid="B52">2005</xref>; Rhodes et al., <xref ref-type="bibr" rid="B70">2007</xref>). In fact, test faces were presented for 100, 200, 400, 800, or 1,600&#x02009;ms and the adaptation effects, as measured in paradigms of FDAEs and FIAEs, reduced with increasing test time. Since similar effects are evident with simple aftereffects, face and simple perceptual information (i.e., tilt, orientation) illustrate common coding principles at different levels of cortical visual hierarchy. However, what is clearly lacking in this domain is research on the question of whether the test duration has an effect when adaptation effects are tested on a memory rather than on a perceptual level. That is to say, it is an open issue in the literature whether the negative relation of test duration and adaptation effect (e.g., increasing test duration and decreasing adaptation effects) is evident after a delay of minutes, hours, or days. Furthermore, this negative relationship was established for FDAEs and FIAEs. However, it is lacking for alternative facial information and such tests should be attempted in future studies. They are essential to establish a more elaborate knowledge of the coding principles of face-specific types of information.</p>
</sec>
</sec>
<sec id="S4">
<title>Investigating the Transferability of Adaptation Effects &#x02013; The Transfer Dimension</title>
<p>After reviewing adaptation effects of different face information and time characteristics (e.g., the sustainability of adaptation effects), it is essential to review the relationship between the adapting and test face images; i.e., to test transfers of adaptation effects to new face images (or new image versions) not presented during adaptation. Here, we review findings that investigated transfer effects between the same image of one identity (identical images or images differing in viewpoint, orientation, or size between adapting and test images) and different images of the same identity. Additionally, we also discuss adaptation transfer effects between images of different identities. As illustrated in Table <xref ref-type="table" rid="T3">3</xref>, we test these transfer effects at the <italic>pictorial level</italic> (identical face image during adaptation and test), <italic>identity level</italic> (different face images of the same identity during adaptation and test), and <italic>novel level</italic> (different face images of different identities during adaptation and test, Carbon and Ditye, <xref ref-type="bibr" rid="B13">2011</xref>). In the following section, we primarily discuss types of adapting face information that demonstrate transfer effects. We followed this strategy because, for the remaining types of face information (e.g., gaze, emotion), there was (A) no investigation of transfer effects and/or (B) no conclusive evidence for such effects.</p>
<table-wrap position="float" id="T3">
<label>Table 3</label>
<caption>
<p><bold>Different transfer levels of adaptation effects as realized in studies of Carbon and colleagues (Carbon et al., <xref ref-type="bibr" rid="B18">2007b</xref>; Carbon and Ditye, <xref ref-type="bibr" rid="B13">2011</xref>, <xref ref-type="bibr" rid="B14">2012</xref>; Strobach et al., <xref ref-type="bibr" rid="B77">2011</xref>)</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left"/>
<th align="center" colspan="3">Transfer of the adaptation effect<hr/></th>
</tr>
<tr>
<th align="left"/>
<th align="left">Picture level</th>
<th align="left">Identity level</th>
<th align="left">Novel level</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Adaptation phase</td>
<td align="left"><inline-graphic xlink:href="fpsyg-04-00318-i001.tif"/></td>
<td align="left"><inline-graphic xlink:href="fpsyg-04-00318-i002.tif"/></td>
<td align="left"><inline-graphic xlink:href="fpsyg-04-00318-i003.tif"/></td>
</tr>
<tr>
<td align="left">Test phase</td>
<td align="left"><inline-graphic xlink:href="fpsyg-04-00318-i004.tif"/></td>
<td align="left"><inline-graphic xlink:href="fpsyg-04-00318-i005.tif"/></td>
<td align="left"><inline-graphic xlink:href="fpsyg-04-00318-i006.tif"/></td>
</tr>
</tbody>
</table>
</table-wrap>
<p>When focusing on FIAEs, Hole (<xref ref-type="bibr" rid="B28">2011</xref>) demonstrated adaptation effects with identical face images during adaptation and testing which also transfer onto new image versions changed in viewpoints, orientation, and vertically stretched versions from the adapting face images when using familiar; this confirmed Jiang et al.&#x02019;s (<xref ref-type="bibr" rid="B38">2006</xref>) finding of viewpoint invariance of FIAEs who also added evidence of their transfers across shape and surface reflectance information. Anderson and Wilson (<xref ref-type="bibr" rid="B2">2005</xref>) supported Hole&#x02019;s finding of size independent transfer with unfamiliar, synthetic faces while their study provided no support of a viewpoint-invariant FIAE for this face type. Guo et al. (<xref ref-type="bibr" rid="B26">2009</xref>) revealed limits to the transferability of the FIAEs by showing that this effect exclusively works from upright to inverted orientation, but not vice versa with unfamiliar faces. With familiar faces, however, Hole showed that the FIAE produced by inverted adaptation faces and upright test faces was similar to that produced by upright adapting faces. Furthermore, this type of adaptation effect seems to be gender-specific since there is an effect from adaptation to test faces when these faces are related via a gender-specific prototype, whereas there was no such effect with an androgynous face (i.e., combined male and female prototype; Rhodes et al., <xref ref-type="bibr" rid="B68">2011</xref>). Leopold et al. (<xref ref-type="bibr" rid="B50">2001</xref>, <xref ref-type="bibr" rid="B52">2005</xref>) showed that relations between a facial prototype and the individual faces in face space could be manipulated by face adaptations of different identities. In other words, this manipulation includes face images during adaptation that are not located on a trajectory crossing an original face and a face-space average (similar to Figure <xref ref-type="fig" rid="F1">1</xref>); thus, there is evidence for FIAEs on the <italic>novel level</italic>. This conclusion is consistent with the fact that FIAEs are rather high-level perceptual effects: composite faces (different views of a composite face comprised of the top half of a famous face and the bottom half of a non-famous face) either did or did not produce a FIAE depending on whether or not the famous face is explicitly recognized before the post-adaptation test phase (Laurence and Hole, <xref ref-type="bibr" rid="B47">2012</xref>).</p>
<p>Conversely, adaptation to facial expressions (i.e., emotional information) was partly independent from the represented identities. That is, adaptation effects with a focus on facial expressions were transferred to different faces and thus include at least portions of novel-level processing (Fox and Barton, <xref ref-type="bibr" rid="B24">2007</xref>). Interestingly, the above discussed FIAE is not affected by expressional information. That is, FIAEs were not modulated by congruency of facial expression during adaptation and test phases (i.e., same expression vs. different expression; Fox et al., <xref ref-type="bibr" rid="B25">2008</xref>). Thus, expressional adaptation and FIAEs tend toward asymmetry with impact of identity information on expression adaptation, but there is no reverse effect.</p>
<p>For facial gender, Yang et al. (<xref ref-type="bibr" rid="B93">2011</xref>) demonstrated that gender discrimination enhancement induced by face adaptation can transfer across a substantial change in three-dimensional facial orientation. Additionally, gender-adaptation effects are position invariant effects (Kov&#x000E1;cs et al., <xref ref-type="bibr" rid="B45">2007</xref>). These effects also seem to be age-independent, since Barrett and O&#x02019;Toole (<xref ref-type="bibr" rid="B4">2009</xref>) demonstrated an effect of gender adaptation within sets of children&#x02019;s and adults&#x02019; faces and also between these sets of faces. These age-independent effects also demonstrate that gender-adaptation effects may work on a <italic>novel level</italic> since adaptation and test faces were derived from different identities. However, this novel level is limited to the orientation of faces; that is, adaptation effects work independently with upright and inverted face presentations (Watson and Clifford, <xref ref-type="bibr" rid="B86">2006</xref>). Alternatively, the limitation of emotion adaptation effects is set at race boundaries: adapting to one type of emotion in, for instance, a Caucasian face, affects the later processing of an alternative Caucasian face but not that of an alternative ethnicity (i.e., black faces; Otten and Banaji, <xref ref-type="bibr" rid="B63">2012</xref>).</p>
<p>The adaptation effect realized in the form of viewpoint adaptation (i.e., adaptation to left or right-turned faces) occurs at the <italic>novel level</italic> as demonstrated by transfer effects across different identities, different gender, and different vertical orientations (Fang et al., <xref ref-type="bibr" rid="B23">2007</xref>). In the case of face normality ratings and their adaptation effects, there exists evidence for at least orientation-transferable adaptation effects, i.e., between upright and inverted orientations of face images (Rhodes et al., <xref ref-type="bibr" rid="B71">2003</xref>).</p>
<p>Transfer effects of adapting configural information are not only in action at the <italic>pictorial</italic> and <italic>identity</italic> levels, but also at the <italic>novel</italic> level (i.e., different face images of different identities during adaptation and test; Carbon et al., <xref ref-type="bibr" rid="B18">2007b</xref>; Carbon and Ditye, <xref ref-type="bibr" rid="B13">2011</xref>); even though the effect was slightly reduced compared with pictorial and identity conditions. This was demonstrated by the transfer effects of adapting configural information in combinations of identical adaptation and test facial images, of different adaptation and test facial images from the same identity and transfers to new identities, i.e., transfers to test face images of identities not presented during prior adaptation (Walton and Hills, <xref ref-type="bibr" rid="B84">2012</xref>, for comparable results with exclusive eyes shifts in the vertical axis). Little et al. (<xref ref-type="bibr" rid="B56">2008</xref>) assumed that such transfer effects are gender-specific. FDAEs are not transferable to images mirrored after the adaptation phase (Morikawa, <xref ref-type="bibr" rid="B59">2005</xref>), but there is evidence of the transfer of such effects to different ethnicities (Jaquet et al., <xref ref-type="bibr" rid="B31">2007</xref>), between different viewpoints (Jeffery et al., <xref ref-type="bibr" rid="B35">2006</xref>, <xref ref-type="bibr" rid="B36">2007</xref>), different orientations of upright and inverted faces (e.g., adapting face is oriented 45&#x000B0; from vertically upright and the test face 45&#x000B0; in the opposite direction; Watson and Clifford, <xref ref-type="bibr" rid="B85">2003</xref>) between different identities and orientations (Webster and MacLin, <xref ref-type="bibr" rid="B90">1999</xref>) as well as different facial image sizes (Zhao and Chubb, <xref ref-type="bibr" rid="B95">2001</xref>; Yamashita et al., <xref ref-type="bibr" rid="B92">2005</xref>). Consequently, FDAEs occur up to the <italic>novel level</italic>. Consistently, adaptation effects of facial age on a <italic>novel level</italic> are demonstrated by transfers between different genders (O&#x02019;Neil and Webster, <xref ref-type="bibr" rid="B62">2011</xref>) and identities (Schweinberger et al., <xref ref-type="bibr" rid="B76">2010</xref>).</p>
<p>In sum, there is clear evidence for adaptation effects across different identities for a first set of face information (e.g., gender, age, configural information), i.e., transfer effects on a <italic>novel level</italic>. Adaptation of this set of information seems to affect the high-order visual system and/or memory representations. In contrast to these transferable adaptation effects, there is no clear evidence of transfer effects for other face information, such as attractiveness. Furthermore, there is evidence that some face information is only transferable between different identities when the specific subgroups are not changed simultaneously (e.g., FDAE transfers are gender-specific). To learn about processing characteristics and mental representations of faces, this section indicates that face coding is hierarchically structured with an orchestration of common underlying structures. This common structure was demonstrated at the novel level and maybe theoretically represented in a prototype in face space (Valentine, <xref ref-type="bibr" rid="B83">2001</xref>). However, the processing of some facial aspects is characterized by and related to specific modules (e.g., gender-specific modules of FDAEs) potentially working in parallel to a general face-space prototype.</p>
</sec>
<sec id="S5">
<title>Future Investigations of Face Adaptation Effects</title>
<p>A summarizing overview of existing and lacking research in the field of face adaptation effects is illustrated in Table <xref ref-type="table" rid="T4">4</xref>. Future studies may apply the present framework&#x02019;s dimensions or operational parameters (adapting information, time, and transfer) for a systematic continuation of investigating face adaptation effects. For instance, for a number of adapting information types (e.g., emotion, age, attractiveness, gender) there exists no test for the robustness of adaptation effects; that is, adaptation effects of these types of face information are demonstrated after short delays between adaptation and adaptation test phases. However, there are no studies that test these adaptation effects after long delays and their decay over time. To present one specific example, in accordance with time intervals applied by Carbon and colleagues (Carbon et al., <xref ref-type="bibr" rid="B18">2007b</xref>; Carbon and Ditye, <xref ref-type="bibr" rid="B13">2011</xref>), the adaptation effect of facial age should be tested after time intervals of 5&#x02009;min, 24&#x02009;h, and 1&#x02009;week in order to cover a broad range of time periods and to test the robustness of the age adaptation effect. Likewise, testing the impact of adaptation and test time should be extended to forms of facial information beyond phenomena investigated in the context of FDAEs (Leopold et al., <xref ref-type="bibr" rid="B52">2005</xref>; Rhodes et al., <xref ref-type="bibr" rid="B70">2007</xref>). As illustrated, a first extension of investigations on adaptation time occurs in the context of adaptation of configural information (Strobach et al., <xref ref-type="bibr" rid="B77">2011</xref>), but other contexts are definitely needed to generate a broader picture of timing aspects in face adaptation. In this way, our framework is able to characterize gaps in the adaptation research literature while combining findings along the dimension adaptation information and delay.</p>
<table-wrap position="float" id="T4">
<label>Table 4</label>
<caption>
<p><bold>Overview of existing and lacking research in the field of face adaptation effects: what types of face information does this research include? What types are neglected?</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">Dimension</th>
<th align="left">Existing investigations on</th>
<th align="left">Lacking investigations on</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Adapting information</td>
<td align="left">FDAE</td>
<td align="left">Distinctiveness</td>
</tr>
<tr>
<td align="left"/>
<td align="left">Configural information</td>
<td align="left">Eye color</td>
</tr>
<tr>
<td align="left"/>
<td align="left">Gaze information</td>
<td align="left"/>
</tr>
<tr>
<td align="left"/>
<td align="left">Viewpoint information</td>
<td align="left"/>
</tr>
<tr>
<td align="left"/>
<td align="left">Emotional information</td>
<td align="left"/>
</tr>
<tr>
<td align="left"/>
<td align="left">Attractiveness information</td>
<td align="left"/>
</tr>
<tr>
<td align="left"/>
<td align="left">Gender information</td>
<td align="left"/>
</tr>
<tr>
<td align="left"/>
<td align="left">Age information</td>
<td align="left"/>
</tr>
<tr>
<td align="left"/>
<td align="left">Ethnicity information</td>
<td align="left"/>
</tr>
<tr>
<td align="left"/>
<td align="left">FIAE</td>
<td align="left"/>
</tr>
<tr>
<td align="left">Time</td>
</tr>
<tr>
<td align="left">&#x02003;Delay</td>
<td align="left">Gaze information</td>
<td align="left">Alternative types of information</td>
</tr>
<tr>
<td align="left"/>
<td align="left">Configural information</td>
<td align="left"/>
</tr>
<tr>
<td align="left">&#x02003;Adaptation duration</td>
<td align="left">FDAE</td>
<td align="left">Alternative types of information</td>
</tr>
<tr>
<td align="left"/>
<td align="left">FIAE</td>
<td align="left">Various delays between adaptation and adaptation test phase</td>
</tr>
<tr>
<td align="left">&#x02003;Test duration</td>
<td align="left">FDAE</td>
<td align="left">Alternative types of information</td>
</tr>
<tr>
<td align="left"/>
<td align="left">FIAE</td>
<td align="left">Various delays between adaptation and adaptation test phase</td>
</tr>
<tr>
<td align="left">Transfer</td>
<td align="left">Configural information</td>
<td align="left">Alternative types of information</td>
</tr>
<tr>
<td align="left"/>
<td align="left">Gender information</td>
<td align="left">Temporal characteristics (i.e., delay, adaptation duration, test duration)</td>
</tr>
<tr>
<td align="left"/>
<td align="left">Emotional information</td>
<td align="left"/>
</tr>
<tr>
<td align="left"/>
<td align="left">Viewpoint information</td>
<td align="left"/>
</tr>
<tr>
<td align="left"/>
<td align="left">Attractiveness information</td>
<td align="left"/>
</tr>
<tr>
<td align="left"/>
<td align="left">FIAE</td>
<td align="left"/>
</tr>
</tbody>
</table>
</table-wrap>
<p>Similarly, such an expansion of tests should also be performed on the transfer dimension since this type of test is essential for investigating the functional level of adaptation effects and increasing the ecological validity of these studies. While transfer tests on the same face image of the same identity (e.g., varying orientation or context during adaptation and test; Carbon and Ditye, <xref ref-type="bibr" rid="B14">2012</xref>) enable investigations on picture or &#x0201C;iconic&#x0201D; processing (Carbon, <xref ref-type="bibr" rid="B11">2008</xref>), and thus not necessarily on face processing <italic>per se</italic>, transfer tests on different face images of the same identity as well as different identities&#x02019; face images allow for conclusions about the characteristics of face representations. So far, these two types of transfer tests [i.e., (1) different face images of the same identity, (2) different face images of different identities] were separately conducted in most studies. Future studies may combine these two transfer types.</p>
<p>An additional way to continue face adaptation research, in terms of gaining knowledge on the basis of adaptation effects, is to combine aspects (i.e., our framework&#x02019;s dimensions) of time and transfer. For instance, future studies should systemically investigate the effects of manipulating face information across different delays between adaptation and test phases, combined with tests for the different transfer levels between adaptation and test faces. Additionally, the effects of gender adaptation can be investigated after relatively short and long delays between faces of the same or different age groups, ethnicities, or gaze points. A related investigation focused on transfer effects between different emotional expressions and gender in the context of FDAEs (Tillman and Webster, <xref ref-type="bibr" rid="B82">2012</xref>). This systematic investigation of the interplay of delay and transfer may provide conclusions about the range of adaptation effects and their origin from similar or different methods of neural coding.</p>
<p>Additionally, future studies should relate investigations on the functional level of face adaptation effects to concepts applied to other types of processes or skills. One option might be in relation to improved skills acquired during cognitive training (e.g., working memory training) and their range of transferability. Existing studies on this issue (Li et al., <xref ref-type="bibr" rid="B54">2008b</xref>; Karbach and Kray, <xref ref-type="bibr" rid="B40">2009</xref>; Strobach et al., <xref ref-type="bibr" rid="B78">2012a</xref>,<xref ref-type="bibr" rid="B79">b</xref>) categorize the range of skill transfer into near transfer (transfer between situations with common basic characteristics) and far transfer (transfer between situations with structural differences). Conversely, near transfer tests could investigate the adaptation effects of facial images on other facial images, while far transfer tests investigate adaptation effects on facial images after prior adaptation to specific properties of cars (Carbon, <xref ref-type="bibr" rid="B12">2010</xref>) or adaptations to mental sets due to the presentation of gender-typical objects (e.g., lipstick vs. motor bike, Javadi and Wee, <xref ref-type="bibr" rid="B33">2012</xref>). For instance, one could test whether there is a transfer of adaptation effects of configural information (i.e., spatial relations between features) from face stimuli to car stimuli (i.e., far transfer). In particular, front views of cars with a similar setting of parts as can be found in faces are favorable for performing such transfer tests (see Windhager et al., <xref ref-type="bibr" rid="B91">2010</xref>). In fact, this type of future transfer study would cross boundaries between different types of visual objects and may show similarities and differences in these general object characteristics.</p>
</sec>
<sec id="S6">
<title>Conflict of Interest Statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</body>
<back>
<ack>
<p>This study was supported by a research grant (FNK) of the University of Bamberg and by a project funded by the Doktor Robert Pfleger-Stiftung.</p>
</ack>
<ref-list>
<title>References</title>
<ref id="B1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Adams</surname> <given-names>W. J.</given-names></name> <name><surname>Gray</surname> <given-names>K. L. H.</given-names></name> <name><surname>Garner</surname> <given-names>M.</given-names></name> <name><surname>Graf</surname> <given-names>E. W.</given-names></name></person-group> (<year>2010</year>). <article-title>High-level face adaptation without awareness</article-title>. <source>Psychol. Sci.</source> <volume>21</volume>, <fpage>205</fpage>&#x02013;<lpage>210</lpage>.<pub-id pub-id-type="doi">10.1177/0956797609359508</pub-id><pub-id pub-id-type="pmid">20424046</pub-id></citation></ref>
<ref id="B2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Anderson</surname> <given-names>N. D.</given-names></name> <name><surname>Wilson</surname> <given-names>H. R.</given-names></name></person-group> (<year>2005</year>). <article-title>The nature of synthetic face adaptation</article-title>. <source>Vision Res.</source> <volume>45</volume>, <fpage>1815</fpage>&#x02013;<lpage>1828</lpage>.<pub-id pub-id-type="doi">10.1016/j.visres.2005.01.012</pub-id></citation></ref>
<ref id="B3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Anzures</surname> <given-names>G.</given-names></name> <name><surname>Mondloch</surname> <given-names>C. J.</given-names></name> <name><surname>Lackner</surname> <given-names>C.</given-names></name></person-group> (<year>2009</year>). <article-title>Face adaptation and attractiveness aftereffects in 8-year-olds and adults</article-title>. <source>Child Dev.</source> <volume>80</volume>, <fpage>178</fpage>&#x02013;<lpage>191</lpage>.<pub-id pub-id-type="doi">10.1111/j.1467-8624.2008.01253.x</pub-id><pub-id pub-id-type="pmid">19236400</pub-id></citation></ref>
<ref id="B4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Barrett</surname> <given-names>S. E.</given-names></name> <name><surname>O&#x02019;Toole</surname> <given-names>A. J.</given-names></name></person-group> (<year>2009</year>). <article-title>Face adaptation to gender: does adaptation transfer across age categories?</article-title> <source>Vis. Cogn.</source> <volume>17</volume>, <fpage>700</fpage>&#x02013;<lpage>715</lpage>.<pub-id pub-id-type="doi">10.1080/13506280802332197</pub-id></citation></ref>
<ref id="B5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Benton</surname> <given-names>C. P.</given-names></name> <name><surname>Burgess</surname> <given-names>E. C.</given-names></name></person-group> (<year>2008</year>). <article-title>The direction of measured face aftereffects</article-title>. <source>J. Vis.</source> <volume>8</volume>, <fpage>1</fpage>&#x02013;<lpage>6</lpage>.<pub-id pub-id-type="doi">10.1167/8.15.1</pub-id><pub-id pub-id-type="pmid">19146285</pub-id></citation></ref>
<ref id="B6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bestelmeyer</surname> <given-names>P. E. G.</given-names></name> <name><surname>Jones</surname> <given-names>B. C.</given-names></name> <name><surname>DeBruine</surname> <given-names>L. M.</given-names></name> <name><surname>Little</surname> <given-names>A. C.</given-names></name> <name><surname>Perrett</surname> <given-names>D. I.</given-names></name> <name><surname>Schneider</surname> <given-names>A.</given-names></name> <etal/></person-group> (<year>2008</year>). <article-title>Sex-contingent face aftereffects depend on perceptual category rather than structural encoding</article-title>. <source>Cognition</source> <volume>107</volume>, <fpage>353</fpage>&#x02013;<lpage>365</lpage>.<pub-id pub-id-type="doi">10.1016/j.cognition.2007.07.018</pub-id><pub-id pub-id-type="pmid">17870064</pub-id></citation></ref>
<ref id="B7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bruce</surname> <given-names>V.</given-names></name></person-group> (<year>1994</year>). <article-title>Stability from variation: the case of face recognition. The M.D. Vernon memorial lecture</article-title>. <source>Q. J. Exp. Psychol.</source> <volume>47A</volume>, <fpage>5</fpage>&#x02013;<lpage>28</lpage>.<pub-id pub-id-type="doi">10.1080/14640749408401141</pub-id></citation></ref>
<ref id="B8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Buckingham</surname> <given-names>G.</given-names></name> <name><surname>DeBruine</surname> <given-names>L. M.</given-names></name> <name><surname>Little</surname> <given-names>A. C.</given-names></name> <name><surname>Welling</surname> <given-names>L. L. M.</given-names></name> <name><surname>Conway</surname> <given-names>C. A.</given-names></name> <name><surname>Tiddeman</surname> <given-names>B. P.</given-names></name> <etal/></person-group> (<year>2006</year>). <article-title>Visual adaptation to masculine and feminine faces influences generalized preferences and perceptions of trustworthiness</article-title>. <source>Evol. Hum. Behav.</source> <volume>27</volume>, <fpage>381</fpage>&#x02013;<lpage>389</lpage>.<pub-id pub-id-type="doi">10.1016/j.evolhumbehav.2006.03.001</pub-id></citation></ref>
<ref id="B9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Burkhardt</surname> <given-names>A.</given-names></name> <name><surname>Blaha</surname> <given-names>L. M.</given-names></name> <name><surname>Jurs</surname> <given-names>B. S.</given-names></name> <name><surname>Rhodes</surname> <given-names>G.</given-names></name> <name><surname>Jeffery</surname> <given-names>L.</given-names></name> <name><surname>Wyatte</surname> <given-names>D.</given-names></name> <etal/></person-group> (<year>2010</year>). <article-title>Adaptation modulates the electrophysiological substrates of perceived facial distortion: support for opponent coding</article-title>. <source>Neuropsychologia</source> <volume>48</volume>, <fpage>3743</fpage>&#x02013;<lpage>3756</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2010.08.016</pub-id><pub-id pub-id-type="pmid">20736026</pub-id></citation></ref>
<ref id="B10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cabeza</surname> <given-names>R.</given-names></name> <name><surname>Kato</surname> <given-names>T.</given-names></name></person-group> (<year>2000</year>). <article-title>Features are also important: contributions of featural and configural processing to face recognition</article-title>. <source>Psychol. Sci.</source> <volume>11</volume>, <fpage>429</fpage>&#x02013;<lpage>433</lpage>.<pub-id pub-id-type="doi">10.1111/1467-9280.00283</pub-id></citation></ref>
<ref id="B11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Carbon</surname> <given-names>C.-C.</given-names></name></person-group> (<year>2008</year>). <article-title>Last but not least: famous faces as icons. The illusion of being an expert in the recognition of famous faces</article-title>. <source>Perception</source> <volume>37</volume>, <fpage>801</fpage>&#x02013;<lpage>806</lpage>.<pub-id pub-id-type="doi">10.1068/p5789</pub-id><pub-id pub-id-type="pmid">18605151</pub-id></citation></ref>
<ref id="B12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Carbon</surname> <given-names>C.-C.</given-names></name></person-group> (<year>2010</year>). <article-title>The cycle of preference: long-term dynamics of aesthetic appreciation</article-title>. <source>Acta Psychol. (Amst.)</source> <volume>134</volume>, <fpage>233</fpage>&#x02013;<lpage>244</lpage>.<pub-id pub-id-type="doi">10.1016/j.actpsy.2010.02.004</pub-id><pub-id pub-id-type="pmid">20236624</pub-id></citation></ref>
<ref id="B13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Carbon</surname> <given-names>C.-C.</given-names></name> <name><surname>Ditye</surname> <given-names>T.</given-names></name></person-group> (<year>2011</year>). <article-title>Sustained effects of adaptation on the perception of familiar faces</article-title>. <source>J. Exp. Psychol. Hum. Percept. Perform.</source> <volume>37</volume>, <fpage>615</fpage>&#x02013;<lpage>625</lpage>.<pub-id pub-id-type="doi">10.1037/a0019949</pub-id></citation></ref>
<ref id="B14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Carbon</surname> <given-names>C.-C.</given-names></name> <name><surname>Ditye</surname> <given-names>T.</given-names></name></person-group> (<year>2012</year>). <article-title>Face adaptation effects show strong and long-lasting transfer from lab to more ecological contexts</article-title>. <source>Front. Psychol.</source> <volume>3</volume>:<issue>3</issue>.<pub-id pub-id-type="doi">10.3389/fpsyg.2012.00003</pub-id><pub-id pub-id-type="pmid">22291676</pub-id></citation></ref>
<ref id="B15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Carbon</surname> <given-names>C.-C.</given-names></name> <name><surname>Leder</surname> <given-names>H.</given-names></name></person-group> (<year>2005</year>). <article-title>Face adaptation: changing stable representations of familiar faces within minutes?</article-title> <source>Adv. Cogn. Psychol.</source> <volume>1</volume>, <fpage>1</fpage>&#x02013;<lpage>7</lpage>.<pub-id pub-id-type="doi">10.2478/v10053-008-0038-8</pub-id></citation></ref>
<ref id="B16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Carbon</surname> <given-names>C.-C.</given-names></name> <name><surname>Leder</surname> <given-names>H.</given-names></name></person-group> (<year>2006</year>). <article-title>The Mona Lisa effect: is &#x02018;our&#x02019; Lisa fame or fake?</article-title> <source>Perception</source> <volume>35</volume>, <fpage>411</fpage>&#x02013;<lpage>414</lpage>.<pub-id pub-id-type="doi">10.1068/p5452</pub-id><pub-id pub-id-type="pmid">16619955</pub-id></citation></ref>
<ref id="B17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Carbon</surname> <given-names>C.-C.</given-names></name> <name><surname>Leder</surname> <given-names>H.</given-names></name> <name><surname>Ditye</surname> <given-names>T.</given-names></name></person-group> (<year>2007a</year>). <article-title>When style matters. Art-specific adaptation effects</article-title>. <source>Perception</source> <volume>36</volume>, <fpage>17</fpage>.</citation></ref>
<ref id="B18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Carbon</surname> <given-names>C.-C.</given-names></name> <name><surname>Strobach</surname> <given-names>T.</given-names></name> <name><surname>Langton</surname> <given-names>S. R. H.</given-names></name> <name><surname>Hars&#x000E1;nyi</surname> <given-names>G.</given-names></name> <name><surname>Leder</surname> <given-names>H.</given-names></name> <name><surname>Kov&#x000E1;cs</surname> <given-names>G.</given-names></name></person-group> (<year>2007b</year>). <article-title>Adaptation effects of highly familiar faces: immediate and long lasting</article-title>. <source>Mem. Cognit.</source> <volume>35</volume>, <fpage>1966</fpage>&#x02013;<lpage>1976</lpage>. <pub-id pub-id-type="pmid">18265612</pub-id></citation></ref>
<ref id="B19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chen</surname> <given-names>J.</given-names></name> <name><surname>Yang</surname> <given-names>H.</given-names></name> <name><surname>Wang</surname> <given-names>A.</given-names></name> <name><surname>Fang</surname> <given-names>F.</given-names></name></person-group> (<year>2010</year>). <article-title>Perceptual consequences of face viewpoint adaptation: face viewpoint aftereffect, changes of differential sensitivity to face view, and their relationship</article-title>. <source>J. Vis.</source> <volume>10</volume>, <fpage>1</fpage>&#x02013;<lpage>11</lpage>.<pub-id pub-id-type="doi">10.1167/10.3.12</pub-id><pub-id pub-id-type="pmid">20377289</pub-id></citation></ref>
<ref id="B20"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Clifford</surname> <given-names>C. W. G.</given-names></name> <name><surname>Rhodes</surname> <given-names>G.</given-names></name></person-group> (<year>2005</year>). <source>Fitting the Mind to the World: Adaptation and Aftereffects in High Level Vision</source>. <publisher-loc>Oxford</publisher-loc>: <publisher-name>Oxford University Press</publisher-name>.</citation></ref>
<ref id="B21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Clifford</surname> <given-names>C. W. G.</given-names></name> <name><surname>Wenderoth</surname> <given-names>P.</given-names></name> <name><surname>Spehar</surname> <given-names>B.</given-names></name></person-group> (<year>2000</year>). <article-title>A functional angle on some aftereffects in cortical vision</article-title>. <source>Proc. Biol. Sci.</source> <volume>267</volume>, <fpage>1705</fpage>&#x02013;<lpage>1710</lpage>.<pub-id pub-id-type="doi">10.1098/rspb.2000.1198</pub-id></citation></ref>
<ref id="B22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Daelli</surname> <given-names>V.</given-names></name> <name><surname>van Rijsbergen</surname> <given-names>N. J.</given-names></name> <name><surname>Treves</surname> <given-names>A.</given-names></name></person-group> (<year>2010</year>). <article-title>How recent experience affects the perception of ambiguous objects</article-title>. <source>Brain Res.</source> <volume>1322</volume>, <fpage>81</fpage>&#x02013;<lpage>91</lpage>.<pub-id pub-id-type="doi">10.1016/j.brainres.2010.01.060</pub-id><pub-id pub-id-type="pmid">20122901</pub-id></citation></ref>
<ref id="B23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fang</surname> <given-names>F.</given-names></name> <name><surname>Ijichi</surname> <given-names>K.</given-names></name> <name><surname>He</surname> <given-names>S.</given-names></name></person-group> (<year>2007</year>). <article-title>Transfer of the face viewpoint aftereffect from adaptation to different and inverted faces</article-title>. <source>J. Vis.</source> <volume>7</volume>, <fpage>1</fpage>&#x02013;<lpage>9</lpage>.<pub-id pub-id-type="doi">10.1167/7.13.6</pub-id><pub-id pub-id-type="pmid">17997634</pub-id></citation></ref>
<ref id="B24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fox</surname> <given-names>C. J.</given-names></name> <name><surname>Barton</surname> <given-names>J. J. S.</given-names></name></person-group> (<year>2007</year>). <article-title>What is adapted in face adaptation? The neural representations of expression in the human visual system</article-title>. <source>Brain Res.</source> <volume>1127</volume>, <fpage>80</fpage>&#x02013;<lpage>89</lpage>.<pub-id pub-id-type="doi">10.1016/j.brainres.2006.09.104</pub-id><pub-id pub-id-type="pmid">17109830</pub-id></citation></ref>
<ref id="B25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fox</surname> <given-names>C. J.</given-names></name> <name><surname>Oruc</surname> <given-names>I.</given-names></name> <name><surname>Barton</surname> <given-names>J. J. S.</given-names></name></person-group> (<year>2008</year>). <article-title>It doesn&#x02019;t matter how you feel. The facial identity aftereffect is invariant to changes in facial expression</article-title>. <source>J. Vis.</source> <volume>8</volume>, <fpage>1</fpage>&#x02013;<lpage>13</lpage>.<pub-id pub-id-type="doi">10.1167/8.3.11</pub-id><pub-id pub-id-type="pmid">18484817</pub-id></citation></ref>
<ref id="B26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Guo</surname> <given-names>X. M.</given-names></name> <name><surname>Oru&#x000E7;</surname> <given-names>I.</given-names></name> <name><surname>Barton</surname> <given-names>J. J. S.</given-names></name></person-group> (<year>2009</year>). <article-title>Cross-orientation transfer of adaptation for facial identity is asymmetric: a study using contrast-based recognition thresholds</article-title>. <source>Vision Res.</source> <volume>49</volume>, <fpage>2254</fpage>&#x02013;<lpage>2260</lpage>.<pub-id pub-id-type="doi">10.1016/j.visres.2009.06.012</pub-id></citation></ref>
<ref id="B27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hills</surname> <given-names>P. J.</given-names></name> <name><surname>Holland</surname> <given-names>A. M.</given-names></name> <name><surname>Lewis</surname> <given-names>M. B.</given-names></name></person-group> (<year>2010</year>). <article-title>Aftereffects for face attributes with different natural variability: children are more adaptable than adolescents</article-title>. <source>Cogn. Dev.</source> <volume>25</volume>, <fpage>278</fpage>&#x02013;<lpage>289</lpage>.<pub-id pub-id-type="doi">10.1016/j.cogdev.2010.01.002</pub-id></citation></ref>
<ref id="B28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hole</surname> <given-names>G.</given-names></name></person-group> (<year>2011</year>). <article-title>Identity-specific face adaptation effects: evidence for abstractive face representations</article-title>. <source>Cognition</source> <volume>119</volume>, <fpage>216</fpage>&#x02013;<lpage>228</lpage>.<pub-id pub-id-type="doi">10.1016/j.cognition.2011.01.011</pub-id><pub-id pub-id-type="pmid">21316651</pub-id></citation></ref>
<ref id="B29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hurlbert</surname> <given-names>A.</given-names></name></person-group> (<year>2001</year>). <article-title>Trading faces</article-title>. <source>Nat. Neurosci.</source> <volume>4</volume>, <fpage>3</fpage>&#x02013;<lpage>5</lpage>.<pub-id pub-id-type="doi">10.1038/82877</pub-id></citation></ref>
<ref id="B30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jaquet</surname> <given-names>E.</given-names></name> <name><surname>Rhodes</surname> <given-names>G.</given-names></name></person-group> (<year>2008</year>). <article-title>Face aftereffects indicate dissociable, but not distinct, coding of male and female faces</article-title>. <source>J. Exp. Psychol. Hum. Percept. Perform.</source> <volume>34</volume>, <fpage>101</fpage>&#x02013;<lpage>112</lpage>.<pub-id pub-id-type="doi">10.1037/0096-1523.34.1.101</pub-id><pub-id pub-id-type="pmid">18248142</pub-id></citation></ref>
<ref id="B31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jaquet</surname> <given-names>E.</given-names></name> <name><surname>Rhodes</surname> <given-names>G.</given-names></name> <name><surname>Hayward</surname> <given-names>W. G.</given-names></name></person-group> (<year>2007</year>). <article-title>Opposite aftereffects for Chinese and Caucasian faces are selective for social category information and not just physical face differences</article-title>. <source>Q. J. Exp. Psychol.</source> <volume>60</volume>, <fpage>1457</fpage>&#x02013;<lpage>1467</lpage>.<pub-id pub-id-type="doi">10.1080/17470210701467870</pub-id><pub-id pub-id-type="pmid">17853233</pub-id></citation></ref>
<ref id="B32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jaquet</surname> <given-names>E.</given-names></name> <name><surname>Rhodes</surname> <given-names>G.</given-names></name> <name><surname>Hayward</surname> <given-names>W. G.</given-names></name></person-group> (<year>2008</year>). <article-title>Race-contingent aftereffects suggest distinct perceptual norms for different race faces</article-title>. <source>Vis. Cogn.</source> <volume>16</volume>, <fpage>734</fpage>&#x02013;<lpage>753</lpage>.<pub-id pub-id-type="doi">10.1080/13506280701350647</pub-id></citation></ref>
<ref id="B33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Javadi</surname> <given-names>A. H.</given-names></name> <name><surname>Wee</surname> <given-names>N.</given-names></name></person-group> (<year>2012</year>). <article-title>Cross-category adaptation: objects produce gender adaptation in the perception of faces</article-title>. <source>PLoS ONE</source> <volume>9</volume>:<fpage>e46079</fpage>.<pub-id pub-id-type="doi">10.1371/journal.pone.0046079</pub-id><pub-id pub-id-type="pmid">23049942</pub-id></citation></ref>
<ref id="B34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jeffery</surname> <given-names>L.</given-names></name> <name><surname>Rhodes</surname> <given-names>G.</given-names></name></person-group> (<year>2011</year>). <article-title>Insights into the development of face recognition mechanisms revealed by face aftereffects</article-title>. <source>Br. J. Psychol.</source> <volume>102</volume>, <fpage>799</fpage>&#x02013;<lpage>815</lpage>.<pub-id pub-id-type="doi">10.1111/j.2044-8295.2011.02066.x</pub-id></citation></ref>
<ref id="B35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jeffery</surname> <given-names>L.</given-names></name> <name><surname>Rhodes</surname> <given-names>G.</given-names></name> <name><surname>Busey</surname> <given-names>T.</given-names></name></person-group> (<year>2006</year>). <article-title>View-specific coding of face shape</article-title>. <source>Psychol. Sci.</source> <volume>17</volume>, <fpage>501</fpage>&#x02013;<lpage>505</lpage>.<pub-id pub-id-type="doi">10.1111/j.1467-9280.2006.01735.x</pub-id><pub-id pub-id-type="pmid">16771800</pub-id></citation></ref>
<ref id="B36"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jeffery</surname> <given-names>L.</given-names></name> <name><surname>Rhodes</surname> <given-names>G.</given-names></name> <name><surname>Busey</surname> <given-names>T.</given-names></name></person-group> (<year>2007</year>). <article-title>Broadly tuned, view-specific coding of face shape: opposing figural aftereffects can be induced in different views</article-title>. <source>Vision Res.</source> <volume>47</volume>, <fpage>3070</fpage>&#x02013;<lpage>3077</lpage>.<pub-id pub-id-type="doi">10.1016/j.visres.2007.08.018</pub-id><pub-id pub-id-type="pmid">17920099</pub-id></citation></ref>
<ref id="B37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jenkins</surname> <given-names>R.</given-names></name> <name><surname>Beaver</surname> <given-names>J. D.</given-names></name> <name><surname>Calder</surname> <given-names>A. J.</given-names></name></person-group> (<year>2006</year>). <article-title>I thought you were looking at me: direction-specific aftereffects in gaze perception</article-title>. <source>Psychol. Sci.</source> <volume>17</volume>, <fpage>506</fpage>&#x02013;<lpage>513</lpage>.<pub-id pub-id-type="doi">10.1111/j.1467-9280.2006.01736.x</pub-id><pub-id pub-id-type="pmid">16771801</pub-id></citation></ref>
<ref id="B38"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jiang</surname> <given-names>F.</given-names></name> <name><surname>Blanz</surname> <given-names>V.</given-names></name> <name><surname>O&#x02019;Toole</surname> <given-names>A. J.</given-names></name></person-group> (<year>2006</year>). <article-title>Probing the visual representation of faces with adaptation: a view from the other side of the mean</article-title>. <source>Psychol. Sci.</source> <volume>17</volume>, <fpage>493</fpage>&#x02013;<lpage>500</lpage>.<pub-id pub-id-type="doi">10.1111/j.1467-9280.2006.01734.x</pub-id><pub-id pub-id-type="pmid">16771799</pub-id></citation></ref>
<ref id="B39"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Juricevic</surname> <given-names>I.</given-names></name> <name><surname>Webster</surname> <given-names>M.</given-names></name></person-group> (<year>2012</year>). <article-title>Selectivity of face aftereffects for expressions and anti-expressions</article-title>. <source>Front. Psychol.</source> <volume>3</volume>:<issue>4</issue>.<pub-id pub-id-type="doi">10.3389/fpsyg.2012.00004</pub-id><pub-id pub-id-type="pmid">22291677</pub-id></citation></ref>
<ref id="B40"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Karbach</surname> <given-names>J.</given-names></name> <name><surname>Kray</surname> <given-names>J.</given-names></name></person-group> (<year>2009</year>). <article-title>How useful is executive control training? Age differences in near and far transfer of task-switching training</article-title>. <source>Dev. Sci.</source> <volume>12</volume>, <fpage>978</fpage>&#x02013;<lpage>990</lpage>.<pub-id pub-id-type="doi">10.1111/j.1467-7687.2009.00846.x</pub-id></citation></ref>
<ref id="B41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kloth</surname> <given-names>N.</given-names></name> <name><surname>Schweinberger</surname> <given-names>S. R.</given-names></name></person-group> (<year>2008</year>). <article-title>The temporal decay of eye gaze adaptation effects</article-title>. <source>J. Vis.</source> <volume>8</volume>, <fpage>1</fpage>&#x02013;<lpage>11</lpage>.<pub-id pub-id-type="doi">10.1167/8.11.4</pub-id></citation></ref>
<ref id="B42"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kloth</surname> <given-names>N.</given-names></name> <name><surname>Schweinberger</surname> <given-names>S. R.</given-names></name> <name><surname>Kov&#x000E1;cs</surname> <given-names>G.</given-names></name></person-group> (<year>2010</year>). <article-title>Neural correlates of generic versus gender-specific face adaptation</article-title>. <source>J. Cogn. Neurosci.</source> <volume>22</volume>, <fpage>2345</fpage>&#x02013;<lpage>2356</lpage>.<pub-id pub-id-type="doi">10.1162/jocn.2009.21329</pub-id><pub-id pub-id-type="pmid">19702459</pub-id></citation></ref>
<ref id="B43"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>K&#x000F6;hler</surname> <given-names>W.</given-names></name> <name><surname>Wallach</surname> <given-names>H.</given-names></name></person-group> (<year>1944</year>). <article-title>Figural aftereffects: an investigation of visual processes</article-title>. <source>Proc. Am. Philos. Soc.</source> <volume>88</volume>, <fpage>269</fpage>&#x02013;<lpage>357</lpage>.</citation></ref>
<ref id="B44"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kov&#x000E1;cs</surname> <given-names>G.</given-names></name> <name><surname>Zimmer</surname> <given-names>M.</given-names></name> <name><surname>Bank&#x000F3;</surname> <given-names>&#x000C9;.</given-names></name> <name><surname>Harza</surname> <given-names>I.</given-names></name> <name><surname>Antal</surname> <given-names>A.</given-names></name> <name><surname>Vidny&#x000E1;nszky</surname> <given-names>Z.</given-names></name></person-group> (<year>2006</year>). <article-title>Electrophysiological correlates of visual adaptation to faces and body parts in humans</article-title>. <source>Cereb. Cortex</source> <volume>16</volume>, <fpage>742</fpage>&#x02013;<lpage>753</lpage>.<pub-id pub-id-type="doi">10.1093/cercor/bhj020</pub-id></citation></ref>
<ref id="B45"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kov&#x000E1;cs</surname> <given-names>G.</given-names></name> <name><surname>Zimmer</surname> <given-names>M.</given-names></name> <name><surname>Harza</surname> <given-names>I.</given-names></name> <name><surname>Vidny&#x000E1;nszky</surname> <given-names>Z.</given-names></name></person-group> (<year>2007</year>). <article-title>Adaptation duration affects the spatial selectivity of facial aftereffects</article-title>. <source>Vision Res.</source> <volume>47</volume>, <fpage>3141</fpage>&#x02013;<lpage>3149</lpage>.<pub-id pub-id-type="doi">10.1016/j.visres.2007.08.019</pub-id><pub-id pub-id-type="pmid">17935749</pub-id></citation></ref>
<ref id="B46"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lai</surname> <given-names>M.</given-names></name> <name><surname>Oruc</surname> <given-names>I.</given-names></name> <name><surname>Barton</surname> <given-names>J. J. S.</given-names></name></person-group> (<year>2012</year>). <article-title>Facial age after-effects show partial identity invariance and transfer from hands to faces</article-title>. <source>Cortex</source> <volume>48</volume>, <fpage>477</fpage>&#x02013;<lpage>486</lpage>.<pub-id pub-id-type="doi">10.1016/j.cortex.2010.11.014</pub-id><pub-id pub-id-type="pmid">21208612</pub-id></citation></ref>
<ref id="B47"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Laurence</surname> <given-names>S.</given-names></name> <name><surname>Hole</surname> <given-names>G.</given-names></name></person-group> (<year>2012</year>). <article-title>Identity specific adaptation with composite faces</article-title>. <source>Vis. Cogn.</source> <volume>20</volume>, <fpage>109</fpage>&#x02013;<lpage>120</lpage>.<pub-id pub-id-type="doi">10.1080/13506285.2012.655805</pub-id></citation></ref>
<ref id="B48"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Leder</surname> <given-names>H.</given-names></name> <name><surname>Carbon</surname> <given-names>C.-C.</given-names></name></person-group> (<year>2005</year>). <article-title>When context hinders! Learn-test compatibility in face recognition</article-title>. <source>Q. J. Exp. Psychol. A</source> <volume>58A</volume>, <fpage>235</fpage>&#x02013;<lpage>250</lpage>.<pub-id pub-id-type="pmid">15903116</pub-id></citation></ref>
<ref id="B49"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Leder</surname> <given-names>H.</given-names></name> <name><surname>Carbon</surname> <given-names>C.-C.</given-names></name></person-group> (<year>2006</year>). <article-title>Face-specific configural processing of relational information</article-title>. <source>Br. J. Psychol.</source> <volume>97</volume>, <fpage>19</fpage>&#x02013;<lpage>29</lpage>.<pub-id pub-id-type="doi">10.1348/000712605X54794</pub-id></citation></ref>
<ref id="B50"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Leopold</surname> <given-names>D. A.</given-names></name> <name><surname>O&#x02019;Toole</surname> <given-names>A. J.</given-names></name> <name><surname>Vetter</surname> <given-names>T.</given-names></name> <name><surname>Blanz</surname> <given-names>V.</given-names></name></person-group> (<year>2001</year>). <article-title>Prototype-referenced shape encoding revealed by high-level aftereffects</article-title>. <source>Nat. Neurosci.</source> <volume>4</volume>, <fpage>89</fpage>&#x02013;<lpage>94</lpage>.<pub-id pub-id-type="doi">10.1038/82947</pub-id><pub-id pub-id-type="pmid">11135650</pub-id></citation></ref>
<ref id="B51"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Leopold</surname> <given-names>D. A.</given-names></name> <name><surname>Rhodes</surname> <given-names>G.</given-names></name></person-group> (<year>2010</year>). <article-title>A comparative view of face perception</article-title>. <source>J. Comp. Psychol.</source> <volume>124</volume>, <fpage>233</fpage>&#x02013;<lpage>251</lpage>.<pub-id pub-id-type="doi">10.1037/a0019460</pub-id><pub-id pub-id-type="pmid">20695655</pub-id></citation></ref>
<ref id="B52"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Leopold</surname> <given-names>D. A.</given-names></name> <name><surname>Rhodes</surname> <given-names>G.</given-names></name> <name><surname>M&#x000FC;ller</surname> <given-names>K. M.</given-names></name> <name><surname>Jeffery</surname> <given-names>L.</given-names></name></person-group> (<year>2005</year>). <article-title>The dynamics of visual adaptation to faces</article-title>. <source>Proc. Biol. Sci.</source> <volume>272</volume>, <fpage>897</fpage>&#x02013;<lpage>904</lpage>.<pub-id pub-id-type="doi">10.1098/rspb.2004.3022</pub-id></citation></ref>
<ref id="B53"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Li</surname> <given-names>A.</given-names></name> <name><surname>Tzen</surname> <given-names>B.</given-names></name> <name><surname>Yadgarova</surname> <given-names>A.</given-names></name> <name><surname>Zaidi</surname> <given-names>Q.</given-names></name></person-group> (<year>2008a</year>). <article-title>Neural basis of 3-D shape aftereffects</article-title>. <source>Vision Res.</source> <volume>48</volume>, <fpage>244</fpage>&#x02013;<lpage>252</lpage>.<pub-id pub-id-type="doi">10.1016/j.visres.2007.11.009</pub-id><pub-id pub-id-type="pmid">18166208</pub-id></citation></ref>
<ref id="B54"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Li</surname> <given-names>S.-C.</given-names></name> <name><surname>Schmiedek</surname> <given-names>F.</given-names></name> <name><surname>Huxhold</surname> <given-names>O.</given-names></name> <name><surname>R&#x000F6;cke</surname> <given-names>C.</given-names></name> <name><surname>Smith</surname> <given-names>J.</given-names></name> <name><surname>Lindenberger</surname> <given-names>U.</given-names></name></person-group> (<year>2008b</year>). <article-title>Working memory plasticity in old age: practice gain, transfer, and maintenance</article-title>. <source>Psychol. Aging</source> <volume>23</volume>, <fpage>731</fpage>&#x02013;<lpage>742</lpage>.<pub-id pub-id-type="doi">10.1037/a0014343</pub-id><pub-id pub-id-type="pmid">19140644</pub-id></citation></ref>
<ref id="B55"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Little</surname> <given-names>A.</given-names></name> <name><surname>DeBruine</surname> <given-names>L. M.</given-names></name> <name><surname>Jones</surname> <given-names>B. C.</given-names></name></person-group> (<year>2005</year>). <article-title>Sex-contingent face after-effects suggest distinct neural populations code male and female faces</article-title>. <source>Proc. Biol. Sci.</source> <volume>272</volume>, <fpage>2283</fpage>&#x02013;<lpage>2287</lpage>.<pub-id pub-id-type="doi">10.1098/rspb.2005.3220</pub-id><pub-id pub-id-type="pmid">16191641</pub-id></citation></ref>
<ref id="B56"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Little</surname> <given-names>A.</given-names></name> <name><surname>DeBruine</surname> <given-names>L. M.</given-names></name> <name><surname>Jones</surname> <given-names>B. C.</given-names></name> <name><surname>Waitt</surname> <given-names>C.</given-names></name></person-group> (<year>2008</year>). <article-title>Category contingent aftereffects for faces of different races, ages and species</article-title>. <source>Cognition</source> <volume>106</volume>, <fpage>1537</fpage>&#x02013;<lpage>1547</lpage>.<pub-id pub-id-type="doi">10.1016/j.cognition.2007.06.008</pub-id><pub-id pub-id-type="pmid">17707364</pub-id></citation></ref>
<ref id="B57"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>MacLin</surname> <given-names>O. H.</given-names></name> <name><surname>Webster</surname> <given-names>M. A.</given-names></name></person-group> (<year>2001</year>). <article-title>Influence of adaptation on the perception of distortions in natural images</article-title>. <source>J. Electron. Imaging</source> <volume>10</volume>, <fpage>100</fpage>&#x02013;<lpage>109</lpage>.<pub-id pub-id-type="doi">10.1117/1.1330573</pub-id></citation></ref>
<ref id="B58"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>McKone</surname> <given-names>E.</given-names></name> <name><surname>Aitkin</surname> <given-names>A.</given-names></name> <name><surname>Edwards</surname> <given-names>M.</given-names></name></person-group> (<year>2005</year>). <article-title>Categorical and coordinate relations in faces, or Fechner&#x02019;s law and face space instead?</article-title> <source>J. Exp. Psychol. Hum. Percept. Perform.</source> <volume>31</volume>, <fpage>1181</fpage>&#x02013;<lpage>1198</lpage>.<pub-id pub-id-type="doi">10.1037/0096-1523.31.6.1181</pub-id><pub-id pub-id-type="pmid">16366783</pub-id></citation></ref>
<ref id="B59"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Morikawa</surname> <given-names>K.</given-names></name></person-group> (<year>2005</year>). <article-title>Adaptation to asymmetrically distorted faces and its lack of effect on mirror images</article-title>. <source>Vision Res.</source> <volume>45</volume>, <fpage>3180</fpage>&#x02013;<lpage>3188</lpage>.<pub-id pub-id-type="doi">10.1016/j.visres.2005.08.004</pub-id><pub-id pub-id-type="pmid">16169038</pub-id></citation></ref>
<ref id="B60"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ng</surname> <given-names>M.</given-names></name> <name><surname>Boynton</surname> <given-names>G. M.</given-names></name> <name><surname>Fine</surname> <given-names>I.</given-names></name></person-group> (<year>2008</year>). <article-title>Face adaptation does not improve performance on search or discrimination tasks</article-title>. <source>J. Vis.</source> <volume>8</volume>, <fpage>1</fpage>&#x02013;<lpage>20</lpage>.<pub-id pub-id-type="doi">10.1167/8.1.1</pub-id></citation></ref>
<ref id="B61"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ng</surname> <given-names>M.</given-names></name> <name><surname>Ciaramitaro</surname> <given-names>V. M.</given-names></name> <name><surname>Anstis</surname> <given-names>S.</given-names></name> <name><surname>Boynton</surname> <given-names>G. M.</given-names></name> <name><surname>Fine</surname> <given-names>I.</given-names></name> <name><surname>Squire</surname> <given-names>L. R.</given-names></name></person-group> (<year>2006</year>). <article-title>Selectivity for the configural cues that identify the gender, ethnicity, and identity of faces in human cortex</article-title>. <source>Proc. Natl. Acad. Sci. U.S.A.</source> <volume>103</volume>, <fpage>19552</fpage>&#x02013;<lpage>19557</lpage>.<pub-id pub-id-type="doi">10.1073/pnas.0605358104</pub-id><pub-id pub-id-type="pmid">17164335</pub-id></citation></ref>
<ref id="B62"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>O&#x02019;Neil</surname> <given-names>S. F.</given-names></name> <name><surname>Webster</surname> <given-names>M. A.</given-names></name></person-group> (<year>2011</year>). <article-title>Adaptation and the perception of facial age</article-title>. <source>Vis. Cogn.</source> <volume>19</volume>, <fpage>534</fpage>&#x02013;<lpage>550</lpage>.<pub-id pub-id-type="doi">10.1080/13506285.2011.561262</pub-id><pub-id pub-id-type="pmid">22215952</pub-id></citation></ref>
<ref id="B63"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Otten</surname> <given-names>M.</given-names></name> <name><surname>Banaji</surname> <given-names>M. R.</given-names></name></person-group> (<year>2012</year>). <article-title>Social categories shape the neural representation of emotion: evidence from a visual face adaptation task</article-title>. <source>Front. Integr. Neurosci.</source> <volume>6</volume>:<issue>9</issue>.<pub-id pub-id-type="doi">10.3389/fnint.2012.00009</pub-id><pub-id pub-id-type="pmid">22403531</pub-id></citation></ref>
<ref id="B64"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Palermo</surname> <given-names>R.</given-names></name> <name><surname>Rivolta</surname> <given-names>D.</given-names></name> <name><surname>Wilson</surname> <given-names>C. E.</given-names></name> <name><surname>Jeffery</surname> <given-names>L.</given-names></name></person-group> (<year>2011</year>). <article-title>Adaptive face space coding in congenital prosopagnosia: typical figural aftereffects but abnormal identity aftereffects</article-title>. <source>Neuropsychologia</source> <volume>49</volume>, <fpage>3801</fpage>&#x02013;<lpage>3812</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2011.09.039</pub-id><pub-id pub-id-type="pmid">21986295</pub-id></citation></ref>
<ref id="B65"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rhodes</surname> <given-names>G.</given-names></name></person-group> (<year>1988</year>). <article-title>Looking at faces: first-order and second-order features as determinants of facial appearance</article-title>. <source>Perception</source> <volume>17</volume>, <fpage>43</fpage>&#x02013;<lpage>63</lpage>.<pub-id pub-id-type="doi">10.1068/p170043</pub-id><pub-id pub-id-type="pmid">3205669</pub-id></citation></ref>
<ref id="B66"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rhodes</surname> <given-names>G.</given-names></name> <name><surname>Evangelista</surname> <given-names>E.</given-names></name> <name><surname>Jeffery</surname> <given-names>L.</given-names></name></person-group> (<year>2009a</year>). <article-title>Orientation-sensitivity of face identity aftereffects</article-title>. <source>Vision Res.</source> <volume>49</volume>, <fpage>2379</fpage>&#x02013;<lpage>2385</lpage>.<pub-id pub-id-type="doi">10.1016/j.visres.2009.07.010</pub-id><pub-id pub-id-type="pmid">19631682</pub-id></citation></ref>
<ref id="B67"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rhodes</surname> <given-names>G.</given-names></name> <name><surname>Louw</surname> <given-names>K.</given-names></name> <name><surname>Evangelista</surname> <given-names>E.</given-names></name></person-group> (<year>2009b</year>). <article-title>Perceptual adaptation to facial asymmetries</article-title>. <source>Psychon. Bull. Rev.</source> <volume>16</volume>, <fpage>503</fpage>&#x02013;<lpage>508</lpage>.<pub-id pub-id-type="doi">10.3758/PBR.16.3.503</pub-id><pub-id pub-id-type="pmid">19451376</pub-id></citation></ref>
<ref id="B68"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rhodes</surname> <given-names>G.</given-names></name> <name><surname>Jaquet</surname> <given-names>E.</given-names></name> <name><surname>Jeffery</surname> <given-names>L.</given-names></name> <name><surname>Evangelista</surname> <given-names>E.</given-names></name> <name><surname>Keane</surname> <given-names>J.</given-names></name> <name><surname>Calder</surname> <given-names>A. J.</given-names></name></person-group> (<year>2011</year>). <article-title>Sex-specific norms code face identity</article-title>. <source>J. Vis.</source> <volume>11</volume>, <fpage>1</fpage>&#x02013;<lpage>11</lpage>.<pub-id pub-id-type="doi">10.1167/11.1.1</pub-id><pub-id pub-id-type="pmid">21199895</pub-id></citation></ref>
<ref id="B69"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rhodes</surname> <given-names>G.</given-names></name> <name><surname>Jeffery</surname> <given-names>L.</given-names></name></person-group> (<year>2006</year>). <article-title>Adaptive norm-based coding of facial identity</article-title>. <source>Vision Res.</source> <volume>46</volume>, <fpage>2977</fpage>&#x02013;<lpage>2987</lpage>.<pub-id pub-id-type="doi">10.1016/j.visres.2006.03.002</pub-id><pub-id pub-id-type="pmid">16647736</pub-id></citation></ref>
<ref id="B70"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rhodes</surname> <given-names>G.</given-names></name> <name><surname>Jeffery</surname> <given-names>L.</given-names></name> <name><surname>Clifford</surname> <given-names>C. W. G.</given-names></name> <name><surname>Leopold</surname> <given-names>D. A.</given-names></name></person-group> (<year>2007</year>). <article-title>The timecourse of higher-level face aftereffects</article-title>. <source>Vision Res.</source> <volume>47</volume>, <fpage>2291</fpage>&#x02013;<lpage>2296</lpage>.<pub-id pub-id-type="doi">10.1016/j.visres.2007.05.012</pub-id><pub-id pub-id-type="pmid">17619045</pub-id></citation></ref>
<ref id="B71"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rhodes</surname> <given-names>G.</given-names></name> <name><surname>Jeffery</surname> <given-names>L.</given-names></name> <name><surname>Watson</surname> <given-names>T. L.</given-names></name> <name><surname>Clifford</surname> <given-names>C. W. G.</given-names></name> <name><surname>Nakayama</surname> <given-names>K.</given-names></name></person-group> (<year>2003</year>). <article-title>Fitting the mind to the world: face adaptation and attractiveness aftereffects</article-title>. <source>Psychol. Sci.</source> <volume>14</volume>, <fpage>558</fpage>&#x02013;<lpage>566</lpage>.<pub-id pub-id-type="doi">10.1046/j.0956-7976.2003.psci_1465.x</pub-id><pub-id pub-id-type="pmid">14629686</pub-id></citation></ref>
<ref id="B72"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rhodes</surname> <given-names>G.</given-names></name> <name><surname>Watson</surname> <given-names>T. L.</given-names></name> <name><surname>Jeffery</surname> <given-names>L.</given-names></name> <name><surname>Clifford</surname> <given-names>C. W. G.</given-names></name></person-group> (<year>2010</year>). <article-title>Perceptual adaptation helps us identify faces</article-title>. <source>Vision Res.</source> <volume>50</volume>, <fpage>963</fpage>&#x02013;<lpage>968</lpage>.<pub-id pub-id-type="doi">10.1016/j.visres.2010.03.003</pub-id><pub-id pub-id-type="pmid">20214920</pub-id></citation></ref>
<ref id="B73"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Robbins</surname> <given-names>R.</given-names></name> <name><surname>McKone</surname> <given-names>E.</given-names></name> <name><surname>Edwards</surname> <given-names>M.</given-names></name></person-group> (<year>2007</year>). <article-title>Aftereffects for face attributes with different natural variability: adapter position effects and neural models</article-title>. <source>J. Exp. Psychol. Hum. Percept. Perform.</source> <volume>33</volume>, <fpage>570</fpage>&#x02013;<lpage>592</lpage>.<pub-id pub-id-type="doi">10.1037/0096-1523.33.3.570</pub-id><pub-id pub-id-type="pmid">17563222</pub-id></citation></ref>
<ref id="B74"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Schwaninger</surname> <given-names>A.</given-names></name> <name><surname>Carbon</surname> <given-names>C. C.</given-names></name> <name><surname>Leder</surname> <given-names>H.</given-names></name></person-group> (<year>2003</year>). <article-title>&#x0201C;Expert face processing: specialization and constraints,&#x0201D;</article-title> in <source>The Development of Face Processing</source>, eds <person-group person-group-type="editor"><name><surname>Schwarzer</surname> <given-names>G.</given-names></name> <name><surname>Leder</surname> <given-names>H.</given-names></name></person-group> (<publisher-loc>G&#x000F6;ttingen</publisher-loc>: <publisher-name>Hogrefe &#x00026; Huber</publisher-name>), <fpage>81</fpage>&#x02013;<lpage>97</lpage>.</citation></ref>
<ref id="B75"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schweinberger</surname> <given-names>S. R.</given-names></name> <name><surname>Kloth</surname> <given-names>N.</given-names></name> <name><surname>Jenkins</surname> <given-names>R.</given-names></name></person-group> (<year>2007</year>). <article-title>Are you looking at me? Neural correlates of gaze adaptation</article-title>. <source>Neuroreport</source> <volume>18</volume>, <fpage>693</fpage>&#x02013;<lpage>696</lpage>. <pub-id pub-id-type="pmid">17426601</pub-id></citation></ref>
<ref id="B76"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schweinberger</surname> <given-names>S. R.</given-names></name> <name><surname>Z&#x000E4;ske</surname> <given-names>R.</given-names></name> <name><surname>Walther</surname> <given-names>C.</given-names></name> <name><surname>Golle</surname> <given-names>J.</given-names></name> <name><surname>Kov&#x000E1;cs</surname> <given-names>G.</given-names></name> <name><surname>Wiese</surname> <given-names>H.</given-names></name></person-group> (<year>2010</year>). <article-title>Young without plastic surgery: perceptual adaptation to the age of female and male faces</article-title>. <source>Vision Res.</source> <volume>50</volume>, <fpage>2570</fpage>&#x02013;<lpage>2576</lpage>.<pub-id pub-id-type="doi">10.1016/j.visres.2010.08.017</pub-id><pub-id pub-id-type="pmid">20800608</pub-id></citation></ref>
<ref id="B77"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Strobach</surname> <given-names>T.</given-names></name> <name><surname>Ditye</surname> <given-names>T.</given-names></name> <name><surname>Carbon</surname> <given-names>C.-C.</given-names></name></person-group> (<year>2011</year>). <article-title>Long-term adaptation effects of highly familiar faces are modulated by adaptation duration</article-title>. <source>Perception</source> <volume>40</volume>, <fpage>1000</fpage>&#x02013;<lpage>1004</lpage>.<pub-id pub-id-type="doi">10.1068/p6986</pub-id></citation></ref>
<ref id="B78"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Strobach</surname> <given-names>T.</given-names></name> <name><surname>Frensch</surname> <given-names>P. A.</given-names></name> <name><surname>Schubert</surname> <given-names>T.</given-names></name></person-group> (<year>2012a</year>). <article-title>Video game practice optimizes executive control skills in dual-task and task switching situations</article-title>. <source>Acta Psychol. (Amst.)</source> <volume>140</volume>, <fpage>13</fpage>&#x02013;<lpage>24</lpage>.<pub-id pub-id-type="doi">10.1016/j.actpsy.2012.02.001</pub-id><pub-id pub-id-type="pmid">22426427</pub-id></citation></ref>
<ref id="B79"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Strobach</surname> <given-names>T.</given-names></name> <name><surname>Frensch</surname> <given-names>P. A.</given-names></name> <name><surname>Soutschek</surname> <given-names>A.</given-names></name> <name><surname>Schubert</surname> <given-names>T.</given-names></name></person-group> (<year>2012b</year>). <article-title>Investigation on the improvement and transfer of dual-task coordination skills</article-title>. <source>Psychol. Res.</source> <volume>76</volume>, <fpage>794</fpage>&#x02013;<lpage>811</lpage>.<pub-id pub-id-type="doi">10.1007/s00426-011-0381-0</pub-id></citation></ref>
<ref id="B80"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tanaka</surname> <given-names>J. W.</given-names></name> <name><surname>Farah</surname> <given-names>M. J.</given-names></name></person-group> (<year>1993</year>). <article-title>Parts and wholes in face recognition</article-title>. <source>Q. J. Exp. Psychol. A</source> <volume>46A</volume>, <fpage>225</fpage>&#x02013;<lpage>245</lpage>.<pub-id pub-id-type="pmid">8316637</pub-id></citation></ref>
<ref id="B81"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tanaka</surname> <given-names>J. W.</given-names></name> <name><surname>Sengco</surname> <given-names>J. A.</given-names></name></person-group> (<year>1997</year>). <article-title>Features and their configuration in face recognition</article-title>. <source>Mem. Cognit.</source> <volume>25</volume>, <fpage>583</fpage>&#x02013;<lpage>592</lpage>. <pub-id pub-id-type="pmid">9337578</pub-id></citation></ref>
<ref id="B82"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tillman</surname> <given-names>M.</given-names></name> <name><surname>Webster</surname> <given-names>M.</given-names></name></person-group> (<year>2012</year>). <article-title>Selectivity of face distortion aftereffects for differences in expression or gender</article-title>. <source>Front. Psychol.</source> <volume>3</volume>:<issue>14</issue>.<pub-id pub-id-type="doi">10.3389/fpsyg.2012.00014</pub-id></citation></ref>
<ref id="B83"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Valentine</surname> <given-names>T.</given-names></name></person-group> (<year>2001</year>). <article-title>&#x0201C;Face-space models of face recognition,&#x0201D;</article-title> in <source>Computational, Geometric, and Process Perspectives on Facial Cognition: Contexts and Challenges</source>, eds. <person-group person-group-type="editor"><name><surname>Wenger</surname> <given-names>M. J.</given-names></name> <name><surname>Townsend</surname> <given-names>J. T.</given-names></name></person-group> (<publisher-loc>Mahwah, NJ</publisher-loc>: <publisher-name>Lawrence Erlbaum Associates Publishers</publisher-name>), <fpage>83</fpage>&#x02013;<lpage>113</lpage>.</citation></ref>
<ref id="B84"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Walton</surname> <given-names>B. R. P.</given-names></name> <name><surname>Hills</surname> <given-names>P. J.</given-names></name></person-group> (<year>2012</year>). <article-title>Face distortion aftereffects in personally familiar and unfamiliar faces</article-title>. <source>Front. Psychol.</source> <volume>3</volume>:<issue>258</issue>.<pub-id pub-id-type="doi">10.3389/fpsyg.2012.00258</pub-id><pub-id pub-id-type="pmid">22870069</pub-id></citation></ref>
<ref id="B85"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Watson</surname> <given-names>T. L.</given-names></name> <name><surname>Clifford</surname> <given-names>C. W. G.</given-names></name></person-group> (<year>2003</year>). <article-title>Pulling faces: an investigation of the face-distortion aftereffect</article-title>. <source>Perception</source> <volume>32</volume>, <fpage>1109</fpage>&#x02013;<lpage>1116</lpage>.<pub-id pub-id-type="doi">10.1068/p5082</pub-id></citation></ref>
<ref id="B86"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Watson</surname> <given-names>T. L.</given-names></name> <name><surname>Clifford</surname> <given-names>C. W. G.</given-names></name></person-group> (<year>2006</year>). <article-title>Orientation dependence of the orientation-contingent face aftereffect</article-title>. <source>Vision Res.</source> <volume>46</volume>, <fpage>3422</fpage>&#x02013;<lpage>3429</lpage>.<pub-id pub-id-type="doi">10.1016/j.visres.2006.03.026</pub-id><pub-id pub-id-type="pmid">16723149</pub-id></citation></ref>
<ref id="B87"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Webster</surname> <given-names>M. A.</given-names></name></person-group> (<year>2011</year>). <article-title>Adaptation and visual coding</article-title>. <source>J. Vis.</source> <volume>11</volume>, <fpage>1</fpage>&#x02013;<lpage>23</lpage>.<pub-id pub-id-type="doi">10.1167/11.5.3</pub-id></citation></ref>
<ref id="B88"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Webster</surname> <given-names>M. A.</given-names></name> <name><surname>Kaping</surname> <given-names>D.</given-names></name> <name><surname>Mizokami</surname> <given-names>Y.</given-names></name> <name><surname>Duhamel</surname> <given-names>P.</given-names></name></person-group> (<year>2004</year>). <article-title>Adaptation to natural facial categories</article-title>. <source>Nature</source> <volume>428</volume>, <fpage>557</fpage>&#x02013;<lpage>561</lpage>.<pub-id pub-id-type="doi">10.1038/nature02420</pub-id></citation></ref>
<ref id="B89"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Webster</surname> <given-names>M. A.</given-names></name> <name><surname>MacLeod</surname> <given-names>D. I. A.</given-names></name></person-group> (<year>2011</year>). <article-title>Visual adaptation and the perception of faces</article-title>. <source>Philos. Trans. R. Soc. Lond. B Biol. Sci.</source> <volume>366</volume>, <fpage>1702</fpage>&#x02013;<lpage>1725</lpage>.<pub-id pub-id-type="doi">10.1098/rstb.2010.0360</pub-id></citation></ref>
<ref id="B90"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Webster</surname> <given-names>M. A.</given-names></name> <name><surname>MacLin</surname> <given-names>O. H.</given-names></name></person-group> (<year>1999</year>). <article-title>Figural aftereffects in the perception of faces</article-title>. <source>Psychon. Bull. Rev.</source> <volume>6</volume>, <fpage>647</fpage>&#x02013;<lpage>653</lpage>. <pub-id pub-id-type="pmid">10682208</pub-id></citation></ref>
<ref id="B91"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Windhager</surname> <given-names>S.</given-names></name> <name><surname>Hutzler</surname> <given-names>F.</given-names></name> <name><surname>Carbon</surname> <given-names>C. C.</given-names></name> <name><surname>Oberzaucher</surname> <given-names>E.</given-names></name> <name><surname>Schaefer</surname> <given-names>K.</given-names></name> <name><surname>Thorstensen</surname> <given-names>T.</given-names></name> <etal/></person-group> (<year>2010</year>). <article-title>Laying eyes on headlights: eye tracking reveals facial features in cars</article-title>. <source>Coll. Antropol.</source> <volume>34</volume>, <fpage>1075</fpage>&#x02013;<lpage>1080</lpage>. <pub-id pub-id-type="pmid">20977106</pub-id></citation></ref>
<ref id="B92"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yamashita</surname> <given-names>J. A.</given-names></name> <name><surname>Hardy</surname> <given-names>J. L.</given-names></name> <name><surname>De Valois</surname> <given-names>K. K.</given-names></name> <name><surname>Webster</surname> <given-names>M. A.</given-names></name></person-group> (<year>2005</year>). <article-title>Stimulus selectivity of figural aftereffects for faces</article-title>. <source>J. Exp. Psychol. Hum. Percept. Perform.</source> <volume>31</volume>, <fpage>420</fpage>&#x02013;<lpage>437</lpage>.<pub-id pub-id-type="doi">10.1037/0096-1523.31.3.420</pub-id><pub-id pub-id-type="pmid">15982123</pub-id></citation></ref>
<ref id="B93"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yang</surname> <given-names>H.</given-names></name> <name><surname>Shen</surname> <given-names>J.</given-names></name> <name><surname>Chen</surname> <given-names>J.</given-names></name> <name><surname>Fang</surname> <given-names>F.</given-names></name></person-group> (<year>2011</year>). <article-title>Face adaptation improves gender discrimination</article-title>. <source>Vision Res.</source> <volume>51</volume>, <fpage>105</fpage>&#x02013;<lpage>110</lpage>.<pub-id pub-id-type="doi">10.1016/j.visres.2010.10.006</pub-id><pub-id pub-id-type="pmid">20937298</pub-id></citation></ref>
<ref id="B94"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Young</surname> <given-names>A. W.</given-names></name> <name><surname>Hellawell</surname> <given-names>D.</given-names></name> <name><surname>Hay</surname> <given-names>D. C.</given-names></name></person-group> (<year>1987</year>). <article-title>Configurational information in face perception</article-title>. <source>Perception</source> <volume>16</volume>, <fpage>747</fpage>&#x02013;<lpage>759</lpage>.<pub-id pub-id-type="doi">10.1068/p160747</pub-id></citation></ref>
<ref id="B95"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhao</surname> <given-names>L.</given-names></name> <name><surname>Chubb</surname> <given-names>C.</given-names></name></person-group> (<year>2001</year>). <article-title>The size-tuning of the face-distortion after-effect</article-title>. <source>Vision Res.</source> <volume>41</volume>, <fpage>2979</fpage>&#x02013;<lpage>2994</lpage>.<pub-id pub-id-type="doi">10.1016/S0042-6989(01)00202-4</pub-id><pub-id pub-id-type="pmid">11704237</pub-id></citation></ref>
</ref-list>
<fn-group>
<fn id="fn1">
<p><sup>1</sup>We refer to face adaptation effects as aftereffects (K&#x000F6;hler and Wallach, <xref ref-type="bibr" rid="B43">1944</xref>) at higher perceptual and cognitive levels (Carbon and Ditye, <xref ref-type="bibr" rid="B13">2011</xref>).</p></fn>
<fn id="fn2">
<p><sup>2</sup>In the present work, we define that different facial images can originate from the same identity. For example, the same identity can produce face images differing in emotionality. In contrast, the same facial image of the same identity can produce different versions that vary in size or orientation.</p></fn>
</fn-group>
</back>
</article>
