<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" article-type="brief-report" dtd-version="2.3" xml:lang="EN">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Hum. Neurosci.</journal-id>
<journal-title>Frontiers in Human Neuroscience</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Hum. Neurosci.</abbrev-journal-title>
<issn pub-type="epub">1662-5161</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fnhum.2023.1268972</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Human Neuroscience</subject>
<subj-group>
<subject>Brief Research Report</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Face processing and early event-related potentials: replications and novel findings</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes"><name><surname>Brunet</surname> <given-names>Nicolas M.</given-names></name><xref rid="c001" ref-type="corresp"><sup>&#x002A;</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/360646/overview"/>
<role content-type="https://credit.niso.org/contributor-roles/conceptualization/"/>
<role content-type="https://credit.niso.org/contributor-roles/formal-analysis/"/>
<role content-type="https://credit.niso.org/contributor-roles/funding-acquisition/"/>
<role content-type="https://credit.niso.org/contributor-roles/investigation/"/>
<role content-type="https://credit.niso.org/contributor-roles/methodology/"/>
<role content-type="https://credit.niso.org/contributor-roles/project-administration/"/>
<role content-type="https://credit.niso.org/contributor-roles/resources/"/>
<role content-type="https://credit.niso.org/contributor-roles/supervision/"/>
<role content-type="https://credit.niso.org/contributor-roles/visualization/"/>
<role content-type="https://credit.niso.org/contributor-roles/writing-original-draft/"/>
<role content-type="https://credit.niso.org/contributor-roles/writing-review-&#x0026;-editing/"/>
</contrib>
</contrib-group>
<aff><institution>Department of Psychology, California State University of San Bernardino</institution>, <addr-line>San Bernardino, CA</addr-line>, <country>United States</country></aff>
<author-notes>
<fn fn-type="edited-by" id="fn0001">
<p>Edited by: Vilfredo De Pascalis, Sapienza University of Rome, Italy</p>
</fn>
<fn fn-type="edited-by" id="fn0002">
<p>Reviewed by: Carlo Lai, Sapienza University of Rome, Italy; Joseph Ciorciari, Swinburne University of Technology, Australia</p>
</fn>
<corresp id="c001">&#x002A;Correspondence: Nicolas M. Brunet, <email>nicolas.brunet@csusb.edu</email></corresp>
</author-notes>
<pub-date pub-type="epub">
<day>25</day>
<month>10</month>
<year>2023</year>
</pub-date>
<pub-date pub-type="collection">
<year>2023</year>
</pub-date>
<volume>17</volume>
<elocation-id>1268972</elocation-id>
<history>
<date date-type="received">
<day>28</day>
<month>07</month>
<year>2023</year>
</date>
<date date-type="accepted">
<day>05</day>
<month>10</month>
<year>2023</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x00A9; 2023 Brunet.</copyright-statement>
<copyright-year>2023</copyright-year>
<copyright-holder>Brunet</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/">
<p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p>
</license>
</permissions>
<abstract>
<p>This research explores early Event-Related Potentials (ERPs) sensitivity to facial stimuli, investigating various facial features aimed to unveil underlying neural mechanisms. Two experiments, each involving 15 undergraduate students, utilized a multidimensional stimulus set incorporating race, gender, age, emotional expression, face masks, and stimulus orientation. Findings highlight significant modulations in N170 and P200 amplitudes and latencies for specific attributes, replicating prior research and revealing novel insights. Notably, age-related facial feature variations, facial inversion, and the presence of face masks significantly impact neural responses. Several speculative explanations are proposed to elucidate these results: First, the findings lend support to the idea that the increased N170 amplitude observed with facial inversion is closely tied to the activation of object-sensitive neurons. This is further bolstered by a similar amplitude increase noted when masks (effective objects) are added to faces. Second, the absence of an additional amplitude increase, when inverting face images with face masks suggests that neural populations may have reached a saturation point, limiting further enhancement. Third, the study reveals that the latency deficit in N170 induced by facial inversion is even more pronounced in the subsequent ERP component, the P200, indicating that face inversion may impact multiple stages of face processing. Lastly, the significant increase in P200 amplitude, typically associated with face typicality, for masked faces in this study aligns with previous research that demonstrated elevated P200 amplitudes for scrambled faces. This suggests that obscured faces may be processed as typical, potentially representing a default state in face processing.</p>
</abstract>
<kwd-group>
<kwd>ERP</kwd>
<kwd>EEG</kwd>
<kwd>N170</kwd>
<kwd>P200</kwd>
<kwd>age</kwd>
<kwd>face inversion</kwd>
<kwd>face mask</kwd>
<kwd>amplitude</kwd>
</kwd-group>
<contract-num rid="cn1">R15NS121788</contract-num>
<contract-sponsor id="cn1">National Institutes of Health (NIH)/National Institutes of Neurological Disorders and Stroke (NINDS)</contract-sponsor>
<counts>
<fig-count count="3"/>
<table-count count="1"/>
<equation-count count="0"/>
<ref-count count="58"/>
<page-count count="10"/>
<word-count count="7573"/>
</counts>
<custom-meta-wrap>
<custom-meta>
<meta-name>section-at-acceptance</meta-name>
<meta-value>Cognitive Neuroscience</meta-value>
</custom-meta>
</custom-meta-wrap>
</article-meta>
</front>
<body>
<sec sec-type="intro" id="sec1">
<label>1.</label>
<title>Introduction</title>
<p>Electroencephalography (EEG) plays a crucial role in cognitive neuroscience, particularly in studying facial processing (<xref ref-type="bibr" rid="ref41">Rossion, 2014</xref>). The favorable location of the Fusiform Face Area (FFA) in the inferior temporal cortex, specialized for facial stimuli, enhances EEG data collection through scalp sensors. Researchers utilize EEG to investigate FFA activity, exploring various factors and cognitive processes. Participants encounter a range of facial and non-facial stimuli, engaging in tasks like memory and identification. This approach yields event-related potentials (ERPs) obtained through EEG data averaging, revealing positive and negative voltage deflections (ERP components) (<xref ref-type="bibr" rid="ref43">Rossion and Jacques, 2008</xref>). These components correspond to distinct neural sources, encapsulating different facial processing stages.</p>
<p>Key ERP components commonly addressed in early vision studies include the P1 (P100), a positive deflection occurring approximately 100 ms post-stimulus onset. It primarily reflects occipital lobe activity and exhibits heightened responses to facial stimuli (<xref ref-type="bibr" rid="ref22">Itier and Taylor, 2004</xref>; <xref ref-type="bibr" rid="ref20">Herrmann et al., 2005</xref>; <xref ref-type="bibr" rid="ref26">Kaltwasser et al., 2014</xref>; <xref ref-type="bibr" rid="ref31">Moradi et al., 2017</xref>; <xref ref-type="bibr" rid="ref13">Gantiva et al., 2020</xref>). The N170, a negative deflection within the 130&#x2013;200&#x2009;ms timeframe, originates from FFA, showing larger amplitudes for faces and a pronounced sensitivity to facial features and configurations (<xref ref-type="bibr" rid="ref25">Johnson, 2005</xref>; <xref ref-type="bibr" rid="ref44">Rossion and Jacques, 2011</xref>; <xref ref-type="bibr" rid="ref21">Hinojosa et al., 2015</xref>; <xref ref-type="bibr" rid="ref47">Schindler and Bublatzky, 2020</xref>). Lastly, the P2 (P200) component peaks around 150&#x2013;275&#x2009;ms post-stimulus onset, characterized by its sensitivity to attentional processes (<xref ref-type="bibr" rid="ref7">Carreti&#x00E9; et al., 2013</xref>) and facial prototypicality (<xref ref-type="bibr" rid="ref48">Schweinberger and Neumann, 2016</xref>).</p>
<p>Studying the Fusiform Face Area (FFA) is of utmost importance because it sheds light on a fundamental aspect of brain processing: the brain&#x2019;s specialization in processing specific information categories like faces, places, tools, and body parts, which likely extends to other sensory modalities and cognitive processes (<xref ref-type="bibr" rid="ref35">Pascual-Leone and Hamilton, 2001</xref>).</p>
<p>For a visual stimulus to elicit a face-sensitive N170 response, it must contain enough information in terms of local elements and their arrangement to create the perception of a face (<xref ref-type="bibr" rid="ref44">Rossion and Jacques, 2011</xref>). Due to the cyclical resurgence of respiratory viruses, face masks are expected to persist. A recent study (<xref ref-type="bibr" rid="ref12">Freud et al., 2022</xref>) reports that adding masks significantly impairs face recognition, contradicting the notion of easy adaptation to a masked world. This raises the question of how mask-wearing affects early face processing and the retrieval of vital facial information like age, gender, race/ethnicity, and facial expressions, characteristics, usually easily discernible in unmasked faces.</p>
<p>To that extent, a set of facial stimuli was carefully curated, varying across five binary dimensions: gender (male/female), race (white/black), facial expression (happy/angry), age (young/old), and the presence or absence of a face mask. This approach not only facilitates the exploration of interactions among these variables but also offers the opportunity to reexamine prior research with divergent findings regarding the influence of these factors on P100 and N170 processes. Some studies, for instance, reported no discrimination by N170 based on emotional expression (<xref ref-type="bibr" rid="ref9">Eimer and Holmes, 2002</xref>; <xref ref-type="bibr" rid="ref19">Herrmann et al., 2002</xref>; <xref ref-type="bibr" rid="ref10">Eimer et al., 2003</xref>), while others noted larger amplitudes in response to fearful faces (<xref ref-type="bibr" rid="ref3">Batty and Taylor, 2003</xref>; <xref ref-type="bibr" rid="ref55">Williams et al., 2006</xref>; <xref ref-type="bibr" rid="ref4">Blau et al., 2007</xref>; <xref ref-type="bibr" rid="ref29">Luo et al., 2010</xref>), some even at earlier latencies (<xref ref-type="bibr" rid="ref50">Walker et al., 2008</xref>). Social category modulation of N170 has also produced mixed results, with some studies showing no effect (<xref ref-type="bibr" rid="ref6">Caldara et al., 2004</xref>; <xref ref-type="bibr" rid="ref18">He et al., 2009</xref>; <xref ref-type="bibr" rid="ref54">Wiese et al., 2009</xref>) or increased N170 responses to other-race faces (<xref ref-type="bibr" rid="ref50">Walker et al., 2008</xref>). Conversely, no gender effects have been reported on P100 or N170 components (<xref ref-type="bibr" rid="ref33">Mouchetant-Rostaing et al., 2000</xref>; <xref ref-type="bibr" rid="ref32">Mouchetant-Rostaing and Giard, 2003</xref>). Regarding facial age, both young and older participants exhibit heightened N170 amplitudes when presented with older faces compared to younger ones (<xref ref-type="bibr" rid="ref53">Wiese et al., 2008</xref>; <xref ref-type="bibr" rid="ref52">Wiese, 2012</xref>). However, the N170&#x2019;s sensitivity to age-related factors diminishes when age and race/ethnicity factors are presented concurrently, suggesting potential modulation by contextual or task-related variables (<xref ref-type="bibr" rid="ref52">Wiese, 2012</xref>). These findings indicate that N170 responsiveness to age differs from its reactivity to race/ethnicity. These discrepancies across studies likely stem from variations in stimulus characteristics, task demands, experimental design, and stimulus presentation, posing challenges for direct comparisons.</p>
<p>The findings from 15 participants in the study revealed a significant and substantial increase in both N170 and P200 activity when comparing masked and unmasked faces, as elaborated in the Results and Discussion sections. This heightened N170 response bears a resemblance to the pattern observed with inverted faces (<xref ref-type="bibr" rid="ref42">Rossion et al., 2000</xref>), which is thought to involve the additional engagement of object-sensitive neurons, supported by evidence from fMRI studies. These studies suggest that inverted faces become more akin to objects, eliciting stronger responses in object-sensitive brain regions (<xref ref-type="bibr" rid="ref58">Yovel and Kanwisher, 2005</xref>; <xref ref-type="bibr" rid="ref11">Epstein et al., 2006</xref>) while reducing activity in face-selective areas (<xref ref-type="bibr" rid="ref58">Yovel and Kanwisher, 2005</xref>; <xref ref-type="bibr" rid="ref30">Mazard et al., 2006</xref>). Given that face masks, perceptually, introduce an element of both object addition and reduced facial visibility, it is plausible that the hypothesis regarding inverted faces may extend to masked faces. Consequently, it can be hypothesized that inverted and masked faces elicit comparable ERP patterns. To investigate this hypothesis, the original experiment was modified by introducing a new binary dimension: stimulus orientation (upright/inverted). To maintain an equivalent number of trials as the original experiment, the binary dimension of emotional expression was simplified to include only &#x201C;happy&#x201D; faces (see <xref rid="fig1" ref-type="fig">Figure 1A</xref>). In this second experiment, an additional 15 participants were recruited.</p>
<fig position="float" id="fig1">
<label>Figure 1</label>
<caption>
<p>Experimental procedures and stimulus selection. <bold>(A)</bold> Illustration of experimental stimuli. This panel presents representative samples of the stimuli employed in two distinct experiments, each comprising a total of 416 stimuli. These stimuli were strategically chosen to investigate alterations in neural responses to facial stimuli, specifically examining the impact of face masks. The stimulus set was meticulously designed to encompass a balanced representation of various facial attributes, including an equitable distribution of both Caucasian and African American faces (dimension 1), male and female faces (dimension 2), as well as happy and angry facial expressions (dimension 3), drawn from the openly accessible RADIATE face database. To introduce an additional dimension, an &#x201C;older&#x201D; rendition of the faces was created using the photo editing application FaceApp. Furthermore, a fifth dimension (presence or absence of a mask) was introduced by digitally incorporating masks onto every image using Adobe Photoshop. For enhanced interpretability of the study results, a secondary experimental study was devised, which also featured inverted face images. Notably, the &#x201C;emotion&#x201D; dimension was excluded from the stimulus set in this second study to prevent an unwieldy number of trials. <bold>(B)</bold> Stimulus presentation and response task. All 416 stimuli are presented in random order. Participants are instructed to use a button box to indicate whether the displayed face is male or female. Faces are displayed for a minimum of 1&#x2009;s and remain on screen until a participant response is recorded or up to 4&#x2009;s if no response is detected. The intertrial interval lasts 1 s, during which a cross is displayed in the center of the monitor. <bold>(C)</bold> Topographic map with sensor locations. This panel illustrates the topographic map displaying the sensor locations utilized in both experiments. The reported study results are based on the averaged signal recorded from the sensors highlighted in yellow. However, topographic maps based on all sensors can be found in the <xref ref-type="supplementary-material" rid="SM1">Supplementary material</xref> section (see <xref ref-type="supplementary-material" rid="SM1">Supplementary Figures S1&#x2013;S5</xref>).</p>
</caption>
<graphic xlink:href="fnhum-17-1268972-g001.tif"/>
</fig>
<p>To tackle the issue of the multiple comparison problem, a concern that frequently impacts ERP-related research and has been thoroughly discussed by <xref ref-type="bibr" rid="ref28">Luck and Gaspelin (2017)</xref>, the analysis focused on a single electrode combination for all assessments and a robust non-parametric randomization test to evaluate variations in amplitude and latencies among the various conditions (see <xref ref-type="supplementary-material" rid="SM1">Analysis</xref> in the <xref ref-type="supplementary-material" rid="SM1">Materials and methods</xref> section for more information).</p>
</sec>
<sec sec-type="materials|methods" id="sec2">
<label>2.</label>
<title>Materials and methods</title>
<sec id="sec3">
<label>2.1.</label>
<title>Subjects</title>
<p>This study involved two experiments, each with 15 undergraduate students. In experiment 1, there were 10 female participants and 5 male participants, all with a mean age of 24&#x2009;years. Experiment 2 consisted of 12 female participants and 3 male participants, with a mean age of 25&#x2009;years. All participants were psychology majors at the California State University of San Bernardino, and they received class credit in appreciation of their participation. Prior to their involvement, each student provided informed consent, and no student participated in more than one experiment.</p>
</sec>
<sec id="sec4">
<label>2.2.</label>
<title>Stimuli</title>
<p>To generate the image dataset used for Experiment 1, original face images from 26 white and 26 black models (equally distributed across gender) were selected from the RADIATE database (<xref ref-type="bibr" rid="ref8">Conley et al., 2018</xref>). This database provides open-access face stimuli, featuring racially and ethnically diverse models displaying various emotional expressions. Two emotional expressions, namely &#x201C;happy&#x201D; and &#x201C;angry,&#x201D; were chosen for each model, resulting in a set of 104 unique face images. To expand the dataset, an AI aging filter (FaceApp) was employed to create an &#x201C;older version&#x201D; of each model, effectively doubling the number of stimuli. Subsequently, the dataset was doubled again by digitally adding a facemask to each face image, employing Adobe Photoshop. As a result, the dataset was expanded to include a total of 416 face images. For Experiment 2, the initial image dataset from Experiment 1 was halved by removing all &#x201C;angry&#x201D; faces. Then, the dataset was brought back to a total of 416 images by adding an inversed version of the remaining faces. Throughout both experiments, all face images were thus equally divided based on several attributes, including race (black/white), gender (male/female), age (young/old), use of face mask (mask/no mask), and emotional expression (happy/angry). In Experiment 2, an additional attribute, face orientation (upright/inverted), was considered, in place of emotional expression. <xref rid="fig1" ref-type="fig">Figure 1A</xref> illustrates examples of the stimuli used for experiments 1 and 2, featuring one black and one white male and female model and the stimuli derived from it.</p>
</sec>
<sec id="sec5">
<label>2.3.</label>
<title>Experimental procedure and EEG equipment</title>
<p>The experiment involved participants wearing a 64-channel EEG-cap (BrainVision), with only 32 channels effectively utilized (sampling rate of 500&#x2009;Hz) and the reference electrode placed at the FCz location, following the standard 10&#x2013;20 EEG system. The EEG electrodes were connected to a BrainVision actiCHamp active channel amplifier (BrainVision) and checked for proper conductivity (impedance below 5k&#x03A9; for each electrode) before starting the recording. During the recording session, participants were seated in a dimly lit, quiet room, in front of a 19-inch Dell monitor, positioned 50&#x2009;cm away from their heads. The experiment presented 416 face images (see &#x201C;Stimuli&#x201D;) sequentially, each displayed at the center of the screen with a visual angle of 17&#x00B0;&#x2009;&#x00D7;&#x2009;23&#x00B0;. The order of the images was randomized to minimize any potential biases. To ensure participants&#x2019; attention, they were instructed to indicate the gender of each face using a button box (see <xref rid="fig1" ref-type="fig">Figure 1B</xref>). Each face was displayed for at least 1 s, and disappeared from the screen as soon as it was evaluated, followed by a 1-s inter-trial interval displaying a cross at the center of the screen before the next face appeared (see <xref rid="fig1" ref-type="fig">Figure 1B</xref>). Participants were encouraged to respond as quickly and accurately as possible. Any images not evaluated within 4 s disappeared from the screen to maintain the experimental flow. The entire experimental paradigm was created using Experiment Builder by SR Research.</p>
</sec>
<sec id="sec6">
<label>2.4.</label>
<title>Analysis</title>
<sec id="sec7">
<label>2.4.1.</label>
<title>Stimulus presentation and data acquisition</title>
<p>Each stimulus used in both Experiment 1 and 2 contained a small black square engineered in the bottom right corner, allowing precise timing information through a screen-positioned photodiode. This setup ensured accurate timestamps for the onset and offset of the 416 face stimuli used in the experiments.</p>
</sec>
<sec id="sec8">
<label>2.4.2.</label>
<title>Data segmentation and preprocessing</title>
<p>The EEG data were analyzed offline using the FieldTrip Matlab software toolbox (<xref ref-type="bibr" rid="ref34">Oostenveld et al., 2011</xref>). Noisy trials were removed using the Fieldtrip data browser function, and the timestamps were utilized to generate 416 data segments, each lasting 2&#x2009;s, spanning 0.5&#x2009;s before stimulus onset to 1.5&#x2009;s after onset. This broader range than needed was selected to prevent edge effects caused by preprocessing. Subsequently, the raw data was filtered (3&#x2013;45&#x2009;Hz) and demeaned. The unusual high-pass filter (3&#x2009;Hz) was selected because it effectively removed slow drifts observed in the EEG signal (<xref ref-type="bibr" rid="ref36">Pinto et al., 2019</xref>). DFT notch filters were applied at 60 and 120&#x2009;Hz.</p>
</sec>
<sec id="sec9">
<label>2.4.3.</label>
<title>Electrode selection and grand average ERP waveforms</title>
<p>In alignment with established practices in the field (e.g., <xref ref-type="bibr" rid="ref14">Gao et al., 2009</xref>), ERP waveforms and their associated components were computed by averaging across specific occipito-temporal electrodes, namely P7, P3, O1, O2, P4, and P8 (as illustrated in <xref rid="fig1" ref-type="fig">Figure 1C</xref>, highlighted in yellow). A consistent electrode selection was maintained across all ERP components to prevent multiple implicit comparisons (<xref ref-type="bibr" rid="ref28">Luck and Gaspelin, 2017</xref>). However, data from all recorded sites were utilized to generate topographic maps, which can be found in the <xref ref-type="supplementary-material" rid="SM1">Supplementary material</xref>. Following the averaging of data for each pair of experimental conditions and each participant, grand average ERP waveforms were computed by averaging across the results from all 15 participants for each experiment. The shaded regions in <xref rid="fig2" ref-type="fig">Figures 2</xref>, <xref rid="fig3" ref-type="fig">3</xref> represent the standard error of the grand average, denoted as &#x00B1;SE. To ensure smooth curves and error bands, a 5-point moving average with a window of 0.010&#x2009;s was applied.</p>
<fig position="float" id="fig2">
<label>Figure 2</label>
<caption>
<p>ERP waveforms for Experiment 1. This figure presents the grand mean ERP waveforms, computed from a total of 15 datasets with 416 trials. The waveforms are grouped according to specific conditions: &#x201C;young/old&#x201D; <bold>(A)</bold>, &#x201C;white/black&#x201D; <bold>(B)</bold>, &#x201C;male/female&#x201D; <bold>(C)</bold>, &#x201C;mask/no mask&#x201D; <bold>(D)</bold>, the first and second half of the session <bold>(E)</bold>, and &#x201C;angry/happy&#x201D; <bold>(F)</bold>. To generate the conditions depicted in each panel, the electrode data from the left-side cluster (P7, P3, and O1) and the right-side cluster (P4, P8, and O2) were first averaged across all participants. The shaded areas in the graphs represent the standard error (+/&#x2212;) with <italic>N</italic>&#x2009;=&#x2009;15 participants. Significance testing along the waveforms for differences between the two conditions was conducted using paired <italic>t</italic>-tests (see <xref ref-type="supplementary-material" rid="SM1">Materials and methods</xref>) with an alpha level of 0.01 to account for multiple comparisons. Statistically significant differences are indicated by pink horizontal lines beneath the waveforms. For amplitude comparisons between the P100, N170, and P200 components of both conditions, a non-parametric randomization test was employed (see <xref ref-type="supplementary-material" rid="SM1">Materials and methods</xref> and <xref rid="tab1" ref-type="table">Table 1</xref>). Statistically significant differences are represented by one (<italic>p</italic>&#x2009;&#x003C;&#x2009;0.05), two (<italic>p</italic>&#x2009;&#x003C;&#x2009;0.01), or three (<italic>p</italic>&#x2009;&#x003C;&#x2009;0.001) asterisks at the relevant locations.</p>
</caption>
<graphic xlink:href="fnhum-17-1268972-g002.tif"/>
</fig>
<fig position="float" id="fig3">
<label>Figure 3</label>
<caption>
<p>ERP waveforms for Experiment 2. This figure illustrates the grand mean ERP waveforms, derived from 15 datasets with 416 trials. Like <xref rid="fig2" ref-type="fig">Figure 2</xref>, the waveforms are grouped according to specific conditions: &#x201C;young/old&#x201D; <bold>(A)</bold>, &#x201C;white/black&#x201D; <bold>(B)</bold>, &#x201C;male/female&#x201D; <bold>(C)</bold>, &#x201C;mask/no mask&#x201D; <bold>(D)</bold>, and the first and second half of the session <bold>(E)</bold>. However, in Experiment 2, the analysis was performed separately for upright (left subpanels) and inverted (right subpanels) stimuli. <bold>(F)</bold> Displays the grand mean ERP waveforms for the &#x201C;mask vs. no mask&#x201D; conditions, both for upright stimuli (solid curves) and inverted stimuli (dotted curves). The shaded areas in the graphs represent the standard error (+/&#x2212;) with <italic>N</italic>&#x2009;=&#x2009;15 participants. Significance testing along the waveforms for differences between the two conditions <bold>(A&#x2013;E)</bold> was conducted using paired <italic>t</italic>-tests (see <xref ref-type="supplementary-material" rid="SM1">Materials and methods</xref>) with an alpha level of 0.01 to account for multiple comparisons. Statistically significant differences are indicated by pink horizontal lines beneath the waveforms. For amplitude comparisons between the P100, N170, and P200 components of each pair of conditions <bold>(A&#x2013;E)</bold> and for the two &#x201C;mask&#x201D; conditions or the two &#x201C;no mask&#x201D; conditions <bold>(F)</bold>, a non-parametric randomization test was employed (see <xref ref-type="supplementary-material" rid="SM1">Materials and methods</xref> and <xref rid="tab1" ref-type="table">Table 1</xref>). Statistically significant differences are represented by one (<italic>p</italic>&#x2009;&#x003C;&#x2009;0.05), two (<italic>p</italic>&#x2009;&#x003C;&#x2009;0.01), or three (<italic>p</italic>&#x2009;&#x003C;&#x2009;0.001) asterisks at the relevant locations. The asterisks in panel F are color-coded to distinguish between comparisons of the two &#x201C;mask&#x201D; conditions (blue) and the two &#x201C;no mask&#x201D; conditions (red).</p>
</caption>
<graphic xlink:href="fnhum-17-1268972-g003.tif"/>
</fig>
</sec>
<sec id="sec10">
<label>2.4.4.</label>
<title>Statistical analysis</title>
<p>To evaluate statistical distinctions along the ERP waveforms across experimental conditions, the study implemented a &#x201C;running <italic>p</italic>-value&#x201D; approach utilizing a paired <italic>t</italic>-test with a sample size of <italic>N</italic>&#x2009;=&#x2009;15. This method involved assessing variations in intervals of 0.010&#x2009;s (equivalent to 5 samples) throughout the ERP waveforms. Pink horizontal line segments were incorporated to indicate locations where <italic>p</italic>-values&#x2009;&#x003C;&#x2009;0.01 were observed beneath the curves for enhanced visual representation.</p>
<p>Although the study is structured around a factorial design featuring 5 independent variables, only three-way ANOVA (using the Matlab function anovan.m) were utilized to investigate interactions between different pairs of independent variables concerning the amplitudes of P100, N170, and P200 (see <xref ref-type="supplementary-material" rid="SM1">Supplementary material</xref> for results). This choice was made because higher-dimensional ANOVAs, would of have required averaging smaller trial quantities and potentially led to unstable ERP waveforms and associated components.</p>
<p>For the presentation of topographic maps illustrating the amplitudes of the various ERP components, the Matlab function plottopography.m was employed (available through the Mathworks file exchange). Detailed explanations regarding the derivation of ERP component peak values, utilized for both the ANOVA and the creation of topographic maps, are provided in the subsequent section.</p>
</sec>
<sec id="sec11">
<label>2.4.5.</label>
<title>Non-parametric approach for amplitude comparison</title>
<p>Amplitude differences in the EEG components (P100, N170, P200) between two experimental conditions were explored using a non-parametric approach. This approach involved 100,000 iterations, during which&#x2014;for each iteration&#x2014;data from 416 trials for each participant were randomly divided into two pseudo conditions, yielding grand average ERP waveforms for each condition by averaging across occipito-temporal electrodes and participants (see above). Amplitude differences for each pair of pseudo-ERP waveforms were calculated within specific time windows: 80&#x2013;120&#x2009;ms (P100), 150&#x2013;300&#x2009;ms (P200), and 100&#x2013;200&#x2009;ms (N170) for each iteration. The resulting 100,000 amplitude differences were sorted from high to low, allowing computation of the ranking of the experimentally observed differences for each pair of experimental conditions and each ERP component of interest among the randomizations. For instance, a rank of 5 indicated that only 4 out of 100,000 randomizations yielded a greater amplitude difference, resulting in a probability of 5/100,000 (thus, <italic>p</italic>&#x2009;=&#x2009;0.00005). In Experiment 2, this non-parametric test was performed separately for trials featuring upright and inverse face images, as well as for trials featuring face images with and without masks (see <xref rid="tab1" ref-type="table">Table 1</xref>). An identical approach, using the same time windows for each ERP component, was also used to compute differences in latency between masked and unmasked faces, and upright and inverted faces.</p>
<table-wrap position="float" id="tab1">
<label>Table 1</label>
<caption>
<p>Non-parametric randomization test results.</p>
</caption>
<table frame="hsides" rules="groups">
<tbody>
<tr>
<td align="left" valign="top">
<inline-graphic xlink:href="fnhum-17-1268972-i001.tif"/>
</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p>A non-parametric test (see <xref ref-type="supplementary-material" rid="SM1">Materials and methods</xref> for details) was conducted with 100,000 iterations to investigate potential differences in the amplitudes of key ERP components (P100, N170, and P200) among various conditions. Each iteration involved dividing all trials of all participants into two pseudo conditions, resulting in 100,000 amplitude differences for each ERP component. These differences were then ranked from highest to lowest, allowing for an assessment of their positions compared to those generated by randomizations. A rank of 1 indicated that none of the 100,000 randomizations resulted in a larger amplitude difference than the one measured. Based on these rankings, <italic>p</italic>-values were assigned. The table displays the rankings and <italic>p</italic>-values for each pair of conditions (as listed in the light blue shaded table header). The top part of the table presents the results obtained for experiment 1, while the detached bottom part shows the results obtained for experiment 2. In experiment 2, separate analyses were performed for upright and inverted face stimuli, with the exception of the statistics provided in the (detached) last column, where the analysis was conducted separately for face stimuli with and without a mask. Each comparison in the table corresponds to a specific figure panel (figure references are provided). To enhance clarity, a color code we used: gray shading for <italic>p</italic>-values above 0.05, yellow for <italic>p</italic>-values smaller than 0.05 but larger than 0.01, orange for <italic>p</italic>-values smaller than 0.01 but larger than 0.001, and pink for <italic>p</italic>-values equal to or lower than 0.001. The direction (left/right) of significant <italic>p</italic>-values, although not explicitly reported in the table, can easily be inferred by consulting the associated figure panels.</p>
</table-wrap-foot>
</table-wrap>
</sec>
</sec>
</sec>
<sec sec-type="results" id="sec12">
<label>3.</label>
<title>Results</title>
<sec id="sec13">
<label>3.1.</label>
<title>Experiment 1</title>
<p>Participants viewed 416 images, which were presented one by one in random order. Their sole task was to identify whether each face image depicted a female or a male by using a button box (see <xref rid="fig1" ref-type="fig">Figure 1B</xref>).</p>
<p>The 416-image stimulus set was purposely designed to be split into two sets of 208 images each, based on five distinct face attributes (see <xref rid="fig1" ref-type="fig">Figure 1A</xref> for examples). Consequently, this division resulted in five pairs of grand averaged ERP waveforms. These pairs were categorized as follows: &#x201C;young&#x201D; and &#x201C;old&#x201D; (see <xref rid="fig2" ref-type="fig">Figure 2A</xref>), &#x201C;white Caucasian&#x201D; and &#x201C;Black African American&#x201D; (see <xref rid="fig2" ref-type="fig">Figure 2B</xref>), &#x201C;male&#x201D; and &#x201C;female&#x201D; (see <xref rid="fig2" ref-type="fig">Figure 2C</xref>), &#x201C;mask&#x201D; and &#x201C;no mask&#x201D; (see <xref rid="fig2" ref-type="fig">Figure 2D</xref>), and &#x201C;angry&#x201D; and &#x201C;happy&#x201D; (see <xref rid="fig2" ref-type="fig">Figure 2F</xref>). Additionally, for each participant, the trials were divided into two groups: those presented first (trials 1 to 208) and those presented last (trials 209&#x2013;416) (see <xref rid="fig2" ref-type="fig">Figure 2E</xref>).</p>
<p>To visualize and analyze the differences in the measured EEG signals between the selected pairs of conditions (see <xref ref-type="supplementary-material" rid="SM1">Materials and methods</xref>), the following features were included in each panel: a shaded error band representing &#x00B1; standard error for each waveform, short horizontal lines under each pair of waveforms to indicate the timepoints at which the waveforms statistically differed from each other (with alpha&#x2009;=&#x2009;0.01), and asterisks (see caption <xref rid="fig2" ref-type="fig">Figure 2</xref>) to indicate differences in amplitudes between the waveforms for the key ERP components (P100, N170, and P200) that are relevant to this study, based upon a non-parametric test (see <xref ref-type="supplementary-material" rid="SM1">Materials and methods</xref>).</p>
<p>Results, obtained through a 3-way ANOVA, aimed at exploring interactions among the binary dimensions of &#x201C;age,&#x201D; &#x201C;emotion,&#x201D; and &#x201C;mask,&#x201D; as well as &#x201C;gender,&#x201D; &#x201C;race,&#x201D; and &#x201C;mask,&#x201D; and their impact on the amplitudes of the P100, N170, and P200 components, are presented in <xref ref-type="supplementary-material" rid="SM1">Supplementary Tables S1, S2</xref>. Complementing these findings, <xref ref-type="supplementary-material" rid="SM1">Supplementary Figures S1, S2</xref> show topographic maps depicting amplitude distributions for these components under various conditions.</p>
</sec>
<sec id="sec14">
<label>3.2.</label>
<title>Experiment 2</title>
<p>Following experiment 1, a group of 15 new participants was recruited to replicate the study, incorporating notable modifications. Specifically, the &#x201C;emotional expression&#x201D; factor was omitted, and a new &#x201C;stimulus orientation dimension&#x201D; was introduced. These adjustments were informed by the findings of experiment 1, which highlighted substantial differences, particularly in N170 and P200 amplitudes, between masked and unmasked faces. In contrast, no discernible distinctions were observed between &#x201C;angry&#x201D; and &#x201C;happy&#x201D; faces. This alignment with the concept that emotional effects on early vision may not necessarily signify an influence of cognitive processes (<xref ref-type="bibr" rid="ref39">Raftopoulos, 2023</xref>). Because similar amplitude differences have been reported for inverted faces (<xref ref-type="bibr" rid="ref42">Rossion et al., 2000</xref>), the introduction of face stimulus orientation provided an opportunity to compare and study the effect of both modulations, and hence learn more about the underlaying neural mechanisms. For examples of stimuli used for Experiment 2, and how they differ compared with Experiment 1, see <xref rid="fig1" ref-type="fig">Figure 1A</xref>.</p>
<p>Similar to the results shown for Experiment 1 (see <xref rid="fig2" ref-type="fig">Figure 2</xref>), the panels in <xref rid="fig3" ref-type="fig">Figures 3A</xref>&#x2013;<xref rid="fig3" ref-type="fig">E</xref> illustrates the results obtained from splitting the data based on one pair of experimental conditions. To examine the effect of inverting the stimuli, upright (left subpanels) and inverted (right subpanels) stimuli were analyzed separately. Additionally, to compare the effects of both masks and stimulus orientation, the data was split into four groups: no masks and masks, either upright or inverted (see <xref rid="fig3" ref-type="fig">Figure 3F</xref>).</p>
<p>The results, derived from a 3-way ANOVA, were directed toward investigating interactions among the binary dimensions of &#x201C;stimulus orientation&#x201D; and &#x201C;mask,&#x201D; along with either &#x201C;age&#x201D; (<xref ref-type="supplementary-material" rid="SM1">Supplementary Table S3</xref>), &#x201C;gender&#x201D; (<xref ref-type="supplementary-material" rid="SM1">Supplementary Table S4</xref>), or &#x201C;race&#x201D; (<xref ref-type="supplementary-material" rid="SM1">Supplementary Table S5</xref>), and their influence on the amplitudes of the P100 and N170 components (see <xref ref-type="supplementary-material" rid="SM1">Supplementary material</xref>). To complement these findings, <xref ref-type="supplementary-material" rid="SM1">Supplementary Figures S3&#x2013;S5</xref> present topographic maps illustrating amplitude distributions for these components under conditions involving masked vs. unmasked stimuli and upright vs. inverted faces.</p>
</sec>
<sec id="sec15">
<label>3.3.</label>
<title>Amplitude and latencies of the P100, N170, and P200</title>
<p><xref rid="tab1" ref-type="table">Table 1</xref> provides a comprehensive summary of amplitude differences for all ERP components considered in both experiment 1 and experiment 2. A non-parametric randomization test was employed to compare amplitudes between any two experimental conditions (see <xref ref-type="supplementary-material" rid="SM1">Materials and methods</xref>). Additionally, this test helped identify latency shifts (not shown in <xref rid="tab1" ref-type="table">Table 1</xref>) between different conditions for different ERP components. When analyzing the EEG response separately for inverted or upright face image stimuli, no latency shift was observed between masked and unmasked images. However, the response to inverted unmasked faces showed a significant delay compared to upright unmasked faces for the N170 (6&#x2009;ms, <italic>p</italic>&#x2009;=&#x2009;0.017) and the P200 (20&#x2009;ms, <italic>p</italic>&#x2009;&#x003C;&#x2009;0.00001). Inverting masked faces also exhibited a delay, albeit not statistically significant (4&#x2009;ms, <italic>p</italic>&#x2009;=&#x2009;0.08), but a significant delay for the P200 (20&#x2009;ms, <italic>p</italic>&#x2009;&#x003C;&#x2009;0.00001) compared to upright masked faces. For a more detailed interpretation of these findings, please see the &#x201C;Discussion&#x201D; section below.</p>
</sec>
</sec>
<sec sec-type="discussions" id="sec16">
<label>4.</label>
<title>Discussion</title>
<sec id="sec17">
<label>4.1.</label>
<title>Factors influencing the N170 amplitude and latencies</title>
<p>This study found that N170 amplitudes increased with appearent complexity of processing faces. Older faces, for instance, known to be more challenging in terms of emotional expression recognition (<xref ref-type="bibr" rid="ref15">Grondhuis et al., 2021</xref>), elicited larger N170 responses, noted in both Experiment 1 and 2, consistent with prior research (<xref ref-type="bibr" rid="ref53">Wiese et al., 2008</xref>). Interestingly, the inversion of faces abolished this age-related difference (see <xref rid="fig3" ref-type="fig">Figure 3</xref>), although a separate study observed larger inversion effects for young faces compared to old faces (<xref ref-type="bibr" rid="ref53">Wiese et al., 2008</xref>). The presence of a face mask, which also imposes additional processing demands, was also found to increase N170 amplitudes and P200 responses in both experiments, aligning with recent studies reporting similar N170 increments due to face masks (<xref ref-type="bibr" rid="ref37">Prete et al., 2022</xref>; <xref ref-type="bibr" rid="ref38">Proverbio and Cerri, 2023</xref>). It&#x2019;s noteworthy, however, that another study (<xref ref-type="bibr" rid="ref1002">&#x017B;ochowska et al., 2022</xref>) did not observe changes in N170 amplitudes with face masks. Remarkably, none of these studies noted a corresponding increase in P200 amplitude, possibly indicative of cognitive task variations.</p>
<p>Another critical aspect impacting face processing is facial inversion, a phenomenon evident across all conditions in our study except for the &#x201C;masked face condition&#x201D; (see below). This effect resulting in larger N170s is well-documented (<xref ref-type="bibr" rid="ref42">Rossion et al., 2000</xref>; <xref ref-type="bibr" rid="ref45">Rousselet et al., 2004</xref>; <xref ref-type="bibr" rid="ref46">Sadeh and Yovel, 2010</xref>), been observed to impact the N250 (<xref ref-type="bibr" rid="ref17">Hashemi et al., 2019</xref>; <xref ref-type="bibr" rid="ref1">Abreu et al., 2023</xref>), an ERP component that falls outside the scope of this study. The absence of familiarity with faces (<xref ref-type="bibr" rid="ref23">Ito and Urland, 2003</xref>), giving rise to the &#x201C;own race bias effect,&#x201D; has also been linked to heightened challenges in processing faces, resulting in larger N170 responses to other-race faces (<xref ref-type="bibr" rid="ref49">Sun et al., 2014</xref>; <xref ref-type="bibr" rid="ref57">Yao and Zhao, 2019</xref>). However, it&#x2019;s important to note that the current study did not identify any sensitivity of early ERP components to race or skin color. This absence of sensitivity can be attributed to the predominant representation of Latino participants in the sample, which may not offer the requisite diversity to thoroughly investigate own-race effects. Furthermore, gender-based differences may affect N170 amplitudes, as evidenced by larger N170s for male faces, primarily due to the gender imbalance in our participant pool. Nevertheless, this finding warrants further investigation with a more balanced participant pool.</p>
</sec>
<sec id="sec18">
<label>4.2.</label>
<title>Potential neural mechanisms for increased N170 response to inverted faces</title>
<p>One plausible explanation for the enhanced N170 response to inverted faces lies in the early recruitment of additional neural mechanisms, rather than a simple increase in activity within existing neural populations during the N170 time-window. These findings align with Rossion&#x2019;s hypothesis, which posits the involvement of object-sensitive neurons in augmenting the N170 amplitude observed for inverted faces (<xref ref-type="bibr" rid="ref42">Rossion et al., 2000</xref>). This hypothesis may also be extended to elucidate the increased N170 response observed in our study for masked faces, as masks themselves can be considered objects. Intriguingly, the study demonstrates that the addition of face masks has a comparable effect on N170 amplitude as inverting the maskless face stimulus. However, combining a mask and inversion did not lead to an additive increase, suggesting a potential neural saturation point.</p>
</sec>
<sec id="sec19">
<label>4.3.</label>
<title>Delayed N170 response for inverted faces</title>
<p>The delay in the N170 response to inverted faces is often linked to alterations in the spatial relationships among facial features. Additionally, an amplification of the N170 component and a corresponding shift in latency have been associated with a reduced ability to recognize faces. For instance, a study demonstrated that gradually rotating facial images from an upright to an upside-down position resulted in a declining ability to identify faces (<xref ref-type="bibr" rid="ref24">Jacques and Rossion, 2007</xref>). However, it&#x2019;s important to emphasize that the reduced face recognition ability alone may not completely explain the observed delay in both the N170 and P200 components. This becomes evident in the present study, where no such delay was observed for faces with face masks compared to unmasked faces, despite face masks also significantly impeding face identification (<xref ref-type="bibr" rid="ref12">Freud et al., 2022</xref>).</p>
</sec>
<sec id="sec20">
<label>4.4.</label>
<title>Factors influencing the P200 amplitude and latencies</title>
<p>The P200 component is well-established as being linked to configural face encoding, with more typical faces consistently yielding larger P200 amplitudes (<xref ref-type="bibr" rid="ref56">Wuttke and Schweinberger, 2019</xref>). Conversely, as deviations from the norm increase, relatively smaller P200 amplitudes are typically observed (<xref ref-type="bibr" rid="ref16">Halit et al., 2000</xref>; <xref ref-type="bibr" rid="ref27">Latinus and Taylor, 2006</xref>; <xref ref-type="bibr" rid="ref48">Schweinberger and Neumann, 2016</xref>). An intriguing revelation from our study is that both masked faces, as observed in our current investigation, and scrambled faces (<xref ref-type="bibr" rid="ref27">Latinus and Taylor, 2006</xref>), elicit substantial P200 amplitudes. In contrast, a similar robust increase in the P200 component was not observed in response to facial inversion. This intriguing finding suggests that faces lacking spatial and configurational information may be processed as highly typical or in a default state, with the amplitude modulation being influenced by the presence of spatial face information.</p>
<p>Moreover, the inversion of faces was found to induce delays in both N170 and P200 latencies. This phenomenon has been noted previously (<xref ref-type="bibr" rid="ref27">Latinus and Taylor, 2006</xref>), as well as in the case of Mooney faces compared to other face types. However, it is crucial to highlight that, unlike in our current study, previous research primarily emphasized the delay in N170 latency, as indicated by N170-to-P200 peak analyses. In contrast, this study revealed that the observed latency for inverted faces increased from 4&#x2013;6&#x2009;ms (N170) to approximately 20&#x2009;ms (P200), demonstrating that this delay is not limited to one specific processing stage.</p>
</sec>
<sec id="sec21">
<label>4.5.</label>
<title>Effects of neural adaptation on the N170 and P200 components</title>
<p>The investigation of diminished ERP components resulting from repetition effects typically involves comparing responses to two identical faces or faces sharing common attributes (e.g., identity) vs. responses to two distinct faces. For an in-depth review, see (<xref ref-type="bibr" rid="ref48">Schweinberger and Neumann, 2016</xref>). Previous studies (<xref ref-type="bibr" rid="ref2">Amihai et al., 2011</xref>; <xref ref-type="bibr" rid="ref51">Walther et al., 2013</xref>) have demonstrated that the N170 component is influenced when preceded by another face, regardless of whether the sequentially presented faces represent the same individual or different individuals. The present study provides further insight by showing that the categorical face adaptation effect accumulates over the duration of a session, leading to a gradual decline in N170 amplitude throughout the experimental session, as depicted in <xref rid="fig2" ref-type="fig">Figures 2E</xref>, <xref rid="fig3" ref-type="fig">3E</xref>. Even when upright and inverted face stimuli were interleaved and separately analyzed, the apparent &#x201C;neural adaptation effect&#x201D; remained consistent across sessions. It&#x2019;s noteworthy that this phenomenon has also been observed in data collected from the occipital lobe of rhesus macaque monkeys using non-face stimuli (<xref ref-type="bibr" rid="ref5">Brunet et al., 2014</xref>). Therefore, researchers utilizing block designs to compare experimental conditions should consider this effect to ensure the reliability and validity of their interpretations.</p>
</sec>
<sec id="sec22">
<label>4.6.</label>
<title>Limitations of the study</title>
<p>It is essential to acknowledge certain limitations in our study. The homogeneity of our participant pool, primarily consisting of college students identifying as Hispanic/Latine, may influence our findings. Furthermore, the gender distribution in the sample was skewed toward females, suggesting the need for a more balanced participant pool in future investigations.</p>
</sec>
</sec>
<sec sec-type="data-availability" id="sec23">
<title>Data availability statement</title>
<p>The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.</p>
</sec>
<sec sec-type="ethics-statement" id="sec24">
<title>Ethics statement</title>
<p>The studies involving humans were approved by Institutional Review Board of the California State University San Bernardino. The studies were conducted in accordance with the local legislation and institutional requirements. The participants provided their written informed consent to participate in this study.</p>
</sec>
<sec sec-type="author-contributions" id="sec25">
<title>Author contributions</title>
<p>NB: Conceptualization, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Visualization, Writing &#x2013; original draft, Writing &#x2013; review &#x0026; editing.</p>
</sec>
</body>
<back>
<sec sec-type="funding-information" id="sec26">
<title>Funding</title>
<p>The author declares financial support was received for the research, authorship, and/or publication of this article. NB received grant support from the National Institutes of Health (NIH)/National Institutes of Neurological Disorders and Stroke (NINDS), R15NS121788.</p>
</sec>
<ack>
<p>NB would like to express appreciation to Krisha Orgo, Camryn Amundsen, Diana Guevara, Britney Aguirre, Emily Mendiola, Candace Perez, Marshina Brown, and Stephanie Villalpando for their valuable assistance with data collection.</p>
</ack>
<sec sec-type="COI-statement" id="sec27">
<title>Conflict of interest</title>
<p>The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec id="sec100" sec-type="disclaimer">
<title>Publisher&#x2019;s note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
<sec sec-type="supplementary-material" id="sec28">
<title>Supplementary material</title>
<p>The Supplementary material for this article can be found online at: <ext-link xlink:href="https://www.frontiersin.org/articles/10.3389/fnhum.2023.1268972/full#supplementary-material" ext-link-type="uri">https://www.frontiersin.org/articles/10.3389/fnhum.2023.1268972/full#supplementary-material</ext-link></p>
<supplementary-material xlink:href="Data_Sheet_1.PDF" id="SM1" mimetype="application/pdf" xmlns:xlink="http://www.w3.org/1999/xlink"/>
</sec>
<ref-list>
<title>References</title>
<ref id="ref1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Abreu</surname> <given-names>A. L.</given-names></name> <name><surname>Fern&#x00E1;ndez-Aguilar</surname> <given-names>L.</given-names></name> <name><surname>Ferreira-Santos</surname> <given-names>F.</given-names></name> <name><surname>Fernandes</surname> <given-names>C.</given-names></name></person-group> (<year>2023</year>). <article-title>Increased N250 elicited by facial familiarity: an ERP study including the face inversion effect and facial emotion processing</article-title>. <source>Neuropsychologia</source> <volume>188</volume>:<fpage>108623</fpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2023.108623</pub-id></citation></ref>
<ref id="ref2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Amihai</surname> <given-names>I.</given-names></name> <name><surname>Deouell</surname> <given-names>L. Y.</given-names></name> <name><surname>Bentin</surname> <given-names>S.</given-names></name></person-group> (<year>2011</year>). <article-title>Neural adaptation is related to face repetition irrespective of identity: a reappraisal of the N170 effect</article-title>. <source>Exp. Brain Res.</source> <volume>209</volume>, <fpage>193</fpage>&#x2013;<lpage>204</lpage>. doi: <pub-id pub-id-type="doi">10.1007/s00221-011-2546-x</pub-id>, PMID: <pub-id pub-id-type="pmid">21287156</pub-id></citation></ref>
<ref id="ref3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Batty</surname> <given-names>M.</given-names></name> <name><surname>Taylor</surname> <given-names>M. J.</given-names></name></person-group> (<year>2003</year>). <article-title>Early processing of the six basic facial emotional expressions</article-title>. <source>Cogn. Brain Res.</source> <volume>17</volume>, <fpage>613</fpage>&#x2013;<lpage>620</lpage>. doi: <pub-id pub-id-type="doi">10.1016/S0926-6410(03)00174-5</pub-id>, PMID: <pub-id pub-id-type="pmid">14561449</pub-id></citation></ref>
<ref id="ref4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Blau</surname> <given-names>V. C.</given-names></name> <name><surname>Maurer</surname> <given-names>U.</given-names></name> <name><surname>Tottenham</surname> <given-names>N.</given-names></name> <name><surname>McCandliss</surname> <given-names>B. D.</given-names></name></person-group> (<year>2007</year>). <article-title>The face-specific N170 component is modulated by emotional facial expression</article-title>. <source>Behav. Brain Funct.</source> <volume>3</volume>, <fpage>1</fpage>&#x2013;<lpage>13</lpage>. doi: <pub-id pub-id-type="doi">10.1186/1744-9081-3-7</pub-id></citation></ref>
<ref id="ref5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brunet</surname> <given-names>N. M.</given-names></name> <name><surname>Bosman</surname> <given-names>C. A.</given-names></name> <name><surname>Vinck</surname> <given-names>M.</given-names></name> <name><surname>Roberts</surname> <given-names>M.</given-names></name> <name><surname>Oostenveld</surname> <given-names>R.</given-names></name> <name><surname>Desimone</surname> <given-names>R.</given-names></name> <etal/></person-group>. (<year>2014</year>). <article-title>Stimulus repetition modulates gamma-band synchronization in primate visual cortex</article-title>. <source>Proc. Natl. Acad. Sci.</source> <volume>111</volume>, <fpage>3626</fpage>&#x2013;<lpage>3631</lpage>. doi: <pub-id pub-id-type="doi">10.1073/pnas.1309714111</pub-id>, PMID: <pub-id pub-id-type="pmid">24554080</pub-id></citation></ref>
<ref id="ref6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Caldara</surname> <given-names>R.</given-names></name> <name><surname>Rossion</surname> <given-names>B.</given-names></name> <name><surname>Bovet</surname> <given-names>P.</given-names></name> <name><surname>Hauert</surname> <given-names>C.-A.</given-names></name></person-group> (<year>2004</year>). <article-title>Event-related potentials and time course of the &#x2018;other-race&#x2019; face classification advantage</article-title>. <source>Neuroreport</source> <volume>15</volume>, <fpage>905</fpage>&#x2013;<lpage>910</lpage>. doi: <pub-id pub-id-type="doi">10.1097/00001756-200404090-00034</pub-id>, PMID: <pub-id pub-id-type="pmid">15073540</pub-id></citation></ref>
<ref id="ref7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Carreti&#x00E9;</surname> <given-names>L.</given-names></name> <name><surname>Kessel</surname> <given-names>D.</given-names></name> <name><surname>Carboni</surname> <given-names>A.</given-names></name> <name><surname>L&#x00F3;pez-Mart&#x00ED;n</surname> <given-names>S.</given-names></name> <name><surname>Albert</surname> <given-names>J.</given-names></name> <name><surname>Tapia</surname> <given-names>M.</given-names></name> <etal/></person-group>. (<year>2013</year>). <article-title>Exogenous attention to facial vs non-facial emotional visual stimuli</article-title>. <source>Soc. Cogn. Affect. Neurosci.</source> <volume>8</volume>, <fpage>764</fpage>&#x2013;<lpage>773</lpage>. doi: <pub-id pub-id-type="doi">10.1093/scan/nss068</pub-id>, PMID: <pub-id pub-id-type="pmid">22689218</pub-id></citation></ref>
<ref id="ref8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Conley</surname> <given-names>M. I.</given-names></name> <name><surname>Dellarco</surname> <given-names>D. V.</given-names></name> <name><surname>Rubien-Thomas</surname> <given-names>E.</given-names></name> <name><surname>Cohen</surname> <given-names>A. O.</given-names></name> <name><surname>Cervera</surname> <given-names>A.</given-names></name> <name><surname>Tottenham</surname> <given-names>N.</given-names></name> <etal/></person-group>. (<year>2018</year>). <article-title>The racially diverse affective expression (RADIATE) face stimulus set</article-title>. <source>Psychiatry Res.</source> <volume>270</volume>, <fpage>1059</fpage>&#x2013;<lpage>1067</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.psychres.2018.04.066</pub-id>, PMID: <pub-id pub-id-type="pmid">29910020</pub-id></citation></ref>
<ref id="ref9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Eimer</surname> <given-names>M.</given-names></name> <name><surname>Holmes</surname> <given-names>A.</given-names></name></person-group> (<year>2002</year>). <article-title>An ERP study on the time course of emotional face processing</article-title>. <source>Neuroreport</source> <volume>13</volume>, <fpage>427</fpage>&#x2013;<lpage>431</lpage>. doi: <pub-id pub-id-type="doi">10.1097/00001756-200203250-00013</pub-id>, PMID: <pub-id pub-id-type="pmid">11930154</pub-id></citation></ref>
<ref id="ref10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Eimer</surname> <given-names>M.</given-names></name> <name><surname>Holmes</surname> <given-names>A.</given-names></name> <name><surname>McGlone</surname> <given-names>F. P.</given-names></name></person-group> (<year>2003</year>). <article-title>The role of spatial attention in the processing of facial expression: an ERP study of rapid brain responses to six basic emotions</article-title>. <source>Cogn. Affect. Behav. Neurosci.</source> <volume>3</volume>, <fpage>97</fpage>&#x2013;<lpage>110</lpage>. doi: <pub-id pub-id-type="doi">10.3758/CABN.3.2.97</pub-id>, PMID: <pub-id pub-id-type="pmid">12943325</pub-id></citation></ref>
<ref id="ref11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Epstein</surname> <given-names>R. A.</given-names></name> <name><surname>Higgins</surname> <given-names>J. S.</given-names></name> <name><surname>Parker</surname> <given-names>W.</given-names></name> <name><surname>Aguirre</surname> <given-names>G. K.</given-names></name> <name><surname>Cooperman</surname> <given-names>S.</given-names></name></person-group> (<year>2006</year>). <article-title>Cortical correlates of face and scene inversion: a comparison</article-title>. <source>Neuropsychologia</source> <volume>44</volume>, <fpage>1145</fpage>&#x2013;<lpage>1158</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2005.10.009</pub-id>, PMID: <pub-id pub-id-type="pmid">16303149</pub-id></citation></ref>
<ref id="ref12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Freud</surname> <given-names>E.</given-names></name> <name><surname>Di Giammarino</surname> <given-names>D.</given-names></name> <name><surname>Stajduhar</surname> <given-names>A.</given-names></name> <name><surname>Rosenbaum</surname> <given-names>R. S.</given-names></name> <name><surname>Avidan</surname> <given-names>G.</given-names></name> <name><surname>Ganel</surname> <given-names>T.</given-names></name></person-group> (<year>2022</year>). <article-title>Recognition of masked faces in the era of the pandemic: no improvement despite extensive natural exposure</article-title>. <source>Psychol. Sci.</source> <volume>33</volume>, <fpage>1635</fpage>&#x2013;<lpage>1650</lpage>. doi: <pub-id pub-id-type="doi">10.1177/09567976221105459</pub-id></citation></ref>
<ref id="ref13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gantiva</surname> <given-names>C.</given-names></name> <name><surname>Sotaquir&#x00E1;</surname> <given-names>M.</given-names></name> <name><surname>Araujo</surname> <given-names>A.</given-names></name> <name><surname>Cuervo</surname> <given-names>P.</given-names></name></person-group> (<year>2020</year>). <article-title>Cortical processing of human and emoji faces: an ERP analysis</article-title>. <source>Behav. Inf. Technol.</source> <volume>39</volume>, <fpage>935</fpage>&#x2013;<lpage>943</lpage>. doi: <pub-id pub-id-type="doi">10.1080/0144929X.2019.1632933</pub-id></citation></ref>
<ref id="ref14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gao</surname> <given-names>L.</given-names></name> <name><surname>Xu</surname> <given-names>J.</given-names></name> <name><surname>Zhang</surname> <given-names>B.</given-names></name> <name><surname>Zhao</surname> <given-names>L.</given-names></name> <name><surname>Harel</surname> <given-names>A.</given-names></name> <name><surname>Bentin</surname> <given-names>S.</given-names></name></person-group> (<year>2009</year>). <article-title>Aging effects on early-stage face perception: an ERP study</article-title>. <source>Psychophysiology</source> <volume>46</volume>, <fpage>970</fpage>&#x2013;<lpage>983</lpage>. doi: <pub-id pub-id-type="doi">10.1111/j.1469-8986.2009.00853.x</pub-id>, PMID: <pub-id pub-id-type="pmid">19558400</pub-id></citation></ref>
<ref id="ref15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Grondhuis</surname> <given-names>S. N.</given-names></name> <name><surname>Jimmy</surname> <given-names>A.</given-names></name> <name><surname>Teague</surname> <given-names>C.</given-names></name> <name><surname>Brunet</surname> <given-names>N. M.</given-names></name></person-group> (<year>2021</year>). <article-title>Having difficulties reading the facial expression of older individuals? Blame it on the facial muscles, not the wrinkles</article-title>. <source>Front. Psychol.</source> <volume>12</volume>:<fpage>620768</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2021.620768</pub-id>, PMID: <pub-id pub-id-type="pmid">34149508</pub-id></citation></ref>
<ref id="ref16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Halit</surname> <given-names>H.</given-names></name> <name><surname>de Haan</surname> <given-names>M.</given-names></name> <name><surname>Johnson</surname> <given-names>M. H.</given-names></name></person-group> (<year>2000</year>). <article-title>Modulation of event-related potentials by prototypical and atypical faces</article-title>. <source>Neuroreport</source> <volume>11</volume>, <fpage>1871</fpage>&#x2013;<lpage>1875</lpage>. doi: <pub-id pub-id-type="doi">10.1097/00001756-200006260-00014</pub-id>, PMID: <pub-id pub-id-type="pmid">10884035</pub-id></citation></ref>
<ref id="ref17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hashemi</surname> <given-names>A.</given-names></name> <name><surname>Pachai</surname> <given-names>M. V.</given-names></name> <name><surname>Bennett</surname> <given-names>P. J.</given-names></name> <name><surname>Sekuler</surname> <given-names>A. B.</given-names></name></person-group> (<year>2019</year>). <article-title>The role of horizontal facial structure on the N170 and N250</article-title>. <source>Vis. Res.</source> <volume>157</volume>, <fpage>12</fpage>&#x2013;<lpage>23</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.visres.2018.02.006</pub-id>, PMID: <pub-id pub-id-type="pmid">29555299</pub-id></citation></ref>
<ref id="ref18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>He</surname> <given-names>Y.</given-names></name> <name><surname>Johnson</surname> <given-names>M. K.</given-names></name> <name><surname>Dovidio</surname> <given-names>J. F.</given-names></name> <name><surname>McCarthy</surname> <given-names>G.</given-names></name></person-group> (<year>2009</year>). <article-title>The relation between race-related implicit associations and scalp-recorded neural activity evoked by faces from different races</article-title>. <source>Soc. Neurosci.</source> <volume>4</volume>, <fpage>426</fpage>&#x2013;<lpage>442</lpage>. doi: <pub-id pub-id-type="doi">10.1080/17470910902949184</pub-id>, PMID: <pub-id pub-id-type="pmid">19562628</pub-id></citation></ref>
<ref id="ref19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Herrmann</surname> <given-names>M. J.</given-names></name> <name><surname>Aranda</surname> <given-names>D.</given-names></name> <name><surname>Ellgring</surname> <given-names>H.</given-names></name> <name><surname>Mueller</surname> <given-names>T. J.</given-names></name> <name><surname>Strik</surname> <given-names>W. K.</given-names></name> <name><surname>Heidrich</surname> <given-names>A.</given-names></name> <etal/></person-group>. (<year>2002</year>). <article-title>Face-specific event-related potential in humans is independent from facial expression</article-title>. <source>Int. J. Psychophysiol.</source> <volume>45</volume>, <fpage>241</fpage>&#x2013;<lpage>244</lpage>. doi: <pub-id pub-id-type="doi">10.1016/S0167-8760(02)00033-8</pub-id>, PMID: <pub-id pub-id-type="pmid">12208530</pub-id></citation></ref>
<ref id="ref20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Herrmann</surname> <given-names>M. J.</given-names></name> <name><surname>Ehlis</surname> <given-names>A.-C.</given-names></name> <name><surname>Ellgring</surname> <given-names>H.</given-names></name> <name><surname>Fallgatter</surname> <given-names>A. J.</given-names></name></person-group> (<year>2005</year>). <article-title>Early stages (P100) of face perception in humans as measured with event-related potentials (ERPs)</article-title>. <source>J. Neural Transm.</source> <volume>112</volume>, <fpage>1073</fpage>&#x2013;<lpage>1081</lpage>. doi: <pub-id pub-id-type="doi">10.1007/s00702-004-0250-8</pub-id>, PMID: <pub-id pub-id-type="pmid">15583954</pub-id></citation></ref>
<ref id="ref21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hinojosa</surname> <given-names>J. A.</given-names></name> <name><surname>Mercado</surname> <given-names>F.</given-names></name> <name><surname>Carreti&#x00E9;</surname> <given-names>L.</given-names></name></person-group> (<year>2015</year>). <article-title>N170 sensitivity to facial expression: a meta-analysis</article-title>. <source>Neurosci. Biobehav. Rev.</source> <volume>55</volume>, <fpage>498</fpage>&#x2013;<lpage>509</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neubiorev.2015.06.002</pub-id>, PMID: <pub-id pub-id-type="pmid">26067902</pub-id></citation></ref>
<ref id="ref22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Itier</surname> <given-names>R. J.</given-names></name> <name><surname>Taylor</surname> <given-names>M. J.</given-names></name></person-group> (<year>2004</year>). <article-title>N170 or N1? Spatiotemporal differences between object and face processing using ERPs</article-title>. <source>Cereb. Cortex</source> <volume>14</volume>, <fpage>132</fpage>&#x2013;<lpage>142</lpage>. doi: <pub-id pub-id-type="doi">10.1093/cercor/bhg111</pub-id></citation></ref>
<ref id="ref23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ito</surname> <given-names>T. A.</given-names></name> <name><surname>Urland</surname> <given-names>G. R.</given-names></name></person-group> (<year>2003</year>). <article-title>Race and gender on the brain: electrocortical measures of attention to the race and gender of multiply categorizable individuals</article-title>. <source>J. Pers. Soc. Psychol.</source> <volume>85</volume>, <fpage>616</fpage>&#x2013;<lpage>626</lpage>. doi: <pub-id pub-id-type="doi">10.1037/0022-3514.85.4.616</pub-id>, PMID: <pub-id pub-id-type="pmid">14561116</pub-id></citation></ref>
<ref id="ref24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jacques</surname> <given-names>C.</given-names></name> <name><surname>Rossion</surname> <given-names>B.</given-names></name></person-group> (<year>2007</year>). <article-title>Early electrophysiological responses to multiple face orientations correlate with individual discrimination performance in humans</article-title>. <source>NeuroImage</source> <volume>36</volume>, <fpage>863</fpage>&#x2013;<lpage>876</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuroimage.2007.04.016</pub-id>, PMID: <pub-id pub-id-type="pmid">17500010</pub-id></citation></ref>
<ref id="ref25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Johnson</surname> <given-names>M. H.</given-names></name></person-group> (<year>2005</year>). <article-title>Subcortical face processing</article-title>. <source>Nat. Rev. Neurosci.</source> <volume>6</volume>, <fpage>766</fpage>&#x2013;<lpage>774</lpage>. doi: <pub-id pub-id-type="doi">10.1038/nrn1766</pub-id></citation></ref>
<ref id="ref26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kaltwasser</surname> <given-names>L.</given-names></name> <name><surname>Hildebrandt</surname> <given-names>A.</given-names></name> <name><surname>Recio</surname> <given-names>G.</given-names></name> <name><surname>Wilhelm</surname> <given-names>O.</given-names></name> <name><surname>Sommer</surname> <given-names>W.</given-names></name></person-group> (<year>2014</year>). <article-title>Neurocognitive mechanisms of individual differences in face cognition: a replication and extension</article-title>. <source>Cogn. Affect. Behav. Neurosci.</source> <volume>14</volume>, <fpage>861</fpage>&#x2013;<lpage>878</lpage>. doi: <pub-id pub-id-type="doi">10.3758/s13415-013-0234-y</pub-id>, PMID: <pub-id pub-id-type="pmid">24379165</pub-id></citation></ref>
<ref id="ref27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Latinus</surname> <given-names>M.</given-names></name> <name><surname>Taylor</surname> <given-names>M. J.</given-names></name></person-group> (<year>2006</year>). <article-title>Face processing stages: impact of difficulty and the separation of effects</article-title>. <source>Brain Res.</source> <volume>1123</volume>, <fpage>179</fpage>&#x2013;<lpage>187</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.brainres.2006.09.031</pub-id>, PMID: <pub-id pub-id-type="pmid">17054923</pub-id></citation></ref>
<ref id="ref28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Luck</surname> <given-names>S. J.</given-names></name> <name><surname>Gaspelin</surname> <given-names>N.</given-names></name></person-group> (<year>2017</year>). <article-title>How to get statistically significant effects in any ERP experiment (and why you shouldn&#x2019;t)</article-title>. <source>Psychophysiology</source> <volume>54</volume>, <fpage>146</fpage>&#x2013;<lpage>157</lpage>. doi: <pub-id pub-id-type="doi">10.1111/psyp.12639</pub-id>, PMID: <pub-id pub-id-type="pmid">28000253</pub-id></citation></ref>
<ref id="ref29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Luo</surname> <given-names>W.</given-names></name> <name><surname>Feng</surname> <given-names>W.</given-names></name> <name><surname>He</surname> <given-names>W.</given-names></name> <name><surname>Wang</surname> <given-names>N.-Y.</given-names></name> <name><surname>Luo</surname> <given-names>Y.-J.</given-names></name></person-group> (<year>2010</year>). <article-title>Three stages of facial expression processing: ERP study with rapid serial visual presentation</article-title>. <source>NeuroImage</source> <volume>49</volume>, <fpage>1857</fpage>&#x2013;<lpage>1867</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuroimage.2009.09.018</pub-id>, PMID: <pub-id pub-id-type="pmid">19770052</pub-id></citation></ref>
<ref id="ref30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mazard</surname> <given-names>A.</given-names></name> <name><surname>Schiltz</surname> <given-names>C.</given-names></name> <name><surname>Rossion</surname> <given-names>B.</given-names></name></person-group> (<year>2006</year>). <article-title>Recovery from adaptation to facial identity is larger for upright than inverted faces in the human occipito-temporal cortex</article-title>. <source>Neuropsychologia</source> <volume>44</volume>, <fpage>912</fpage>&#x2013;<lpage>922</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2005.08.015</pub-id>, PMID: <pub-id pub-id-type="pmid">16229867</pub-id></citation></ref>
<ref id="ref31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Moradi</surname> <given-names>A.</given-names></name> <name><surname>Mehrinejad</surname> <given-names>S. A.</given-names></name> <name><surname>Ghadiri</surname> <given-names>M.</given-names></name> <name><surname>Rezaei</surname> <given-names>F.</given-names></name></person-group> (<year>2017</year>). <article-title>Event-related potentials of bottom-up and top-down processing of emotional faces</article-title>. <source>Basic Clin. Neurosci.</source> <volume>8</volume>, <fpage>27</fpage>&#x2013;<lpage>36</lpage>. doi: <pub-id pub-id-type="doi">10.15412/J.BCN.03080104</pub-id>, PMID: <pub-id pub-id-type="pmid">28446947</pub-id></citation></ref>
<ref id="ref32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mouchetant-Rostaing</surname> <given-names>Y.</given-names></name> <name><surname>Giard</surname> <given-names>M.-H.</given-names></name></person-group> (<year>2003</year>). <article-title>Electrophysiological correlates of age and gender perception on human faces</article-title>. <source>J. Cogn. Neurosci.</source> <volume>15</volume>, <fpage>900</fpage>&#x2013;<lpage>910</lpage>. doi: <pub-id pub-id-type="doi">10.1162/089892903322370816</pub-id>, PMID: <pub-id pub-id-type="pmid">14511542</pub-id></citation></ref>
<ref id="ref33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mouchetant-Rostaing</surname> <given-names>Y.</given-names></name> <name><surname>Giard</surname> <given-names>M.</given-names></name> <name><surname>Bentin</surname> <given-names>S.</given-names></name> <name><surname>Aguera</surname> <given-names>P.</given-names></name> <name><surname>Pernier</surname> <given-names>J.</given-names></name></person-group> (<year>2000</year>). <article-title>Neurophysiological correlates of face gender processing in humans</article-title>. <source>Eur. J. Neurosci.</source> <volume>12</volume>, <fpage>303</fpage>&#x2013;<lpage>310</lpage>. doi: <pub-id pub-id-type="doi">10.1046/j.1460-9568.2000.00888.x</pub-id>, PMID: <pub-id pub-id-type="pmid">10651885</pub-id></citation></ref>
<ref id="ref34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Oostenveld</surname> <given-names>R.</given-names></name> <name><surname>Fries</surname> <given-names>P.</given-names></name> <name><surname>Maris</surname> <given-names>E.</given-names></name> <name><surname>Schoffelen</surname> <given-names>J.-M.</given-names></name></person-group> (<year>2011</year>). <article-title>FieldTrip: open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data</article-title>. <source>Comput. Intell. Neurosci.</source> <volume>2011</volume>, <fpage>1</fpage>&#x2013;<lpage>9</lpage>. doi: <pub-id pub-id-type="doi">10.1155/2011/156869</pub-id>, PMID: <pub-id pub-id-type="pmid">21253357</pub-id></citation></ref>
<ref id="ref35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pascual-Leone</surname> <given-names>A.</given-names></name> <name><surname>Hamilton</surname> <given-names>R.</given-names></name></person-group> (<year>2001</year>). <article-title>The metamodal organization of the brain</article-title>. <source>Prog. Brain Res.</source> <volume>134</volume>, <fpage>427</fpage>&#x2013;<lpage>445</lpage>. doi: <pub-id pub-id-type="doi">10.1016/S0079-6123(01)34028-1</pub-id></citation></ref>
<ref id="ref36"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pinto</surname> <given-names>S.</given-names></name> <name><surname>Tremblay</surname> <given-names>P.</given-names></name> <name><surname>Basirat</surname> <given-names>A.</given-names></name> <name><surname>Sato</surname> <given-names>M.</given-names></name></person-group> (<year>2019</year>). <article-title>The impact of when, what and how predictions on auditory speech perception</article-title>. <source>Exp. Brain Res.</source> <volume>237</volume>, <fpage>3143</fpage>&#x2013;<lpage>3153</lpage>. doi: <pub-id pub-id-type="doi">10.1007/s00221-019-05661-5</pub-id></citation></ref>
<ref id="ref37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Prete</surname> <given-names>G.</given-names></name> <name><surname>D&#x2019;Anselmo</surname> <given-names>A.</given-names></name> <name><surname>Tommasi</surname> <given-names>L.</given-names></name></person-group> (<year>2022</year>). <article-title>A neural signature of exposure to masked faces after 18 months of COVID-19</article-title>. <source>Neuropsychologia</source> <volume>174</volume>:<fpage>108334</fpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2022.108334</pub-id>, PMID: <pub-id pub-id-type="pmid">35850282</pub-id></citation></ref>
<ref id="ref38"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Proverbio</surname> <given-names>A. M.</given-names></name> <name><surname>Cerri</surname> <given-names>A.</given-names></name></person-group>, <source>Gallotta C</source>. <article-title>Facemasks selectively impair the recognition of facial expressions that stimulate empathy. An ERP study</article-title>. <source>Psychophysiology</source>. (<year>2023</year>);<fpage>e14280</fpage>, <volume>60</volume>, doi: <pub-id pub-id-type="doi">10.1111/psyp.14280</pub-id></citation></ref>
<ref id="ref39"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Raftopoulos</surname> <given-names>A.</given-names></name></person-group> (<year>2023</year>). <article-title>Does the emotional modulation of visual experience entail the cognitive penetrability of early vision?</article-title> <source>Rev. Philos. Psychol.</source>, <fpage>1</fpage>&#x2013;<lpage>24</lpage>. doi: <pub-id pub-id-type="doi">10.1007/s13164-023-00695-9</pub-id></citation></ref>
<ref id="ref41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rossion</surname> <given-names>B.</given-names></name></person-group> (<year>2014</year>). <article-title>Understanding face perception by means of human electrophysiology</article-title>. <source>Trends Cogn. Sci.</source> <volume>18</volume>, <fpage>310</fpage>&#x2013;<lpage>318</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.tics.2014.02.013</pub-id>, PMID: <pub-id pub-id-type="pmid">24703600</pub-id></citation></ref>
<ref id="ref42"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rossion</surname> <given-names>B.</given-names></name> <name><surname>Gauthier</surname> <given-names>I.</given-names></name> <name><surname>Tarr</surname> <given-names>M. J.</given-names></name> <name><surname>Despland</surname> <given-names>P.</given-names></name> <name><surname>Bruyer</surname> <given-names>R.</given-names></name> <name><surname>Linotte</surname> <given-names>S.</given-names></name> <etal/></person-group>. (<year>2000</year>). <article-title>The N170 occipito-temporal component is delayed and enhanced to inverted faces but not to inverted objects: an electrophysiological account of face-specific processes in the human brain</article-title>. <source>Neuroreport</source> <volume>11</volume>, <fpage>69</fpage>&#x2013;<lpage>72</lpage>. doi: <pub-id pub-id-type="doi">10.1097/00001756-200001170-00014</pub-id>, PMID: <pub-id pub-id-type="pmid">10683832</pub-id></citation></ref>
<ref id="ref43"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rossion</surname> <given-names>B.</given-names></name> <name><surname>Jacques</surname> <given-names>C.</given-names></name></person-group> (<year>2008</year>). <article-title>Does physical interstimulus variance account for early electrophysiological face sensitive responses in the human brain? Ten lessons on the N170</article-title>. <source>NeuroImage</source> <volume>39</volume>, <fpage>1959</fpage>&#x2013;<lpage>1979</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuroimage.2007.10.011</pub-id>, PMID: <pub-id pub-id-type="pmid">18055223</pub-id></citation></ref>
<ref id="ref44"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Rossion</surname> <given-names>B.</given-names></name> <name><surname>Jacques</surname> <given-names>C.</given-names></name></person-group> (<year>2011</year>). &#x201C;<article-title>The N170: understanding the time course of face perception in the human brain</article-title>&#x201D; in <source>The Oxford Handbook of Event-Related Potential Components</source>, <fpage>115</fpage>&#x2013;<lpage>142</lpage>.</citation></ref>
<ref id="ref45"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rousselet</surname> <given-names>G. A.</given-names></name> <name><surname>Mac&#x00E9;</surname> <given-names>M. J.-M.</given-names></name> <name><surname>Fabre-Thorpe</surname> <given-names>M.</given-names></name></person-group> (<year>2004</year>). <article-title>Animal and human faces in natural scenes: how specific to human faces is the N170 ERP component?</article-title> <source>J. Vis.</source> <volume>4</volume>, <fpage>13</fpage>&#x2013;<lpage>21</lpage>. doi: <pub-id pub-id-type="doi">10.1167/4.1.2</pub-id></citation></ref>
<ref id="ref46"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sadeh</surname> <given-names>B.</given-names></name> <name><surname>Yovel</surname> <given-names>G.</given-names></name></person-group> (<year>2010</year>). <article-title>Why is the N170 enhanced for inverted faces? An ERP competition experiment</article-title>. <source>Neuroimage</source> <volume>53</volume>, <fpage>782</fpage>&#x2013;<lpage>789</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuroimage.2010.06.029</pub-id>, PMID: <pub-id pub-id-type="pmid">20558303</pub-id></citation></ref>
<ref id="ref47"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schindler</surname> <given-names>S.</given-names></name> <name><surname>Bublatzky</surname> <given-names>F.</given-names></name></person-group> (<year>2020</year>). <article-title>Attention and emotion: an integrative review of emotional face processing as a function of attention</article-title>. <source>Cortex</source> <volume>130</volume>, <fpage>362</fpage>&#x2013;<lpage>386</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.cortex.2020.06.010</pub-id>, PMID: <pub-id pub-id-type="pmid">32745728</pub-id></citation></ref>
<ref id="ref48"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schweinberger</surname> <given-names>S. R.</given-names></name> <name><surname>Neumann</surname> <given-names>M. F.</given-names></name></person-group> (<year>2016</year>). <article-title>Repetition effects in human ERPs to faces</article-title>. <source>Cortex</source> <volume>80</volume>, <fpage>141</fpage>&#x2013;<lpage>153</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.cortex.2015.11.001</pub-id></citation></ref>
<ref id="ref49"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sun</surname> <given-names>G.</given-names></name> <name><surname>Zhang</surname> <given-names>G.</given-names></name> <name><surname>Yang</surname> <given-names>Y.</given-names></name> <name><surname>Bentin</surname> <given-names>S.</given-names></name> <name><surname>Zhao</surname> <given-names>L.</given-names></name></person-group> (<year>2014</year>). <article-title>Mapping the time course of other-race face classification advantage: a cross-race ERP study</article-title>. <source>Brain Topogr.</source> <volume>27</volume>, <fpage>663</fpage>&#x2013;<lpage>671</lpage>. doi: <pub-id pub-id-type="doi">10.1007/s10548-013-0348-0</pub-id>, PMID: <pub-id pub-id-type="pmid">24375283</pub-id></citation></ref>
<ref id="ref50"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Walker</surname> <given-names>P. M.</given-names></name> <name><surname>Silvert</surname> <given-names>L.</given-names></name> <name><surname>Hewstone</surname> <given-names>M.</given-names></name> <name><surname>Nobre</surname> <given-names>A. C.</given-names></name></person-group> (<year>2008</year>). <article-title>Social contact and other-race face processing in the human brain</article-title>. <source>Soc. Cogn. Affect. Neurosci.</source> <volume>3</volume>, <fpage>16</fpage>&#x2013;<lpage>25</lpage>. doi: <pub-id pub-id-type="doi">10.1093/scan/nsm035</pub-id>, PMID: <pub-id pub-id-type="pmid">19015091</pub-id></citation></ref>
<ref id="ref51"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Walther</surname> <given-names>C.</given-names></name> <name><surname>Schweinberger</surname> <given-names>S. R.</given-names></name> <name><surname>Kaiser</surname> <given-names>D.</given-names></name> <name><surname>Kov&#x00E1;cs</surname> <given-names>G.</given-names></name></person-group> (<year>2013</year>). <article-title>Neural correlates of priming and adaptation in familiar face perception</article-title>. <source>Cortex</source> <volume>49</volume>, <fpage>1963</fpage>&#x2013;<lpage>1977</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.cortex.2012.08.012</pub-id>, PMID: <pub-id pub-id-type="pmid">23021070</pub-id></citation></ref>
<ref id="ref52"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wiese</surname> <given-names>H.</given-names></name></person-group> (<year>2012</year>). <article-title>The role of age and ethnic group in face recognition memory: ERP evidence from a combined own-age and own-race bias study</article-title>. <source>Biol. Psychol.</source> <volume>89</volume>, <fpage>137</fpage>&#x2013;<lpage>147</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.biopsycho.2011.10.002</pub-id>, PMID: <pub-id pub-id-type="pmid">22008365</pub-id></citation></ref>
<ref id="ref53"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wiese</surname> <given-names>H.</given-names></name> <name><surname>Schweinberger</surname> <given-names>S. R.</given-names></name> <name><surname>Hansen</surname> <given-names>K.</given-names></name></person-group> (<year>2008</year>). <article-title>The age of the beholder: ERP evidence of an own-age bias in face memory</article-title>. <source>Neuropsychologia</source> <volume>46</volume>, <fpage>2973</fpage>&#x2013;<lpage>2985</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2008.06.007</pub-id>, PMID: <pub-id pub-id-type="pmid">18602408</pub-id></citation></ref>
<ref id="ref54"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wiese</surname> <given-names>H.</given-names></name> <name><surname>Stahl</surname> <given-names>J.</given-names></name> <name><surname>Schweinberger</surname> <given-names>S. R.</given-names></name></person-group> (<year>2009</year>). <article-title>Configural processing of other-race faces is delayed but not decreased</article-title>. <source>Biol. Psychol.</source> <volume>81</volume>, <fpage>103</fpage>&#x2013;<lpage>109</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.biopsycho.2009.03.002</pub-id></citation></ref>
<ref id="ref55"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Williams</surname> <given-names>L. M.</given-names></name> <name><surname>Palmer</surname> <given-names>D.</given-names></name> <name><surname>Liddell</surname> <given-names>B. J.</given-names></name> <name><surname>Song</surname> <given-names>L.</given-names></name> <name><surname>Gordon</surname> <given-names>E.</given-names></name></person-group> (<year>2006</year>). <article-title>The &#x2018;when&#x2019;and &#x2018;where&#x2019;of perceiving signals of threat versus non-threat</article-title>. <source>NeuroImage</source> <volume>31</volume>, <fpage>458</fpage>&#x2013;<lpage>467</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuroimage.2005.12.009</pub-id>, PMID: <pub-id pub-id-type="pmid">16460966</pub-id></citation></ref>
<ref id="ref56"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wuttke</surname> <given-names>S. J.</given-names></name> <name><surname>Schweinberger</surname> <given-names>S. R.</given-names></name></person-group> (<year>2019</year>). <article-title>The P200 predominantly reflects distance-to-norm in face space whereas the N250 reflects activation of identity-specific representations of known faces</article-title>. <source>Biol. Psychol.</source> <volume>140</volume>, <fpage>86</fpage>&#x2013;<lpage>95</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.biopsycho.2018.11.011</pub-id>, PMID: <pub-id pub-id-type="pmid">30529289</pub-id></citation></ref>
<ref id="ref57"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yao</surname> <given-names>Q.</given-names></name> <name><surname>Zhao</surname> <given-names>L.</given-names></name></person-group> (<year>2019</year>). <article-title>Using spatial frequency scales for processing own-race and other-race faces: an ERP analysis</article-title>. <source>Neurosci. Lett.</source> <volume>705</volume>, <fpage>167</fpage>&#x2013;<lpage>171</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neulet.2019.04.059</pub-id>, PMID: <pub-id pub-id-type="pmid">31051221</pub-id></citation></ref>
<ref id="ref58"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yovel</surname> <given-names>G.</given-names></name> <name><surname>Kanwisher</surname> <given-names>N.</given-names></name></person-group> (<year>2005</year>). <article-title>The neural basis of the behavioral face-inversion effect</article-title>. <source>Curr. Biol.</source> <volume>15</volume>, <fpage>2256</fpage>&#x2013;<lpage>2262</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.cub.2005.10.072</pub-id>, PMID: <pub-id pub-id-type="pmid">16360687</pub-id></citation></ref>
<ref id="ref1002"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>&#x017B;ochowska</surname> <given-names>A.</given-names></name> <name><surname>Jakuszyk</surname> <given-names>P.</given-names></name> <name><surname>Nowicka</surname> <given-names>M. M.</given-names></name> <name><surname>Nowicka</surname> <given-names>A.</given-names></name></person-group> (<year>2022</year>). <article-title>Are covered faces eye-catching for us? The impact of masks on attentional processing of self and other faces during the COVID-19 pandemic</article-title>. <source>Cortex.</source> <volume>149</volume>, <fpage>173</fpage>&#x2013;<lpage>87</lpage>.</citation></ref>
</ref-list>
</back>
</article>