Skip to main content

MINI REVIEW article

Front. Hum. Neurosci., 16 February 2022
Sec. Sensory Neuroscience
Volume 15 - 2021 | https://doi.org/10.3389/fnhum.2021.781234

Visual Mismatch Negativity: A Mini-Review of Non-pathological Studies With Special Populations and Stimuli

  • Institute of Cognitive Neuroscience and Psychology, Research Centre for Natural Sciences, Budapest, Hungary

In this mini-review, we summarized the results of 12 visual mismatch negativity (vMMN) studies that attempted to use this component as a tool for investigating differences between non-clinical samples of participants as well as the possibility of automatic discrimination in the case of specific categories of visual stimuli. These studies investigated the effects of gender, the effects of long-term differences between the groups of participants (fitness, experience in different sports, and Internet addiction), and the effects of short-term states (mental fatigue and hypoxia), as well as the vMMN effect elicited by artworks as a special stimulus category.

Introduction

Stimuli different from the representation of expected events automatically elicit the visual mismatch negativity (vMMN)1 component of event-related potentials (ERPs) even if the events are unattended. VMMN emerges at posterior electrode locations within the post-stimulus range of 100–350 ms. This component is usually recorded in passive oddball paradigms in which vMMN-related stimuli are not involved in the ongoing task and is identified as the ERP difference between rare (deviant) and frequent (standard) stimuli. VMMN appears to deviant simple visual features such as color (Athanasopoulos et al., 2010) and spatial frequency (Heslenfeld, 2003), perceptual categories [e.g., symmetrical vs. random patterns (Kecskés-Kovács et al., 2013)], complex categories such as facial emotions (Astikainen and Heitanen, 2009), and lexical processing (Wei et al., 2018) (for comprehensive reviews, refer to Kimura et al., 2011; Stefanics et al., 2014). As in many fields of ERP research, attempts to apply the results from basic research concentrated on the comparison of various clinical populations (for a review, refer to Kremláček et al., 2016) and on investigating aging effects (Sulykos et al., 2017). The aim of this mini-review was to show how vMMN can be a tool for investigating differences between other, non-clinical samples of participants. Furthermore, vMMN can be a feasible tool to investigate the sensitivity to specific categories of visual stimuli, which is another promising application of ERP research. In other words, the aim was to highlight in what research areas there is emerging evidence of the usefulness of vMMN beyond the study of pathology, aging, single deviant features, sequential deviancies, and categorization per se (i.e., when not used as the means to study non-clinical samples), all of which have already been subject to reviews (Czigler, 2014; Stefanics et al., 2014; Kremláček et al., 2016), and lexical processing (this topic deserves a separate review). To this end, in this mini-review, we summarized research dealing with vMMN differences between different non-clinical samples, i.e., gender, expertise in particular fields, and with vMMN emerging to the uncommon type of stimuli. Accordingly, we do not review the articles on topics dealing with stimulus features and categorization (as described above), language-related effects, and effects specific to stimulation (e.g., face-specific effects, Wang et al., 2014). Studies that were published within the period of 2000–2021 (i.e., a period of considerable research activity in this field) were collected from the Web of Sciences and Scopus databases. We used the search term “visual mismatch negativity” (exact phrase search) and then chose all studies describing non-clinical samples or specific categories of visual stimuli according to the “beyond” criteria. An overall summary of the methods and results of the studies can be found in Table 1. A more detailed summary of the results including effect sizes can be found in Table 2.

TABLE 1
www.frontiersin.org

Table 1. A summary of the review studies.

TABLE 2
www.frontiersin.org

Table 2. A summary of the effect sizes for intergroup comparisons in the review studies.

Gender-Related Differences in vMMN

The majority of vMMN studies include both female and male participants. Supposedly, in these studies, researchers would have detected any substantial gender-related differences, but no such incidental observations have been published. Several studies have targeted directly possible gender-related differences in vMMN to simple visual features, facial emotions, and attractiveness.

In the first direct study (Langrová et al., 2012), age-matched young women and men participated in a three-stimulus oddball design (within a sequence, there were frequent and infrequent non-target and infrequent target stimuli). The target stimuli were centrally presented sinusoidal grating patterns, whereas vMMN-related moving sinusoidal gratings were presented at the periphery. The upward movement direction was standard, and the downward movement direction was deviant. vMMN emerged within the latency range of 120–240 ms without any gender-related differences.

Yang et al. (2016) investigated the duration-related vMMN in 21 female and 21 male participants. They applied the reverse control procedure where both values of the stimuli were standard and deviant; therefore, stimuli with the same physical parameters were compared. Stimulus durations were 50 and 150 ms, respectively. Stimuli were black squares presented on the two sides of a fixation cross. Participants had to attend and react to the occasional size change of the cross. VMMN was analyzed at the P3, O1, Pz, Oz, P4, and O2 locations. The gender-related vMMN differences appeared for the mean amplitude in the range of 180–260 ms. In this study, vMMN was larger overall as well as asymmetric in male participants, with the mean amplitude being larger over the right hemisphere.

Many studies have indicated gender-related differences to emotional stimuli, both at behavioral and electrophysiological levels, according to which women seem to be more sensitive to various aspects of emotions (for reviews, refer to Vuilleumier and Pourtois, 2007; Whittle et al., 2011). A reasonable question is whether these differences can be observed at a preattentive level. So far, the effect of facial emotions has been investigated in four vMMN studies.

Xu et al. (2013) presented neutral, happy, and sad schematic faces to young female (n = 14) and male (n = 15) participants in a reverse control procedure (only happy and neutral or sad and neutral faces were presented together). A face was presented on each side of a fixation cross. The task was to react to the changes of the cross. VMMN was calculated as the average activity of the potential differences in the ranges of 120–230 and 230–350 ms over parietal and occipitoparietal locations. Only vMMN to happy and sad faces were compared. In the range of 120–230 ms, the vMMN amplitude was larger over the right hemisphere to sad than to happy faces in the female group, while there were no emotion-related differences in the male group. In the range of 230–350 ms, sad faces elicited a larger vMMN than happy faces in both gender groups.

In a more recent study, Li Q. et al. (2018) presented photographs of happy, fearful, and neutral faces to young female and male participants (n = 19 in both groups) in a reverse control procedure (only happy and neutral or fearful and neutral faces were presented together). Four cropped photographs appeared in the four corners of an imaginary square, and participants had to detect changes in a fixation cross at the center of the screen. VMMN was analyzed at P7, P8, FCz, and Cz locations as the mean activity in the ranges of 100–200 and 250–350 ms and was compared separately for happy vs. neutral and fearful vs. neutral faces. In the happy-neutral context, a larger vMMN to both happy and neutral faces was elicited in the female group compared to the male group in both latency ranges. In the later latency range, this difference appeared in both hemispheres for happy faces but only in the right hemisphere for neutral faces. There were no gender-related differences in the fearful-neutral context in either of the latency ranges.

Zhang J. et al. (2018) investigated functional connectivity to deviant-minus-standard ERP differences using schematic sad, happy, and neutral faces (only happy and neutral or sad and neutral faces were presented together). The participants were 16 young women and 18 young men. Deviant and standard stimuli were delivered in a reverse control arrangement. Two identical faces appeared on the two sides of a fixation cross. The participants had to detect occasional size changes of the cross. To obtain data on functional connectivity, the time-frequency analysis (delta, theta, alpha, beta, and gamma bands) was followed by the calculation of the phase lag index. The latter was used to calculate short and long connections. Only the happy and sad face conditions were compared. The gender-related differences in emotional processing appeared in the range of 150–250 ms, with alpha activity to sad stimuli being greater compared to happy faces in the female group. The connections were stronger to both emotional stimuli in women than in men. The connections in men were confined to occipital areas and to a frontal area, while in women, the network involved wider occipital, parietal, and frontal sites. In women, the number of long connections was larger than in men, both within and between hemispheres.

Another evolutionarily important aspect of the face is attractiveness. Zhang S. et al. (2018) investigated vMMN to attractive opposite-gender faces in women and men (i.e., photographs were preselected as attractive or less attractive portraits). In Experiment 1 (29 women, 31 men), the female participants were investigated during the menstrual phase, while in Experiment 2 (30 women, 30 men), the female participants were in the ovulation period. The authors hypothesized that the effects of male attractiveness increased during ovulation. Instead of the traditional oddball design, the cross-modal delayed-matching procedure was applied. The participants had to discriminate the pitch of sounds and respond after the onset of an imperative click. Between the two sounds, 0–2 task-irrelevant photographs were presented. Less attractive photographs were frequent (standard), and attractive ones were rare (deviant). The amplitudes of the deviant-minus-standard potential differences were measured in the ranges of 100–240, 240–380, and 380–520 ms. VMMN was measured as the largest negativity at posterior locations. In the earlier and middle ranges, vMMN was larger in the male participants in both experiments. In Experiment 2, in the late range (380–520 ms), the difference was larger in the female group. Comparing the female participants in the two studies, in the early and late ranges, vMMN was larger for participants in the ovulation phase. Accordingly, attractiveness has a generally larger effect in men, but in periods of fertility, the role of attractiveness was increased in female participants. The late range was longer than the usual range of vMMN. Due to the not particularly strict attention control in the cross-modal delayed-response paradigm, it is doubtful that the ERPs in the late range are the signatures of automatic change detection.

Transient States

The three review studies dealt with mental fatigue, hunger, and hypoxia, respectively. Li J. et al. (2016) investigated young participants divided into “fatigue” or “leisure” groups (n = 12 in both groups). Mental fatigue was elicited by performing a continuous detection task for 2.5 h. The control (“leisure”) group was free to engage in self-chosen activities. Mental fatigue and mood were assessed by questionnaires pre- and post-manipulation, and these results indicated differences between the groups. VMMN was assessed in a duration oddball task with 50 ms (standard) and 150 ms (deviant) exposure of two black squares on the two sides of a fixation point. The task was to detect occasional size changes of the fixation point. VMMN was measured as the maximum amplitude in the range of 100–300 ms at the Fz, Cz, O1, Oz, and O2 locations. The decrease in vMMN amplitude was larger in the “fatigue” group compared to the pre-manipulation measurement. The fatigue-related decrement over the five electrodes was fairly large, i.e., 1.8 at post- vs. 4.1 μV at pre-manipulation. Peak latency in the “fatigue” group also increased between the two measures.

Sultson et al. (2019) investigated the effect of hunger on vMMN. The stimuli were the pictures of food (high-fat savory or high-fat sweet) and the pictures of non-food items (each food stimuli had a non-food counterpart). Participants (18 women) were tested in both “hunger” and “fed” conditions. Experiments were conducted in the morning. The participants were asked to refrain from eating for 10–12 h before the session. In the “fed” session, food was provided in the laboratory before the electroencephalography (EEG) recording. Within the oddball sequences, the standard was a non-food picture, and two high-fat savory or high-fat sweet (depending on the food condition) and one different non-food pictures were deviants. Equal probability control sequences (25% of each of the four stimulus types) were also administered. Four identical pictures were presented in the four corners of an imaginary square, and the task-related stimuli, i.e., items of a 2-back letter-matching task, were presented at the center. VMMN was measured at the O1, Oz, and O2 locations as the mean activity of the deviant-minus-standard and deviant-minus-control potential differences in the ranges of 100–160 and 160–220 ms. The deviant-minus-control differences did not result in a reliable vMMN. No differences between the effects of the two kinds of food deviants were found in the deviant-minus-standard differences; however, food stimuli elicited larger vMMN in the “hunger” condition. Interestingly, one of the high-fat savory foods, i.e., the hamburger, elicited a larger effect than the rest of the food stimuli.

Blacker et al. (2021) analyzed 24 young participants under normoxia (20.4% O2) and hypoxia (10.6% O2, corresponding to ~5,300 m altitude). Participants performed a tracking task with centrally presented events, while the vMMN-related stimuli, i.e., isoluminant green-black and red-black checkerboards, appeared on two lateral displays. A reverse control procedure was applied. VMMN was measured at a set of posterior (parieto-occipital and occipital) and anterior (Fz, FCz, and Cz) locations as the peak of the potential difference2 in the range of 150–250 ms. Amplitudes were smaller (negativity at posterior and positivity at anterior locations) in hypoxia compared to normoxia. It should be noted that vMMN amplitudes, and accordingly the differences, were small, i.e., ~0.1 μV.

Enduring Dispositions

In this “Enduring dispositions” section, we present the data about the effect of physical fitness, expertise in sports with focal vs. distributed attentional demands, and Internet addiction on vMMN. Pesonen et al. (2019) investigated 16 pairs of male twins. One member of each pair was physically more active than the other (physically active for more vs. fewer than two times per week). The participants listened to an auditory play while oblique bars were presented as vMMN-related stimuli. One orientation of the bar was the standard, the other was the deviant. ERPs were analyzed at locations near F3, F4, O1, Pz, and O2. Over the posterior locations, the peak latency of the potential difference was shorter in the physically active group. Total vMMN activation was measured as waveform integrals in the rectified potential differences in the range of 100–300 ms. There was a tendency for larger activation in the physically active participants.

Petro et al. (2021) compared vMMN in groups of target shooters and handball players (n = 20 in both groups). While shooting demands well-established focal attention, handball players have to divide attention among events within a wide visual field. Checkerboard stimuli were presented bilaterally. The locations of the black and white squares were swapped as standard and deviant in a reverse control design. The participants performed a tracking task to move stimuli within a central field. ERPs were measured at the FZ, P7, P8, PO3, PO4, PO7, PO8, POZ, O1, O2, and OZ locations. VMMN was calculated as the mean amplitude in the range of 100–150 ms. Over the posterior electrodes, vMMN was larger in the handball player group, i.e., athletes with demand for processing events from a wide visual field were more sensitive to peripheral events violating sequential regulations. The difference was fairly large over the occipital locations, i.e., −2.3 vs. −1.2 μV.

He et al. (2018) investigated 15 Internet addicts and 15 control participants [i.e., Internet addiction is now included in the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-V), but it was not discussed in the review by Kremláček et al. (2016)]. In this study, the stimuli were common network logos and pictures of everyday objects. These stimuli were presented on the two sides of a fixation cross. The task was to react to the occasional changes of the cross. ERPs were measured at the O1, Oz, O2, PO5, and PO6 locations, as the average of the range of 200–300 ms. Stimuli had either black or red outlines, with the red ones being the deviants. The Internet-related and the red deviants elicited larger vMMN in both groups, but the difference was reliable only in the group of Internet addicts.

Specific Stimuli

Menzel et al. (2018) investigated whether abstract artworks are automatically distinguished from the reorganized versions of the same pictures. To this end, elements from 20 artworks were shuffled. This way, the elements were identical, but the composition became different. The pictures were presented in the lower half of the visual field. In the center, a fixation cross was presented, and the task was to react to occasional changes. Seventeen young participants were investigated in the study. After the EEG session, the participants rated the pictures on the dimensions of harmony, orderliness, and likability. VMMN was identified when the amplitude of the potential differences was different from zero at least for 20 subsequent points (40 ms) and at least over two subsequent electrode locations. Deviant-related differences appeared to both the original and the shuffled versions, but the deviant originals elicited positive differences, while the shuffled versions elicited negative differences. Such differences appeared in various ranges between 146 and 871 ms, but independent of the polarity, they concentrated in the range of 220–300 ms at parietal and parieto-occipital locations. While the two picture versions were similar, the participants rated the originals as more harmonious and ordered.

Discussion

We aimed to review studies using vMMN in non-clinical populations, the application of vMMN under specific circumstances, and investigate the processing of specific visual stimuli. This way, we attempted to demonstrate the wide range of utilizations of this ERP component. VMMN seems to be sensitive enough to disclose gender differences as well as both the long-term and temporary differences between the non-clinical samples of participants. However, due to the publication bias, it is unknown that how many studies without reliable differences have remained unpublished.

As the results of the reviewed studies show, women are more sensitive to emotional events even if these events are outside the focus of attention. This is in agreement with the results of studies with various methods (e.g., Vuilleumier and Pourtois, 2007; Whittle et al., 2011). It is an open question whether the gender difference is confined to the negative (sad; Xu et al., 2013) or positive (happy; Li Q. et al., 2018) emotions. This larger sensitivity is indicated by the involvement of wider brain activity and a larger number of long connections (Zhang J. et al., 2018).

To compare the role of the attractiveness of opposite-gender faces, according to the investigated sample of stimuli and sample of participants, Zhang S. et al. (2018) obtained larger sensitivity in men. However, sensitivity to attractiveness in the female participants increased during the fertility period.

A considerable vMMN amplitude reduction appeared in the case of mental fatigue (Li J. et al., 2016). Accordingly, vMMN seems to be a promising indicator in this field. Hunger increased sensitivity to food-related deviant stimuli (Sultson et al., 2019), but instead of nutrition quality, the saliency of particular foods (hamburger in the study) elicited the largest effect. In the case of hypoxia, vMMN amplitude decreased (Blacker et al., 2021), but the difference was small. Regular physical activity had no effect on vMMN amplitude (Pesonen et al., 2019), but it reduced vMMN latency. Athletes from sports that demand the processing of a wider visual field were more sensitive to changes in the periphery than athletes from sports with strong focal attention demand (Petro et al., 2021). It is unknown whether this is an effect of practice or a part of trait differences. As Menzel et al. (2018) pointed out, vMMN can be a useful tool even in experimental aesthetics.

On a methodological level, the majority of studies presented the vMMN-related stimuli in two or four locations around the center of the visual field. In the most popular task, the participants had to detect changes in the fixation point. In principle, such changes can be introduced at any moment within the sequence. However, in the previous studies, the task-related events appeared only during the interstimulus interval. As we argued earlier (Czigler, 2007), in such an arrangement, participants can discover that, during the presentation of vMMN-related stimuli, there are no task-related changes; therefore, from time to time, they can observe the task-irrelevant events. Only a few studies applied continuous tasks such as tracking (Blacker et al., 2021; Petro et al., 2021) or a 2-back task (Sultson et al., 2019). What is really needed is research comparing vMMN to identical deviants in sequences with various task-related events.

Finally, what is the desirable progress in the field? In all studies, the differences appeared between various samples or between various conditions. The next step is to develop methods appropriate for use in applied research. Fatigue, hypoxia, and sensitivity to peripheral stimuli are the fields that require the methods of objective assessment of individual differences, and it seems that vMMN is a promising tool for fulfilling this purpose.

Author Contributions

IC and PK drafted and wrote the manuscript. All authors contributed to the article and approved the submitted version.

Funding

This study was financially supported by OTKA-K funding scheme (No. 119587) provided by the Ministry of Innovation and Technology of Hungary through the National Research, Development, and Innovation Office.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's Note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Footnotes

1. ^VMMN is the term found in the relevant literature. However, in several studies, the component has positive polarity (Amado and Kovács, 2016).

2. ^The paper states the difference as standard-minus-deviant, but the authors corrected the text on p. 329, and they subtracted the ERPs to the standard from the ERPs to the deviant.

References

Amado, C., and Kovács, G. (2016). Does surprise enhancement or repetition suppression explain visual mismatch negativity? Eur. J. Neurosci. 43, 1590–1600. doi: 10.1111/ejn.13263

PubMed Abstract | CrossRef Full Text | Google Scholar

Astikainen, P., and Heitanen, J. K. (2009). Event-related potentials to irrelevant changes in facial expressions. Behav Brain Funct. 5:30. doi: 10.1186/1744-9081-5-30

PubMed Abstract | CrossRef Full Text | Google Scholar

Athanasopoulos, P., Dering, B., Wiggett, A., Kuipers, J.-R., and Thierry, G. (2010). Perceptual shift in bilingualism: Brain potentials reveal plasticity in pre-attentive colour perception. Cognition 116, 437–443. doi: 10.1016/j.cognition.2010.05.016

PubMed Abstract | CrossRef Full Text | Google Scholar

Blacker, K. J., Seech, T. R., Funke, M. E., and Kinney, M. J. (2021). Deficits in visual processing during hypoxia as evidenced by visual mismatch negativity. Aerosp. Med. Hum. Perf. 92, 326–332. doi: 10.3357/AMHP.5735.2021

PubMed Abstract | CrossRef Full Text | Google Scholar

Czigler, I. (2007). Visual mismatch negativity: violation of nonattended environmental regularities. J. Psychophysiol. 21, 224–230. doi: 10.1027/0269-8803.21.34.224

CrossRef Full Text | Google Scholar

Czigler, I. (2014). Visual mismatch negativity and categorization. Brain Topogr. 27, 590–598. doi: 10.1007/s10548-013-0316-8

PubMed Abstract | CrossRef Full Text | Google Scholar

He, J., Zheng, Y., Nie, Y., and Zhou, Z. (2018). Automatic detection advantage of network information among Internet addicts: behavioral and ERP evidence. Sci. Rep. 8:1289. doi: 10.1038/s41598-018-25442-4

PubMed Abstract | CrossRef Full Text | Google Scholar

Heslenfeld, D. J. (2003). “Visual mismatch negativity,” in Detection of Change: Event-related Potential and fMRI Findings, ed J. Polich (Boston, MA: Springer). doi: 10.1007/978-1-4615-0294-4_3

CrossRef Full Text | Google Scholar

Kecskés-Kovács, K., Sulykos, I., and Czigler, I. (2013). Visual mismatch negativity is sensitive to symmetry as a perceptual category. Eur. J. Neurosci. 37, 662–667. doi: 10.1111/ejn.12061

PubMed Abstract | CrossRef Full Text | Google Scholar

Kimura, M., Schröger, E., and Czigler, I. (2011). Visual mismatch negativity and its importance in visual cognitive sciences. Neuroreport. 22, 669–673. doi: 10.1097/WNR.0b013e32834973ba

PubMed Abstract | CrossRef Full Text | Google Scholar

Kremláček, J., Kreegipuu, K., Tales, A., Astikainen, P., Poldver, N., Näätänen, R., et al. (2016). Visual mismatch negativity (vMMN): A review and meta-analysis of studies in psychiatric and neurological disorders. Cortex 80, 76–112. doi: 10.1016/j.cortex.2016.03.017

PubMed Abstract | CrossRef Full Text | Google Scholar

Langrová, J., Kremláček, J., Kuba, M., Kubová, Z., and Szanyi, J. (2012). Gender impact on electrophysiological activity of the brain. Physiol. Res. 61(suppl. 2), S119–S127. doi: 10.33549/physiolres.932421

PubMed Abstract | CrossRef Full Text | Google Scholar

Li, J., Song, G., and Miao, D. (2016). Effect of mental fatigue on nonattention: a visual mismatch negativity study. Neuroreport 27, 1323–1330. doi: 10.1097/WNR.0000000000000694

PubMed Abstract | CrossRef Full Text | Google Scholar

Li, Q., Zhou, S., Zheng, Y., and Liu, X. (2018). Female advantage in automatic change detection of facial expressions during a happy-neutral context: an ERP study. Front. Hum. Neurosci. 12:146. doi: 10.3389/fnhum.2018.00146

PubMed Abstract | CrossRef Full Text | Google Scholar

Menzel, C., Kovács, G., Amado, C., Hayn-Leichsenring, G. U., and Redies, C. (2018). Visual mismatch negativity indicates automatic, task-independent detection of artistic image composition in abstract artworks. Biol. Psychol. 136, 76–86. doi: 10.1016/j.biopsycho.2018.05.005

PubMed Abstract | CrossRef Full Text | Google Scholar

Pesonen, H., Savić, A. M., Kujala, U. M., and Tarkka, I. M. (2019). Long-term physical activity modifies automatic visual processing. Int. J. Sport. Exerc. Psychol. 17, 275–284. doi: 10.1080/1612197X.2017.1321031

CrossRef Full Text | Google Scholar

Petro, B., Lénárt, Á., Gaál, Z. A., Kojouharova, P., Kökény, T., Ökrös, C., et al. (2021). Automatic detection of peripheral stimuli in shooters and handball players: an event-related potential study. Exp. Brain Res. 239, 1531–1538. doi: 10.1007/s00221-021-06071-2

PubMed Abstract | CrossRef Full Text | Google Scholar

Stefanics, G., Kremláček, J., and Czigler, I. (2014). Visual mismatch negativity: a predictive coding view. Front. Hum. Neurosci. 8:666. doi: 10.3389/fnhum.2014.00666

PubMed Abstract | CrossRef Full Text | Google Scholar

Sultson, H., Vainik, U., and Kreegipuu, K. (2019). Hunger enhances automatic processing of food and non-food stimuli: A visual mismatch negativity study. Appetite 133, 324–336. doi: 10.1016/j.appet.2018.11.031

PubMed Abstract | CrossRef Full Text | Google Scholar

Sulykos, I., Gaál, Z. A., and Czigler, I. (2017). Visual mismatch negativity to vanishing parts of objects in younger and older adults. PLoS ONE 12:e0188929. doi: 10.1371/journal.pone.0188929

PubMed Abstract | CrossRef Full Text | Google Scholar

Vuilleumier, P., and Pourtois, G. (2007). Distributed and interactive brain mechanisms during emotion face perception: evidence from functional neuroimaging. Neuropsychologia 45, 174–194. doi: 10.1016/j.neuropsychologia.2006.06.003

PubMed Abstract | CrossRef Full Text | Google Scholar

Wang, W., Miao, D., and Zhao, L. (2014). Automatic detection of orientation changes of faces versus non-face objects: a visual MMN study. Biol. Psychol. 100, 71–78. doi: 10.1016/j.biopsycho.2014.05.004

PubMed Abstract | CrossRef Full Text | Google Scholar

Wei, R., Dowens, M. G., and Guo, T. (2018). Early lexical processing of Chines words indexed by Visual Mismatch Negativity effects. Sci. Rep. 8:1289. doi: 10.1038/s41598-018-19394-y

PubMed Abstract | CrossRef Full Text | Google Scholar

Whittle, S., Yücel, M., Yap, M. B., and Allen, N. B. (2011). Sex differences in the neural correlates of emotion: evidence from neuroimaging. Biol. Psychol. 87, 319–333. doi: 10.1016/j.biopsycho.2011.05.003

PubMed Abstract | CrossRef Full Text | Google Scholar

Xu, Q., Yang, Y., Wang, P., Sun, G., and Zhao, L. (2013). Gender differences in preattentive processing of facial expressions: an ERP study. Brain Topogr. 26, 488–500. doi: 10.1007/s10548-013-0275-0

PubMed Abstract | CrossRef Full Text | Google Scholar

Yang, X., Yu, Y., Chen, L., Sun, H., Qiao, Z., Qiu, X., et al. (2016). Gender differences in pre-attentive change detection for visual but not auditory stimuli. Clin. Neurophysiol. 127, 431–441. doi: 10.1016/j.clinph.2015.05.013

PubMed Abstract | CrossRef Full Text | Google Scholar

Zhang, J., Dong, X., Wang, L., Zhao, L., Weng, Z., Zhang, T., et al. (2018). Gender differences in global functional connectivity during facial emotion processing: A visual MMN study. Front. Behav. Neurosci. 12:220. doi: 10.3389/fnbeh.2018.00220

PubMed Abstract | CrossRef Full Text | Google Scholar

Zhang, S., Wang, H., and Guo, Q. (2018). Sex and physiological cycles affect the automatic perception of attractive opposite-sex faces: a visual mismatch negativity study. Evol. Psychol. 16, 1–18. doi: 10.1177/1474704918812140

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: visual mismatch negativity (vMMN), gender, mental fatigue, hypoxia, Internet addiction, physical exercise, fine arts

Citation: Czigler I and Kojouharova P (2022) Visual Mismatch Negativity: A Mini-Review of Non-pathological Studies With Special Populations and Stimuli. Front. Hum. Neurosci. 15:781234. doi: 10.3389/fnhum.2021.781234

Received: 22 September 2021; Accepted: 21 December 2021;
Published: 16 February 2022.

Edited by:

Vasil Kolev, Bulgarian Academy of Sciences (BAS), Bulgaria

Reviewed by:

Jan Kremláček, Charles University, Czechia
Dawei Wei, Xi'an Jiaotong-Liverpool University, China

Copyright © 2022 Czigler and Kojouharova. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Petia Kojouharova, kojouharova.petia@ttk.hu

Download