<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="review-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Neuroergon.</journal-id>
<journal-title>Frontiers in Neuroergonomics</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Neuroergon.</abbrev-journal-title>
<issn pub-type="epub">2673-6195</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fnrgo.2024.1341790</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Neuroergonomics</subject>
<subj-group>
<subject>Mini Review</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Ecological decoding of visual aesthetic preference with oscillatory electroencephalogram features&#x02014;A mini-review</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Welter</surname> <given-names>Marc</given-names></name>
<xref ref-type="corresp" rid="c001"><sup>&#x0002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/2541424/overview"/>
<role content-type="https://credit.niso.org/contributor-roles/conceptualization/"/>
<role content-type="https://credit.niso.org/contributor-roles/data-curation/"/>
<role content-type="https://credit.niso.org/contributor-roles/investigation/"/>
<role content-type="https://credit.niso.org/contributor-roles/methodology/"/>
<role content-type="https://credit.niso.org/contributor-roles/writing-original-draft/"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Lotte</surname> <given-names>Fabien</given-names></name>
<uri xlink:href="http://loop.frontiersin.org/people/12838/overview"/>
<role content-type="https://credit.niso.org/contributor-roles/conceptualization/"/>
<role content-type="https://credit.niso.org/contributor-roles/funding-acquisition/"/>
<role content-type="https://credit.niso.org/contributor-roles/methodology/"/>
<role content-type="https://credit.niso.org/contributor-roles/project-administration/"/>
<role content-type="https://credit.niso.org/contributor-roles/supervision/"/>
<role content-type="https://credit.niso.org/contributor-roles/writing-review-editing/"/>
</contrib>
</contrib-group>
<aff><institution>Inria Center at the University of Bordeaux/LaBRI</institution>, <addr-line>Talence</addr-line>, <country>France</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: David Perpetuini, University of Studies G. d&#x00027;Annunzio Chieti and Pescara, Italy</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Federica Gioia, University of Pisa, Italy</p>
<p>Sergio Rinella, University of Catania, Italy</p></fn>
<corresp id="c001">&#x0002A;Correspondence: Marc Welter <email>marc.welter&#x00040;inria.fr</email></corresp>
</author-notes>
<pub-date pub-type="epub">
<day>21</day>
<month>02</month>
<year>2024</year>
</pub-date>
<pub-date pub-type="collection">
<year>2024</year>
</pub-date>
<volume>5</volume>
<elocation-id>1341790</elocation-id>
<history>
<date date-type="received">
<day>20</day>
<month>11</month>
<year>2023</year>
</date>
<date date-type="accepted">
<day>19</day>
<month>01</month>
<year>2024</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2024 Welter and Lotte.</copyright-statement>
<copyright-year>2024</copyright-year>
<copyright-holder>Welter and Lotte</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license>
</permissions>
<abstract>
<p>In today&#x00027;s digital information age, human exposure to visual artifacts has reached an unprecedented quasi-omnipresence. Some of these cultural artifacts are elevated to the status of artworks which indicates a special appreciation of these objects. For many persons, the perception of such artworks coincides with aesthetic experiences (AE) that can positively affect health and wellbeing. AEs are composed of complex cognitive and affective mental and physiological states. More profound scientific understanding of the neural dynamics behind AEs would allow the development of passive Brain-Computer-Interfaces (BCI) that offer personalized art presentation to improve AE without the necessity of explicit user feedback. However, previous empirical research in visual neuroaesthetics predominantly investigated functional Magnetic Resonance Imaging and Event-Related-Potentials correlates of AE in unnaturalistic laboratory conditions which might not be the best features for practical neuroaesthetic BCIs. Furthermore, AE has, until recently, largely been framed as the experience of beauty or pleasantness. Yet, these concepts do not encompass all types of AE. Thus, the scope of these concepts is too narrow to allow personalized and optimal art experience across individuals and cultures. This narrative mini-review summarizes the state-of-the-art in oscillatory Electroencephalography (EEG) based visual neuroaesthetics and paints a road map toward the development of ecologically valid neuroaesthetic passive BCI systems that could optimize AEs, as well as their beneficial consequences. We detail reported oscillatory EEG correlates of AEs, as well as machine learning approaches to classify AE. We also highlight current limitations in neuroaesthetics and suggest future directions to improve EEG decoding of AE.</p></abstract>
<kwd-group>
<kwd>Electroencephalography (EEG)</kwd>
<kwd>brain-computer-interfaces</kwd>
<kwd>neuroaesthetics</kwd>
<kwd>aesthetic preference</kwd>
<kwd>oscillatory activity</kwd>
</kwd-group>
<counts>
<fig-count count="0"/>
<table-count count="1"/>
<equation-count count="0"/>
<ref-count count="76"/>
<page-count count="7"/>
<word-count count="6155"/>
</counts>
<custom-meta-wrap>
<custom-meta>
<meta-name>section-at-acceptance</meta-name>
<meta-value>Cognitive Neuroergonomics</meta-value>
</custom-meta>
</custom-meta-wrap>
</article-meta>
</front>
<body>
<sec sec-type="intro" id="s1">
<title>1 Introduction</title>
<p>Modern humans live in environments in which we are almost constantly confronted with artifacts. Many of these artifacts primarily serve instrumental purposes, i.e., they are designed to fulfill specific practical goals. Although, the design of many of these artifacts also take aesthetic considerations into account, a subset of artifacts are especially appreciated for their aesthetic qualities and designated with the label &#x0201C;art&#x0201D; or &#x0201C;artwork&#x0201D;. The field of neuroaesthetics studies the neural correlates of aesthetic experience (AE) (Vessel, <xref ref-type="bibr" rid="B71">2022</xref>). While some of this research looks at aesthetics from the lens of economics and aims to discover sensory objects feature that increase economic interests such as product sales (Costa-Feito et al., <xref ref-type="bibr" rid="B16">2019</xref>), an increasing number of studies report positive correlations between AEs and health and wellbeing (e.g., Fancourt and Finn, <xref ref-type="bibr" rid="B21">2019</xref>; see Skov and Nadal, <xref ref-type="bibr" rid="B67">2023</xref>, for a critique of such reports). With the internet and social media, many humans have now an unprecedented access to artworks. The abundance of available digital art together with advances in machine learning could allow for the personalization of AE with brain-computer-interfaces (BCI), and thus increase a kind of user experience that might positively affect human existence. BCIs are systems that allow direct communication between computers and brain signals (Vidal, <xref ref-type="bibr" rid="B74">1973</xref>). For example, BCIs that decode correlates of AE could be used in combination with recommender systems that curate personalized art exhibitions in virtual museums in order to optimize visitor satisfaction. Although, the personalization of AE based on behavioral data already exists in digital spaces such as social media, such personalization algorithms often require explicit user feedback which can interrupt AE and thereby diminish user experience.</p>
<p>In order to alleviate this current limitation of AE personalization, the development of human-machine-interfaces that implicitly measure individual AE in real time offers a promising lead. One type of human-machine-interface that seems particularly suited to this endeavor are passive non-invasive BCI such as those based on Electroencephalography (EEG). Previous neuroimaging during AEs focused mostly on correlating beauty/pleasantness ratings with functional Magnetic Resonance Imaging (fMRI) signals (Vessel, <xref ref-type="bibr" rid="B71">2022</xref>) and EEG Event-Related-Potentials (ERPs) (Jacobsen and Klein, <xref ref-type="bibr" rid="B31">2022</xref>) in unecological lab conditions. While fMRI cannot be used in ecological conditions, EEG is used in many environments, even real museums (King and Parada, <xref ref-type="bibr" rid="B37">2021</xref>). However, aesthetic preference does not necessarily arise time-locked to stimulus onset and might develop over larger timescales (Carbon, <xref ref-type="bibr" rid="B9">2023</xref>). Thus, in addition to ERPs, EEG oscillatory features can contain information about aesthetic preference (Strijbosch et al., <xref ref-type="bibr" rid="B68">2022</xref>). Therefore, EEG oscillatory features could be promising for aesthetic passive BCIs in ecological conditions outside the lab. However, the neuro-cognitive mechanisms of art experience in general and art preference in particular, remain unclear and more empirical research on neuro-aesthetics is required. This article summarizes the state-of-the-art in aesthetic preference decoding based on oscillatory EEG features and develops a road map toward the development of ecologically valid passive EEG-based BCI systems that could personalize user AE.</p></sec>
<sec id="s2">
<title>2 Aesthetic experience and art preference</title>
<p>AE can be defined as &#x0201C;a perceptual experience that is evaluative, affectively absorbing and engages comprehension (meaning) processes&#x0201D; (Vessel, <xref ref-type="bibr" rid="B71">2022</xref>). This definition corresponds to the Aesthetic Triad Model (Chatterjee and Vartanian, <xref ref-type="bibr" rid="B11">2014</xref>) which, based mostly on experimental aesthetics fMRI research, considers AE emerging from the interaction of three brain systems: sensory-motor, emotion-valuation, and knowledge-meaning. Similarly, Schaeffer (<xref ref-type="bibr" rid="B58">2015</xref>) postulates three main components of AE: attention, affect, and reward. To be precise, Schaeffer uses the French word &#x0201C;emotion&#x0201D; instead of &#x0201C;affect,&#x0201D; but following the definitory framework proposed by Schiller et al. (<xref ref-type="bibr" rid="B59">2023</xref>), we use the more general term &#x0201C;affect&#x0201D; composed of valence, arousal, and motivation. In this framework, &#x0201C;emotions&#x0201D; are considered a subset of conscious affects. Although, AEs are often related to the perception of artworks, the definitions above remains neutral with regards to the categorical status of the perceived objects, and it seems plausible that non-art objects, e.g., natural landscapes, can evoke similar AEs to art-related AEs. Nonetheless, the scope of this mini-review is constrained to the decoding of AEs during art watching which seems the most practical condition for improving AE with a BCI.</p>
<p>It seems evident that attentional information regulation is needed for any kind of experience. And indeed, works of art often invite attention. However, attentional EEG markers alone might not be ideal or sufficient features for AE decoding. Both negatively and positively valenced stimuli draw high amounts of attention due to the evolutionary significance of searching rewards and avoiding threats (Karim et al., <xref ref-type="bibr" rid="B35">2017</xref>). Furthermore, both internally and externally oriented attentional mechanisms are involved in the processing of visual AE (Ansorge et al., <xref ref-type="bibr" rid="B1">2022</xref>).</p>
<p>AEs cover a vast spectrum of different affective experiences (Menninghaus et al., <xref ref-type="bibr" rid="B45">2018</xref>) which makes it very difficult to train a machine learning model to discriminate between all different classes of AE (M&#x000FC;hl et al., <xref ref-type="bibr" rid="B46">2014</xref>). However, as we are primarily concerned with improving AE of art watching, we can simplify the problem to decoding and ranking aesthetic preference for various art stimuli.</p>
<p>Many neuroaesthetic fMRI studies reported activation of reward and pleasure processing areas in the brain during aesthetic experiences (Chuan-Peng et al., <xref ref-type="bibr" rid="B14">2019</xref>). In the aesthetic literature, &#x0201C;pleasantness&#x0201D; often refers to objective stimulus features related to beauty (e.g., harmony, symmetry) (e.g., Babiloni et al., <xref ref-type="bibr" rid="B2">2013</xref>). We, on the other hand, define &#x0201C;pleasantness&#x0201D; as related to an activation of hedonic brain systems that might be independent of objective stimulus properties. As Berridge and Kringelbach (<xref ref-type="bibr" rid="B5">2015</xref>) show, the reward system of the human brain consists of two major independent pathways, one related to pleasure and &#x0201C;liking&#x0201D; generated by opioids and endocannabinoids, and another one related to motivation and &#x0201C;wanting&#x0201D; produced by dopamine. Furthermore, activation of these reward circuits may not lead to the conscious experience of &#x0201C;liking&#x0201D; or &#x0201C;wanting&#x0201D; (Berridge and Winkielman, <xref ref-type="bibr" rid="B6">2010</xref>). These types of experiences are difficult to decode, because training labels come from subjective aesthetic ratings. Therefore, preferring one aesthetic stimulus over another aesthetic stimulus is not reducible to feeling a higher amount of pleasure while perceiving this stimulus compared to the other one. Nor is beautiful art always liked more than non-beautiful art (Muth et al., <xref ref-type="bibr" rid="B47">2020</xref>). The independence of these two reward systems has led scientists in the tradition of Kantian aesthetics (Kant, <xref ref-type="bibr" rid="B34">1983</xref>) to affirm the nature of aesthetic appreciation as inherently disinterested (e.g Sarasso et al., <xref ref-type="bibr" rid="B57">2020</xref>). Indeed, empirical data suggest that aesthetic pleasure does not need to be correlated with extrinsic motivations such as the desire to own or control the appreciated aesthetic object (Chatterjee and Vartanian, <xref ref-type="bibr" rid="B12">2016</xref>). However, from this it does not follow that aesthetic appreciation is essentially disinterested, because even though the aesthetic object might not be perceived in an instrumental fashion to fulfill an external goal, beholders might be intrinsically motivated and therefore interested to interact with an aesthetic object, because they will experience intrinsic rewards through this interaction. It has been shown that motor activity can be suppressed during AEs such as the perception of beauty (Kawabata and Zeki, <xref ref-type="bibr" rid="B36">2004</xref>). When Sarasso et al. (<xref ref-type="bibr" rid="B57">2020</xref>) interpret this phenomenon as evidence for aesthetic disinterest, they miss that movement is often a constitutional part of AE and that moments of stillness can lead to further motor interaction with an artwork (K&#x000FC;hnapfel et al., <xref ref-type="bibr" rid="B40">2023</xref>). Because AEs are intrinsically rewarding, they motivate to prolong the AE and to search for more AEs in the future (Reeves, <xref ref-type="bibr" rid="B53">1989</xref>). Still, the exact role of rewards in AE remains controversial. Some scientists claim that aesthetic appreciation can be reduced to the valuation of sensory objects (Skov and Nadal, <xref ref-type="bibr" rid="B66">2020</xref>), whereas others disagree with this reduction (Vessel, <xref ref-type="bibr" rid="B71">2022</xref>). Nonetheless, we hypothesize that EEG correlates related to reward processing should be discriminative for aesthetic preferences.</p></sec>
<sec id="s3">
<title>3 Oscillatory EEG correlates of visual aesthetic preference</title>
<p>As many Machine Learning classification algorithms require hand-crafted features, the following section will describe oscillatory EEG correlates of visual aesthetic preference that could constitute informative features. We conducted a literature review by searching public databases, as well as references in the neuroaesthetic literature. The search query was: &#x0201C;&#x0002B;<italic>aesthetic</italic>&#x0002A;|<italic>art</italic>&#x0002A;|<italic>paint</italic>&#x0002A;&#x0002B;<italic>EEG</italic>|<italic>brain</italic>|<italic>neur</italic>&#x0002A;&#x0002B;<italic>oscillat</italic>&#x0002A;|<italic>wav</italic>&#x0002A;|<italic>frequen</italic>&#x0002A;|<italic>rhythm</italic>.&#x0201D; We only included studies reporting oscillatory EEG correlates of AE for static visual art stimuli in naturalistic conditions. These findings are discussed as potential correlates of attention, affect and reward during AE (see <xref ref-type="table" rid="T1">Table 1</xref> for a summary).</p>
<table-wrap position="float" id="T1">
<label>Table 1</label>
<caption><p>This table summarizes EEG rhythm modulations correlated to attentional, affective and reward components of AE.</p></caption>
<table frame="box" rules="all">
<thead>
<tr style="background-color:#919498;color:#ffffff">
<th valign="top" align="left"><bold>EEG markers</bold></th>
<th valign="top" align="left"><bold>Attention</bold></th>
<th valign="top" align="left"><bold>Affect</bold></th>
<th valign="top" align="left"><bold>Reward</bold></th>
<th valign="top" align="left"><bold>Modulation</bold></th>
<th valign="top" align="left"><bold>Mental correlates</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">Frontal alpha asymmetry</td>
<td/>
<td valign="top" align="left">x</td>
<td valign="top" align="left">x</td>
<td valign="top" align="left">&#x02191;</td>
<td valign="top" align="left">Motivation (Wacker et al., <xref ref-type="bibr" rid="B75">2013</xref>) Pleasure (Babiloni et al., <xref ref-type="bibr" rid="B2">2013</xref>)</td>
</tr> <tr>
<td valign="top" align="left">Occipital/parietal alpha</td>
<td valign="top" align="left">x</td>
<td/>
<td/>
<td valign="top" align="left">&#x02193;</td>
<td valign="top" align="left">Attention (Peylo et al., <xref ref-type="bibr" rid="B50">2021</xref>) DMN (Vessel et al., <xref ref-type="bibr" rid="B72">2012</xref>) Pleasure (Rawls et al., <xref ref-type="bibr" rid="B52">2021</xref>)</td>
</tr> <tr>
<td valign="top" align="left">Frontal beta</td>
<td valign="top" align="left">x</td>
<td valign="top" align="left">x</td>
<td/>
<td valign="top" align="left">&#x02193;</td>
<td valign="top" align="left">Engagement (Herrera-Arcos et al., <xref ref-type="bibr" rid="B28">2017</xref>) Empathy (Schubring and Schupp, <xref ref-type="bibr" rid="B61">2019</xref>)</td>
</tr>
<tr>
<td valign="top" align="left">Centroparietal gamma</td>
<td valign="top" align="left">x</td>
<td valign="top" align="left">x</td>
<td valign="top" align="left">x</td>
<td valign="top" align="left">&#x02191;</td>
<td valign="top" align="left">Savoring (Strijbosch et al., <xref ref-type="bibr" rid="B68">2022</xref>)</td>
</tr></tbody>
</table>
<table-wrap-foot>
<p>&#x0201C;x&#x0201D; denotes hypothesized links between EEG markers and components of AE, whereas &#x0201C;&#x02191;&#x0201D; and &#x0201C;&#x02193;&#x0201D; correspond to the directions of oscillatory modulations, i.e., &#x0201C;&#x02191;&#x0201D; indicates that this EEG marker&#x00027;s value increases for preferred artworks. The last column lists possible mental correlates related to these modulations.</p>
</table-wrap-foot>
</table-wrap>
<sec>
<title>3.1 Attention</title>
<p>Although correlates of visual attention are widely studied with EEG and harnessed in BCI (Nam et al., <xref ref-type="bibr" rid="B48">2018</xref>), to our knowledge only Rawls et al. (<xref ref-type="bibr" rid="B52">2021</xref>) report rhythm modulations commonly associated with visual attention that were informative of art preference. Instructing 44 subjects to give preference ratings to binarized Jackson Pollock paintings and computer generated cantor fractals after 4s viewing time, they found a suppression of the parietal alpha rhythm correlated with art preference. Parietal alpha modulations have been implicated in attentional processing (Peylo et al., <xref ref-type="bibr" rid="B50">2021</xref>) and the Default Mode Network (DMN) which has been linked to aesthetically moving art watching (Vessel et al., <xref ref-type="bibr" rid="B72">2012</xref>, <xref ref-type="bibr" rid="B73">2013</xref>). However, the authors noted that it might be related to stimulus properties such as visual complexity and not necessarily be related to preference. Another brain rhythm associated with the DMN, the theta rhythm, was investigated by Strijbosch et al. (<xref ref-type="bibr" rid="B68">2022</xref>) during aesthetically moving art watching, but they found no relation between theta modulations and AE.</p>
</sec>
<sec>
<title>3.2 Affect</title>
<p>Many of the brain areas involved in affective processing are located subcortically which makes it difficult to measure their activity with EEG (M&#x000FC;hl et al., <xref ref-type="bibr" rid="B46">2014</xref>). Nonetheless, beta suppression has been reported during empathy and affective processing in general (Schubring and Schupp, <xref ref-type="bibr" rid="B61">2019</xref>). Herrera-Arcos et al. (<xref ref-type="bibr" rid="B28">2017</xref>) conducted a mobile EEG study (&#x00023;subjects = 25) during an Otto Dix exhibition in a real museum with a commercial Muse EEG. Their analyses show a correlation between beta suppression and artwork preference. This beta suppression was interpreted as related to emotional engagement by the authors. However, it is also possible that the measured beta suppression was generated by motor activity, especially under mobile recording conditions (Pope et al., <xref ref-type="bibr" rid="B51">2022</xref>). Still, motor activity could potentially be related to affective processing and informative of aesthetic preference (K&#x000FC;hnapfel et al., <xref ref-type="bibr" rid="B40">2023</xref>).</p>
</sec>
<sec>
<title>3.3 Reward</title>
<p>fMRI scans during AEs often report activation of brain areas involved in reward processing (e.g., Kawabata and Zeki, <xref ref-type="bibr" rid="B36">2004</xref>). Unfortunately, similarly to affective processing, many reward related areas in the brain are also subcortical and hard to measure. Still, reward related-signals in the frontal cortex could be measurable by frontal EEG electrode, an hypothesis which is strengthened by a multitude of experimental reports of a relation between frontal alpha asymmetry (FAA) in the EEG and reward processing across different reward modalities (see Sabu et al., <xref ref-type="bibr" rid="B56">2022</xref> for a review). Although, FAA has been linked to pleasure and liking, some suggest that it is generated by motivational dopamine release (Wacker et al., <xref ref-type="bibr" rid="B75">2013</xref>). Babiloni et al. (<xref ref-type="bibr" rid="B2">2013</xref>) collected mobile EEG data (&#x00023;subjects = 25) during a Dutch Golden Age exhibition in a museum and found a correlation between FAA and beauty/pleasantness ratings. Babiloni et al. (<xref ref-type="bibr" rid="B3">2015</xref>) reproduced this correlation during a Titian exhibition (&#x00023;subjects = 27). Cheung et al. (<xref ref-type="bibr" rid="B13">2019</xref>) also reported a link between FAA and beauty ratings (&#x00023;subjects = 20) for Western art displayed on a screen. Therefore, FAA could be a good marker for an AE that motivates to prolong the AE or to reproduce it in the future. However, one has to keep in mind that FAA is related to motivational processing in general and not limited to a particular valence (Lacey and Gable, <xref ref-type="bibr" rid="B41">2022</xref>). Thus, artworks that evoke anger or other negatively valenced approach-motivation related experiences could also generate FAA.</p>
</sec>
<sec>
<title>3.4 Mixed component</title>
<p>Strijbosch et al. (<xref ref-type="bibr" rid="B68">2022</xref>) studied the neural dynamics during aesthetically moving experiences with EEG. Showing 35 participants a wide range of diverse artworks for 6 s, they reported a gamma increase 1 s after stimulus presentation until the end of the trial. This was interpreted as a correlate of savoring, a process of up-regulation of positive affect by sustaining attention on an experience. Similar gamma modulations were found with regards to other types of positively affecting AE, such as enjoying the taste of chocolate (Berk et al., <xref ref-type="bibr" rid="B4">2016</xref>; Silver et al., <xref ref-type="bibr" rid="B64">2018</xref>). EEG gamma rhythms are often contaminated by muscle artifacts and are, therefore, commonly filtered out in EEG analysis (Jeunet et al., <xref ref-type="bibr" rid="B33">2018</xref>). And although, Strijbosch et al. (<xref ref-type="bibr" rid="B68">2022</xref>) followed a rigorous artifact removal protocol, muscle contamination remains possible.</p>
<p>Now that we have reviewed potential oscillatory EEG markers of aesthetic preference, we will discuss their use as features for machine learning classification of AE.</p></sec>
</sec>
<sec id="s4">
<title>4 Classification of aesthetic preference based on oscillatory EEG markers</title>
<p>Almost no publications reporting aesthetic preference classification results based on EEG signals exist to our knowledge. Fraiwan et al. (<xref ref-type="bibr" rid="B22">2023</xref>) reported an aesthetic enjoyment classification accuracy above 98% using a deep neural network and Multiscale Entropy features on a mobile data set from Cruz-Garza et al. (<xref ref-type="bibr" rid="B17">2017</xref>) (&#x00023;subjects = 28). EEG Entropy measure have been shown to contain meaningful information for emotion decoding (Patel et al., <xref ref-type="bibr" rid="B49">2021</xref>) and decoding of aesthetical preference of music (Carpentier et al., <xref ref-type="bibr" rid="B10">2019</xref>). However, EEG classification almost never yields such high performance in practice and we should remain skeptical (Jeunet et al., <xref ref-type="bibr" rid="B33">2018</xref>). The data could contain informative muscular artifacts, as the authors do not report any artifact removal protocol or the trained model could have been overfitted on the data, as deep learning models often are (Goodfellow et al., <xref ref-type="bibr" rid="B26">2016</xref>). Nonetheless, brain entropy measures could be useful for visual aesthetic preference decoding.</p>
<p>Mazzacane et al. (<xref ref-type="bibr" rid="B44">2023</xref>) used a novel classification algorithm based on temporal decision trees (Sciavicco and Stan, <xref ref-type="bibr" rid="B62">2020</xref>) to extract symbolic rules ralating EEG amplitudes of specific frequency bands and electrode locations to aesthetic liking or disliking in an ecological museum context (&#x00023;subjects = 16). The authors report high classification performance using features from beta and gamma bands that could be related to affective and reward processing. However, we remain skeptical to their claim that their temporal decision tree classification algorithm makes muscle artifact removal unnecessary and hypothesize that the beta and gamma activity used by the classifier could be generated by muscular and movement artifacts since they were not controlled for, cleaned nor removed.</p>
<p>Surprisingly, more established EEG classification algorithms (see Lotte et al., <xref ref-type="bibr" rid="B43">2018</xref>, for a review) have not been applied to AE decoding to our knowledge, and should be explored in the neuroaesthetic domain. Deep learning classifiers (e.g., Lawhern et al., <xref ref-type="bibr" rid="B42">2016</xref>; Schirrmeister et al., <xref ref-type="bibr" rid="B60">2017</xref>) might be worth investigating as well, because our theoretical and empirical knowledge about which EEG features contain discriminant information for AE decoding is lacking. However, large public datasets required to benchmark decoding algorithms of AE are missing (Jayaram and Barachant, <xref ref-type="bibr" rid="B32">2018</xref>). For now, BCI classification algorithms based on Riemannian Geometry seem to work best and often outperform deep learning methods (Congedo et al., <xref ref-type="bibr" rid="B15">2017</xref>; Roy et al., <xref ref-type="bibr" rid="B55">2022</xref>) which is why some state-of-the-art EEG BCI decoding implements Riemmanian Geometry in deep neural network architectures (e.g., Kobler et al., <xref ref-type="bibr" rid="B38">2022</xref>).</p></sec>
<sec sec-type="discussion" id="s5">
<title>5 Discussion</title>
<p>Our literature search yielded six publications that investigated oscillatory EEG markers of AE that reported EEG modulations in alpha, beta and gamma frequency bands. This variability could be explainable by different experimental protocols, e.g., artworks used and aesthetic ratings. Thus, robust EEG features to decode aesthetic preference that generalize to all types of AE remain unclear.</p>
<sec>
<title>5.1 Pitfalls</title>
<p>We related specific EEG frequency bands with attentional, affective and reward-related components of AE, however neural rhythms can not be one-to-one mapped to mental states (Brouwer et al., <xref ref-type="bibr" rid="B7">2015</xref>). Furthermore, EEG measurements suffer from high variability within and between subjects and even data from the same subject might be very different if recorded at different times (Fairclough and Lotte, <xref ref-type="bibr" rid="B20">2020</xref>). Additionally, it has been shown that context influences AE, and AEs in laboratory conditions are quite different from AE in the wild (Carbon, <xref ref-type="bibr" rid="B8">2020</xref>). As such, we do not know whether correlates of AE discovered during a laboratory experiment will generalize to natural contexts outside the laboratory. To our knowledge, only FAA has been reported in both conditions. However, in experimental conditions, even in real museum settings, subjects often have to look at an affective stimulus, even if they would prefer not to, which influences FAA (Lacey and Gable, <xref ref-type="bibr" rid="B41">2022</xref>).</p>
<p>Due to the need of a preference groundtruth, and without direct access to reward processing information in the brain, aesthetic preference decoding has to rely on explicit rating tasks that assign a subjective value to an artwork. This explicit aesthetic judgment task can introduce confounds, as shown by ERPs that only appear with aesthetic rating tasks (H&#x000F6;fel and Jacobsen, <xref ref-type="bibr" rid="B29">2017</xref>). Similarly, some EEG oscillations might be related to aesthetic judgment and not AE <italic>per se</italic>. Therefore, these might not appear in natural art watching conditions.</p>
<p>Different artworks might evoke very different AEs and consequently different EEG signals. Additionally, various visual stimulus features are known to affect the EEG signal such as luminance (Ero&#x0011F;lu et al., <xref ref-type="bibr" rid="B19">2020</xref>) or complexity (Rawls et al., <xref ref-type="bibr" rid="B52">2021</xref>). Therefore, it remains unclear, whether empirical results gathered with one type of art stimuli will generalize to other types of art. Last, but not least, natural visual AE is an embodied experience that involves motor processes such as eye movements. Therefore, EEG data of such experiences should be subjected to similar rigorous artifact removal protocols as other mobile EEG imaging scenarios (Gorjan et al., <xref ref-type="bibr" rid="B27">2022</xref>).</p>
</sec>
<sec>
<title>5.2 Future work</title>
<p>It was mentioned before that the subcortical location of many reward and affect processing areas involved in aesthetic processing makes EEG decoding difficult. Nonetheless, their activity can be estimated using computational modeling. Singer et al. (<xref ref-type="bibr" rid="B65">2023</xref>) used fMRI-informed EEG models of reward processing to decode activity from the ventral striatum using EEG and demonstrated good model performance across aesthetic and non aesthetic domains, as well as across subjects. Another possibility that could improve decoding of aesthetic appreciation would be to use EEG source localization algorithms with realistic headmodels and define reward areas as regions of interest. Although EEG source localization is limited in its accuracy, empirical results have shown that extracting information from source localized regions of interest can improve classification performance (Edelman et al., <xref ref-type="bibr" rid="B18">2019</xref>). One limitation of this approach is that source localization algorithms are computationally expensive. Potentially, investigating functional connectivity of EEG electrodes (Gonzalez-Astudillo et al., <xref ref-type="bibr" rid="B25">2022</xref>), might be an alternative and/or a complement to anatomical source space feature extraction. Brain networks such as the DMN are activated during AE (Vessel, <xref ref-type="bibr" rid="B71">2022</xref>). Kontson et al. (<xref ref-type="bibr" rid="B39">2015</xref>) reported increased functional connectivity between frontal and parieto-occipital EEG electrodes during art watching compared to a baseline. However, they did not investigate functional connectivity for different aesthetic preference levels.</p>
<p>Finally, AE is not exhausted by attention, affect, and reward, but also includes semantic and motor processes. While decoding semantic content in EEG signals is very difficult (e.g., Gifford et al., <xref ref-type="bibr" rid="B24">2022</xref>), EEG motor correlates are relatively well studied in BCI (Yuan and He, <xref ref-type="bibr" rid="B76">2014</xref>). Neuroaesthetic research has shown that art watching can activate motor related brain areas which could be related to motor simulation (e.g., Umilt&#x000E1; et al., <xref ref-type="bibr" rid="B70">2012</xref>) and empathy (Gallese, <xref ref-type="bibr" rid="B23">2017</xref>). Unfortunately, the link between embodied cognition and art appreciation has not been shown using neuroimaging. However, correlations between motor priming and art appreciation have been reported (e.g., Ticini et al., <xref ref-type="bibr" rid="B69">2014</xref>). We hypothesize that EEG correlates of motor control such as mu rhythm modulation could, therefore, be informative of AE.</p></sec>
</sec>
<sec id="s6">
<title>6 Concluding remarks</title>
<p>Neuroaesthetic research on AE decoding is still in its infancy. We reviewed sparse and inconsistent reports of EEG oscillatory correlates of AE. Although, we focused on AE of visual art here, empirical data suggests that other AEs are not fundamentally different (Vessel, <xref ref-type="bibr" rid="B71">2022</xref>). Thus, some of the neural correlates of art preference reviewed here could generalize to other types of preference. Still, our focus on EEG for AE decoding constitutes a limitation. Other physiological signals, e.g., heart rate and skin conductance, are informative of affective states in general (Shu et al., <xref ref-type="bibr" rid="B63">2018</xref>; Rinella et al., <xref ref-type="bibr" rid="B54">2022</xref>) and of AE in particular (K&#x000FC;hnapfel et al., <xref ref-type="bibr" rid="B40">2023</xref>). Indeed, combining EEG with these signals can improve mental state classification (Hogervorst et al., <xref ref-type="bibr" rid="B30">2014</xref>) and should be explored for AE decoding. Overall, there remain a number of challenges to solve before BCI could reliably and rigorously decode AE from EEG, for different artworks and different contexts. We hope this mini-review modestly contributed to identify these challenges and to propose relevant directions for future works.</p></sec>
<sec sec-type="author-contributions" id="s7">
<title>Author contributions</title>
<p>MW: Conceptualization, Data curation, Investigation, Methodology, Writing &#x02014; original draft. FL: Conceptualization, Funding acquisition, Methodology, Project administration, Supervision, Writing &#x02014; review &#x00026; editing.</p></sec>
</body>
<back>
<sec sec-type="funding-information" id="s8">
<title>Funding</title>
<p>The author(s) declare financial support was received for the research, authorship, and/or publication of this article. This research was funded by CHISTERA project BITSCOPE (grant ANR-21-CHRA-0003-01).</p>
</sec>
<sec sec-type="COI-statement" id="conf1">
<title>Conflict of interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest. The author(s) declared that they were an editorial board member of Frontiers, at the time of submission. This had no impact on the peer review process and the final decision.</p>
</sec>
<sec sec-type="disclaimer" id="s9">
<title>Publisher&#x00027;s note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
<ref-list>
<title>References</title>
<ref id="B1">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ansorge</surname> <given-names>U.</given-names></name> <name><surname>Pelowski</surname> <given-names>M.</given-names></name> <name><surname>Quigley</surname> <given-names>C.</given-names></name> <name><surname>Peschl</surname> <given-names>M.</given-names></name> <name><surname>Leder</surname> <given-names>H.</given-names></name></person-group> (<year>2022</year>). <article-title>Art and perception: using empirical aesthetics in research on consciousness</article-title>. <source>Front. Psychol</source>. <volume>13</volume>:<fpage>895985</fpage>. <pub-id pub-id-type="doi">10.3389/fpsyg.2022.895985</pub-id><pub-id pub-id-type="pmid">35756216</pub-id></citation></ref>
<ref id="B2">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Babiloni</surname> <given-names>F.</given-names></name> <name><surname>Cherubino</surname> <given-names>P.</given-names></name> <name><surname>Graziani</surname> <given-names>I.</given-names></name> <name><surname>Trettel</surname> <given-names>A.</given-names></name> <name><surname>Infarinato</surname> <given-names>F.</given-names></name> <name><surname>Picconi</surname> <given-names>D.</given-names></name> <etal/></person-group>. (<year>2013</year>). <article-title>&#x0201C;Neuroelectric brain imaging during a real visit of a fine arts gallery: a neuroaesthetic study of XVII century dutch painters,&#x0201D;</article-title> in <source>Proceedings of the Annual International Conference of the IEEE/EMBS</source> (<publisher-loc>Osaka</publisher-loc>: <publisher-name>IEEE</publisher-name>), <fpage>6179</fpage>&#x02013;<lpage>6182</lpage>. <pub-id pub-id-type="doi">10.1109/EMBC.2013.6610964</pub-id><pub-id pub-id-type="pmid">24111151</pub-id></citation></ref>
<ref id="B3">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Babiloni</surname> <given-names>F.</given-names></name> <name><surname>Rossi</surname> <given-names>D.</given-names></name> <name><surname>Cherubino</surname> <given-names>P.</given-names></name> <name><surname>Trettel</surname> <given-names>A.</given-names></name> <name><surname>Picconi</surname> <given-names>D.</given-names></name> <name><surname>Maglione</surname> <given-names>A.</given-names></name> <etal/></person-group>. (<year>2015</year>). <article-title>&#x0201C;The first impression is what matters: a neuroaesthetic study of the cerebral perception and appreciation of paintings by titian,&#x0201D;</article-title> in <source>Proceedings of the Annual International Conference of the IEEE/EMBS</source> (<publisher-loc>Milan</publisher-loc>: <publisher-name>IEEE</publisher-name>), <fpage>7990</fpage>&#x02013;<lpage>7993</lpage>. <pub-id pub-id-type="doi">10.1109/EMBC.2015.7320246</pub-id><pub-id pub-id-type="pmid">26738146</pub-id></citation></ref>
<ref id="B4">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Berk</surname> <given-names>L.</given-names></name> <name><surname>Lee</surname> <given-names>J.</given-names></name> <name><surname>Mali</surname> <given-names>D.</given-names></name> <name><surname>Lohman</surname> <given-names>E.</given-names></name> <name><surname>Bains</surname> <given-names>G.</given-names></name> <name><surname>Daher</surname> <given-names>N.</given-names></name> <etal/></person-group>. (<year>2016</year>). <article-title>Chocolate and the brain: Cacao increases power spectral density (&#x003BC;v2) of EEG gamma wave band activity (31&#x02013;40hz) which is associated with neuronal synchronization, enhanced cognition, memory, recall and physiological benefits</article-title>. <source>FASEB J</source>. 30. <pub-id pub-id-type="doi">10.1096/fasebj.30.1_supplement.679.14</pub-id></citation>
</ref>
<ref id="B5">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Berridge</surname> <given-names>K.</given-names></name> <name><surname>Kringelbach</surname> <given-names>M. L.</given-names></name></person-group> (<year>2015</year>). <article-title>Pleasure systems in the brain</article-title>. <source>Neuron</source> <volume>86</volume>, <fpage>646</fpage>&#x02013;<lpage>664</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuron.2015.02.018</pub-id><pub-id pub-id-type="pmid">25950633</pub-id></citation></ref>
<ref id="B6">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Berridge</surname> <given-names>K.</given-names></name> <name><surname>Winkielman</surname> <given-names>P.</given-names></name></person-group> (<year>2010</year>). <article-title>What is an unconscious emotion? (the case of unconscious &#x0201C;liking&#x0201D;)</article-title>. <source>Cogn. Emot</source>. <volume>17</volume>, <fpage>181</fpage>&#x02013;<lpage>211</lpage>. <pub-id pub-id-type="doi">10.1080/02699930302289</pub-id></citation>
</ref>
<ref id="B7">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brouwer</surname> <given-names>A.</given-names></name> <name><surname>Zander</surname> <given-names>T.</given-names></name> <name><surname>van Erp</surname> <given-names>J.</given-names></name> <name><surname>Korteling</surname> <given-names>J.</given-names></name> <name><surname>Bronkhorst</surname> <given-names>A.</given-names></name></person-group> (<year>2015</year>). <article-title>Using neurophysiological signals that reflect cognitive or affective state: six recommendations to avoid common pitfalls</article-title>. <source>Front. Neurosci</source>. <volume>9</volume>:<fpage>136</fpage>. <pub-id pub-id-type="doi">10.3389/fnins.2015.00136</pub-id><pub-id pub-id-type="pmid">25983676</pub-id></citation></ref>
<ref id="B8">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Carbon</surname> <given-names>C. C.</given-names></name></person-group> (<year>2020</year>). <article-title>Ecological art experience: how we can gain experimental control while preserving ecologically valid settings and contexts</article-title>. <source>Front. Psychol</source>. <volume>11</volume>:<fpage>800</fpage>. <pub-id pub-id-type="doi">10.3389/fpsyg.2020.00800</pub-id><pub-id pub-id-type="pmid">32499736</pub-id></citation></ref>
<ref id="B9">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Carbon</surname> <given-names>C. C.</given-names></name></person-group> (<year>2023</year>). <article-title>About the need for a more adequate way to get an understanding of the experiencing of aesthetic items</article-title>. <source>Behav. Sci</source>. 13. <pub-id pub-id-type="doi">10.3390/bs13110907</pub-id><pub-id pub-id-type="pmid">37998654</pub-id></citation></ref>
<ref id="B10">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Carpentier</surname> <given-names>S.</given-names></name> <name><surname>McCulloch</surname> <given-names>A.</given-names></name> <name><surname>Brown</surname> <given-names>T.</given-names></name> <name><surname>Faber</surname> <given-names>S.</given-names></name> <name><surname>Ritter</surname> <given-names>P.</given-names></name> <name><surname>Wang</surname> <given-names>Z.</given-names></name> <etal/></person-group>. (<year>2019</year>). <article-title>Complexity matching: brain signals mirror environment information patterns during music listening and reward</article-title>. <source>J. Cogn. Neurosci</source>. <volume>32</volume>, <fpage>1</fpage>&#x02013;<lpage>11</lpage>. <pub-id pub-id-type="doi">10.1101/693531</pub-id><pub-id pub-id-type="pmid">31820677</pub-id></citation></ref>
<ref id="B11">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chatterjee</surname> <given-names>A.</given-names></name> <name><surname>Vartanian</surname> <given-names>O.</given-names></name></person-group> (<year>2014</year>). <article-title>Neuroaesthetics</article-title>. <source>Trends Cogn. Sci</source>. <volume>18</volume>, <fpage>370</fpage>&#x02013;<lpage>375</lpage>. <pub-id pub-id-type="doi">10.1016/j.tics.2014.03.003</pub-id></citation>
</ref>
<ref id="B12">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chatterjee</surname> <given-names>A.</given-names></name> <name><surname>Vartanian</surname> <given-names>O.</given-names></name></person-group> (<year>2016</year>). <article-title>Neuroscience of aesthetics</article-title>. <source>Ann. N.Y. Acad. Sci</source>. <volume>1369</volume>, <fpage>172</fpage>&#x02013;<lpage>194</lpage>. <pub-id pub-id-type="doi">10.1111/nyas.13035</pub-id></citation>
</ref>
<ref id="B13">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cheung</surname> <given-names>M. C.</given-names></name> <name><surname>Law</surname> <given-names>D.</given-names></name> <name><surname>Yip</surname> <given-names>J.</given-names></name> <name><surname>Wong</surname> <given-names>C. W. Y.</given-names></name></person-group> (<year>2019</year>). <article-title>Emotional responses to visual art and commercial stimuli: implications for creativity and aesthetics</article-title>. <source>Front. Psychol</source>. <volume>10</volume>:<fpage>14</fpage>. <pub-id pub-id-type="doi">10.3389/fpsyg.2019.00014</pub-id><pub-id pub-id-type="pmid">30723437</pub-id></citation></ref>
<ref id="B14">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chuan-Peng</surname> <given-names>H.</given-names></name> <name><surname>Huang</surname> <given-names>Y.</given-names></name> <name><surname>Eickhoff</surname> <given-names>S. B.</given-names></name> <name><surname>Peng</surname> <given-names>K.</given-names></name> <name><surname>Sui</surname> <given-names>J.</given-names></name></person-group> (<year>2019</year>). <article-title>Seeking the beauty center in the brain: a meta-analysis of fmri studies of beautiful human faces and visual art</article-title>. <source>Cogn. Affect. Behav. Neurosci</source>. <volume>20</volume>, <fpage>1200</fpage>&#x02013;<lpage>1215</lpage>. <pub-id pub-id-type="doi">10.3758/s13415-020-00827-z</pub-id><pub-id pub-id-type="pmid">33089442</pub-id></citation></ref>
<ref id="B15">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Congedo</surname> <given-names>M.</given-names></name> <name><surname>Barachant</surname> <given-names>A.</given-names></name> <name><surname>Bhatia</surname> <given-names>R.</given-names></name></person-group> (<year>2017</year>). <article-title>Riemannian geometry for EEG-based brain-computer interfaces; a primer and a review</article-title>. <source>Brain-Comput. Int.</source> <volume>4</volume>, <fpage>1</fpage>&#x02013;<lpage>20</lpage>. <pub-id pub-id-type="doi">10.1080/2326263X.2017.1297192</pub-id></citation>
</ref>
<ref id="B16">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Costa-Feito</surname> <given-names>A.</given-names></name> <name><surname>Gonz&#x000E1;lez-Fern&#x000E1;ndez</surname> <given-names>A.M.</given-names></name> <name><surname>Rodr&#x000ED;guez-Santos</surname> <given-names>C.</given-names></name> <name><surname>Cervantes-Blanco</surname> <given-names>M.</given-names></name></person-group> (<year>2019</year>). <article-title>Electroencephalography in consumer behaviour and marketing: a science mapping approach</article-title>. <source>Humanit. Soc. Sci. Commun</source>. <volume>10</volume>:<fpage>474</fpage>. <pub-id pub-id-type="doi">10.1057/s41599-023-01991-6</pub-id></citation>
</ref>
<ref id="B17">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cruz-Garza</surname> <given-names>J.</given-names></name> <name><surname>Brantley</surname> <given-names>J.</given-names></name> <name><surname>Nakagome</surname> <given-names>S.</given-names></name> <name><surname>Kontson</surname> <given-names>K.</given-names></name> <name><surname>Robleto</surname> <given-names>D.</given-names></name> <name><surname>Contreras-Vidal</surname> <given-names>J.</given-names></name> <etal/></person-group>. (<year>2017</year>). <source>Mobile EEG recordings in an art museum setting</source>. IEEE Dataport.</citation>
</ref>
<ref id="B18">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Edelman</surname> <given-names>B. J.</given-names></name> <name><surname>Meng</surname> <given-names>J.</given-names></name> <name><surname>Suma</surname> <given-names>D.</given-names></name> <name><surname>Zurn</surname> <given-names>C. A.</given-names></name> <name><surname>Nagarajan</surname> <given-names>E.</given-names></name> <name><surname>Baxter</surname> <given-names>B. S.</given-names></name> <etal/></person-group>. (<year>2019</year>). <article-title>Noninvasive neuroimaging enhances continuous neural tracking for robotic device control</article-title>. <source>Sci. Robot</source>. <volume>4</volume>:<fpage>aaw6844</fpage>. <pub-id pub-id-type="doi">10.1126/scirobotics.aaw6844</pub-id><pub-id pub-id-type="pmid">31656937</pub-id></citation></ref>
<ref id="B19">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ero&#x0011F;lu</surname> <given-names>K.</given-names></name> <name><surname>Kayik&#x000E7;io&#x0011F;lu</surname> <given-names>T.</given-names></name> <name><surname>Osman</surname> <given-names>O.</given-names></name></person-group> (<year>2020</year>). <article-title>Effect of brightness of visual stimuli on EEG signals</article-title>. <source>Behav. Brain Res</source>. <volume>382</volume>:<fpage>112486</fpage>. <pub-id pub-id-type="doi">10.1016/j.bbr.2020.112486</pub-id><pub-id pub-id-type="pmid">31958517</pub-id></citation></ref>
<ref id="B20">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fairclough</surname> <given-names>S.</given-names></name> <name><surname>Lotte</surname> <given-names>F.</given-names></name></person-group> (<year>2020</year>). <article-title>Grand challenges in neurotechnology and system neuroergonomics</article-title>. <source>Front. Neuroergonom</source>. <volume>1</volume>:<fpage>602504</fpage>. <pub-id pub-id-type="doi">10.3389/fnrgo.2020.602504</pub-id><pub-id pub-id-type="pmid">38234311</pub-id></citation></ref>
<ref id="B21">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fancourt</surname> <given-names>D.</given-names></name> <name><surname>Finn</surname> <given-names>S.</given-names></name></person-group> (<year>2019</year>). <article-title>What is the evidence on the role of the arts in improving health and well-being? A scoping review</article-title>. <source>Nord. J. Arts Cult. Health</source> <volume>2</volume>, <fpage>77</fpage>&#x02013;<lpage>83</lpage>. <pub-id pub-id-type="doi">10.18261/issn.2535-7913-2020-01-08</pub-id></citation>
</ref>
<ref id="B22">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fraiwan</surname> <given-names>M.</given-names></name> <name><surname>Alafeef</surname> <given-names>M.</given-names></name> <name><surname>Almomani</surname> <given-names>F.</given-names></name></person-group> (<year>2023</year>). <article-title>Gauging human visual interest using multiscale entropy analysis of EEG signals</article-title>. <source>J. Ambient. Intell. Humaniz. Comput</source>. <volume>12</volume>, <fpage>2435</fpage>&#x02013;<lpage>2447</lpage>. <pub-id pub-id-type="doi">10.1007/s12652-020-02381-5</pub-id></citation>
</ref>
<ref id="B23">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Gallese</surname> <given-names>V.</given-names></name></person-group> (<year>2017</year>). <article-title>&#x0201C;The empathic body in experimental aesthetics-embodied simulation and art,&#x0201D;</article-title> in <source>Empathy</source>, ed. W. Lux (<publisher-loc>London</publisher-loc>: <publisher-name>Palgrave Macmillan</publisher-name>), <fpage>181</fpage>&#x02013;<lpage>199</lpage>. <pub-id pub-id-type="doi">10.1057/978-1-137-51299-4_7</pub-id></citation>
</ref>
<ref id="B24">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gifford</surname> <given-names>A. T.</given-names></name> <name><surname>Dwivedi</surname> <given-names>K.</given-names></name> <name><surname>Roig</surname> <given-names>G.</given-names></name> <name><surname>Cichy</surname> <given-names>R. M.</given-names></name></person-group> (<year>2022</year>). <article-title>A large and rich EEG dataset for modeling human visual object recognition</article-title>. <source>Neuroimage</source> <volume>264</volume>:<fpage>119754</fpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2022.119754</pub-id><pub-id pub-id-type="pmid">36400378</pub-id></citation></ref>
<ref id="B25">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gonzalez-Astudillo</surname> <given-names>J.</given-names></name> <name><surname>Cattai</surname> <given-names>T.</given-names></name> <name><surname>Bassignana</surname> <given-names>G.</given-names></name> <name><surname>Corsi</surname> <given-names>M. C.</given-names></name> <name><surname>Fallani</surname> <given-names>F. D. V.</given-names></name></person-group> (<year>2022</year>). <article-title>Network-based brain-computer interfaces: principles and applications</article-title>. <source>J. Neural Eng</source>. 18. <pub-id pub-id-type="doi">10.1088/1741-2552/abc760</pub-id></citation>
</ref>
<ref id="B26">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Goodfellow</surname> <given-names>I.</given-names></name> <name><surname>Bengio</surname> <given-names>Y.</given-names></name> <name><surname>Courville</surname> <given-names>A.</given-names></name></person-group> (<year>2016</year>). <source>Deep Learning</source>. <publisher-loc>Cambridge, MA</publisher-loc>: <publisher-name>MIT Press</publisher-name>.</citation>
</ref>
<ref id="B27">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gorjan</surname> <given-names>D.</given-names></name> <name><surname>Gramann</surname> <given-names>K.</given-names></name> <name><surname>De Pauw</surname> <given-names>K.</given-names></name> <name><surname>Marusic</surname> <given-names>U.</given-names></name></person-group> (<year>2022</year>). <article-title>Removal of movement-induced EEG artifacts: current state of the art and guidelines</article-title>. <source>J. Neural Eng</source>. 19. <pub-id pub-id-type="doi">10.1088/1741-2552/ac542c</pub-id></citation>
</ref>
<ref id="B28">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Herrera-Arcos</surname> <given-names>G.</given-names></name> <name><surname>Tamez-Duque</surname> <given-names>J.</given-names></name> <name><surname>Acosta</surname> <given-names>E.</given-names></name> <name><surname>Kwan-Loo</surname> <given-names>K.</given-names></name> <name><surname>de Alba</surname> <given-names>M.</given-names></name> <name><surname>Tamez-Duque</surname> <given-names>U.</given-names></name> <etal/></person-group>. (<year>2017</year>). <article-title>Modulation of neural activity during guided viewing of visual art</article-title>. <source>Front. Hum. Neurosci</source>. <volume>11</volume>:<fpage>581</fpage>. <pub-id pub-id-type="doi">10.3389/fnhum.2017.00581</pub-id><pub-id pub-id-type="pmid">29249949</pub-id></citation></ref>
<ref id="B29">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>H&#x000F6;fel</surname> <given-names>L.</given-names></name> <name><surname>Jacobsen</surname> <given-names>T.</given-names></name></person-group> (<year>2017</year>). <article-title>Electrophysiological indices of processing symmetry and aesthetics</article-title>. <source>J. Psychophysiol</source>. <volume>21</volume>, <fpage>9</fpage>&#x02013;<lpage>21</lpage>. <pub-id pub-id-type="doi">10.1027/0269-8803.21.1.9</pub-id></citation>
</ref>
<ref id="B30">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hogervorst</surname> <given-names>M.</given-names></name> <name><surname>Brouwer</surname> <given-names>A.</given-names></name> <name><surname>Erp</surname> <given-names>J.</given-names></name></person-group> (<year>2014</year>). <article-title>Combining and comparing EEG, peripheral physiology and eye-related measures for the assessment of mental workload</article-title>. <source>Front. Neurosci</source>. <volume>8</volume>:<fpage>322</fpage>. <pub-id pub-id-type="doi">10.3389/fnins.2014.00322</pub-id><pub-id pub-id-type="pmid">25352774</pub-id></citation></ref>
<ref id="B31">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Jacobsen</surname> <given-names>T.</given-names></name> <name><surname>Klein</surname> <given-names>S.</given-names></name></person-group> (<year>2022</year>). <article-title>&#x0201C;Electrophysiology,&#x0201D;</article-title> in <source>The Oxford Handbook of Empirical Aesthetics</source>, ed. V. Nadal (<publisher-loc>Oxford</publisher-loc>: <publisher-name>Oxford University Press</publisher-name>), <fpage>291</fpage>&#x02013;<lpage>307</lpage>. <pub-id pub-id-type="doi">10.1093/oxfordhb/9780198824350.013.13</pub-id></citation>
</ref>
<ref id="B32">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jayaram</surname> <given-names>V.</given-names></name> <name><surname>Barachant</surname> <given-names>A.</given-names></name></person-group> (<year>2018</year>). <article-title>MOABB: trustworthy algorithm benchmarking for BCIs</article-title>. <source>J. Neural Eng</source>. 15, 6.<pub-id pub-id-type="pmid">30177583</pub-id></citation></ref>
<ref id="B33">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Jeunet</surname> <given-names>C.</given-names></name> <name><surname>Debener</surname> <given-names>S.</given-names></name> <name><surname>Lotte</surname> <given-names>F.</given-names></name> <name><surname>Mattout</surname> <given-names>J.</given-names></name> <name><surname>Scherer</surname> <given-names>R.</given-names></name> <name><surname>Zich</surname> <given-names>C.</given-names></name> <etal/></person-group>. (<year>2018</year>). <article-title>&#x0201C;Mind the traps! Design guidelines for rigorous BCI experiments,&#x0201D;</article-title> in <source>Brain-computer Interfaces Handbook: Technological and Theoretical Advances</source>, eds C. S. Nam, A. Nijholt, and F. Lotte (<publisher-loc>Baton Rouge, FL</publisher-loc>: <publisher-name>CRC Press</publisher-name>), <fpage>613</fpage>&#x02013;<lpage>634</lpage>. <pub-id pub-id-type="doi">10.1201/9781351231954-32</pub-id></citation>
</ref>
<ref id="B34">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Kant</surname> <given-names>I.</given-names></name></person-group> (<year>1983</year>). <article-title>&#x0201C;Kritik der urteilskraft,&#x0201D;</article-title> in <source>Kant Werke</source>, ed. W. Weischedel (<publisher-loc>Darmstadt</publisher-loc>: <publisher-name>Wissenschaftliche Buchgesellschaft</publisher-name>).</citation>
</ref>
<ref id="B35">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Karim</surname> <given-names>A. K. M. R.</given-names></name> <name><surname>Proulx</surname> <given-names>M.</given-names></name> <name><surname>de Sousa</surname> <given-names>A. A.</given-names></name> <name><surname>Likova</surname> <given-names>L. T.</given-names></name></person-group> (<year>2017</year>). <article-title>Do we enjoy what we sense and perceive? A dissociation between aesthetic appreciation and basic perception of environmental objects or events</article-title>. <source>Cogn. Affect. Behav. Neurosci</source>. <volume>22</volume>, <fpage>904</fpage>&#x02013;<lpage>951</lpage>. <pub-id pub-id-type="doi">10.3758/s13415-022-01004-0</pub-id><pub-id pub-id-type="pmid">35589909</pub-id></citation></ref>
<ref id="B36">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kawabata</surname> <given-names>H.</given-names></name> <name><surname>Zeki</surname> <given-names>S.</given-names></name></person-group> (<year>2004</year>). <article-title>Neural correlates of beauty</article-title>. <source>J. Neurophysiol</source>. <volume>91</volume>, <fpage>1699</fpage>&#x02013;<lpage>1705</lpage>. <pub-id pub-id-type="doi">10.1152/jn.00696.2003</pub-id></citation>
</ref>
<ref id="B37">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>King</surname> <given-names>J. L.</given-names></name> <name><surname>Parada</surname> <given-names>F. J.</given-names></name></person-group> (<year>2021</year>). <article-title>Using mobile brain/body imaging to advance research in arts, health, and related therapeutics</article-title>. <source>Eur. J. Neurosci</source>. <volume>54</volume>, <fpage>8364</fpage>&#x02013;<lpage>8380</lpage>. <pub-id pub-id-type="doi">10.1111/ejn.15313</pub-id><pub-id pub-id-type="pmid">33999462</pub-id></citation></ref>
<ref id="B38">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kobler</surname> <given-names>R.</given-names></name> <name><surname>Hirayama</surname> <given-names>J. I.</given-names></name> <name><surname>Zhao</surname> <given-names>Q.</given-names></name> <name><surname>Kawanabe</surname> <given-names>M.</given-names></name></person-group> (<year>2022</year>). <article-title>Spd domain-specific batch normalization to crack interpretable unsupervised domain adaptation in EEG</article-title>. <source>Adv. Neural Inf. Process. Syst</source>. <volume>35</volume>, <fpage>6219</fpage>&#x02013;<lpage>6235</lpage>.</citation>
</ref>
<ref id="B39">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kontson</surname> <given-names>K.</given-names></name> <name><surname>Megjhani</surname> <given-names>M.</given-names></name> <name><surname>Brantley</surname> <given-names>J.</given-names></name> <name><surname>Cruz-Garza</surname> <given-names>J.</given-names></name> <name><surname>Nakagome</surname> <given-names>S.</given-names></name> <name><surname>Robleto</surname> <given-names>D.</given-names></name> <etal/></person-group>. (<year>2015</year>). <article-title>Your brain on art: emergent cortical dynamics during aesthetic experiences</article-title>. <source>Front. Hum. Neurosci</source>. <volume>9</volume>:<fpage>626</fpage>. <pub-id pub-id-type="doi">10.3389/fnhum.2015.00626</pub-id><pub-id pub-id-type="pmid">26635579</pub-id></citation></ref>
<ref id="B40">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>K&#x000FC;hnapfel</surname> <given-names>C.</given-names></name> <name><surname>Fingerhut</surname> <given-names>J.</given-names></name> <name><surname>Brinkmann</surname> <given-names>H.</given-names></name> <name><surname>Ganster</surname> <given-names>V.</given-names></name> <name><surname>Tanaka</surname> <given-names>T.</given-names></name> <name><surname>Specker</surname> <given-names>E.</given-names></name> <etal/></person-group>. (<year>2023</year>). <article-title>How do we move in front of art? How does this relate to art experience? Linking movement, eye tracking, emotion, and evaluations in a gallery-like setting</article-title>. <source>Empir. Stud. Arts</source> <volume>42</volume>, <fpage>86</fpage>&#x02013;<lpage>146</lpage>. <pub-id pub-id-type="doi">10.1177/02762374231160000</pub-id></citation>
</ref>
<ref id="B41">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lacey</surname> <given-names>M.</given-names></name> <name><surname>Gable</surname> <given-names>P.</given-names></name></person-group> (<year>2022</year>). <article-title>Frontal asymmetry as a neural correlate of motivational conflict</article-title>. <source>Symmetry</source> 14. <pub-id pub-id-type="doi">10.3390/sym14030507</pub-id></citation>
</ref>
<ref id="B42">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lawhern</surname> <given-names>V.</given-names></name> <name><surname>Solon</surname> <given-names>A.</given-names></name> <name><surname>Waytowich</surname> <given-names>N.</given-names></name> <name><surname>Gordon</surname> <given-names>S.</given-names></name> <name><surname>Hung</surname> <given-names>C.</given-names></name> <name><surname>Lance</surname> <given-names>B.</given-names></name> <etal/></person-group>. (<year>2016</year>). <article-title>EEGNET: a compact convolutional network for EEG-based brain-computer interfaces</article-title>. <source>J. Neural Eng</source>. <volume>15</volume>:<fpage>056013</fpage>. <pub-id pub-id-type="doi">10.1088/1741-2552/aace8c</pub-id><pub-id pub-id-type="pmid">29932424</pub-id></citation></ref>
<ref id="B43">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lotte</surname> <given-names>F.</given-names></name> <name><surname>Bougrain</surname> <given-names>L.</given-names></name> <name><surname>Cichocki</surname> <given-names>A.</given-names></name> <name><surname>Clerc</surname> <given-names>M.</given-names></name> <name><surname>Congedo</surname> <given-names>M.</given-names></name> <name><surname>Rakotomamonjy</surname> <given-names>A.</given-names></name> <etal/></person-group>. (<year>2018</year>). <article-title>A review of classification algorithms for EEG-based brain-computer interfaces: a 10 year update</article-title>. <source>J. Neural Eng</source>. 15, 31005. <pub-id pub-id-type="doi">10.1088/1741-2552/aab2f2</pub-id><pub-id pub-id-type="pmid">29488902</pub-id></citation></ref>
<ref id="B44">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mazzacane</surname> <given-names>S.</given-names></name> <name><surname>Coccagna</surname> <given-names>M.</given-names></name> <name><surname>Manzella</surname> <given-names>F.</given-names></name> <name><surname>Pagliarini</surname> <given-names>G.</given-names></name> <name><surname>Sironi</surname> <given-names>V.</given-names></name> <name><surname>Gatti</surname> <given-names>A.</given-names></name> <etal/></person-group>. (<year>2023</year>). <article-title>Towards an objective theory of subjective liking: a first step in understanding the sense of beauty</article-title>. <source>PLoS ONE</source> <volume>18</volume>:<fpage>e0287513</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0287513</pub-id><pub-id pub-id-type="pmid">37352316</pub-id></citation></ref>
<ref id="B45">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Menninghaus</surname> <given-names>W.</given-names></name> <name><surname>Wagner</surname> <given-names>V.</given-names></name> <name><surname>Wassiliwizky</surname> <given-names>E.</given-names></name> <name><surname>Schindler</surname> <given-names>I.</given-names></name> <name><surname>Hanich</surname> <given-names>J.</given-names></name> <name><surname>Jacobsen</surname> <given-names>T.</given-names></name> <etal/></person-group>. (<year>2018</year>). <article-title>What are aesthetic emotions?</article-title> <source>Psychol. Rev</source>. <volume>126</volume>, <fpage>171</fpage>&#x02013;<lpage>195</lpage>. <pub-id pub-id-type="doi">10.1037/rev0000135</pub-id></citation>
</ref>
<ref id="B46">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>M&#x000FC;hl</surname> <given-names>C.</given-names></name> <name><surname>Allison</surname> <given-names>B.</given-names></name> <name><surname>Nijholt</surname> <given-names>A.</given-names></name> <name><surname>Chanel</surname> <given-names>G.</given-names></name></person-group> (<year>2014</year>). <article-title>A survey of affective brain computer interfaces: principles, state-of-the-art, and challenges</article-title>. <source>Brain Comput. Interfaces</source> <volume>1</volume>, <fpage>66</fpage>&#x02013;<lpage>84</lpage>. <pub-id pub-id-type="doi">10.1080/2326263X.2014.912881</pub-id></citation>
</ref>
<ref id="B47">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Muth</surname> <given-names>C.</given-names></name> <name><surname>Briesen</surname> <given-names>J.</given-names></name> <name><surname>Carbon</surname> <given-names>C. C.</given-names></name></person-group> (<year>2020</year>). <article-title>&#x0201C;I like how it looks but it is not beautiful&#x0201D;</article-title>. <source>Poetics</source> 79. <pub-id pub-id-type="doi">10.1016/j.poetic.2019.101376</pub-id></citation>
</ref>
<ref id="B48">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Nam</surname> <given-names>C. S.</given-names></name> <name><surname>Nijholt</surname> <given-names>A.</given-names></name> <name><surname>Lotte</surname> <given-names>F.</given-names></name></person-group> (<year>2018</year>). <source>Brain-computer Interfaces Handbook: Technological and Theoretical Advances</source>. <publisher-loc>Baton Rouge, FL</publisher-loc>: <publisher-name>CRC Press</publisher-name>. <pub-id pub-id-type="doi">10.1201/9781351231954</pub-id></citation>
</ref>
<ref id="B49">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Patel</surname> <given-names>P.</given-names></name> <name><surname>Rajkumar</surname> <given-names>R.</given-names></name> <name><surname>Annavarapu</surname> <given-names>R. N.</given-names></name></person-group> (<year>2021</year>). <article-title>EEG-based human emotion recognition using entropy as a feature extraction measure</article-title>. <source>Brain Inform</source>. <volume>8</volume>:<fpage>20</fpage>. <pub-id pub-id-type="doi">10.1186/s40708-021-00141-5</pub-id><pub-id pub-id-type="pmid">34609639</pub-id></citation></ref>
<ref id="B50">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Peylo</surname> <given-names>C.</given-names></name> <name><surname>Hilla</surname> <given-names>Y.</given-names></name> <name><surname>Sauseng</surname> <given-names>P.</given-names></name></person-group> (<year>2021</year>). <article-title>Cause or consequence? alpha oscillations in visuospatial attention</article-title>. <source>Trends Neurosci</source>. <volume>44</volume>, <fpage>705</fpage>&#x02013;<lpage>713</lpage>. <pub-id pub-id-type="doi">10.1016/j.tins.2021.05.004</pub-id><pub-id pub-id-type="pmid">34167840</pub-id></citation></ref>
<ref id="B51">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pope</surname> <given-names>K.</given-names></name> <name><surname>Lewis</surname> <given-names>T.</given-names></name> <name><surname>Fitzgibbon</surname> <given-names>S.</given-names></name> <name><surname>Janani</surname> <given-names>A.</given-names></name> <name><surname>Grummett</surname> <given-names>T.</given-names></name> <name><surname>Williams</surname> <given-names>P.</given-names></name> <etal/></person-group>. (<year>2022</year>). <article-title>Managing electromyogram contamination in scalp recordings: an approach identifying reliable beta and gamma EEG features of psychoses or other disorders</article-title>. <source>Brain Behav</source>. <volume>12</volume>:<fpage>e2721</fpage>. <pub-id pub-id-type="doi">10.1002/brb3.2721</pub-id><pub-id pub-id-type="pmid">35919931</pub-id></citation></ref>
<ref id="B52">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rawls</surname> <given-names>E.</given-names></name> <name><surname>White</surname> <given-names>R.</given-names></name> <name><surname>Kane</surname> <given-names>S.</given-names></name> <name><surname>Stevens</surname> <given-names>C. E. J.</given-names></name></person-group> (<year>2021</year>). <article-title>Parametric cortical representations of complexity and preference for artistic and computer-generated fractal patterns revealed by single-trial EEG power spectral analysis</article-title>. <source>Neuroimage</source> <volume>236</volume>:<fpage>118092</fpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2021.118092</pub-id><pub-id pub-id-type="pmid">33895307</pub-id></citation></ref>
<ref id="B53">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Reeves</surname> <given-names>J.</given-names></name></person-group> (<year>1989</year>). <article-title>The interest-enjoyment distinction in intrinsic motivation</article-title>. <source>Motiv. Emot</source>. <volume>13</volume>, <fpage>83</fpage>&#x02013;<lpage>103</lpage>. <pub-id pub-id-type="doi">10.1007/BF00992956</pub-id></citation>
</ref>
<ref id="B54">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rinella</surname> <given-names>S.</given-names></name> <name><surname>Massimino</surname> <given-names>S.</given-names></name> <name><surname>Fallica</surname> <given-names>P. G.</given-names></name> <name><surname>Giacobbe</surname> <given-names>A.</given-names></name> <name><surname>Donato</surname> <given-names>N.</given-names></name> <name><surname>Coco</surname> <given-names>M.</given-names></name> <etal/></person-group>. (<year>2022</year>). <article-title>Emotion recognition: photoplethysmography and electrocardiography in comparison</article-title>. <source>Biosensors</source> <volume>12</volume>:<fpage>811</fpage>. <pub-id pub-id-type="doi">10.3390/bios12100811</pub-id><pub-id pub-id-type="pmid">36290948</pub-id></citation></ref>
<ref id="B55">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Roy</surname> <given-names>R. N.</given-names></name> <name><surname>Hinss</surname> <given-names>M. F.</given-names></name> <name><surname>Darmet</surname> <given-names>L.</given-names></name> <name><surname>Ladouce</surname> <given-names>S.</given-names></name> <name><surname>Jahanpour</surname> <given-names>E. S.</given-names></name> <name><surname>Somon</surname> <given-names>B.</given-names></name> <etal/></person-group>. (<year>2022</year>). <article-title>Retrospective on the first passive brain-computer interface competition on cross-session workload estimation</article-title>. <source>Front. Neuroergonom</source>. <volume>3</volume>:<fpage>838342</fpage>. <pub-id pub-id-type="doi">10.3389/fnrgo.2022.838342</pub-id><pub-id pub-id-type="pmid">38235453</pub-id></citation></ref>
<ref id="B56">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sabu</surname> <given-names>P.</given-names></name> <name><surname>Stuldreher</surname> <given-names>I.</given-names></name> <name><surname>Kaneko</surname> <given-names>D.</given-names></name> <name><surname>Brouwer</surname> <given-names>A.</given-names></name></person-group> (<year>2022</year>). <article-title>A review on the role of affective stimuli in event-related frontal alpha asymmetry</article-title>. <source>Front. Comput. Sci</source>. <volume>4</volume>:<fpage>869123</fpage>. <pub-id pub-id-type="doi">10.3389/fcomp.2022.869123</pub-id></citation>
</ref>
<ref id="B57">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sarasso</surname> <given-names>P.</given-names></name> <name><surname>Neppi-Modona</surname> <given-names>M.</given-names></name> <name><surname>Sacco</surname> <given-names>K.</given-names></name> <name><surname>Ronga</surname> <given-names>I.</given-names></name></person-group> (<year>2020</year>). <article-title>Stopping for knowledge: the sense of beauty in the perception-action cycle</article-title>. <source>Neurosci. Biobehav. Rev</source>. <volume>118</volume>, <fpage>723</fpage>&#x02013;<lpage>738</lpage>. <pub-id pub-id-type="doi">10.1016/j.neubiorev.2020.09.004</pub-id><pub-id pub-id-type="pmid">32926914</pub-id></citation></ref>
<ref id="B58">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Schaeffer</surname> <given-names>J.</given-names></name></person-group> (<year>2015</year>). <source>L&#x00027;Exp&#x000E9;rience Esth&#x000E9;tique</source>. <publisher-loc>Paris</publisher-loc>: <publisher-name>Gallimard</publisher-name>.</citation>
</ref>
<ref id="B59">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schiller</surname> <given-names>D.</given-names></name> <name><surname>Yu</surname> <given-names>A. N. C.</given-names></name> <name><surname>Alia-Klein</surname> <given-names>N.</given-names></name> <name><surname>Becker</surname> <given-names>S.</given-names></name> <name><surname>Cromwell</surname> <given-names>H. C.</given-names></name> <name><surname>Dolcos</surname> <given-names>F.</given-names></name> <etal/></person-group>. (<year>2023</year>). <article-title>The human affectome</article-title>. <source>Neurosci. Biobehav. Rev</source>. 105450. <pub-id pub-id-type="doi">10.1016/j.neubiorev.2023.105450</pub-id></citation>
</ref>
<ref id="B60">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schirrmeister</surname> <given-names>R.</given-names></name> <name><surname>Springenberg</surname> <given-names>J.</given-names></name> <name><surname>Fiederer</surname> <given-names>L.</given-names></name> <name><surname>Glasstetter</surname> <given-names>M.</given-names></name> <name><surname>Eggensperger</surname> <given-names>K.</given-names></name> <name><surname>Tangermann</surname> <given-names>M.</given-names></name> <etal/></person-group>. (<year>2017</year>). <article-title>Deep learning with convolutional neural networks for EEG decoding and visualization: convolutional neural networks in EEG analysis</article-title>. <source>Hum. Brain Mapp</source>. <volume>38</volume>, <fpage>5391</fpage>&#x02013;<lpage>5420</lpage>. <pub-id pub-id-type="doi">10.1002/hbm.23730</pub-id><pub-id pub-id-type="pmid">28782865</pub-id></citation></ref>
<ref id="B61">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schubring</surname> <given-names>D.</given-names></name> <name><surname>Schupp</surname> <given-names>H.</given-names></name></person-group> (<year>2019</year>). <article-title>Affective picture processing: alpha- and lower beta-band desynchronization reflects emotional arousal</article-title>. <source>Psychophysiology</source> <volume>56</volume>:<fpage>e13386</fpage>. <pub-id pub-id-type="doi">10.1111/psyp.13386</pub-id><pub-id pub-id-type="pmid">31026079</pub-id></citation></ref>
<ref id="B62">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Sciavicco</surname> <given-names>G.</given-names></name> <name><surname>Stan</surname> <given-names>I. E.</given-names></name></person-group> (<year>2020</year>). <article-title>&#x0201C;Knowledge extraction with interval temporal logic decision trees,&#x0201D;</article-title> in <source>Proc. of 27th International Symposium on Temporal Representation and Reasoning (TIME), Leibniz International Proceedings in Informatics</source> (<publisher-loc>Dagstuhl</publisher-loc>).</citation>
</ref>
<ref id="B63">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Shu</surname> <given-names>L.</given-names></name> <name><surname>Xie</surname> <given-names>J.</given-names></name> <name><surname>Yang</surname> <given-names>M.</given-names></name> <name><surname>Li</surname> <given-names>Z.</given-names></name> <name><surname>Li</surname> <given-names>Z.</given-names></name> <name><surname>Liao</surname> <given-names>D.</given-names></name> <etal/></person-group>. (<year>2018</year>). <article-title>A review of emotion recognition using physiological signals</article-title>. <source>Sensors</source> <volume>18</volume>:<fpage>2074</fpage>. <pub-id pub-id-type="doi">10.3390/s18072074</pub-id><pub-id pub-id-type="pmid">29958457</pub-id></citation></ref>
<ref id="B64">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Silver</surname> <given-names>S.</given-names></name> <name><surname>Al-Tikriti</surname> <given-names>R.</given-names></name> <name><surname>Jin</surname> <given-names>N.</given-names></name></person-group> (<year>2018</year>). <article-title>Dark chocolate (70% Cacao) modulates gamma wave frequencies in vigorously active individuals</article-title>. <source>Loma Linda Univ. Res. Rep</source>. 12. <pub-id pub-id-type="doi">10.1096/fasebj.2018.32.1_supplement.878.9</pub-id></citation>
</ref>
<ref id="B65">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Singer</surname> <given-names>N.</given-names></name> <name><surname>Poker</surname> <given-names>G.</given-names></name> <name><surname>Dunsky-Moran</surname> <given-names>N.</given-names></name> <name><surname>Nemni</surname> <given-names>S.</given-names></name> <name><surname>Reznik Balter</surname> <given-names>S.</given-names></name> <name><surname>Doron</surname> <given-names>M.</given-names></name> <etal/></person-group>. (<year>2023</year>). <article-title>Development and validation of an fmri-informed EEG model of reward-related ventral striatum activation</article-title>. <source>Neuroimage</source> <volume>276</volume>:<fpage>120183</fpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2023.120183</pub-id><pub-id pub-id-type="pmid">37225112</pub-id></citation></ref>
<ref id="B66">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Skov</surname> <given-names>M.</given-names></name> <name><surname>Nadal</surname> <given-names>M.</given-names></name></person-group> (<year>2020</year>). <article-title>A farewell to art: aesthetics as a topic in psychology and neuroscience</article-title>. <source>Perspect. Psychol. Sci</source>. <volume>15</volume>, <fpage>630</fpage>&#x02013;<lpage>642</lpage>.<pub-id pub-id-type="pmid">32027577</pub-id></citation></ref>
<ref id="B67">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Skov</surname> <given-names>M.</given-names></name> <name><surname>Nadal</surname> <given-names>M.</given-names></name></person-group> (<year>2023</year>). <source>Can arts-based Interventions Improve Health? A Conceptual and Methodological Critique of Art Therapy</source> [preprint]. <pub-id pub-id-type="doi">10.31234/osf.io/sp9y3</pub-id></citation>
</ref>
<ref id="B68">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Strijbosch</surname> <given-names>W.</given-names></name> <name><surname>Vessel</surname> <given-names>E. A.</given-names></name> <name><surname>Welke</surname> <given-names>D.</given-names></name> <name><surname>Mita</surname> <given-names>O.</given-names></name> <name><surname>Gelissen</surname> <given-names>J.</given-names></name> <name><surname>Bastiaansen</surname> <given-names>M.</given-names></name> <etal/></person-group>. (<year>2022</year>). <article-title>On the neuronal dynamics of aesthetic experience: evidence from electroencephalographic oscillatory dynamics</article-title>. <source>J. Cogn. Neurosci</source>. <volume>34</volume>, <fpage>461</fpage>&#x02013;<lpage>479</lpage>. <pub-id pub-id-type="doi">10.1162/jocn_a_01812</pub-id><pub-id pub-id-type="pmid">35015884</pub-id></citation></ref>
<ref id="B69">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ticini</surname> <given-names>L.</given-names></name> <name><surname>Rachman</surname> <given-names>L.</given-names></name> <name><surname>Pelletier</surname> <given-names>J.</given-names></name> <name><surname>Dubal</surname> <given-names>S.</given-names></name></person-group> (<year>2014</year>). <article-title>Enhancing aesthetic appreciation by priming canvases with actions that match the artist&#x00027;s painting style</article-title>. <source>Front. Hum. Neurosci</source>. <volume>8</volume>:<fpage>391</fpage>. <pub-id pub-id-type="doi">10.3389/fnhum.2014.00391</pub-id><pub-id pub-id-type="pmid">24917808</pub-id></citation></ref>
<ref id="B70">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Umilt&#x000E1;</surname> <given-names>M.</given-names></name> <name><surname>Berchio</surname> <given-names>C.</given-names></name> <name><surname>Sestito</surname> <given-names>M.</given-names></name> <name><surname>Gallese</surname> <given-names>V.</given-names></name></person-group> (<year>2012</year>). <article-title>Abstract art and cortical motor activation: an EEG study</article-title>. <source>Front. Hum. Neurosci</source>. <volume>6</volume>:<fpage>311</fpage>. <pub-id pub-id-type="doi">10.3389/fnhum.2012.00311</pub-id><pub-id pub-id-type="pmid">23162456</pub-id></citation></ref>
<ref id="B71">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Vessel</surname> <given-names>E. A.</given-names></name></person-group> (<year>2022</year>). <article-title>&#x0201C;Neuroaesthetics,&#x0201D;</article-title> in <source>Encyclopedia of Behavioral Neuroscience</source>, ed. D. Sala (<publisher-loc>Amsterdam</publisher-loc>: <publisher-name>Elsevier</publisher-name>), <fpage>661</fpage>&#x02013;<lpage>670</lpage>. <pub-id pub-id-type="doi">10.1016/B978-0-12-809324-5.24104-7</pub-id></citation>
</ref>
<ref id="B72">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vessel</surname> <given-names>E. A.</given-names></name> <name><surname>Starr</surname> <given-names>G. G.</given-names></name> <name><surname>Rubin</surname> <given-names>N.</given-names></name></person-group> (<year>2012</year>). <article-title>The brain on art: intense aesthetic experience activates the default mode network</article-title>. <source>Front. Neurosci</source>. <volume>7</volume>:<fpage>66</fpage>. <pub-id pub-id-type="doi">10.3389/fnhum.2012.00066</pub-id><pub-id pub-id-type="pmid">22529785</pub-id></citation></ref>
<ref id="B73">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vessel</surname> <given-names>E. A.</given-names></name> <name><surname>Starr</surname> <given-names>G. G.</given-names></name> <name><surname>Rubin</surname> <given-names>N.</given-names></name></person-group> (<year>2013</year>). <article-title>Art reaches within: aesthetic experience, the self and the default-mode network</article-title>. <source>Front. Neurosci</source>. <volume>7</volume>:<fpage>258</fpage>. <pub-id pub-id-type="doi">10.3389/fnins.2013.00258</pub-id><pub-id pub-id-type="pmid">24415994</pub-id></citation></ref>
<ref id="B74">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vidal</surname> <given-names>J.</given-names></name></person-group> (<year>1973</year>). <article-title>Toward direct brain-computer communication</article-title>. <source>Annu. Rev. Biophys. Bioeng</source>. <volume>7</volume>, <fpage>157</fpage>&#x02013;<lpage>189</lpage>. <pub-id pub-id-type="doi">10.1146/annurev.bb.02.060173.001105</pub-id><pub-id pub-id-type="pmid">4583653</pub-id></citation></ref>
<ref id="B75">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wacker</surname> <given-names>J.</given-names></name> <name><surname>Mueller</surname> <given-names>E.</given-names></name> <name><surname>Pizzagalli</surname> <given-names>D.</given-names></name> <name><surname>Hennig</surname> <given-names>J.</given-names></name> <name><surname>Stemmler</surname> <given-names>G.</given-names></name></person-group> (<year>2013</year>). <article-title>Dopamine-d2-receptor blockade reverses the association between trait approach motivation and frontal asymmetry in an approach-motivation context</article-title>. <source>Psychol. Sci</source>. <volume>24</volume>, <fpage>489</fpage>&#x02013;<lpage>497</lpage>. <pub-id pub-id-type="doi">10.1177/0956797612458935</pub-id><pub-id pub-id-type="pmid">23447558</pub-id></citation></ref>
<ref id="B76">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yuan</surname> <given-names>H.</given-names></name> <name><surname>He</surname> <given-names>B.</given-names></name></person-group> (<year>2014</year>). <article-title>Brain-computer interfaces using sensorimotor rhythms: current state and future perspectives</article-title>. <source>IEEE Trans. Biomed. Eng</source>. <volume>61</volume>, <fpage>1425</fpage>&#x02013;<lpage>1435</lpage>. <pub-id pub-id-type="doi">10.1109/TBME.2014.2312397</pub-id><pub-id pub-id-type="pmid">24759276</pub-id></citation></ref>
</ref-list>
</back>
</article>