<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" article-type="research-article" dtd-version="2.3" xml:lang="EN">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Psychol.</journal-id>
<journal-title>Frontiers in Psychology</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Psychol.</abbrev-journal-title>
<issn pub-type="epub">1664-1078</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fpsyg.2024.1250781</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Psychology</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Color-taste correspondence tested by the Stroop task</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Yang</surname>
<given-names>Yidie</given-names>
</name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Chen</surname>
<given-names>Na</given-names>
</name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<xref ref-type="corresp" rid="c001"><sup>&#x002A;</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/181167/overview"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Kobayashi</surname>
<given-names>Maiko</given-names>
</name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Watanabe</surname>
<given-names>Katsumi</given-names>
</name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/74360/overview"/>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Faculty of Science and Engineering, Waseda University</institution>, <addr-line>Tokyo</addr-line>, <country>Japan</country></aff>
<aff id="aff2"><sup>2</sup><institution>The Gonda Multidisciplinary Brain Research Center, Bar-Ilan University</institution>, <addr-line>Ramat Gan</addr-line>, <country>Israel</country></aff>
<author-notes>
<fn fn-type="edited-by" id="fn0004">
<p>Edited by: Karl Schweizer, Goethe University Frankfurt, Germany</p>
</fn>
<fn fn-type="edited-by" id="fn0005">
<p>Reviewed by: Olivier Penacchio, University of St Andrews, United Kingdom; Givago Silva Souza, Federal University of Par&#x00E1;, Brazil</p>
</fn>
<corresp id="c001">&#x002A;Correspondence: Na Chen, <email>imminana7@gmail.com</email></corresp>
</author-notes>
<pub-date pub-type="epub">
<day>24</day>
<month>01</month>
<year>2024</year>
</pub-date>
<pub-date pub-type="collection">
<year>2024</year>
</pub-date>
<volume>15</volume>
<elocation-id>1250781</elocation-id>
<history>
<date date-type="received">
<day>30</day>
<month>06</month>
<year>2023</year>
</date>
<date date-type="accepted">
<day>11</day>
<month>01</month>
<year>2024</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x00A9; 2024 Yang, Chen, Kobayashi and Watanabe.</copyright-statement>
<copyright-year>2024</copyright-year>
<copyright-holder>Yang, Chen, Kobayashi and Watanabe</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/">
<p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p>
</license>
</permissions>
<abstract>
<p>People consistently associate colors with tastes (e.g., pink-sweet, yellow-sour). However, little has been known on the strength of those color-taste correspondences. The current study examined the congruency effect of color-taste correspondence using two Stroop word categorization tasks. The visual stimuli consisted of food names associated with sweet and sour tastes, presented in different shades of pink and yellow font colors. Participants were instructed to categorize the taste (sweet or sour) of the words in the Stroop word-taste categorization task and to discriminate the font color (pink or yellow) of the words in the Stroop word-color discrimination task. Results showed that participants responded faster in congruent conditions (sweet-pink and sour-yellow) than incongruent conditions (sweet-yellow and sour-pink) in both tasks. Specifically, yellow font colors facilitated the categorization of sour taste words compared to pink font colors, whereas sweet taste words facilitated the discrimination of pink font colors compared to sour taste words. These results provide further evidence for the congruency effect of color-taste correspondence in facilitating the processing of taste-related words and colors. Furthermore, the congruency effect was shown to operate bidirectionally, influencing both the conceptual meaning of tastes and perceptual color perception. This study highlights the significant interference effect of color-taste correspondence on cognitive processing as assessed by the Stroop task.</p>
</abstract>
<kwd-group>
<kwd>color-taste correspondence</kwd>
<kwd>Stroop</kwd>
<kwd>congruency effect</kwd>
<kwd>sweet-pink</kwd>
<kwd>sour-yellow</kwd>
</kwd-group>
<counts>
<fig-count count="3"/>
<table-count count="3"/>
<equation-count count="0"/>
<ref-count count="35"/>
<page-count count="7"/>
<word-count count="5522"/>
</counts>
<custom-meta-wrap>
<custom-meta>
<meta-name>section-at-acceptance</meta-name>
<meta-value>Cognitive Science</meta-value>
</custom-meta>
</custom-meta-wrap>
</article-meta>
</front>
<body>
<sec sec-type="intro" id="sec1">
<label>1</label>
<title>Introduction</title>
<p>Taste plays a crucial role in our daily lives, not only for identifying different foods, but also for experiencing the pleasure of gourmet flavors. Taste perception is known to be psychologically sensitive and can be influenced by various factors, such as food color (<xref ref-type="bibr" rid="ref29">Spence et al., 2015</xref>). Recent studies have found that color can significantly affect how people perceive the taste of foods and beverages, and that different colors can enhance or alter the perception of different taste qualities (<xref ref-type="bibr" rid="ref30">Stillman, 1993</xref>; <xref ref-type="bibr" rid="ref8">Fiszman and Spence, 2015</xref>). For example, red and pink colors can enhance sweetness and decrease bitterness, while green and blue colors can have the opposite effect (<xref ref-type="bibr" rid="ref18">Maga, 1974</xref>; <xref ref-type="bibr" rid="ref10">Hidaka and Shimoda, 2014</xref>). This process of integrating color perception into the evaluation of taste suggest that humans have developed certain associations between color and taste.</p>
<p>Crossmodal correspondence between color and taste refers to the phenomenon that people consistently associate the basic tastes (i.e., sweet, bitter, salty, sour, and umami) with specific colors (for a review, see: <xref ref-type="bibr" rid="ref28">Spence and Levitan, 2021</xref>). Previous studies have found that pink and red are commonly associated with a sweet taste; yellow and green are associated with a sour taste; black, green, purple, and brown have been linked to a bitter taste; and blue, white, and gray have been associated with a salty taste (<xref ref-type="bibr" rid="ref13">Koch and Koch, 2003</xref>; <xref ref-type="bibr" rid="ref29">Spence et al., 2015</xref>; <xref ref-type="bibr" rid="ref35">Woods and Spence, 2016</xref>; <xref ref-type="bibr" rid="ref25">Saluja and Stevenson, 2018</xref>; <xref ref-type="bibr" rid="ref5">Chen et al., 2021</xref>). These color-taste associations have been observed using both actual food samples and taste words. <xref ref-type="bibr" rid="ref25">Saluja and Stevenson (2018)</xref> examined color-taste correspondences using actual tastes, and they found that the color-taste mappings were similar to those obtained using taste words. Moreover, color-taste correspondences are observed in different populations and are influenced by cultural and contextual factors (<xref ref-type="bibr" rid="ref34">Wan et al., 2014</xref>).</p>
<p>Color-taste correspondence can be explained by the statistical learning of co-occurrences in the environment, where people associate certain colors with certain tastes based on the colors of foods and beverages they encounter (<xref ref-type="bibr" rid="ref29">Spence et al., 2015</xref>; <xref ref-type="bibr" rid="ref11">Higgins and Hayes, 2019</xref>; <xref ref-type="bibr" rid="ref27">Spence, 2019</xref>). This learned color-taste mapping information allows for the efficient integration of colors and tastes (<xref ref-type="bibr" rid="ref22">Piqueras-Fiszman and Spence, 2011</xref>; <xref ref-type="bibr" rid="ref1001">Spence, 2011</xref>; <xref ref-type="bibr" rid="ref21">Parise et al., 2013</xref>; <xref ref-type="bibr" rid="ref4">Chen et al., 2015</xref>). For instance, participants made fewer errors in discriminating the taste of real solutions when they were colored according to a color-taste mapping in congruent conditions than incongruent conditions (<xref ref-type="bibr" rid="ref37">Zampini et al., 2007</xref>). <xref ref-type="bibr" rid="ref33">Velasco et al. (2015)</xref> examined the congruency effect of color and flavor label of crisp packages using visual search and Go/No-go tasks, and they found that participants responded faster when the color of the package was congruent with the flavor label (e.g., lemon flavor in yellow and tomato flavor in red) than incongruent conditions. However, to our knowledge, no study has examined the congruency effect of color-taste correspondence using the Stroop task.</p>
<p>The Stroop word task is a widely used neuropsychological test that can probe attentional control and cognitive processing to assess the strength and automatic processing of implicit associations, known as the Stroop effect (<xref ref-type="bibr" rid="ref31">Stroop, 1935</xref>; <xref ref-type="bibr" rid="ref24">Richter and Zwaan, 2009</xref>; <xref ref-type="bibr" rid="ref26">Scarpina and Tagini, 2017</xref>). In a classic Stroop task, participants are presented with words that name colors, but whose font color is different. Participants are instructed to either read the word or name the font color, which requires them to inhibit cognitive interference. Cognitive interference occurs when the processing of one stimulus attribute, such as the meaning of the word, influences the concurrent processing of another attribute, such as the font color (<xref ref-type="bibr" rid="ref31">Stroop, 1935</xref>). The Stroop task relies on the strong overlearned tendency of experienced readers to automatically attend to the meaning of a word, making it challenging to ignore the meaning of the word when instructed to focus on the font color. Longer response times and lower accuracy in naming the font color may indicate difficulties in inhibiting the automatic tendency to categorize the word&#x2019;s meaning. This interference effect, known as the Stroop cost, is consistently observed and demonstrates the robustness of the task. Thus, the Stroop word categorization task can be a useful tool to test the automaticity and strength of color-taste correspondence, and it has been used in several previous studies investigating the implicit associations (<xref ref-type="bibr" rid="ref36">Xiao et al., 2014</xref>; <xref ref-type="bibr" rid="ref3">Chen et al., 2023</xref>).</p>
<p>The current study aimed to examine the congruency effect of color-taste correspondence using the Stroop task. In a previous study, sweet-pink (86.59%) and sour-yellow (80.49%) correspondences were the most chosen associations in a sample of Japanese participants (with a chance level of 20% in <xref ref-type="bibr" rid="ref5">Chen et al., 2021</xref>). Thus, the sweet-pink and sour-yellow correspondences were selected to be tested using the Stroop task. By manipulating the levels of font color and taste words with congruent and incongruent color-taste pairs, this study examined whether the interference effect of color-taste correspondences would influence behavioral performance with response times and accuracy. Ten words (5 with sweet food names and 5 with sour food names) in ten font colors (5 levels of pink and yellow colors) were presented as visual stimuli. Through the two Stroop word tasks, we explored the strength of sweet-pink and sour-yellow correspondences in both directions with the effect of font color on words taste categorization and the effect of words meaning on font color discrimination. Specifically, participants were presented with color-word stimuli in a modified Stroop task paradigm, where they were required to categorize the taste of the presented words as sweet or sour while ignoring the font colors (Stroop word taste categorization task) and to identify the font color of the presented words as pink or yellow while ignoring the semantic content with taste (Stroop word color discrimination task). We hypothesized that a congruency effect of color-taste correspondence could influence both the taste word categorization and font color processing.</p>
</sec>
<sec sec-type="methods" id="sec2">
<label>2</label>
<title>Methods</title>
<sec id="sec3">
<label>2.1</label>
<title>Participants</title>
<p>Thirty-eight Japanese undergraduate students (23 males and 15 females, <italic>M</italic>age&#x2009;=&#x2009;20.4, <italic>SD</italic>&#x2009;=&#x2009;1.5) took part in this study. All of the participants had normal or corrected-to-normal visual acuity and normal color vision. The sample size was set <italic>a priori</italic> at 33 based on a target of 0.8 power with a medium effect size (Cohen&#x2019;s <italic>d</italic>&#x2009;=&#x2009;0.5) using power analysis (G&#x002A;Power 3.0; <xref ref-type="bibr" rid="ref7">Faul et al., 2007</xref>). We collected additional data in response to some participants&#x2019; poor performance in the task (e.g., error rate&#x2009;&#x003E;&#x2009;20%). This experiment was approved by the institutional review board of Waseda University, and conducted in accordance with the ethical standards of the 1964 Declaration of Helsinki.</p>
</sec>
<sec id="sec4">
<label>2.2</label>
<title>Apparatus and stimuli</title>
<p>The experiment was programmed in E-Prime 3.0 (Psychology Software Tools<xref ref-type="fn" rid="fn0001"><sup>1</sup></xref>). The stimuli were displayed on a 24-inch LCD monitor (EIZO FG2421, EIZO Corp, Hakusan, Japan), with a resolution of 1920&#x2009;&#x00D7;&#x2009;1,080 pixels and a refresh rate of 100&#x2009;Hz. Participants viewed the monitor at a distance of approximately 60&#x2009;cm.</p>
<p>Ten Japanese words of food name were used as visual stimuli to rank sweetness and sourness. Five of the words were typically associated with two levels of sweetness [high: &#x30C1;&#x30E7;&#x30B3;&#x30EC;&#x30FC;&#x30C8;(chocolate), &#x3042;&#x3093;&#x3053;(red bean paste); low: &#x30D1;&#x30D5;&#x30A7;(parfait),&#x30C9;&#x30FC;&#x30CA;&#x30C4;(donut), &#x7802;&#x7CD6;(sugar)], while the remaining five are associated with two levels of sourness [high: &#x3046;&#x3081;&#x307C;&#x3057;(dried plum), &#x30B0;&#x30EC;&#x30C3;&#x30D7;&#x30D5;&#x30EB;&#x30FC;&#x30C4;(grapefruit); low: &#x30D1;&#x30A4;&#x30CA;&#x30C3;&#x30D7;&#x30EB;(pineapple), &#x30AD;&#x30A6;&#x30A4;(kiwi), &#x304A;&#x9162;(vinegar)]. The food names were selected from the top twenty foods based on the ranking of sweet<xref ref-type="fn" rid="fn0002"><sup>2</sup></xref> and sour foods<xref ref-type="fn" rid="fn0003"><sup>3</sup></xref> obtained from the Japanese &#x201C;Everyone&#x2019;s Ranking&#x201D; website. The word stimulus was presented in 40 Pt MS gothic font in the center of the screen. Seven of the words were two to four letters long, and three of the words were in six and eight letters. Sour words are five letters longer than sweet words in total. One Japanese letter sustained about 1.3&#x00B0; of visual angle.</p>
<p>The font colors of the word stimuli were 10 colors rendering gradually from pink (P5 in <xref ref-type="table" rid="tab1">Table 1</xref>) to yellow (Y5 in <xref ref-type="table" rid="tab1">Table 1</xref>). They were grouped as pink font colors (high level pink: P5 and P4; low level pink color: P3, P2, and P1 in <xref ref-type="table" rid="tab1">Table 1</xref>), and yellow font colors (high level yellow: Y5 and Y4; low level yellow color: Y3, Y2, and Y1 in <xref ref-type="table" rid="tab1">Table 1</xref>). The ten colors were measured by PR-655 (Photo Research, Chatsworth, CA, USA), and each color was measured 10 times and the average was calculated. The color information is shown in <xref ref-type="table" rid="tab1">Table 1</xref>.</p>
<table-wrap position="float" id="tab1">
<label>Table 1</label>
<caption>
<p>Color values in the L<sup>&#x002A;</sup>a<sup>&#x002A;</sup>b<sup>&#x002A;</sup> color space.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="top">ID</th>
<th align="center" valign="top">L<sup>&#x002A;</sup></th>
<th align="center" valign="top">a<sup>&#x002A;</sup></th>
<th align="center" valign="top">b<sup>&#x002A;</sup></th>
<th align="center" valign="top">Color</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="middle">P5</td>
<td align="center" valign="middle">74.99</td>
<td align="center" valign="middle">42.02</td>
<td align="center" valign="middle">&#x2212;2.64</td>
<td align="center" valign="middle">
<inline-graphic xlink:href="fpsyg-15-1250781-i001.tif"/></td>
</tr>
<tr>
<td align="left" valign="middle">P4</td>
<td align="center" valign="middle">77.44</td>
<td align="center" valign="middle">34.74</td>
<td align="center" valign="middle">4.60</td>
<td align="center" valign="middle">
<inline-graphic xlink:href="fpsyg-15-1250781-i002.tif"/></td>
</tr>
<tr>
<td align="left" valign="middle">P3</td>
<td align="center" valign="middle">78.59</td>
<td align="center" valign="middle">30.61</td>
<td align="center" valign="middle">10.35</td>
<td align="center" valign="middle">
<inline-graphic xlink:href="fpsyg-15-1250781-i003.tif"/></td>
</tr>
<tr>
<td align="left" valign="middle">P2</td>
<td align="center" valign="middle">80.10</td>
<td align="center" valign="middle">26.34</td>
<td align="center" valign="middle">14.84</td>
<td align="center" valign="middle">
<inline-graphic xlink:href="fpsyg-15-1250781-i004.tif"/></td>
</tr>
<tr>
<td align="left" valign="middle">P1</td>
<td align="center" valign="middle">81.27</td>
<td align="center" valign="middle">22.33</td>
<td align="center" valign="middle">20.71</td>
<td align="center" valign="middle">
<inline-graphic xlink:href="fpsyg-15-1250781-i005.tif"/></td>
</tr>
<tr>
<td align="left" valign="middle">Y1</td>
<td align="center" valign="middle">81.34</td>
<td align="center" valign="middle">20.13</td>
<td align="center" valign="middle">27.77</td>
<td align="center" valign="middle">
<inline-graphic xlink:href="fpsyg-15-1250781-i006.tif"/></td>
</tr>
<tr>
<td align="left" valign="middle">Y2</td>
<td align="center" valign="middle">82.86</td>
<td align="center" valign="middle">15.21</td>
<td align="center" valign="middle">35.58</td>
<td align="center" valign="middle">
<inline-graphic xlink:href="fpsyg-15-1250781-i007.tif"/></td>
</tr>
<tr>
<td align="left" valign="middle">Y3</td>
<td align="center" valign="middle">84.37</td>
<td align="center" valign="middle">11.35</td>
<td align="center" valign="middle">39.72</td>
<td align="center" valign="middle">
<inline-graphic xlink:href="fpsyg-15-1250781-i008.tif"/></td>
</tr>
<tr>
<td align="left" valign="middle">Y4</td>
<td align="center" valign="middle">85.21</td>
<td align="center" valign="middle">9.79</td>
<td align="center" valign="middle">39.88</td>
<td align="center" valign="middle">
<inline-graphic xlink:href="fpsyg-15-1250781-i009.tif"/></td>
</tr>
<tr>
<td align="left" valign="middle">Y5</td>
<td align="center" valign="middle">87.38</td>
<td align="center" valign="middle">2.52</td>
<td align="center" valign="middle">54.28</td>
<td align="center" valign="middle">
<inline-graphic xlink:href="fpsyg-15-1250781-i010.tif"/></td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
<sec id="sec5">
<label>2.3</label>
<title>Procedure</title>
<p>The experiment was carried out in a laboratory with dimmed lighting condition (1 lux on the wall). All participants performed two Stroop word categorization tasks. One was the Stroop word taste categorization task, and participants were asked to categorize the taste of the word stimulus as sweet or sour. The other task was the Stroop word color categorization task, and participants were asked to discriminate the font color of the word stimulus as pink or yellow. At the beginning of each task, written instructions were provided on the computer screen. During the experiment, one of the word stimuli was presented in one of the 10 font colors. The background of the screen was always gray (<italic>L</italic><sup>&#x002A;</sup> =&#x2009;20.24, <italic>a</italic><sup>&#x002A;</sup> =&#x2009;0.33, <italic>b</italic><sup>&#x002A;</sup> =&#x2009;0.36). In the Stroop word taste categorization task, participants were asked to categorize the word stimulus as a &#x201C;sweet&#x201D; or &#x201C;sour&#x201D; word by pressing a labeled key (e.g., <italic>z</italic> or <italic>m</italic>) on the keyboard with their two index fingers. In the Stroop word color categorization task, participants were asked to discriminate the font color of the word stimulus as pink or yellow by pressing a labeled key (e.g., <italic>z</italic> or <italic>m</italic>). Key assignments were counterbalanced between participants and tasks. Participants were required to respond as quickly and accurately as possible. At the beginning of each trial, a fixation cross appeared for 300&#x2009;ms, and then a target word stimulus appeared until a response was made. Trials were separated by an inter-stimulus interval (ISI) of 500&#x2009;ms (see <xref ref-type="fig" rid="fig1">Figure 1</xref>). The experimental task had a 2 (taste of words: sweet vs. sour)&#x2009;&#x00D7;&#x2009;5 (words for each taste group)&#x2009;&#x00D7;&#x2009;2 (font color groups: pink vs. yellow)&#x2009;&#x00D7;&#x2009;5 (colors for each font group)&#x2009;&#x00D7;&#x2009;2 repeated measures design, resulting in a total of 200 trials. Twenty practice trials with feedback preceded the experiment. The main experiment was divided into 5 blocks of 40 trials each. At the end of each block, participants had a self-timed break. Each experiment of the Stroop word categorization task lasted for 5&#x2009;~&#x2009;10&#x2009;min. Participants were randomly instructed to either take the Stroop word taste categorization task first or the Stroop word color categorization task first, with a 5-min rest period in between the two tasks. The entire experiment lasted for approximately 20&#x2009;~&#x2009;30&#x2009;min.</p>
<fig position="float" id="fig1">
<label>Figure 1</label>
<caption>
<p>Example of the trial sequence. Participants were asked to categorize the taste of the word stimulus [e.g., &#x201C;&#x30C1;&#x30E7;&#x30B3;&#x30EC;&#x30FC;&#x30C8;(chocolate)&#x201D;] as sweet or sour by pressing labeled keys in the taste categorization task. Participants were asked to categorize the font color of the word stimulus [e.g., &#x201C;&#x30C1;&#x30E7;&#x30B3;&#x30EC;&#x30FC;&#x30C8;(chocolate)&#x201D; in yellow font color] as pink or yellow by pressing labeled keys in the color categorization task.</p>
</caption>
<graphic xlink:href="fpsyg-15-1250781-g001.tif"/>
</fig>
</sec>
<sec id="sec6">
<label>2.4</label>
<title>Data analysis</title>
<p>Data from three participants who made more than 20% errors on either task were excluded. Thus, data from 35 participants were used for the data analysis. Response times (RTs) for correct trials and errors were recorded. Data were analyzed using the R.4.0.2 statistical software (<xref ref-type="bibr" rid="ref23">R Core Team, 2020</xref>). Statistical analyses on response times and response accuracy were computed with generalized linear mixed-effect model (GLMM) using the function <italic>glmer</italic> from the lme4 package in R (<xref ref-type="bibr" rid="ref2">Bates et al., 2015</xref>). The distributions of response times are positively skewed and not normally distributed; therefore, generalized mixed-effects models were used to obtain unbiased parameters without the need for data transformation (<xref ref-type="bibr" rid="ref16">Lo and Andrews, 2015</xref>). We used <italic>fitdistrplus</italic> package to seek the best distribution fit to the present RT data (<xref ref-type="bibr" rid="ref6">Delignette-Muller and Dutang, 2015</xref>), and selected the inverse gaussian distribution (<xref ref-type="bibr" rid="ref12">Johnson et al., 1995</xref>). Thus, inverse gaussian distribution with identity link function was specified as the distribution of RTs in the GLMM models. For the error analysis, we used a GLMM with binomial distribution and logit link function. The main interest is the effect of congruency condition (congruent vs. incongruent) and the interactions with levels of taste and font colors on RTs and errors, thus, the fixed factors were congruency condition (congruent vs. incongruent), taste (sweet vs. sour), taste level (high vs. low), font color (pink vs. yellow), font color level (high vs. low), and the interactions. We included random intercept by participant (the random effects were not maximum, so to simplify the model we restricted to a random effect of participant on the intercept for all analysis; <xref ref-type="bibr" rid="ref1">Barr et al., 2013</xref>; <xref ref-type="bibr" rid="ref19">Meteyard and Davies, 2020</xref>). To test the main and interaction effects, we compared models with and without that fixed effect of interest using likelihood ratio tests. The <italic>Anova</italic> function (using type III Wald chi-square test) from the <italic>car</italic> package (<xref ref-type="bibr" rid="ref9">Fox et al., 2013</xref>) was used to test the significance of fixed factors. Significant effects were compared using the <italic>emmeans</italic> package (<xref ref-type="bibr" rid="ref15">Lenth et al., 2018</xref>). We ran a post-hoc analysis using paired sample <italic>t</italic>-test with Bonferroni correction to determine the simple main effects. Further, Bayes Factors (BF10) were used to determine whether there was support in favor of the alternative (H1) or null (H0) hypotheses (<xref ref-type="bibr" rid="ref20">Morey and Rouder, 2018</xref>). A value of 1 indicates that null and alternative hypotheses are equally likely, larger values indicate that the data support the alternative hypothesis, and smaller values indicate that the data support the null hypothesis.</p>
</sec>
</sec>
<sec sec-type="results" id="sec7">
<label>3</label>
<title>Results</title>
<sec id="sec8">
<label>3.1</label>
<title>Results of the Stroop word taste categorization task</title>
<sec id="sec9">
<label>3.1.1</label>
<title>Response times</title>
<p>Error trials (5.31%) and those longer than 3,000&#x2009;ms and shorter than 100&#x2009;ms (0.35%) were removed. Trials in which the response times fell out of the outliner (mean RT&#x2009;&#x00B1;&#x2009;2.5&#x2009;sd) for each participant in each color and taste condition were excluded from the data analysis (3.38%). Results of GLMM analysis are shown in <xref ref-type="table" rid="tab2">Table 2</xref>. The GLMM model analysis revealed a main effect of congruency condition, with strong evidence for faster RT for congruent trials (mean&#x2009;RT&#x2009;=&#x2009;585.16&#x2009;ms, <italic>sd</italic> =&#x2009;160.94) than for incongruent trials [mean RT&#x2009;=&#x2009;595.09&#x2009;ms, <italic>sd</italic> =&#x2009;170.76; <italic>x</italic> <sup>2</sup>(1)&#x2009;=&#x2009;61.06, <italic>p</italic> &#x003C;&#x2009;0.001]. A significant interaction effect between congruency condition and font color was observed, <italic>x</italic><sup>2</sup>(1)&#x2009;=&#x2009;63.00, <italic>p</italic> &#x003C;&#x2009;0.001. Multiple comparison analysis showed that when categorizing the sour food names, participants responded faster to yellow (congruent condition; 609.38&#x2009;ms, <italic>sd</italic>&#x2009;=&#x2009;96.48) than to pink font color [incongruent condition; 623.70&#x2009;ms, <italic>sd</italic>&#x2009;=&#x2009;98.90; <italic>t</italic>(34)&#x2009;=&#x2009;3.02, Bonferroni corrected <italic>p</italic>&#x2009;=&#x2009;0.0096, Cohen&#x2019;s <italic>d</italic>&#x2009;=&#x2009;0.51, BF10&#x2009;=&#x2009;7.98]. For categorization of the sweet food names, there was no significant difference in RTs between pink (congruent condition; 563.26&#x2009;ms, <italic>sd</italic>&#x2009;=&#x2009;74.65) and yellow (incongruent condition; 571.13&#x2009;ms, <italic>sd</italic>&#x2009;=&#x2009;78.09) font colors [<italic>t</italic>(34)&#x2009;=&#x2009;1.91, Bonferroni corrected <italic>p</italic>&#x2009;=&#x2009;0.12, Cohen&#x2019;s <italic>d</italic>&#x2009;=&#x2009;0.32, BF10&#x2009;=&#x2009;0.93; see <xref ref-type="fig" rid="fig2">Figure 2</xref>]. There was no significant interaction between congruency condition and levels of font color, <italic>x</italic><sup>2</sup>(1)&#x2009;=&#x2009;0.56, <italic>p</italic> =&#x2009;0.45, and no significant three-way interaction between congruency condition, font color, and levels of font color, <italic>x</italic><sup>2</sup>(1)&#x2009;=&#x2009;1.87, <italic>p</italic> =&#x2009;0.17.</p>
<table-wrap position="float" id="tab2">
<label>Table 2</label>
<caption>
<p>Summary of fixe effects for RTs in Task 1.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="top">Predictor</th>
<th align="center" valign="top">Estimate</th>
<th align="center" valign="top">SE</th>
<th align="center" valign="top"><italic>t</italic> value</th>
<th align="center" valign="top"><italic>p</italic></th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top">(Intercept)</td>
<td align="center" valign="top">595.23</td>
<td align="center" valign="top">20.60</td>
<td align="center" valign="top">28.89</td>
<td align="center" valign="top">&#x003C;0.001<sup>&#x002A;&#x002A;&#x002A;</sup></td>
</tr>
<tr>
<td align="left" valign="top">Congruency</td>
<td align="center" valign="top">48.94</td>
<td align="center" valign="top">6.26</td>
<td align="center" valign="top">7.81</td>
<td align="center" valign="top">&#x003C;0.001<sup>&#x002A;&#x002A;&#x002A;</sup></td>
</tr>
<tr>
<td align="left" valign="top">Font color</td>
<td align="center" valign="top">38.17</td>
<td align="center" valign="top">6.38</td>
<td align="center" valign="top">5.98</td>
<td align="center" valign="top">&#x003C;0.001<sup>&#x002A;&#x002A;&#x002A;</sup></td>
</tr>
<tr>
<td align="left" valign="top">Color level</td>
<td align="center" valign="top">&#x2212;1.55</td>
<td align="center" valign="top">5.55</td>
<td align="center" valign="top">&#x2212;0.28</td>
<td align="center" valign="top">0.78</td>
</tr>
<tr>
<td align="left" valign="top">Congruency &#x00D7; Font color</td>
<td align="center" valign="top">&#x2212;75.04</td>
<td align="center" valign="top">9.45</td>
<td align="center" valign="top">&#x2212;7.94</td>
<td align="center" valign="top">&#x003C;0.001<sup>&#x002A;&#x002A;&#x002A;</sup></td>
</tr>
<tr>
<td align="left" valign="top">Congruency &#x00D7; Color level</td>
<td align="center" valign="top">6.41</td>
<td align="center" valign="top">8.52</td>
<td align="center" valign="top">0.75</td>
<td align="center" valign="top">0.45</td>
</tr>
<tr>
<td align="left" valign="top">Font color &#x00D7; Color level</td>
<td align="center" valign="top">1.97</td>
<td align="center" valign="top">8.67</td>
<td align="center" valign="top">0.23</td>
<td align="center" valign="top">0.82</td>
</tr>
<tr>
<td align="left" valign="top">Congruency &#x00D7; Font color &#x00D7; Color level</td>
<td align="center" valign="top">&#x2212;17.34</td>
<td align="center" valign="top">12.67</td>
<td align="center" valign="top">&#x2212;1.37</td>
<td align="center" valign="top">0.17</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p>&#x2018;<sup>&#x002A;&#x002A;&#x002A;</sup>&#x2019;&#x2009;&#x003C;&#x2009;0.001; &#x2018;<sup>&#x002A;&#x002A;</sup>&#x2019;&#x2009;&#x003C;&#x2009;0.01; &#x2018;<sup>&#x002A;</sup>&#x2019;&#x2009;&#x003C;&#x2009;0.05; &#x2018;<sup>+</sup>&#x2019;&#x2009;&#x003C;&#x2009;0.1.</p>
</table-wrap-foot>
</table-wrap>
<fig position="float" id="fig2">
<label>Figure 2</label>
<caption>
<p>Violin plot showing the distribution of individual RTs in categorizing sweet and sour tasted words in congruent (white color) and incongruent (gray color) conditions. The horizontal line represents the mean. Error bars represent the standard errors of the mean (<sup>&#x002A;&#x002A;</sup><italic>p</italic>&#x2009;&#x003C;&#x2009;0.01, Bonferroni corrected).</p>
</caption>
<graphic xlink:href="fpsyg-15-1250781-g002.tif"/>
</fig>
</sec>
<sec id="sec10">
<label>3.1.2</label>
<title>Accuracy</title>
<p>Participants made 5.31% of the errors in all trials. The GLMM analysis showed no significant effect of the congruency condition, <italic>x</italic><sup>2</sup>(1)&#x2009;=&#x2009;2.30, <italic>p</italic> =&#x2009;0.13. Thus, there was no significant difference on the effect of font color on taste categorization by error rate.</p>
</sec>
</sec>
<sec id="sec11">
<label>3.2</label>
<title>Results of the Stroop word color categorization task</title>
<sec id="sec12">
<label>3.2.1</label>
<title>Response times</title>
<p>Error trials (11.57%) and trials in which response times fell out of the outline (mean RT&#x2009;&#x00B1;&#x2009;2.5&#x2009;sd) for each participant in each color and taste condition were excluded from data analysis (3.44%). There was no trial with response times slower than 3,000&#x2009;ms and faster than 100&#x2009;ms.</p>
<p>Results of GLMM analysis were shown in <xref ref-type="table" rid="tab3">Table 3</xref>. The model analysis showed a significant main effect of congruency condition, that RTs in congruent conditions (mean RT&#x2009;=&#x2009;535.93&#x2009;ms, <italic>sd</italic>&#x2009;=&#x2009;177.49) were faster than incongruent conditions [mean RT&#x2009;=&#x2009;549.53&#x2009;ms, <italic>sd</italic>&#x2009;=&#x2009;188.47, <italic>x</italic><sup>2</sup>(1)&#x2009;=&#x2009;51.42, <italic>p</italic>&#x2009;&#x003C;&#x2009;0.001]. A significant interaction effect between congruency condition and taste was observed, <italic>x</italic><sup>2</sup>(1)&#x2009;=&#x2009;64.26, <italic>p</italic>&#x2009;&#x003C;&#x2009;0.001. Further analysis showed that for discriminating pink font color, participants responded faster in sweet taste words (congruent condition; 519.36&#x2009;ms, <italic>sd</italic>&#x2009;=&#x2009;79.13) than sour taste words [incongruent condition; 537.67&#x2009;ms, <italic>sd</italic>&#x2009;=&#x2009;96.58; <italic>t</italic>(34)&#x2009;=&#x2009;2.88, Bonferroni corrected <italic>p</italic>&#x2009;=&#x2009;0.014, Cohen&#x2019;s <italic>d</italic>&#x2009;=&#x2009;0.49, BF10&#x2009;=&#x2009;5.93]. For discriminating yellow font colors, there was no significant difference on RTs between sweet (incongruent condition; 561.96&#x2009;ms, <italic>sd</italic>&#x2009;=&#x2009;94.59) and sour [congruent condition; 559.95&#x2009;ms, <italic>sd</italic>&#x2009;=&#x2009;101.89, taste words, <italic>t</italic>(34)&#x2009;=&#x2009;0.33, Bonferroni corrected <italic>p</italic>&#x2009;&#x003E;&#x2009;1, Cohen&#x2019;s <italic>d</italic>&#x2009;=&#x2009;0.06, BF10&#x2009;=&#x2009;0.19; <xref ref-type="fig" rid="fig3">Figure 3</xref>]. There was a marginal significant effect between congruency condition and level of taste, <italic>x</italic><sup>2</sup>(1)&#x2009;=&#x2009;3.64, <italic>p</italic>&#x2009;=&#x2009;0.056, and a significant three-way interaction between congruency condition, level of taste, and taste, <italic>x</italic><sup>2</sup>(1)&#x2009;=&#x2009;6.09, <italic>p</italic>&#x2009;=&#x2009;0.013.</p>
<table-wrap position="float" id="tab3">
<label>Table 3</label>
<caption>
<p>Summary of fixed factors for RTs in Task 2.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="top">Predictor</th>
<th align="center" valign="top">Estimate</th>
<th align="center" valign="top">SE</th>
<th align="center" valign="top"><italic>t</italic> value</th>
<th align="center" valign="top">Pr (&#x003E;|<italic>z</italic>|)</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top">(Intercept)</td>
<td align="center" valign="top">547.86</td>
<td align="center" valign="top">18.79</td>
<td align="center" valign="top">29.16</td>
<td align="center" valign="top">&#x003C;0.001<sup>&#x002A;&#x002A;&#x002A;</sup></td>
</tr>
<tr>
<td align="left" valign="top">Congruency</td>
<td align="center" valign="top">43.23</td>
<td align="center" valign="top">6.03</td>
<td align="center" valign="top">7.71</td>
<td align="center" valign="top">&#x003C;0.001<sup>&#x002A;&#x002A;&#x002A;</sup></td>
</tr>
<tr>
<td align="left" valign="top">Taste</td>
<td align="center" valign="top">44.58</td>
<td align="center" valign="top">6.27</td>
<td align="center" valign="top">7.11</td>
<td align="center" valign="top">&#x003C;0.001<sup>&#x002A;&#x002A;&#x002A;</sup></td>
</tr>
<tr>
<td align="left" valign="top">Taste level</td>
<td align="center" valign="top">8.77</td>
<td align="center" valign="top">5.29</td>
<td align="center" valign="top">1.66</td>
<td align="center" valign="top">0.098<sup>+</sup></td>
</tr>
<tr>
<td align="left" valign="top">Congruency &#x00D7; Taste</td>
<td align="center" valign="top">&#x2212;68.96</td>
<td align="center" valign="top">8.60</td>
<td align="center" valign="top">&#x2212;8.02</td>
<td align="center" valign="top">&#x003C;0.001<sup>&#x002A;&#x002A;&#x002A;</sup></td>
</tr>
<tr>
<td align="left" valign="top">Congruency &#x00D7; Taste level</td>
<td align="center" valign="top">&#x2212;14.18</td>
<td align="center" valign="top">7.43</td>
<td align="center" valign="top">&#x2212;1.91</td>
<td align="center" valign="top">0.056<sup>+</sup></td>
</tr>
<tr>
<td align="left" valign="top">Taste &#x00D7; Taste level</td>
<td align="center" valign="top">&#x2212;22.36</td>
<td align="center" valign="top">7.82</td>
<td align="center" valign="top">&#x2212;2.86</td>
<td align="center" valign="top">0.004<sup>&#x002A;&#x002A;</sup></td>
</tr>
<tr>
<td align="left" valign="top">Congruency &#x00D7; Taste &#x00D7; Taste level</td>
<td align="center" valign="top">25.64</td>
<td align="center" valign="top">10.39</td>
<td align="center" valign="top">2.47</td>
<td align="center" valign="top">0.01<sup>&#x002A;</sup></td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p>&#x2018;<sup>&#x002A;&#x002A;&#x002A;</sup>&#x2019;&#x2009;&#x003C;&#x2009;0.001; &#x2018;<sup>&#x002A;&#x002A;</sup>&#x2019;&#x2009;&#x003C;&#x2009;0.01; &#x2018;<sup>&#x002A;</sup>&#x2019;&#x2009;&#x003C;&#x2009;0.05; &#x2018;<sup>+</sup>&#x2019;&#x2009;&#x003C;&#x2009;0.1.</p>
</table-wrap-foot>
</table-wrap>
<fig position="float" id="fig3">
<label>Figure 3</label>
<caption>
<p>Violin plot showing the distribution of individual RTs in discriminating pink and yellow font colors in congruent (white color) and incongruent (gray color) conditions. The horizontal line represents the mean. Error bars represent the standard errors of the mean (<sup>&#x002A;</sup><italic>p</italic>&#x2009;&#x003C;&#x2009;0.05, Bonferroni corrected).</p>
</caption>
<graphic xlink:href="fpsyg-15-1250781-g003.tif"/>
</fig>
</sec>
<sec id="sec13">
<label>3.2.2</label>
<title>Accuracy</title>
<p>Participants made 11.57% of the error trials, indicating some difficulty in discriminating the low-level pink and yellow font colors. The GLMM model analysis showed no significant effect of congruency condition on errors, <italic>x</italic><sup>2</sup>(1)&#x2009;=&#x2009;1.84, <italic>p</italic> =&#x2009;0.18. Thus, there was no significant difference on the effect of taste on color discriminating by error rate.</p>
</sec>
</sec>
<sec id="sec14">
<label>3.3</label>
<title>Comparison between the two tasks</title>
<p>The GLMM model analysis showed that there was no significant interaction effect between the two tasks and the congruency conditions, on both response times, <italic>x</italic><sup>2</sup>(1)&#x2009;=&#x2009;0.13, <italic>p</italic> =&#x2009;0.72, and errors, <italic>x</italic><sup>2</sup>(1)&#x2009;=&#x2009;3.51, <italic>p</italic> =&#x2009;0.06. Thus, the color-taste congruency effect showed little difference in taste categorization and font color discrimination.</p>
</sec>
</sec>
<sec sec-type="discussion" id="sec15">
<label>4</label>
<title>Discussion</title>
<p>In two Stroop-word categorization tasks, we manipulated the font color (pink and yellow) and the taste of food names (sweet and sour) to examine the congruency effect of associations between color and taste (pink-sweet and sour-yellow). The results showed that color-taste congruency influenced the speed of categorization of both taste-related conceptual words and perceptual font colors. Specifically, yellow font color facilitated the categorization of sour taste words compared to pink font color, while sweet taste words facilitated the discrimination of pink font color compared to sour taste. The high and low levels of taste words and font colors failed to show any interaction effect with the congruency effect. These findings provided further evidence for the congruency effect of color-taste correspondences on cognitive processing and behavioral performance (<xref ref-type="bibr" rid="ref33">Velasco et al., 2015</xref>; <xref ref-type="bibr" rid="ref5">Chen et al., 2021</xref>).</p>
<p>When participants were instructed to categorize food names as sweet or sour, the font colors were processed simultaneously, which interfered with the taste process. Our participants responded faster when the font color was congruent with the taste category (sweet-pink/sour-yellow) than when it was incongruent (sweet-yellow/sour-pink). Specifically, sour-tasting words in yellow font colors were recognized faster than those in pink font colors, indicating a congruency effect of sour-yellow correspondences on word taste categorization. Meanwhile, when participants were asked to discriminate the font color as pink or yellow, the associated taste (sweet or sour) of the words was also automatically activated, that discriminating font colors in congruent conditions (pink-sweet/yellow-sour) was faster than in incongruent conditions (pink-sour/yellow-sweet). Specifically, a sweet-related word facilitated the discrimination of pink font color compared to sour-related words, suggesting a sweet-pink correspondence. Thus, congruent color-taste correspondences facilitated perceptual color discrimination. Taken together, the results indicate the congruency effect of an automatically activated color-taste correspondence that modulates the processing of both conceptual word taste categorization and perceptual font color discrimination.</p>
<p>Indeed, it is worth noting that foods in the natural environment that represent sweet and sour tastes often have colors that differ from the commonly associated pink and yellow. For example, dried plum, a popular sour-tasting tsukemono (pickled vegetables) in Japan, is usually red in color. In contrast, chocolate, which is sweet and bitter, are predominantly brown in color. Similarly, donuts, which are known for their sweetness, are typically found in brown and yellow/orange colors. These examples illustrate how learned color-taste correspondences can override natural color associations, potentially influencing the perception and behavior.</p>
<p>Previous research suggests that individuals learn the relationships between sensory features/dimensions across different modalities, leading to the establishment of crossmodal correspondences (<xref ref-type="bibr" rid="ref1001">Spence, 2011</xref>). These correspondences aid in the efficient binding and processing of multisensory information, allowing the brain to efficiently integrate and combine sensory inputs to create a coherent perceptual environment (<xref ref-type="bibr" rid="ref21">Parise et al., 2013</xref>). In particular, color-taste correspondences can be acquired through mere exposure to the co-occurrence of tastes and colors in foods and beverages (e.g., sweet-pink from the image of ripe and sweet red fruits, and sour-yellow from the yellow lemon). Importantly, studies have shown that similar color-taste correspondences exist across different cultural backgrounds, and some are influenced by cultural contexts (<xref ref-type="bibr" rid="ref34">Wan et al., 2014</xref>). For example, Japanese individuals show high congruency for sweet-pink (86.59%) and sour-taste (80.49%) correspondences (at a chance level of 20%), which may be due to the regional food behaviors.</p>
<p>Further, color-taste correspondence can also be mediated by the emotional responses from those colors and tastes (e.g., hedonics; <xref ref-type="bibr" rid="ref33">Velasco et al., 2015</xref>). For example, the color pink is often associated with happiness and romanticism, which aligns with the positive emotions evoked by the sweet taste. Similarly, the color yellow is associated with stimulation and excitement, which aligns with the sensory experience of sour tastes. These overlapping emotional experiences between colors and tastes could contribute to the observed correspondences between them. Thus, the emotional correspondence hypothesis offers a potential explanation for color-taste correspondence.</p>
<p>One limitation of the current study is that it focused only on the sweet-pink and sour-yellow correspondences. Future research could extend this by testing the Stroop effect of correspondences between other basic tastes and colors. Thus, the colors implicitly associated with each basic taste and their strength can be revealed using the Stroop task. Another limitation is the selection of food names representing different levels of sweet and sour tastes from online ranking websites. Future studies could adopt a more systematic approach to examine and rank the basic tastes of different foods to ensure a more comprehensive representation of taste profiles. Moreover, the levels of tastes (i.e., high and low levels of sweetness and sourness) were not well controlled. The main interest of this study is to examine the congruency effect, we did not ask participants to rank the sweetness/sourness of the taste. Future studies could further explore the strength of these associations using the Stroop effect, such that the higher level of color-taste correspondence, the stronger the congruency effect. At last, participants reported whether they have abnormal color vision subjectively, future study should use the Ishihara color vision test and a chemosensory disorder questionnaire to assess the perception of color and taste. Besides, there is no control over the luminance of the chromaticity, future studies need to use isoluminant colors. Our participants were all university students, future studies should consider the effect of age difference on the formation and congruency effect of color-taste correspondence.</p>
<p>In conclusion, the present study demonstrated a congruency effect of color-taste correspondences (pink-sweet and yellow-sour), on both word taste categorization and font color discrimination tested by the Stroop tasks. This highlights the congruency effect of color-taste correspondences in influencing the cognitive processing of perceptual and conceptual information. Future research could investigate the statistical nature of these correspondences by examining the learning effects and training participants to establish color associations with novel tastes.</p>
</sec>
<sec sec-type="data-availability" id="sec16">
<title>Data availability statement</title>
<p>The datasets presented in this study can be found in online repositories. The names of the repository/repositories and accession number(s) can be found in the article/supplementary material.</p>
</sec>
<sec sec-type="ethics-statement" id="sec17">
<title>Ethics statement</title>
<p>The studies involving humans were approved by The institutional review board of Waseda University. The studies were conducted in accordance with the local legislation and institutional requirements. The participants provided their written informed consent to participate in this study.</p>
</sec>
<sec sec-type="author-contributions" id="sec18">
<title>Author contributions</title>
<p>NC coded the experiment. YY and MK conducted the experiments. YY and NC analyzed the data, confirmed the statistical analyses, and drafted the manuscript. KW provided the critical revisions on the introduction and discussion. All authors designed the study, read and approved the final manuscript.</p>
</sec>
</body>
<back>
<sec sec-type="funding-information" id="sec19">
<title>Funding</title>
<p>This work was supported by JSPS Grants-in-Aid for Scientific Research (19K20591, 22KJ2877, 22H00090) from Japan Society for the Promotion of Science.</p>
</sec>
<sec sec-type="COI-statement" id="sec20">
<title>Conflict of interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec id="sec100" sec-type="disclaimer">
<title>Publisher&#x2019;s note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
<fn-group>
<fn id="fn0001">
<p><sup>1</sup><ext-link xlink:href="http://pstnet.com/products/e-prime/" ext-link-type="uri">http://pstnet.com/products/e-prime/</ext-link></p>
</fn>
<fn id="fn0002">
<p><sup>2</sup><ext-link xlink:href="https://ranking.net/rankings/best-sweet-foods" ext-link-type="uri">https://ranking.net/rankings/best-sweet-foods</ext-link></p>
</fn>
<fn id="fn0003">
<p><sup>3</sup><ext-link xlink:href="https://ranking.net/rankings/best-sour-foods" ext-link-type="uri">https://ranking.net/rankings/best-sour-foods</ext-link></p>
</fn>
</fn-group>
<ref-list>
<title>References</title>
<ref id="ref1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Barr</surname> <given-names>D. J.</given-names></name> <name><surname>Levy</surname> <given-names>R.</given-names></name> <name><surname>Scheepers</surname> <given-names>C.</given-names></name> <name><surname>Tily</surname> <given-names>H. J.</given-names></name></person-group> (<year>2013</year>). <article-title>Random effects structure for confirmatory hypothesis testing: keep it maximal</article-title>. <source>J. Mem. Lang.</source> <volume>68</volume>, <fpage>255</fpage>&#x2013;<lpage>278</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.jml.2012.11.001</pub-id>, PMID: <pub-id pub-id-type="pmid">24403724</pub-id></citation></ref>
<ref id="ref2"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Bates</surname> <given-names>D.</given-names></name> <name><surname>M&#x00E4;chler</surname> <given-names>M.</given-names></name> <name><surname>Bolker</surname> <given-names>B.</given-names></name> <name><surname>Walker</surname> <given-names>S.</given-names></name></person-group> (<year>2015</year>). <article-title>Fitting linear mixed-effects models using {lme4}</article-title>. <source>J. Stat. Softw.</source> <volume>67</volume>, <fpage>1</fpage>&#x2013;<lpage>48</lpage>. doi: <pub-id pub-id-type="doi">10.18637/jss.v067.i01</pub-id></citation></ref>
<ref id="ref3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chen</surname> <given-names>N.</given-names></name> <name><surname>Nakamura</surname> <given-names>K.</given-names></name> <name><surname>Watanabe</surname> <given-names>K.</given-names></name></person-group> (<year>2023</year>). <article-title>An automatic red-female association tested by the Stroop task</article-title>. <source>Acta Psychol.</source> <volume>238</volume>:<fpage>103982</fpage>. doi: <pub-id pub-id-type="doi">10.1016/j.actpsy.2023.103982</pub-id>, PMID: <pub-id pub-id-type="pmid">37478774</pub-id></citation></ref>
<ref id="ref4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chen</surname> <given-names>N.</given-names></name> <name><surname>Tanaka</surname> <given-names>K.</given-names></name> <name><surname>Watanabe</surname> <given-names>K.</given-names></name></person-group> (<year>2015</year>). <article-title>Color-shape associations revealed with implicit association tests</article-title>. <source>PLoS One</source> <volume>10</volume>:<fpage>e0116954</fpage>. doi: <pub-id pub-id-type="doi">10.1371/journal.pone.0116954</pub-id>, PMID: <pub-id pub-id-type="pmid">25625717</pub-id></citation></ref>
<ref id="ref5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chen</surname> <given-names>N.</given-names></name> <name><surname>Watanabe</surname> <given-names>K.</given-names></name> <name><surname>Wada</surname> <given-names>M.</given-names></name></person-group> (<year>2021</year>). <article-title>People with high autistic traits show fewer consensual crossmodal correspondences between visual features and tastes</article-title>. <source>Front. Psychol.</source> <volume>12</volume>:<fpage>714277</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2021.714277</pub-id>, PMID: <pub-id pub-id-type="pmid">34566793</pub-id></citation></ref>
<ref id="ref6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Delignette-Muller</surname> <given-names>M. L.</given-names></name> <name><surname>Dutang</surname> <given-names>C.</given-names></name></person-group> (<year>2015</year>). <article-title>Fitdistrplus: an R package for fitting distributions</article-title>. <source>J. Stat. Softw.</source> <volume>64</volume>, <fpage>1</fpage>&#x2013;<lpage>34</lpage>. doi: <pub-id pub-id-type="doi">10.18637/jss.v064.i04</pub-id></citation></ref>
<ref id="ref7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Faul</surname> <given-names>F.</given-names></name> <name><surname>Erdfelder</surname> <given-names>E.</given-names></name> <name><surname>Lang</surname> <given-names>A. G.</given-names></name> <name><surname>Buchner</surname> <given-names>A.</given-names></name></person-group> (<year>2007</year>). <article-title>G&#x002A;power 3: a exible statistical power analysis program for the social, behavioral, and biomedical sciences</article-title>. <source>Behav. Res. Methods</source> <volume>39</volume>, <fpage>175</fpage>&#x2013;<lpage>191</lpage>. doi: <pub-id pub-id-type="doi">10.3758/BF03193146</pub-id>, PMID: <pub-id pub-id-type="pmid">17695343</pub-id></citation></ref>
<ref id="ref8"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Fiszman</surname> <given-names>B. P.</given-names></name> <name><surname>Spence</surname> <given-names>C.</given-names></name></person-group> (<year>2015</year>). &#x201C;<article-title>Colour correspondences in chemosensation: the case of food and drinks</article-title>&#x201D; in <source>Nutrition and sensation</source>. ed. <person-group person-group-type="editor"><name><surname>Hirsh</surname> <given-names>A. L.</given-names></name></person-group> (<publisher-loc>Boca Raton</publisher-loc>: <publisher-name>CRC Press</publisher-name>), <fpage>139</fpage>&#x2013;<lpage>158</lpage>.</citation></ref>
<ref id="ref9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fox</surname> <given-names>J.</given-names></name> <name><surname>Friendly</surname> <given-names>M.</given-names></name> <name><surname>Weisberg</surname> <given-names>S.</given-names></name></person-group> (<year>2013</year>). <article-title>Hypothesis tests for multivariate linear models using the car package</article-title>. <source>R J.</source> <volume>5</volume>, <fpage>39</fpage>&#x2013;<lpage>52</lpage>. doi: <pub-id pub-id-type="doi">10.32614/RJ-2013-004</pub-id></citation></ref>
<ref id="ref10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hidaka</surname> <given-names>S.</given-names></name> <name><surname>Shimoda</surname> <given-names>K.</given-names></name></person-group> (<year>2014</year>). <article-title>Investigation of the effects of color on judgments of sweetness using a taste adaptation method</article-title>. <source>Multisens. Res.</source> <volume>27</volume>, <fpage>189</fpage>&#x2013;<lpage>205</lpage>. doi: <pub-id pub-id-type="doi">10.1163/22134808-00002455</pub-id></citation></ref>
<ref id="ref11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Higgins</surname> <given-names>M. J.</given-names></name> <name><surname>Hayes</surname> <given-names>J. E.</given-names></name></person-group> (<year>2019</year>). <article-title>Learned color taste associations in a repeated brief exposure paradigm</article-title>. <source>Food Qual. Prefer.</source> <volume>71</volume>, <fpage>354</fpage>&#x2013;<lpage>365</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.foodqual.2018.08.013</pub-id></citation></ref>
<ref id="ref12"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Johnson</surname> <given-names>N. L.</given-names></name> <name><surname>Kotz</surname> <given-names>S.</given-names></name> <name><surname>Balakrishnan</surname> <given-names>N.</given-names></name></person-group> (<year>1995</year>). <source>Continuous univariate distributions</source>, vol. <volume>289</volume>. New York: <publisher-name>John Wiley &#x0026; Sons</publisher-name>.</citation></ref>
<ref id="ref13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Koch</surname> <given-names>C.</given-names></name> <name><surname>Koch</surname> <given-names>E. C.</given-names></name></person-group> (<year>2003</year>). <article-title>Preconceptions of taste based on color</article-title>. <source>J. Psychol.</source> <volume>137</volume>, <fpage>233</fpage>&#x2013;<lpage>242</lpage>. doi: <pub-id pub-id-type="doi">10.1080/00223980309600611</pub-id>, PMID: <pub-id pub-id-type="pmid">12795546</pub-id></citation></ref>
<ref id="ref15"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Lenth</surname> <given-names>R.</given-names></name> <name><surname>Singmann</surname> <given-names>H.</given-names></name> <name><surname>Love</surname> <given-names>J.</given-names></name> <name><surname>Buerkner</surname> <given-names>P.</given-names></name> <name><surname>Herve</surname> <given-names>M.</given-names></name></person-group> (<year>2018</year>). Emmeans: estimated marginal means. AKA least-squares means, R package version 1.5.0. Available at: <ext-link xlink:href="https://rdrr.io/cran/emmeans/" ext-link-type="uri">https://rdrr.io/cran/emmeans/</ext-link>.</citation></ref>
<ref id="ref16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lo</surname> <given-names>S.</given-names></name> <name><surname>Andrews</surname> <given-names>S.</given-names></name></person-group> (<year>2015</year>). <article-title>To transform or not to transform: using generalized linear mixed models to analyse reaction time data</article-title>. <source>Front. Psychol.</source> <volume>6</volume>:<fpage>1171</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2015.01171</pub-id></citation></ref>
<ref id="ref18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Maga</surname> <given-names>J. A.</given-names></name></person-group> (<year>1974</year>). <article-title>Influence of color on taste thresholds</article-title>. <source>Chem. Senses</source> <volume>1</volume>, <fpage>115</fpage>&#x2013;<lpage>119</lpage>. doi: <pub-id pub-id-type="doi">10.1093/chemse/1.1.115</pub-id></citation></ref>
<ref id="ref19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Meteyard</surname> <given-names>L.</given-names></name> <name><surname>Davies</surname> <given-names>R. A.</given-names></name></person-group> (<year>2020</year>). <article-title>Best practice guidance for linear mixed-effects models in psychological science</article-title>. <source>J. Mem. Lang.</source> <volume>112</volume>:<fpage>104092</fpage>. doi: <pub-id pub-id-type="doi">10.1016/j.jml.2020.104092</pub-id></citation></ref>
<ref id="ref20"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Morey</surname> <given-names>R. D.</given-names></name> <name><surname>Rouder</surname> <given-names>J. N.</given-names></name></person-group> (<year>2018</year>). BayseFactor: computation of bayes factors for common designs. R package version 0.9.12&#x2013;4.2. Available at: <ext-link xlink:href="https://cran.r-project.org/web/packages/BayesFactor/index.html" ext-link-type="uri">https://cran.r-project.org/web/packages/BayesFactor/index.html</ext-link></citation></ref>
<ref id="ref21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Parise</surname> <given-names>C. V.</given-names></name> <name><surname>Harrar</surname> <given-names>V.</given-names></name> <name><surname>Ernst</surname> <given-names>M. O.</given-names></name> <name><surname>Spence</surname> <given-names>C.</given-names></name></person-group> (<year>2013</year>). <article-title>Cross-correlation between auditory and visual signals promotes multisensory integration</article-title>. <source>Multisens. Res.</source> <volume>26</volume>, <fpage>307</fpage>&#x2013;<lpage>316</lpage>. doi: <pub-id pub-id-type="doi">10.1163/22134808-00002417</pub-id></citation></ref>
<ref id="ref22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Piqueras-Fiszman</surname> <given-names>B.</given-names></name> <name><surname>Spence</surname> <given-names>C.</given-names></name></person-group> (<year>2011</year>). <article-title>Crossmodal correspondences in product packaging. Assessing color&#x2013;flavor correspondences for potato chips (crisps)</article-title>. <source>Appetite</source> <volume>57</volume>, <fpage>753</fpage>&#x2013;<lpage>757</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.appet.2011.07.012</pub-id>, PMID: <pub-id pub-id-type="pmid">21824502</pub-id></citation></ref>
<ref id="ref23"><citation citation-type="other"><person-group person-group-type="author"><collab id="coll1">R Core Team</collab></person-group>. (<year>2020</year>). R: a language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. Available at: <ext-link xlink:href="https://www.R-project.org/" ext-link-type="uri">https://www.R-project.org/</ext-link></citation></ref>
<ref id="ref24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Richter</surname> <given-names>T.</given-names></name> <name><surname>Zwaan</surname> <given-names>R. A.</given-names></name></person-group> (<year>2009</year>). <article-title>Processing of color words activates color representations</article-title>. <source>Cognition</source> <volume>111</volume>, <fpage>383</fpage>&#x2013;<lpage>389</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.cognition.2009.02.011</pub-id></citation></ref>
<ref id="ref25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Saluja</surname> <given-names>S.</given-names></name> <name><surname>Stevenson</surname> <given-names>R. J.</given-names></name></person-group> (<year>2018</year>). <article-title>Cross-modal associations between real tastes and colors</article-title>. <source>Chem. Senses</source> <volume>43</volume>, <fpage>475</fpage>&#x2013;<lpage>480</lpage>. doi: <pub-id pub-id-type="doi">10.1093/chemse/bjy033</pub-id>, PMID: <pub-id pub-id-type="pmid">29868904</pub-id></citation></ref>
<ref id="ref26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Scarpina</surname> <given-names>F.</given-names></name> <name><surname>Tagini</surname> <given-names>S.</given-names></name></person-group> (<year>2017</year>). <article-title>The stroop color and word test</article-title>. <source>Front. Psychol.</source> <volume>8</volume>:<fpage>557</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2017.00557</pub-id></citation></ref>
<ref id="ref1001"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Spence</surname> <given-names>C.</given-names></name></person-group> (<year>2011</year>). <article-title>Crossmodal correspondences: A tutorial review</article-title>. <source>Attention, Perception, &#x0026; Psychophysics</source>, <volume>73</volume>, <fpage>971</fpage>&#x2013;<lpage>995</lpage>. doi: <pub-id pub-id-type="doi">10.3758/s13414-010-0073-7</pub-id></citation></ref>
<ref id="ref27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Spence</surname> <given-names>C.</given-names></name></person-group> (<year>2019</year>). <article-title>On the relationship (s) between color and taste/flavor</article-title>. <source>Exp. Psychol.</source> <volume>66</volume>, <fpage>99</fpage>&#x2013;<lpage>111</lpage>. doi: <pub-id pub-id-type="doi">10.1027/1618-3169/a000439</pub-id>, PMID: <pub-id pub-id-type="pmid">30895915</pub-id></citation></ref>
<ref id="ref28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Spence</surname> <given-names>C.</given-names></name> <name><surname>Levitan</surname> <given-names>C. A.</given-names></name></person-group> (<year>2021</year>). <article-title>Explaining crossmodal correspondences between colours and tastes</article-title>. <source>i-Perception</source> <volume>12</volume>:<fpage>20416695211018223</fpage>. doi: <pub-id pub-id-type="doi">10.1177/20416695211018223</pub-id>, PMID: <pub-id pub-id-type="pmid">34211685</pub-id></citation></ref>
<ref id="ref29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Spence</surname> <given-names>C.</given-names></name> <name><surname>Wan</surname> <given-names>X.</given-names></name> <name><surname>Woods</surname> <given-names>A.</given-names></name> <name><surname>Velasco</surname> <given-names>C.</given-names></name> <name><surname>Deng</surname> <given-names>J.</given-names></name> <name><surname>Youssef</surname> <given-names>J.</given-names></name> <etal/></person-group>. (<year>2015</year>). <article-title>On tasty colours and colourful tastes? Assessing, explaining, and utilizing crossmodal correspondences between colours and basic tastes</article-title>. <source>Flavour</source> <volume>4</volume>, <fpage>1</fpage>&#x2013;<lpage>17</lpage>. doi: <pub-id pub-id-type="doi">10.1186/s13411-015-0033-1</pub-id></citation></ref>
<ref id="ref30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Stillman</surname> <given-names>J. A.</given-names></name></person-group> (<year>1993</year>). <article-title>Color influences flavor identification in fruit-flavored beverages</article-title>. <source>J. Food Sci.</source> <volume>58</volume>, <fpage>810</fpage>&#x2013;<lpage>812</lpage>. doi: <pub-id pub-id-type="doi">10.1111/j.1365-2621.1993.tb09364.x</pub-id></citation></ref>
<ref id="ref31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Stroop</surname> <given-names>J. R.</given-names></name></person-group> (<year>1935</year>). <article-title>Studies of interference in serial verbal reactions</article-title>. <source>J. Exp. Psychol.</source> <volume>18</volume>, <fpage>643</fpage>&#x2013;<lpage>662</lpage>. doi: <pub-id pub-id-type="doi">10.1037/h0054651</pub-id></citation></ref>
<ref id="ref33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Velasco</surname> <given-names>C.</given-names></name> <name><surname>Woods</surname> <given-names>A.</given-names></name> <name><surname>Deroy</surname> <given-names>O.</given-names></name> <name><surname>Spence</surname> <given-names>C.</given-names></name></person-group> (<year>2015</year>). <article-title>Hedonic mediation of the crossmodal correspondence between taste and shape</article-title>. <source>Food Qual. Prefer.</source> <volume>41</volume>, <fpage>151</fpage>&#x2013;<lpage>158</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.foodqual.2014.11.010</pub-id></citation></ref>
<ref id="ref34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wan</surname> <given-names>X.</given-names></name> <name><surname>Woods</surname> <given-names>A. T.</given-names></name> <name><surname>van den Bosch</surname> <given-names>J. J.</given-names></name> <name><surname>McKenzie</surname> <given-names>K. J.</given-names></name> <name><surname>Velasco</surname> <given-names>C.</given-names></name> <name><surname>Spence</surname> <given-names>C.</given-names></name></person-group> (<year>2014</year>). <article-title>Cross-cultural differences in crossmodal correspondences between basic tastes and visual features</article-title>. <source>Front. Psychol.</source> <volume>5</volume>:<fpage>1365</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2014.01365</pub-id></citation></ref>
<ref id="ref35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Woods</surname> <given-names>A. T.</given-names></name> <name><surname>Spence</surname> <given-names>C.</given-names></name></person-group> (<year>2016</year>). <article-title>Using single colors and color pairs to communicate basic tastes</article-title>. <source>i-Perception</source> <volume>7</volume>:<fpage>2041669516658817</fpage>. doi: <pub-id pub-id-type="doi">10.1177/2041669516658817</pub-id>, PMID: <pub-id pub-id-type="pmid">27698979</pub-id></citation></ref>
<ref id="ref36"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Xiao</surname> <given-names>X.</given-names></name> <name><surname>Dupuis-Roy</surname> <given-names>N.</given-names></name> <name><surname>Yang</surname> <given-names>X. L.</given-names></name> <name><surname>Qiu</surname> <given-names>J. F.</given-names></name> <name><surname>Zhang</surname> <given-names>Q. L.</given-names></name></person-group> (<year>2014</year>). <article-title>The taste-visual cross-modal Stroop effect: an event-related brain potential study</article-title>. <source>Neuroscience</source> <volume>263</volume>, <fpage>250</fpage>&#x2013;<lpage>256</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuroscience.2014.01.002</pub-id>, PMID: <pub-id pub-id-type="pmid">24418613</pub-id></citation></ref>
<ref id="ref37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zampini</surname> <given-names>M.</given-names></name> <name><surname>Sanabria</surname> <given-names>D.</given-names></name> <name><surname>Phillips</surname> <given-names>N.</given-names></name> <name><surname>Spence</surname> <given-names>C.</given-names></name></person-group> (<year>2007</year>). <article-title>The multisensory perception of flavor: assessing the influence of color cues on flavor discrimination responses</article-title>. <source>Food Qual. Prefer.</source> <volume>18</volume>, <fpage>975</fpage>&#x2013;<lpage>984</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.foodqual.2007.04.001</pub-id></citation></ref>
</ref-list>
</back>
</article>