Skip to main content

ORIGINAL RESEARCH article

Front. Psychol., 11 February 2022
Sec. Emotion Science

Interoceptive Accuracy Did Not Affect Moral Decision-Making, but Affect Regret Rating for One’s Moral Choices

  • 1Department of Cognitive and Psychological Sciences, Graduate School of Informatics, Nagoya University, Nagoya, Japan
  • 2Department of Human Sciences, Toyo Eiwa University, Yokohama, Japan

Previous studies have revealed the effect of interoceptive accuracy (IAcc), a behavioral measure of the ability to feel physiological states and regulation for that, which origin emotion on decision-making such as gambling. Given that decision-making in moral dilemma situations is affected by emotion, it seems that IAcc also affects moral decision-making. The present study preliminarily investigates whether IAcc affects decision-making and emotional ratings such as regret for one’s own choices in moral dilemma situations. IAcc did not affect moral choice (deontological or utilitarian option), but affected regret ratings for one’s moral choice in portions of dilemma scenarios. Moreover, people with higher IAcc make deontological choices more rapidly than those with lower IAcc in self-related dilemma scenarios. These results suggest that people with higher IAcc feel stronger emotional conflicts about utilitarian choices but weaker conflicts about deontological choices than people with lower IAcc depending on the moral dilemma scenario.

Introduction

Moral dilemma studies have been conducted in a broad range of areas, such as psychology, neuroscience, philosophy and ethics. These moral dilemma contexts require people to judge whether actions that involve sacrificing a few lives to save many lives are appropriate or to choose one of two actions in certain dilemma situations (Greene et al., 2001). Typically, a moral dilemma context is established in such a way that one option involves a utilitarian emphasis on the number of people saved, such as sacrificing a few lives to save many lives. The other option is deontological and emphasizes justifying the means, such as avoiding sacrificing innocent people. It seems that the decision-making that people face in daily life often has no single correct answer (Nakao et al., 2012); hence, people face dilemmas and feel conflict regarding which choice they should make. Given the nature of decisions that people make in real-life situations, it is important to focus on decision-making in moral dilemma situations.

Moral dilemma studies have mostly investigated which situations make people inclined to judge utilitarian action as acceptable and what type of people are prone to judge this action as acceptable. These studies have revealed a distinct neural basis for utilitarian or deontological judgments depending on the type of scenario (Greene et al., 2004, 2001; Koenigs et al., 2012). The most famous dilemma scenarios are the trolley and footbridge dilemma. Each scenario supposes an uncontrollable and runaway trolley. Specifically, the following situations are assumed: if the trolley goes straight, five workers who work on one roadway beyond the trolley will crash with the trolley and die. In the trolley dilemma, if the direction of the trolley on the roadway changes, the five workers are saved. However, there is one worker on the other changed line. In that situation, one worker will die instead of five workers. There is a switch that changes the direction of the trolley in front of the protagonist, who is required to judge whether to switch the direction of the trolley. On the other hand, in the footbridge dilemma, none of the other roadways switch to the changed direction of the trolley. Five workers can be saved from the uncontrollable trolley in the footbridge dilemma by pushing a man on the footbridge into the roadway, where his body can stop the trolley. In this situation, a protagonist can save five workers; however, one man would die instead (Thomson, 1986). It is well known that people are inclined to judge utilitarian options as acceptable in trolley dilemma, while they are inclined to judge utilitarian options as unacceptable in footbridge dilemma. Greene et al. (2001) classified the former as a personal moral dilemma and the latter as an impersonal moral dilemma. Personal moral dilemmas are defined as follows (Greene et al., 2004): the action causes serious bodily harm to a particular party, and the harm does not result from deflecting an existing threat onto a different party. Differences in the judgments of the two types of moral dilemmas have been explained by dual process theory. In impersonal moral dilemmas, the harming action regarding utilitarian attitude elicits little negative emotion; therefore, people judge utilitarian options as acceptable. In personal moral dilemmas, on the other hand, the harming action elicits strongly negative emotions, and many people judge utilitarian options to be unacceptable (Greene, 2013). In particular, people with emotional insensitivity, such as psychopathy or alexithymia, have been reported to be more prone to judge utilitarian actions as acceptable in personal moral dilemmas (Koenigs et al., 2012; Patil and Silani, 2014). Moreover, patients with ventromedial prefrontal cortex (VMPFC) deficits, which relate to physiological regulation and the representation of value (Hänsel and von Känel, 2008; Padoa-Schioppa, 2011), are also prone to judge utilitarian actions as acceptable in personal moral dilemmas, accompanied by a decrease in physiological arousal characterized by low-amplitude skin conductance responses (SCRs) (Moretto et al., 2009). In contrast, the amplitude of SCRs before healthy participants conduct moral judgments correlates negatively with the number of utilitarian judgments (Moretto et al., 2009). In other words, people with higher SCRs make fewer utilitarian choices. These results suggest that sensitivity to emotion arising from physiological arousal makes people prone to deontological judgments as a result of refraining from utilitarian judgments in personal moral dilemmas and that insensitivity to emotion makes people prone to utilitarian judgments without emotion-based aversion regarding utilitarian action.

Previous studies have mainly focused on the moral judgment of participants as bystanders. Unlike these studies, we are especially concerned with moral decision-making rather than judgment in moral dilemma situations because people are required to make not only judgments but also decisions, which influence later life. Therefore, it is important to investigate the effect of emotion on moral decision-making as well as moral judgments in moral dilemma situations. However, moral dilemma studies that examine decision-making are still lacking compared to those on judgment.

Interoception is the process by which organisms integrate, sense, and interpret signals originating from within the body (Chen et al., 2021). In particular, interoceptive accuracy (IAcc) reflects performance on objective behavioral measures (Garfinkel et al., 2015), it brings to individual differences in emotional sensitivity. People with high IAcc, as measured by heartbeat counting task (Schandry, 1981), experience more intense emotional arousal states in daily life than people with low IAcc (Barrett et al., 2004). Schandry (1981) showed the extent to which IAcc correlates with emotional sensitivity, which was measured by questionnaire. In addition, research has investigated the extent to which IAcc scores in people with emotional deficits. People with higher psychopathic traits show lower IAcc than participants with no psychopathic traits (Nentjes et al., 2013). Moreover, people with alexithymia, who have difficulty feeling emotion, also report lower IAcc (Shah et al., 2016). These results suggest that higher IAcc relates to sensitivity and lower IAcc relates to insensitivity to emotional experience. It is plausible that the ability to feel physiological states correlates with emotional experience because these neural bases is the anterior insula in common (Critchley et al., 2004; Zaki et al., 2012; Terasawa et al., 2013). In addition, IAcc affects not only emotional sensitivity but also decision-making. Interoception contributes to the maintenance of homeostasis (Craig, 2009) and leads to better motivational behavior, such as decision-making, to satisfy physiological needs (Gu and FitzGerald, 2014). In light of this, Werner et al. (2009) initially focused on the relationship between interoception and decision-making. Their study showed that people with high IAcc less frequently choose disadvantageous options than those with low IAcc in the Iowa gambling task (IGT). Moreover, individuals with higher IAcc show greater aversion to monetary-reward loss than individuals with lower IAcc in decision-making task with risk (Sokol-Hessner et al., 2015). These results suggest that the IAcc-based intensity of emotional experience affects emotion-based decision-making.

Given the relation between IAcc and emotion-based decision-making, it is likely that IAcc also affects decision-making in moral dilemma situations. The relationship between IAcc and social decision-making, which is deeply related to morality, has been investigated (Lenggenhager et al., 2013; Piech et al., 2017); however, few studies have investigated the direct link between the IAcc and moral decision-making in dilemma situations such as trolley problem. The present study thus aims to examine whether and to what extent IAcc affects moral decision-making in dilemma situations. First, we predict that people with higher IAcc are prone to make deontological choices in personal moral dilemmas. In light of dual process theory in moral dilemmas (Greene and Haidt, 2002), people with lower IAcc, who might be insensitive to negative emotions in response to sacrificing innocent people, can choose utilitarian options without hesitation because they feel little conflict between alternatives, particularly in personal moral dilemmas. On the other hand, people with higher IAcc who are sensitive to emotions might feel strong negative emotions in response to utilitarian action and be averse to taking such actions in personal moral dilemmas. In addition, we predict that the IAcc level correlates with the speed of moral decision-making in personal moral dilemmas. When people with higher IAcc make utilitarian choices in personal moral dilemmas, they would take longer because they feel more conflict and require more effort to cognitively regulate to overcome deontological choices. Hence, IAcc and reaction time would positively correlate in people who make utilitarian choices in personal moral dilemmas.

Decision-making in moral dilemmas, in particular, elicits conflicts between alternatives and negative emotions; thus, a second goal of this study is to investigate the effect of IAcc on the intensity of negative emotion accompanying decision-making. Among the emotions that people feel in daily life, regret is a negative emotion that is deeply related to and sometimes attached to people’s own decision-making (Zeelenberg et al., 2002). In fact, moral dilemma research has confirmed that people report feeling regret in moral choices (Szekely and Miu, 2015). Therefore, we measure regret ratings for participants’ own moral choices. However, in imaginary experimental moral dilemma situations, it is difficult for participants to feel actual regret following their own moral decisions. Therefore, we measure forecasting regret for one’s own moral choices as a regret rating. Given that IAcc contributes to the intensity of emotional experiences (Barrett et al., 2004), we also predict that people with higher IAcc regret their harmful actions resulting from their utilitarian choices more intensely than those with lower IAcc.

Materials and Methods

Participants

Thirty-three female undergraduate students participated in this study. Their ages ranged from 18 to 23 (Mean = 20.84, SD = 1.39). The participants gave written informed consent to participate in the study and could leave it at any time. They conducted a heartbeat counting task and a moral dilemma task. This study was approved by the ethics committee of Nagoya University. Data from this study were assembled before the COVID-19 pandemic.

Heartbeat Counting Task

Interoceptive accuracy was assessed by a heartbeat counting task (Schandry, 1981). A sensor (emWave PC system, Heart math company) was attached to each participant’s left earlobe to measure the participant’s actual heartbeats. The participants were asked to concentrate on their internal body states and count their own heartbeats during a period indicated by auditory signals of “start” and “stop.” After the participants heard the stop signal, they reported the number of counted heartbeats. This task contained three trials with varying durations (25, 35, and 45 s), and the participants were not informed of these durations. During the task, they were forbidden to take their pulses or use other manipulations to detect their heartbeats. They were asked to stay as still as possible to avoid confounding data from the experimental devices. Each participant’s IAcc score was calculated by the following formula: 1/3Σ (1-| actual number–reported number| /actual number) (Montoya et al., 1993; Pollatos and Schandry, 2010). In addition, the score was multiplied by 10 to apply logistic regression analysis. The IAcc ranged from 0 to 10, where 0 indicates the least accurate heartbeat counting and 10 indicates the most accurate counting. That is, higher scores indicate higher IAcc.

Moral Dilemma Task

The six scenarios of the moral dilemma task in the present study were selected from the set of scenarios collected and modified by Christensen et al. (2014), except that one scenario was modified for the purposes of our study (see Supplementary Material). We selected only dilemma scenarios that are feasible in real life and could be used to measure people’s tendency to make moral decisions as Tassy et al. (2013) stated that “an action is obviously what the participants think their action could be if they were to make the decision in real life.” With this point in mind, three personal moral dilemma scenarios and three impersonal moral dilemma scenarios were selected: “Bus plunge,” “Orphanage,” “Crying baby,” “Vaccine test,” “Donation,” and “Trolley.” In addition to personal force, these scenarios involve three conceptual factors (Christensen et al., 2014): “Benefit recipient,” “Evitability,” and “Intentionality.” “Benefit recipient” involves whether the protagonist’s life was at stake in the moral dilemma. Cases in which the protagonist’s life is at stake are “Self-beneficial,” and those in which the other’s life is at stake instead of the protagonist’s life are “Other-beneficial.” “Evitability” involves whether the people to be sacrificed by utilitarian action are destined to die anyway. “Intentionality” involves whether the harm is willed and used instrumentally or as a side effect. These four factors are mixed in moral dilemma scenarios because the dilemma scenarios, by their nature, inevitably involve more than one factor. The factors included in each scenario are shown in Table 1.

TABLE 1
www.frontiersin.org

Table 1. Each dilemma scenario included one of the two factors in each category: either self or other in the Benefit recipient category, personal or impersonal in the Personal force category, inevitable or avoidable in the Evitability category, and accidental or instrumental in the Intentionality category.

The participants sat in front of a monitor. PsychoPy3 software (Peirce et al., 2019) installed on a laptop computer was used to present the stimuli and to record the participants’ responses. In each trial, a moral dilemma scenario was presented on the monitor after a certain resting time. The participants read the scenario and went on to the next page by pressing a key, where questions about the dilemma scenario were presented, and responded to them by pressing keys using their right hands.

Previous studies on moral dilemmas have mainly focused on making judgments about the acceptability or appropriateness of an action as a bystander rather than making decisions. In contrast, the present study was especially concerned with binary decision-making (i.e., deontological or utilitarian choices) of whether to take an action rather than the appropriateness of an observed action, as has been conventionally investigated. Hence, in our experiment, the first question asked whether participants would sacrifice a few lives to save many lives. An affirmative answer to this question corresponds to the utilitarian choice, whereas a negative answer corresponds to the deontological choice. The second question asked how much they would regret if they acted in accordance with the answer to the first question. The participants were asked to report the extent of the forecasting regret they would experience on a seven-point scale ranging from 1 (=no regret at all) to 7 (=regret to an extreme degree). The third question asked to what extent their own choice was appropriate. The participants were asked to report the extent to which their chosen action was appropriate on a seven-point scale ranging from 1 (=not appropriate at all) to 7 (=completely appropriate). The dilemma scenario was presented until the participants answered the third question. The flow of each trial is shown in Figure 1. There was no time limitation because the reaction time was measured. The participants were instructed to read a scenario and choose answers to the questions at their own pace.

FIGURE 1
www.frontiersin.org

Figure 1. The flow of a moral-dilemma trial. Each trial required the participants to answer three questions about one moral dilemma scenario: The first question asked whether the participants would sacrifice a few lives to save many lives (affirmative answer, utilitarian; negative answer, deontic). The second question asked the participants to rate the extent of the regret they would experience over their answer to the first question. The third question asked the participants to rate the appropriateness of their choice in response to the first question. The participant’s SCRs and reaction time were recorded during the tasks.

To measure the participants’ SCRs while they performed the task, Ag-AgCl electrodes (Vitrode F-150S, Nihon Kohden, Japan) were attached to the middle and ring fingers of their left hands. SCRs were recorded using a unit (DA-3, Vega systems, Japan). After AD conversion, the SCR waveforms were displayed on a laptop computer screen and recorded. These data were analyzed off-line at 10 Hz. For data analysis, we used SCR data that showed the maximum value in the time window between the end of the reading period and 15 s before the period (see Figure 1).

Statistical Analyses

Regression analysis was conducted to examine the effect of IAcc on the number of deontological choices in all trials. Subsequently, six scenarios were individually analyzed in detail because each dilemma scenario involved different situational factors. First, logistic regression analyses were performed to assess the effects of IAcc and SCR on moral choice, i.e., deontological or utilitarian. Second, Spearman’s rank correlation coefficients were calculated between IAcc and the reaction time for the first question, that is, the time from the end of the scenario reading phase to the time at which a deontological or utilitarian choice was made, to assess whether IAcc is related to the speed of emotion-based intuitive decision-making. Third, hierarchical multiple regression analyses were performed to reveal the effects of IAcc and moral choice on the regret rating. The deontological choice was coded as 0, and the utilitarian choice was coded as 1. Finally, unpaired t-tests were conducted to examine whether the extent to which the participants rated their own choice as acceptable varied with the moral choice content.

Results

One participant was excluded from the data analysis due to a technical problem. Moreover, SCR data were omitted for two participants due to a different technical problem.

Linear regression analysis was used to reveal the relation between IAcc and the general tendency to make deontological choices. The result showed no significant coefficient (B = −0.265, R2 = 0.027, p = 0.369), as shown in Figure 2. Moreover, logistic regression analyses revealed that IAcc and SCR had no significant effect on moral choice for any of the dilemma scenarios. These results are presented in Table 2.

FIGURE 2
www.frontiersin.org

Figure 2. Relation between IAcc and number of deontological choices.

TABLE 2
www.frontiersin.org

Table 2. Effect of IAcc and SCR on moral choice.

The results of Spearman’s rank correlation between IAcc and the time until a moral choice was made are shown in Table 3. When the participants made a utilitarian choice in any moral-dilemma scenario, there was no significant correlation between IAcc and reaction time. In contrast, when the participants made a deontological choice, negative correlations were observed between IAcc and reaction time in only Bus plunge (ρ = −0.618, p < 0.01), which was reported to induce high arousal state (Christensen et al., 2014).

TABLE 3
www.frontiersin.org

Table 3. Rank correlation between IAcc and reaction time for each moral choice.

Hierarchical multiple regression analyses were conducted to reveal the effects of IAcc and moral choice on regret. In the first step, IAcc and moral choice were included, and in the second step, an interaction term (IAcc and moral choice) was added. A value of 0 was assigned to deontological choices, and a value of one was assigned to utilitarian choices. In five of the six scenarios, there were significant main effects of moral choice (Bus plunge; B = 2.272, p < 0.000, Vaccine test; B = 2.317, p < 0.01, Orphanage; B = 2.946, p < 0.000, Donation; B = 2.420, p < 0.000, Crying baby; B = 1.781, p < 0.01). That is, the participants who made a utilitarian choice rated their degree of regret higher than those who made a deontological choice. Moreover, a significant interaction was observed between moral choice and regret in Orphanage. These results are presented in Table 4. Given this interaction, a simple slope analysis of Orphanage was conducted. The participants showed the following tendency when they made a deontological choice in Orphanage: people with higher IAcc, the lower they rated regret. However, the participants showed no such tendency when they made a utilitarian choice in Orphanage (utilitarian choice B = 0.257, n.s.; deontological choice B = −0.613, p < 0.000). Simple slope analysis was conducted for Bus plunge as well because there was a weakly significant interaction between IAcc and moral choice (B = 0.456, p < 0.1). As a result, the participants showed the following tendency when they made a utilitarian choice in Bus plunge: people with higher IAcc rated regret higher. However, the participants showed no such tendency when they made a deontological choice in Bus plunge (utilitarian choice B = 0.447, p < 0.01; deontological choice B = −0.009, n.s.). The results of simple slope test are shown in Figures 3, 4.

TABLE 4
www.frontiersin.org

Table 4. Effects of IAcc and moral choice on regret.

FIGURE 3
www.frontiersin.org

Figure 3. Effect of interaction between IAcc and choice on regret in Orphanage. The higher they rated interoceptive accuracy, the lower they rated regret when they made a deontological choice but not when they made an utilitarian choice (utilitarian choice B = 0.257, n.s.; deontological choice B = –0.613, p < 0.000).

FIGURE 4
www.frontiersin.org

Figure 4. Effect of interaction between IAcc and choice on regret in Bus plunge. The higher they rated interoceptive accuracy, the higher they rated regret when they made a utilitarian choice but not when they made a deontological choice (utilitarian B = 0.447, p < 0.01; deontology B = –0.009, n.s.).

The differences in appropriateness ratings for the choices the participants made were assessed by t-test and the results are shown in Table 5. The participants who made deontological choices in Crying baby rated their choices as significantly more appropriate than the participants who made utilitarian choices (t = 2.938, p < 0.01). In the other scenarios, there were no significant differences in the appropriateness ratings of those who made deontological choices and those who made utilitarian choices.

TABLE 5
www.frontiersin.org

Table 5. Appropriateness rating by deontological and utilitarian choice for each dilemma scenario.

Discussion

The results in the present study, in contrast to our main hypothesis, did not indicated an effect of IAcc on moral decision-making. We address findings for the individual hypotheses.

Our first hypothesis was that people with higher IAcc would be prone to deontological choices in personal moral dilemmas. However, the hypothesis was not confirmed in any moral dilemma scenario. Participants with higher IAcc were not inclined to make deontological choices. First, it has been well confirmed that moral choices are affected by many situational factors of moral dilemmas, participants’ personalities and individual cognitive differences or preferences for the thinking process that they use (Greene et al., 2009; Koenigs et al., 2012; Christensen et al., 2014; Da Silva et al., 2016). The moral choice tendency induced by interoceptive intensity may have been overridden by the moral choice tendency induced by these factors; alternatively, it might be the case that interoceptive intensity alone is too weak to affect moral choices regardless of these factors. Further research is needed to control the influence of these factors on moral decision-making. Second, Awad et al. (2018) reported that there are moral preference differences between people from individualistic cultures and those from collectivistic cultures and people from individualistic cultures showed a preference for utilitarian responses. The present study was conducted with Japanese participants; thus, the participants might have no general preference for utilitarian responses. Because such a bias among the sample might disrupt the effect of IAcc on moral decision-making, further studies should target participants from individualistic cultures, not just collectivistic cultures such as Japan. Third, it remains unclear how interoception and emotions are actually related to each other. Some studies have reported that IAcc is related to self-reported emotional sensitivity or anxiety (Schandry, 1981; Domschke et al., 2010). For example, Schandry (1981) reported that individuals with higher IAcc have higher levels of anxiety and emotional changeability than those with lower IAcc. Domschke et al. (2010) also reviewed the correlation between IAcc and anxiety levels. Considering these studies, it is plausible that the higher IAcc is, the higher emotional sensitivity is. On the other hand, Weiss et al. (2014) reported that IAcc is associated with self-regulation capacity as measured by a self-report questionnaire. Given these findings, people with high IAcc might vary with respect to their levels of emotional sensitivity and cognitive regulation of emotions. If this is the case, high IAcc alone does not determine whether people will make a deontological choice. Some people with high IAcc may have high emotional sensitivity for deontological choices, and other people with high IAcc may have high cognitive regulation ability and be inclined to utilitarian choices. Further studies should consider all the relevant factors behind moral decision-making and examine how each of them contributes to the choices people make.

Moreover, we predicted that when people with higher IAcc make utilitarian choices in personal moral dilemmas, IAcc and reaction time would be positively correlated due to longer cognitive regulation times. However, the tendencies predicted by this hypothesis were not generally observed in the moral dilemma scenarios. On the other hand, a negative relationship between IAcc and choice reaction time for the participants who made deontological choices was found in only one moral dilemma scenario, Bus plunge. This scenario involves many factors that evoke high emotional arousal states (Christensen et al., 2014). Given the extent to which the IAcc correlates with the intensity of the functional connectivity of the saliency network in the posterior insula (Chong et al., 2017), individuals with higher IAcc might easily detect salient stimuli relevant to themselves in self-beneficial moral dilemmas. Thus, for the Bus plunge, it might be that participants with higher IAcc made a deontological choice without hesitation due to promotion-informing somatic sensations serving such as the somatic marker (Damasio, 1994).

Our third hypothesis was that when individuals with higher IAcc make a utilitarian choice in a personal moral dilemma, they would regret it more intensively than individuals with lower IAcc. This hypothesis was not generally confirmed but was partially observed. IAcc was weakly related to the degree of regret in at least two of the three self-beneficial dilemmas. When the participants with higher IAcc made a deontological choice in Bus plunge, they showed low regret; similarly, when the participants with higher IAcc made a utilitarian choice in Orphanage, they showed high regret. These results may suggest that in self-beneficial moral dilemmas rather than personal moral dilemmas, higher IAcc is responsible for the lower regret over deontological choices and the higher regret over utilitarian choices. This suggestion is plausible considering a recent finding: self-beneficial scenarios, where the self is emotionally salient and feels relevant, elicit self-reported high arousal states (Christensen et al., 2014). Moral dilemma situations with a self-beneficial factor may increase emotional arousal; therefore, people with higher IAcc may feel arousal physiological state more strongly than people with lower IAcc (Barrett et al., 2004). As a result, the increase in arousal state may, in turn, trigger an increase in emotions such as regret. If this is correct, it explains why the correlation between IAcc and regret was observed in the self-beneficial dilemma. On the other hand, the relation between IAcc and regret was not observed in all of three self-beneficial dilemmas. Moreover, the dilemma scenarios used in the present study contained factors of various types, and the factors were confounded. In addition, the classification of personal and impersonal dilemmas is controversial because the classification criteria were based on Greene’s intuition and there was no evidence-based clarity criterion (McGuire et al., 2009). Dual process theory was initially proposed to explain the difference in judgment between dilemma situations (Greene et al., 2001, 2004), but the theory does not depend on moral dilemma situations, either personal or impersonal (Greene, 2009). Greene (2009) subsequently mentioned that for both personal and impersonal moral dilemma situations, deontological actions involve emotion and utilitarian action involves cognitive regulation of emotion. Hence, future research should focus on the relation between IAcc and self-beneficial factors in moral dilemma and need to separately investigate possibly influential factors to determine which factors, especially self-beneficial factor, underlies the relation between IAcc and regret ratings in moral dilemma situations.

One unexpected result of our experiments was that when people made a utilitarian choice, they showed high regret regardless of their IAcc. Karneman (2011) noted that people feel intense regret when taking an action against the default. The term “default” here is defined as taking a generally desirable action in a situation. In moral dilemma situations, utilitarian actions involve harming actions that sacrifice a few people’s lives to save many other lives; therefore, it seems that deontological choice is the default, and utilitarian choice is not the default. Hence, the utilitarian choice deviates from the default action and thus might arouse stronger regret than the deontological choice.

The present study investigated forecasted regret accompanying moral choice but did not directly examine the effects of regret avoidance on moral choice. People have tendency to estimate the degrees of regret for each choice option prior to making decisions and to avoid the choices that would arouse greater forecasting regret (Coricelli et al., 2005). Additionally, people have tendency to overestimate post-decisional regret (Gilbert et al., 2004; Sevdalis and Harvey, 2007). It should be useful to examine the effects of such affective forecasting error (Gilbert and Wilson, 2009) on moral decision-making in dilemma situations.

Moreover, significant limitations of IAcc measurement should be mentioned. The present study used Schandry’s heartbeat counting task. This task has been criticized because the heartbeat counting score may be involved in time estimation ability and knowledge of one’s own heartbeat (Desmedt et al., 2018; Zamariola et al., 2018). In addition, given that the task included a low number of trials, the IAcc score in the present study might include a large measurement error. Hence, interpretation related to IAcc in the present study should be carefully conducted. In addition, the present study included only 33 participants. This study suggested the possibility that IAcc affects forecasting regret ratings depending on moral dilemma situations; however, future studies should investigate this effect using larger sample size.

In summary, the present study sheds light on the complex relationships among IAcc, moral decision-making, and the strength of regret for one’s choices. In particular, the results of our experiment showed that people with higher IAcc were not inclined to make a deontological choice in moral dilemma situations. However, they were more likely to feel stronger regret after making a utilitarian choice and weaker regret after making a deontological choice in some self-beneficial dilemmas. These results suggest that people with higher IAcc are inclined to have more negative feelings for utilitarian choices and to face stronger moral conflict in self-beneficial dilemma situations than people with lower IAcc. This study is the first to investigate a link between IAcc and emotion with moral decision-making in dilemma situations.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics Statement

The studies involving human participants were reviewed and approved by Nagoya University. The participants provided their written informed consent to participate in this study.

Author Contributions

KT conceived the research idea, provided the experimental design, collected and analyzed the data, and wrote the original draft. YK provided feedback on the experimental design and data analysis. HO provided feedback for the data analysis and supervised the manuscript. All authors contributed to the article and approved the submitted version.

Funding

This research was supported by JSPS KAKENHI Grant Number 21H04420 and JST CREST Grant Number JPMJCR21P1.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s Note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Supplementary Material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fpsyg.2021.746897/full#supplementary-material

References

Awad, E., Dsouza, S., Kim, R., Schulz, J., Henrich, J., Shariff, A., et al. (2018). The moral machine experiment. Nature 563, 59–64. doi: 10.1038/s41586-018-0637-6

PubMed Abstract | CrossRef Full Text | Google Scholar

Barrett, L. F., Quigley, K. S., Bliss-Moreau, E., and Aronson, K. R. (2004). Interoceptive sensitivity and self-reports of emotional experience. J. Personal. Soc. Psychol. 87, 684–697. doi: 10.1037/0022-3514.87.5.684

PubMed Abstract | CrossRef Full Text | Google Scholar

Chen, W. G., Schloesser, D., Arensdorf, A. M., Simmons, J. M., Cui, C., Valentino, R., et al. (2021). The emerging science of interoception: sensing, integrating, interpreting, and regulating signals within the self. Trends Neurosci. 44, 3–16. doi: 10.1016/j.tins.2020.10.007

PubMed Abstract | CrossRef Full Text | Google Scholar

Chong, J. S. X., Ng, G. J. P., Lee, S. C., and Zhou, J. (2017). Salience network connectivity in the insula is associated with individual differences in interoceptive accuracy. Brain Struct. Funct. 222, 1635–1644. doi: 10.1007/s00429-016-1297-7

PubMed Abstract | CrossRef Full Text | Google Scholar

Christensen, J. F., Flexas, A., Calabrese, M., Gut, N. K., and Gomila, A. (2014). Moral judgment reloaded: a moral dilemma validation study. Front. Psychol. 5:607. doi: 10.3389/fpsyg.2014.00607

PubMed Abstract | CrossRef Full Text | Google Scholar

Coricelli, G., Critchley, H. D., Joffily, M., O’Doherty, J. P., Sirigu, A., and Dolan, R. J. (2005). Regret and its avoidance: a neuroimaging study of choice behavior. Nat. Neurosci. 8, 1255–1262. doi: 10.1038/nn1514

PubMed Abstract | CrossRef Full Text | Google Scholar

Craig, A. D. (2009). How do you feel – now? The anterior insula and human awareness. Nat. Rev. Neurosci. 10, 59–70. doi: 10.1038/nrn2555

PubMed Abstract | CrossRef Full Text | Google Scholar

Critchley, H. D., Wiens, S., Rotshtein, P., Ohman, A., and Dolan, R. J. (2004). Neural systems supporting interoceptive awareness. Nat. Neurosci. 7, 189–195. doi: 10.1038/nn1176

PubMed Abstract | CrossRef Full Text | Google Scholar

Da Silva, S., Matsushita, R., and De Sousa, M. (2016). Utilitarian moral judgments are cognitively too demanding. Open Access Library J. 3, 1–9. doi: 10.4236/oalib.1102380

CrossRef Full Text | Google Scholar

Damasio, A. R. (1994). Descartes’s Error: Emotion, Rationality and the Human Brain. New York: Putnam.

Google Scholar

Desmedt, O., Luminet, O., and Corneille, O. (2018). The heartbeat counting task largely involves non-interoceptive processes: evidence from both the original and an adapted counting task. Biol. Psychol. 138, 185–188. doi: 10.1016/j.biopsycho.2018.09.004

PubMed Abstract | CrossRef Full Text | Google Scholar

Domschke, K., Stevens, S., Pfleiderer, B., and Gerlach, A. L. (2010). Interoceptive sensitivity in anxiety and anxiety disorders: an overview and integration of neurobiological findings. Clin. Psychol. Rev. 30, 1–11. doi: 10.1016/j.cpr.2009.08.008

PubMed Abstract | CrossRef Full Text | Google Scholar

Garfinkel, S. N., Seth, A. K., Barrett, A. B., Suzuki, K., and Critchley, H. D. (2015). Knowing your own heart; distinguishing interoceptive accuracy from interoceptive awareness. Biol. Psychol. 104, 65–74. doi: 10.1016/j.biopsycho.2014.11.004

PubMed Abstract | CrossRef Full Text | Google Scholar

Gilbert, D. T., and Wilson, T. D. (2009). Why the brain talks to itself: sources of error in emotional prediction. Philos. Trans. R. Soc. B Biol. Sci. 364, 1335–1341. doi: 10.1098/rstb.2008.0305

PubMed Abstract | CrossRef Full Text | Google Scholar

Gilbert, D. T., Morewedge, C. K., Risen, J. L., and Wilson, T. D. (2004). Looking forward to looking backward: the misprediction of regret. Psychol. Sci. 15, 346–350. doi: 10.1111/j.0956-7976.2004.00681.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Greene, J. D. (2009). Dual-process morality and the personal/impersonal distinction: a reply to McGuire, Langdon, Coltheart, and Mackenzie. J. Exp. Soc. Psychol. 45, 581–584. doi: 10.1016/j.jesp.2009.01.003

CrossRef Full Text | Google Scholar

Greene, J. D. (2013). Moral Tribes: Emotion, Reason, and the Gap Between Us and Them. New York: Penguin press.

Google Scholar

Greene, J. D., Cushman, F. A., Stewart, L. E., Lowenberg, K., Nystrom, L. E., and Cohen, J. D. (2009). Pushing moral buttons: the interaction between personal force and intention in moral judgment. Cognition 111, 364–371. doi: 10.1016/j.cognition.2009.02.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Greene, J., and Haidt, J. (2002). How (and where) does moral judgment work? Trends Cogn. Sci. 6, 517–523. doi: 10.1016/S1364-6613(02)02011-9

CrossRef Full Text | Google Scholar

Greene, J. D., Nystrom, L. E., Engell, A. D., Darley, J. M., and Cohen, J. D. (2004). The neural bases of cognitive conflict and control in moral judgment. Neuron 44, 389–400. doi: 10.1016/j.neuron.2004.09.027

PubMed Abstract | CrossRef Full Text | Google Scholar

Greene, J. D., Sommerville, R. B., Nystrom, L. E., Darley, J., Cohen, M., and Jonathan, D. (2001). An fMRI investigation of emotional engagement in moral judgment. Science 293, 2105–2108. doi: 10.1126/science.1062872

PubMed Abstract | CrossRef Full Text | Google Scholar

Gu, X., and FitzGerald, T. H. B. (2014). Interoceptive inference: homeostasis and decision-making. Trends Cogn. Sci. 18, 269–270. doi: 10.1016/j.tics.2014.02.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Hänsel, A., and von Känel, R. (2008). The ventro-medial prefrontal cortex: a major link between the autonomic nervous system, regulation of emotion, and stress reactivity? BioPsychoSocial Med. 2:21. doi: 10.1186/1751-0759-2-21

PubMed Abstract | CrossRef Full Text | Google Scholar

Karneman, D. (2011). Thinking, Fast and Slow. New York: bestseller.

Google Scholar

Koenigs, M., Kruepke, M., Zeier, J., and Newman, J. P. (2012). Utilitarian moral judgment in psychopathy. Soc. Cogn. Affect. Neurosci. 7, 708–714. doi: 10.1093/scan/nsr048

PubMed Abstract | CrossRef Full Text | Google Scholar

Lenggenhager, B., Azevedo, R. T., Mancini, A., and Aglioti, S. M. (2013). Listening to your heart and feeling yourself: effects of exposure to interoceptive signals during the ultimatum game. Exp. Brain Res. 230, 233–241. doi: 10.1007/s00221-013-3647-5

PubMed Abstract | CrossRef Full Text | Google Scholar

McGuire, J., Langdon, R., Coltheart, M., and Mackenzie, C. (2009). A reanalysis of the personal/impersonal distinction in moral psychology research. J. Exp. Soc. Psychol. 45, 577–580. doi: 10.1016/j.jesp.2009.01.002

CrossRef Full Text | Google Scholar

Montoya, P., Schandry, R., and Müller, A. (1993). Heartbeat evoked potentials (HEP): topography and influence of cardiac awareness and focus of attention. Electroencephal. Clin. Neurophysiol. 88, 163–172. doi: 10.1016/0168-5597(93)90001-6

CrossRef Full Text | Google Scholar

Moretto, G., Làdavas, E., Mattioli, F., and Di Pellegrino, G. (2009). A psychophysiological investigation of moral judgment after ventromedial prefrontal damage. J. Cogn. Neurosci. 22, 1888–1899. doi: 10.1162/jocn.2009.21367

PubMed Abstract | CrossRef Full Text | Google Scholar

Nakao, T., Ohira, H., and Northoff, G. (2012). Distinction between externally vs. internally guided decision-making: operational differences, meta-analytical comparisons and their theoretical implications. Front. Neurosci. 6:31.

Google Scholar

Nentjes, L., Meijer, E., Bernstein, D., Arntz, A., and Medendorp, W. (2013). Brief communication: investigating the relationship between psychopathy and interoceptive awareness. J. Personal. Dis. 27, 617–624. doi: 10.1521/pedi_2013_27_105

CrossRef Full Text | Google Scholar

Padoa-Schioppa, C. (2011). Neurobiology of economic choice: a good-based model. Ann. Rev. Neurosci. 34, 333–359. doi: 10.1146/annurev-neuro-061010-113648

PubMed Abstract | CrossRef Full Text | Google Scholar

Patil, I., and Silani, G. (2014). Reduced empathic concern leads to utilitarian moral judgments in trait alexithymia. Front. Psychol. 5:501. doi: 10.3389/fpsyg.2014.00501

PubMed Abstract | CrossRef Full Text | Google Scholar

Peirce, J. W., Gray, J. R., Simpson, S., MacAskill, M. R., Höchenberger, R., Sogo, H., et al. (2019). PsychoPy2: experiments in behavior made easy. Behav. Res. Methods 51, 195–203. doi: 10.3758/s13428-018-01193-y

PubMed Abstract | CrossRef Full Text | Google Scholar

Piech, R. M., Strelchuk, D., Knights, J., Hjälmheden, J. V., Olofsson, J. K., and Aspell, J. E. (2017). People with higher interoceptive sensitivity are more altruistic, but improving interoception does not increase altruism. Sci. Rep. 7, 1–5. doi: 10.1038/s41598-017-14318-8

PubMed Abstract | CrossRef Full Text | Google Scholar

Pollatos, O., and Schandry, R. (2010). Accuracy of heartbeat perception is reflected in the amplitude of the heartbeat-evoked brain potential. Psychophysiology 41, 476–482. doi: 10.1111/1469-8986.2004.00170.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Schandry, R. (1981). Heart beat perception and emotional experience. Psychophysiology 18, 483–488. doi: 10.1111/j.1469-8986.1981.tb02486.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Sevdalis, N., and Harvey, N. (2007). Biased forecasting of post decisional affect. Psychol. Sci. 18, 678–681. doi: 10.1111/j.1467-9280.2007.01958.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Shah, P., Hall, R., Catmur, C., and Bird, G. (2016). Alexithymia, not autism, is associated with impaired interoception. Cortex 81, 215–220. doi: 10.1016/j.cortex.2016.03.021

PubMed Abstract | CrossRef Full Text | Google Scholar

Sokol-Hessner, P., Hartley, C. A., Hamilton, J. R., and Phelps, E. A. (2015). Interoceptive ability predicts aversion to losses. Cogn. Emotion 29, 695–701. doi: 10.1080/02699931.2014.925426

PubMed Abstract | CrossRef Full Text | Google Scholar

Szekely, R. D., and Miu, A. C. (2015). Incidental emotions in moral dilemmas: the influence of emotion regulation. Cogn. Emotion 29, 64–75. doi: 10.1080/02699931.2014.895300

PubMed Abstract | CrossRef Full Text | Google Scholar

Tassy, S., Oullier, O., Mancini, J., and Wicker, B. (2013). Discrepancies between judgment and choice of action in moral dilemmas. Front. Psychol. 4:250. doi: 10.3389/fpsyg.2013.00250

PubMed Abstract | CrossRef Full Text | Google Scholar

Terasawa, Y., Fukushima, H., and Umeda, S. (2013). How does interoceptive awareness interact with the subjective experience of emotion? An fMRI study. Hum. Brain Mapp. 34, 598–612. doi: 10.1002/hbm.21458

PubMed Abstract | CrossRef Full Text | Google Scholar

Thomson, J. J. (1986). Rights, Restitution, and Risk: Essays, in Moral Theory. Harvard: Harvard University Press.

Google Scholar

Weiss, S., Sack, M., Henningsen, P., and Pollatos, O. (2014). On the interaction of self-regulation, interoception and pain perception. Psychopathology 47, 377–382. doi: 10.1159/000365107

PubMed Abstract | CrossRef Full Text | Google Scholar

Werner, N. S., J.ung, K., Duschek, S., and Schandry, R. (2009). Enhanced cardiac perception is associated with benefits in decision-making. Psychophysiology 46, 1123–1129. doi: 10.1111/j.1469-8986.2009.00855.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Zaki, J., Davis, J. I., and Ochsner, K. N. (2012). Overlapping activity in anterior insula during interoception and emotional experience. Neuroimage 62, 493–499. doi: 10.1016/j.neuroimage.2012.05.012

PubMed Abstract | CrossRef Full Text | Google Scholar

Zamariola, G., Maurage, P., Luminet, O., and Corneille, O. (2018). Interoceptive accuracy scores from the heartbeat counting task are problematic: evidence from simple bivariate correlations. Biol. Psychol. 137, 12–17.

Google Scholar

Zeelenberg, M., van de Bos, K., van Dijk, E., and Pieters, R. (2002). The inaction effect in the psychology of regret. J. Personal. Soc. Psychol. 82, 314–327. doi: 10.1037/0022-3514.82.3.314

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: moral decision-making, choice, interoception, interoceptive accuracy (IAcc), emotion, regret

Citation: Tamura K, Kobayashi Y and Ohira H (2022) Interoceptive Accuracy Did Not Affect Moral Decision-Making, but Affect Regret Rating for One’s Moral Choices. Front. Psychol. 12:746897. doi: 10.3389/fpsyg.2021.746897

Received: 25 July 2021; Accepted: 20 December 2021;
Published: 11 February 2022.

Edited by:

Michio Nomura, Kyoto University, Japan

Reviewed by:

Jane Elizabeth Aspell, Anglia Ruskin University, United Kingdom
Sergio Da Silva, Federal University of Santa Catarina, Brazil

Copyright © 2022 Tamura, Kobayashi and Ohira. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Kaho Tamura, tamuraka4@gmail.com

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.