Skip to main content

REVIEW article

Front. Psychol., 11 October 2017
Sec. Theoretical and Philosophical Psychology

Why and How. The Future of the Central Questions of Consciousness

  • 1National Institute of Mental Health, Klecany, Czechia
  • 2Third Faculty of Medicine, Charles University, Prague, Czechia

In this review, we deal with two central questions of consciousness how and why, and we outline their possible future development. The question how refers to the empirical endeavor to reveal the neural correlates and mechanisms that form consciousness. On the other hand, the question why generally refers to the “hard problem” of consciousness, which claims that empirical science will always fail to provide a satisfactory answer to the question why is there conscious experience at all. Unfortunately, the hard problem of consciousness will probably never completely disappear because it will always have its most committed supporters. However, there is a good chance that its weight and importance will be highly reduced by empirically tackling consciousness in the near future. We expect that future empirical endeavor of consciousness will be based on a unifying brain theory and will answer the question as to what is the function of conscious experience, which will in turn replace the implications of the hard problem. The candidate of such a unifying brain theory is predictive coding, which will have to explain both perceptual consciousness and conscious mind-wandering in order to become the truly unifying theory of brain functioning.

Central Questions of Consciousness

Most articles and books dealing with consciousness begin with a variation of the same sentence, which roughly claims that conscious experience is the most intimate part of ourselves and also, probably, the most elusive problem of empirical science (e.g., Chalmers, 1996, 1999). Over time, variations of this statement and emphasis on elusiveness have almost become the universal definition of consciousness in relation to subsequent methodological and conceptual skepticism (e.g., Levine, 1983; Chalmers, 1996). Even though there are many questions regarding the nature of consciousness, only the questions why and how are considered the central questions of consciousness.

The question why is there consciousness is closely connected to the philosophy of mind, or more precisely, to the ideas of non-reductive or even dualistic philosophers of mind. Through several epistemic arguments such philosophers introduced major epistemic gap between physical and phenomenal properties, which, according to Chalmers, represents “the central mystery of consciousness” (Chalmers, 2003, p. 104). This epistemic gap, which aims at the nature of conscious experience, has been profoundly discussed by philosophers since Nagel (1974) used the metaphor of a bat’s mind in his essential article defending the irreducibility of conscious experience. Reduction is one of the fundamental principles of empirical science and since conscious experience defies it, the idea of the explanatory gap (Levine, 1983) between consciousness and empirical science began to sprout. The idea of the explanatory gap deepened with the help of several thought experiments such as the famous knowledge argument (Jackson, 1982), inverted spectrum scenarios (Shoemaker, 1982; Block, 1990) and philosophical zombies (Chalmers, 1996), and it finally resulted in the formulation of a distinction between easy problems of consciousness and the hard problem of consciousness (Chalmers, 1996).

The easy problems of consciousness are those that seem directly susceptible to the standard methods of cognitive science, whereby a phenomenon is explained in terms of computational or neural mechanisms. On the other hand, the hard problem refers to the nature, mechanism and role of subjective experience such as quality of deep blue (Chalmers, 1996). Hence, the hard problem of consciousness is the equivalent of the philosophically oriented question why. Why is a neural activity of the brain accompanied by a conscious experience despite the fact that philosophical zombies (beings with adaptive behavior without consciousness) are logically possible? Empirical science cannot hope to give a satisfactory answer to this question because it only operates within the scope of neural mechanisms (easy problems), which are incapable of providing a satisfactory answer to the question why is my brain activity accompanied by subjective experience (Chalmers, 1996).

Even though philosophical work in this area is interesting and quite popular, it can lead someone to very dangerous conclusion that empirical explanation of consciousness is utterly impossible. Against this conclusion stand out physicalist or even eliminative philosophers, such as (Churchland, 1989; Dennett, 1993, 2003, 2005; Bickle, 2003) and many others. For such oriented philosophers there is no such a thing as epistemic gap between physical and phenomenal properties and everything about consciousness can and will be explained by empirical science in the future. Some thinkers even attempted to safe consciousness as a topic from the philosophical restrictions of dualism (Brook, 2005) and reoriented themselves to the question how.

How is consciousness realized by neural mechanisms or how do we become conscious (Prinz, 2005, 2012) became the central question for the empirically oriented science of consciousness, whose main goal is to identify correlates of consciousness (Chalmers, 2000; Metzinger, 2000), which are the first step to describing the neural mechanisms directly responsible for the emergence of conscious experience.

The history of the empirical science of consciousness is full of proposed mechanisms. To name a few, Crick and Koch (1990) believed that the synchrony of gamma oscillations in the cerebral cortex is the key to binding and conscious experience. Edelman (1990, 2005) proposed thalamo-cortical reentrant loops as mechanisms of consciousness. Bogen (1995) argued for to the activity of the intralaminal nuclei of each thalamus. According to Llinas et al. (1994) consciousness emerges due to synchronization of thalamocortical activity below 40 Hz. Newman and Baars (1993) promoted thalamus and its reticular nucleus as the central hub for consciousness and conscious attention. His early ideas were also further developed by Dehaene and Naccache (2001), who supports the global neuronal workspace model, which claims that consciousness is a result of activity of the primary sensory cortex, and frontal and parietal areas. Prinz (2005, 2012) recently proposed his AIR theory of consciousness, which is based on the idea that consciousness emerges when information becomes available to working memory through attention, which is associated with gamma synchrony.

From the above it seems that thalamus and gamma synchrony have been considered as the primary candidates for the correlates of consciousness. The thalamus as the hot spot location for consciousness and gamma as the neural mechanism that renders neural representations conscious. Both concepts seem to be plausible. The thalamus is crucial in transmitting information from the external environment to the cortex, which itself receives relatively little other input (Saalmann and Kastner, 2011). Gamma synchrony accompanies stimulus selection under a binocular rivalry paradigm (Engel et al., 1999) and was reported during conscious detection in masking experiments (Summerfield et al., 2002; Gaillard et al., 2009). Moreover, synchronous gamma oscillations were also reported to accompany conscious auditory perception (Steinmann et al., 2014) and some even propose that gamma stands behind olfactory consciousness (Mori et al., 2013). Large-scale gamma-band synchrony was also proposed as the critical process behind conscious perception by Doesburg et al. (2009), Melloni and Singer (2010), Prinz (2012), and others. Another proposed candidate relating to the global neuronal workspace theory is P300. P300 can be elicited in tasks when subjects attend and discriminate stimuli that differ from one another in some dimension. Such discrimination produces a large positive waveform with a modal latency of about 300 ms (Sutton et al., 1965) reflecting cortical post-synaptic neuroelectric activity related to cognitive processes such as attention, stimulus evaluation and categorization. In global neuronal workspace theory, P300 is considered as a correlate of conscious access (Babiloni et al., 2006; Del Cul et al., 2007; Lamy et al., 2009; Dehaene and Changeux, 2011; Salti et al., 2012; King et al., 2014).

However, all candidates, such as the thalamus, gamma synchrony and P300, have been severely criticized lately (Aru and Bachmann, 2015; Koch et al., 2016). Conscious experience can occur without the thalamus in the form of hypnagogic hallucinations or hypnagogic experiences, which are present while falling asleep. During this phase, thalamic deactivation precedes the cortex, which is still immersed in complex cortical activity having direct relevance to conscious experience (hypnagogic experiences) (Magnin et al., 2010). Gamma band synchronization can occur during the phases of suppressed conscious perception (Vidal et al., 2014). P300 and gamma synchrony are not reported during task irrelevant trials, but are present and robust for task relevant stimuli, which apparently means that these candidates are responsible for post-perceptual processes and not for visual consciousness (Pitts et al., 2014a).

This gradual development is central to the question how. Most conclusions about consciousness are based on paradigms that are mostly directed toward the visual consciousness using experimental paradigms such as binocular rivalry and visual masking. Binocular rivalry refers to the phenomenon of ongoing competition for dominance between two different visual patterns, which are perceived by each eye separately. Both are represented in the brain, but only one is exclusively dominant for a few seconds and thus thrown into the stream of consciousness (Klink et al., 2013). Visual masking is another tool often used by conscious studies. Masking refers to a specially created condition under which a target stimulus is perceived as impaired due to another (masking) stimulus, which is presented in temporal and spatial proximity. Without the masking stimulus, the target stimulus would be perceived clearly and consciously (Ansorge et al., 2007; Bachmann and Francis, 2013). The major limitation of both approaches lies in the fact that it is not clear to what extent their results can be generalized to the entire field of conscious experience.

Moreover, the science of consciousness is now experiencing the emergence of new topics, which aim to isolate genuine correlates of consciousness from the related unconscious or other related brain processing. These new topics include neural prerequisites and consequences of consciousness (Aru et al., 2012), as well as the reporting of conscious perception and its isolation from the actual correlates of consciousness (Pitts et al., 2014a,b). Neural prerequisites and consequences of consciousness emerged from the critical idea that classical paradigms for studying consciousness relaying on contrast between conscious and unconscious perception (e.g., binocular rivalry) do not reveal genuine correlates of consciousness, since they do not include the possibilities of unconscious prerequisites and consequences. Classical paradigms, therefore, cannot from a methodological point of view never reveal the correlates of consciousness because they confound the NCC with other processes (Aru et al., 2012; Aru and Bachmann, 2015). Another topic that aims to clarify the possible confounding of NCC, is the topic of reporting. Subjective experience is private by definition, and therefore conscious perception requires a report, such as a button-press. Unfortunately, neural processing of such reports can also be confounded with genuine correlates of consciousness (Pitts et al., 2014a,b).

The science of consciousness is now experiencing a proliferation of topics that will hopefully remediate the topic of conscious experience and support the materialistically oriented understanding of conscious experience without epistemic gaps. However, another new topic that cannot be neglected is conscious spontaneous cognition or so-called mind-wandering.

Mind-Wandering

Mind-wandering refers to the internally directed cognition, which is stimulus independent, task-unrelated, emerges spontaneously and represents significant part of our daily lives (Killingsworth and Gilbert, 2010). Interesting description, which defines mind-wandering more closely next to other types of internally directed cognition was presented in Christoff et al. (2016):

“…mind-wandering can be defined as a special case of spontaneous thought that tends to be more-deliberately constrained than dreaming, but less-deliberately constrained than creative thinking and goal-directed thought. In addition, mind-wandering can be clearly distinguished from rumination and other types of thought that are marked by a high degree of automatic constraints, such as obsessive thought” (Christoff et al., 2016).

The scientific interest in this type of internally oriented mental states is not completely new (e.g., Singer, 1966; Antrobus, 1968; Antrobus et al., 1970; Giambra, 1989). However, topic of mind-wandering experienced a new wave of interest due to the fact that this type of cognition is accompanied by the intrinsic activity of now well-known default mode network (DMN).

At the end of the 20th century, a series of unexpected discoveries of intrinsic brain activity was made. The authors of these discoveries were Biswal et al. (1995) and Shulman et al. (1997). Several years later, intrinsic brain activity was attributed to the influential concept of the DMN (Greicius et al., 2003; Raichle and Snyder, 2007). The DMN is a large-scale brain network coupling the medial prefrontal cortex; ventral medial prefrontal cortex; medial and lateral temporal cortex; posterior inferior parietal lobule and precuneus/posterior cingulate cortex. This network not only caused an essential scientific revolution in fMRI (Havlík, 2017) but also raised a new topic for studies on the conscious mental contents associated with DMN activity. To name a few, self-referential thinking (Gusnard and Raichle, 2001), imagining future (Buckner et al., 2008), remembering and imagining future (Addis et al., 2007), social cognition (Mars et al., 2012), theory of mind (Buckner et al., 2008; Nekovarova et al., 2014), spontaneous cognition (Andrews-Hanna et al., 2010), mental time travel (Østby et al., 2012), internal train of thought (Smallwood et al., 2012) and mind-wandering (Mason et al., 2007), which is now considered as the umbrella term for the all the above (Callard et al., 2013).

This new ‘era of wandering minds’ (Callard et al., 2013) has not yet broken into the science of consciousness. However, in the future it will be necessary to unify mind-wandering with consciousness in order to satisfactorily answer the question how. According to some, we spend about 20–50% of our time in mind-wandering experiences (Killingsworth and Gilbert, 2010; Song and Wang, 2012), which is a substantial part of our subjective experience that cannot be neglected. Mind-wandering represents the other side of the coin of the conscious experience next to perceptual (mainly visual) consciousness, which was the most studied topic in the science of consciousness. Identifying the mechanisms that accompany mind-wandering and render spontaneous mental states conscious will be necessary for any new universal theory of consciousness that would explain both perceptual consciousness and conscious mind-wandering.

One of the first methodological steps toward unifying consciousness with mind-wandering would be to examine whether candidates for conscious experience such as the thalamus, gamma band and P300 are also detectable under mind-wandering conditions. Interestingly, the thalamus, the best known cerebral region of perceptual consciousness, is not part of the DMN (from a functional connectivity point of view) and is not reported as being active during states of mind-wandering. A celebrity of intrinsic brain activity and the DMN is the precuneus/PCC, which is considered to be the central region of the DMN (Fransson and Marrelec, 2008; Utevsky et al., 2014). The precuneus/PCC, a major connectivity hub of brain organization, is responsible for a high degree of information integration due to its great number of connections with other regions, namely association areas and paralimbic areas (Leech and Sharp, 2014). Precuneus/PCC were proposed to be the main candidates for conscious cognitive information processing (Vogt and Laureys, 2005) and it was more recently suggested that precuneus/PCC could be very important for the balancing of internal and external attentional focus (Leech and Sharp, 2014). This could be interesting for the proponents of the idea that attention is necessary and sufficient for consciousness. Precuneus/PCC are also close to the ‘posterior cortical hot zone’ (Koch et al., 2016), which was recently proposed as the correlate of conscious experience.

It may be tempting to suggest that precuneus/PCC is a ‘new candidate’ for consciousness. However, that alone could be misleading and would certainly not help extinguish the elusive nature of consciousness. To clear up this elusiveness it would be necessary to gradually categorize the neural correlates (and later mechanisms) of both sides of conscious experience – perceptual consciousness and conscious mind-wandering. Given the dichotomy between externally oriented perceptual consciousness and conscious mind-wandering it is necessary to mention that both these types of our conscious life consist of the same forms of mental states such as thoughts, emotions, believes, sensory sensations, and body feelings. The only (but principal) difference is that perceptual consciousness results from the instant processing of external stimuli but mind-wandering is formed by internal manipulations of previous experiences and different levels of their neural representations. This dichotomy is also supported by the antagonistic nature of the DMN activity toward the activity of central executive network (CEN), which represents attention systems toward external environment (Gusnard and Raichle, 2001; Raichle et al., 2001; Raichle and Snyder, 2007) and thus relates to the perception. If DMN is active then CEN is partially disengaged and vice versa. This antagonistic nature between DMN and CEN occurs under natural conditions, however, these networks can be coupled under demanding conditions, such as goal-directed cognition (Spreng et al., 2010).

Also, determining the mechanisms of ‘attention switching’ between perceptual consciousness and mind-wandering would be very helpful for clarifying this elusiveness and the question how. The recently described salience network (SN) is a hot candidate for this mechanism. SN consists of the anterior insula and the anterior cingulate cortex and its main functions are the ‘detection of salience stimuli’ and ‘dynamical switching’ between the DMN and the central executive network (Menon and Uddin, 2010).

Conscious mind-wandering is an important part of conscious experience, and finding its mechanisms will be important for the universal theory of consciousness, which will provide a satisfactory answer to the central question how.

Even though an answer to the question how would be a great scientific discovery, it alone would not be the complete package. To get a ‘complete package of consciousness’ the second central question of consciousness why should not be neglected.

Future of the Question Why

The future development of the question how is clear and the question itself does not need any serious revision. However, this does not count for the question why. Unfortunately, we believe that there is little hope that the most committed proponents of hard problem would be completely satisfied with conclusions of empirical science, since the core argument of hard problem is aimed at the endeavors of empirical science in the first place:

“If we show how a neural or computational mechanism does the job, we have explained learning. We can say the same for other cognitive phenomena, such as perception, memory, and language. (…) When it comes to conscious experience, this sort of explanation fails” (Chalmers, 1999, p. 12).

Proponents of the hard problem are caught in their own intuitions, which are “invulnerable (as) bedrock, an insight so obvious and undeniable that no argument could make it tremble, let alone shatter it” (Dennett, 2013, p. 312).

Even though, the hard problem is very seductive for some philosophers we believe that its strength will be seriously reduced in the near future. We have already mentioned that the question how needs no revision, but this does not apply to the question why. The first step toward this revision is to reject the ‘logical possibility of zombies’ when asking the question why is there conscious experience. The second step is to redirect the question why is there conscious experience toward the function of conscious experience, which it plays in the overall neural machinery. Speaking of function it should be immediately clarified what we mean. Function of consciousness should be understood clearly in the sense of physicalism or even eliminativism (as Type-A Materialism, see Chalmers, 2003) without any teleological implication. Such account of function of conscious experience should be put among the other neural functions without any privilege or special importance and thus close or even eliminate supposed epistemic gap. Unfortunately, there is no clear understanding or general consensus among philosophers and neuroscientists about the function of consciousness. This is one of the main reasons why consciousness still represents such an elusive problem, which has its roots in the fact that there has never been an articulated function of consciousness based on and supported by the unifying brain theory.

Such a unifying theory is emerging in the form of the predictive coding framework (PC) (Rao and Ballard, 1999) and is gaining more and more support (Hohwy, 2012, 2013; Clark, 2013, 2016; Hobson and Friston, 2014). This line of thought, which can be traced back to Immanuel Kant (Kant, 1996; Swanson, 2016), is based on the ideas of Hermann von Helmholtz that the brain is mainly a predictive inference machine (Von Helmholtz, 1866/1925; Dayan et al., 1995). Over time, this predictive framework has been given several names, such as Bayesian brain (Knill and Pouget, 2004), the free-energy principle (Friston, 2009) and prediction error minimization (Hohwy, 2013). PC claims that perception is not ‘directly’ related to the external environment, but it is created endogenously (Gregory, 1968) implicating that brain does not have to exhaustively reconstruct an environment solely by a bottom–up analysis of sensory inputs (Panichello et al., 2012). Top–down predictions, known as priors or prior knowledge, are tested against sensory inputs, which are considered as sensory evidence in the framework of PC. The differences between top–down predictions (priors) and bottom–up sensory evidence are called prediction errors. If a prediction error is encountered, the prior has to be updated according to the sensory evidence, which results in posterior beliefs. The brain of a living organism creates and uses these internal models to predict sensory inputs in order to minimize its free-energy (entropy), which is essentially a measure of surprise about sensory inputs (Friston, 2010; Friston et al., 2012; Clark, 2016).

The PC account of brain function explains perception, attention, action and other brain functions (Hohwy, 2013), and also stands as the unifying theory for other global theories of the brain (Friston, 2010). Nevertheless, for PC to be a complete and truly unifying theory of the brain, it will have to include both parts of conscious experience – perceptual consciousness and conscious mind-wandering, and explain their function in the terms of prediction error minimization. Only this will provide a satisfactory answer to the revised question why. A similar vision is shared by Clark (2016):

“Using this integrated apparatus might we, inch-by-inch and phenomenon-by-phenomenon, begin to solve the so-called, hard problem’ of conscious experience (…).”

Authors now widely accept that inferences and prior knowledge are crucial to understanding consciousness. It is believed that priors shape and determine the contents of consciousness and can even accelerate their access to the stream of consciousness (Melloni et al., 2011). These claims are not exaggerated. Evidence is being gathered suggesting that PC has consequences for conscious experience, such as ambiguous visual inputs (Panichello et al., 2012), predictions accelerate time of conscious access (Melloni et al., 2011; Chang et al., 2015). In addition, the well-known visual paradigms, such as continuous flash suppression (Hohwy, 2013) and binocular rivalry (Hohwy et al., 2008; Denison et al., 2011; Clark, 2013, 2016) are now being reconsidered from the methodological and explanatory perspective of PC.

However, these authors have so far been very reticent about how exactly consciousness fits into the PC framework. For now, there is an emerging idea that understands consciousness as the final result of unconsciousness processing. According to Hohwy (2013), perceptual consciousness fits in the PC theory as the ‘upshot’ or ‘conclusion’ of unconscious perceptual inferences. Melloni (2014) also expects that inferential processes are conducted behind the curtain of consciousness and what we experience are the ‘outcomes’ or ‘results’ of an unconscious inferential process. Lamme (2015) agrees with the idea that consciousness is based on the relation or more precisely conjunction of current inputs and priors, which together produce posterior beliefs. Karl Friston agrees to a certain extent with this interpretation of consciousness. According to Hobson and Friston (2016), consciousness is not a phenomenon but a process, a process of optimizing beliefs through inference (consciousness, dreams, and inference; the Cartesian theater revisited; a response to our theater critics).

Consciousness may emerge from the interaction of unconscious streams – top–down predictive processing and unconscious bottom–up sensory processing. But what is the nature of the contents of conscious experience?

There are two possible but strictly opposing concepts of the relation between predictive error and consciousness. At first glance, the contents of subjective experience could be understood as being composed solely of prediction errors, which are the most important for the updating of priors (adaptive learning). What is repeatedly correctly predicted is fully adaptive and thus does not need to be perceived or be part of conscious experience. This would, however, mean that conscious agents subjectively operate in constant surprise and constant entropy. It is, therefore, more likely that contents of conscious perception are formed by representations that best predict the environment and sensory inputs. Correctly predicted/inferred states are fully adaptive, such contents are still part of the conscious experience, but there is no need for them to be specially addressed within conscious experience. Correctly predicted states are not needed for the actualization of priors and, therefore, are evaluated as dull and not salient. What stands out in the stream of conscious experience as the most salient states are prediction errors. Conscious experience is not composed purely of prediction errors, but prediction errors that are not unconsciously inferred are the most salient inputs, which get the privilege in the stream of conscious experience. Prediction errors that stand out from other inferred states are priors about to be updated. This could be the future answer to the question of the function of conscious experience.

Even though conscious perception has already its place in PC theory, conscious mind-wandering is still mostly left out of this discussion. On the one hand, this is understandable since the resurrection of this theme emerged only few years ago thanks to the DMN, but, on the other hand, the future synthesis of these themes will be necessary. If PC will not include conscious mind-wandering into its explanation of brain behavior, it will not only fail in the explanation of consciousness, it will also fail as the global unifying brain theory.

In PC theory, every agent’s mental state is considered as a result of self-preservation processes and such a theory cannot neglect inner conscious states, which represent large part of an agent’s mental life. Recently, PC was put into context with sleep and dreaming consciousness, whereby its function is the optimization of the internal model of the world by removing its redundant parameters and thus minimizing its complexity (Hobson and Friston, 2012, 2014). PC framework has been also applied to interoception, i.e., “sense of internal physiological condition of the body” (Seth, 2014, p. 1). In this view, emotional feelings can be understood as interoceptive inferences. “Emotional content is determined by a suite of hierarchically organized generative models that predict interoceptive responses to both external stimuli and the internal signals controlling bodily physiology” (Seth and Critchley, 2013, p. 48). Such combination of top–down inferences on the external (exteroception) and internal (interoception) causes of changes in bodily signals constitute the base of experience of selfhood (Seth, 2013). These authors propose a mechanistic model of subjective sense of presence – a basic property of normal conscious experience – which is related to the activity of anterior insular cortex (Seth et al., 2012).

Another recent hypothesis says that in order to reduce free-energy, the brain uses spontaneous activity to obtain maximum information transfer and at the same time to minimize metabolic costs as much as possible (Tozzi et al., 2016). The basis of PC is that the brain is directed to the construction of its own prior expectations of the world in order to reduce its free-energy. Apparently, the same goes for the neural systems that are hierarchically higher. These higher neural systems also try to suppress the free energy of hierarchically lower neural systems. DMN is supposed to occupy the highest point in this neural hierarchy and, therefore, provides top–down control of so-called attentional networks (in gamma power) and minimizes their free-energy (Carhart-Harris and Friston, 2010).

There are two ways that agents can suppress free energy (surprise). The first is by changing sensory input by acting on external states and the second is by modifying their internal states (Friston, 2010; Tozzi et al., 2016). DMN, consisting of memory regions (Buckner et al., 2008) and mainly active during resting states, is known to be associated with many mind-wandering mental states and mental time travel (e.g., Andrews-Hanna et al., 2010; Østby et al., 2012; Smallwood et al., 2012). The function of conscious mind-wandering could be the active, conscious modifying and optimization of internal models of the world, not by reducing their complexity (Hobson and Friston, 2012), but by creating plausible priors/predictions (based on memories) to reduce possible free energy in the future. In this respect, mind-wandering would have the opposite role to sleep and dream consciousness, which are thought to minimize the complexity of inner models (Friston, 2012; Hobson and Friston, 2014).

Daniel Dennett on one occasion spoke about so-called Popperian creatures, who possess an inner environment that permits them to sacrifice their hypotheses instead of them and on this basis permits them to take the best possible future actions (Dennett, 1997). This seems close to the idea of the brain as a predictive machine which can “generate virtual realities (hypotheses) to test against sensory evidence” (Hobson and Friston, 2012). The function of conscious mind-wandering may be a by-product of a brain that does not want to wait for a future surprise and, therefore, tries to minimize it during the resting states (period of training) that accompany endogenous activity.

Karl Friston said, “the future of imaging neuroscience is to refine models of the brain to minimize free-energy, where the brain refines models of the world to minimize free-energy” (Friston, 2012, p. 1230). If free-energy minimization is the brain’s main function, how else should the brain spend its resting states than in states of active inference and testing its hypotheses/models against possible scenarios (future external states) and thus optimize its priors. The brain in PC is understood as a good model of its environment, but this model should not be strict but flexible, so the brain should actively infer many possibilities that could occur in its environment. Rehearsing the learned facts about the world and testing new hypotheses against this in safe conscious mind-wandering states sounds quite possible. Therefore, the possible function of the conscious mind-wandering is conscious optimization and maximization of the complexity of priors.

Conclusion and Future Directions

Above, we have drafted the possible development of the science of consciousness. First, we propose that the initial step should be to replace the methodologically hardly approachable research on the explanatory gap between consciousness and empirical neuroscience by the question on the adaptive role of consciousness (“why”). Second, we argue that empirically oriented studies of consciousness should address not only perceptual consciousness but also mind-wandering representing the other side of conscious experience. Consequently, any current or future candidates for the neuronal correlates of consciousness should be plausible and testable for both modes of conscious experience. Third, prediction error minimization represents the hot candidate for both unifying theory of brain function and development of conscious experience. However, future empirical research in this field should primarily answer the open question if content of consciousness represents the best predicts (priors) of the environment/inputs, or if it corresponds with constant surprise (set of prediction errors). This answer will elucidate the adaptive function of consciousness (why) and underlying neuronal mechanism (how), only if it is valid for both perceptual consciousness and mind-wandering. We assume that empirical approaches to these questions might substantially contribute to our understanding the nature of consciousness. Hopefully, we will then be able to refrain from beginning our articles by mentioning the elusive nature of consciousness and this elusiveness will be replaced by a sentence about the old philosophical relic of consciousness, the hard-problem.

Author Contributions

MH wrote the article. EK and JH revised it and contributed with critical comments and suggestions. All authors have approved the final content of the manuscript.

Funding

This work was supported by the Czech Science Foundation (GACR) grant no. 17-23718S, and by the NPU I project no. LO1611 from MEYS CR.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

The authors would like to acknowledge the support of the Department of Applied Neurosciences and Brain Imaging at the National Institute of Mental Health. The authors would also like to thank all of the reviewers for their insightful comments and valuable suggestions, which significantly improved the manuscript.

References

Addis, D. R., Wong, A. T., and Schacter, D. L. (2007). Remembering the past and imagining the future: common and distinct neural substrates during event construction and elaboration. Neuropsychologia 45, 1363–1377. doi: 10.1016/j.neuropsychologia.2006.10.016

PubMed Abstract | CrossRef Full Text | Google Scholar

Andrews-Hanna, J. R., Reidler, J. S., Huang, C., and Buckner, R. L. (2010). Evidence for the default network’s role in spontaneous cognition. J. Neurophysiol. 104, 322–335. doi: 10.1152/jn.00830.2009

PubMed Abstract | CrossRef Full Text | Google Scholar

Ansorge, U., Francis, G., Herzog, M. H., and Öǧmen, H. (2007). Visual masking and the dynamics of human perception, cognition, and consciousness A century of progress, a contemporary synthesis, and future directions. Adv. Cogn. Psychol. 3, 1–8. doi: 10.2478/v10053-008-0009-0

PubMed Abstract | CrossRef Full Text | Google Scholar

Antrobus, J. S. (1968). Information theory and stimulus-independent thought. Br. J. Psychol. 59, 423–430. doi: 10.1111/j.2044-8295.1968.tb01157.x

CrossRef Full Text | Google Scholar

Antrobus, J. S., Singer, J. L., Goldstein, S., and Fortgang, M. (1970). Section of psychology: mindwandering and cognitive structure. Trans. N. Y. Acad. Sci. 32, 242–252. doi: 10.1111/j.2164-0947.1970.tb02056.x

CrossRef Full Text | Google Scholar

Aru, J., and Bachmann, T. (2015). Still wanted—the mechanisms of consciousness! Front. Psychol. 6:5. doi: 10.3389/fpsyg.2015.00005

PubMed Abstract | CrossRef Full Text | Google Scholar

Aru, J., Bachmann, T., Singer, W., and Melloni, L. (2012). Distilling the neural correlates of consciousness. Neurosci. Biobehav. Rev. 36, 737–746. doi: 10.1016/j.neubiorev.2011.12.003

PubMed Abstract | CrossRef Full Text | Google Scholar

Babiloni, C., Vecchio, F., Bultrini, A., Romani, G. L., and Rossini, P. M. (2006). Pre-and poststimulus alpha rhythms are related to conscious visual perception: a high-resolution EEG study. Cereb. Cortex 16, 1690–1700. doi: 10.1093/cercor/bhj104

PubMed Abstract | CrossRef Full Text | Google Scholar

Bachmann, T., and Francis, G. (2013). Visual Masking: Studying Perception, Attention, and Consciousness. Cambridge, MA: Academic Press.

Google Scholar

Bickle, J. (2003). Philosophy and Neuroscience: A Ruthlessly Reductive Account. Drodrecht: Kluwer. doi: 10.1007/978-94-010-0237-0

CrossRef Full Text | Google Scholar

Biswal, B., Zerrin Yetkin, F., Haughton, V. M., and Hyde, J. S. (1995). Functional connectivity in the motor cortex of resting human brain using echo-planar MRI. Magn. Reson. Med. 34, 537–541. doi: 10.1002/mrm.1910340409

CrossRef Full Text | Google Scholar

Block, N. (1990). Inverted earth. Philos. Perspect. 4, 53–79. doi: 10.2307/2214187

CrossRef Full Text | Google Scholar

Bogen, J. E. (1995). On the neurophysiology of consciousness: 1. An overview. Conscious. Cogn. 4, 52–62. doi: 10.1006/ccog.1995.1003

CrossRef Full Text | Google Scholar

Brook, A. (2005). “Making consciousness safe for neuroscience,” in Cognition and the Brain: The Philosophy and Neuroscience Movement, eds A. Brook and K. Akins (Cambridge: Cambridge University Press), 397–422. doi: 10.1017/CBO9780511610608.013

CrossRef Full Text | Google Scholar

Buckner, R. L., Andrews-Hanna, J. R., and Schacter, D. L. (2008). The brain’s default network. Ann. N. Y. Acad. Sci. 1124, 1–38. doi: 10.1196/annals.1440.011

PubMed Abstract | CrossRef Full Text | Google Scholar

Callard, F., Smallwood, J., Golchert, J., and Margulies, D. S. (2013). The era of the wandering mind? Twenty-first century research on self-generated mental activity. Front. Psychol. 4:891. doi: 10.3389/fpsyg.2013.00891

PubMed Abstract | CrossRef Full Text | Google Scholar

Carhart-Harris, R. L., and Friston, K. J. (2010). The default-mode, ego-functions and free-energy: a neurobiological account of Freudian ideas. Brain 133(Pt 4), 1265–1283. doi: 10.1093/brain/awq010

PubMed Abstract | CrossRef Full Text | Google Scholar

Chalmers, D. J. (1996). The Conscious Mind: In Search of a Fundamental Theory. Oxford: Oxford University Press.

Google Scholar

Chalmers, D. J. (1999). Explaining Consciousness: The Hard Problem, ed. J. Shear. Cambridge, MA: MIT Press.

Google Scholar

Chalmers, D. J. (2000). “What is a neural correlate of consciousness,” in Neural Correlates of Consciousness: Empirical and Conceptual Questions, ed. T. Metzinger (Cambridge, MA: MIT Press), 17–40.

Google Scholar

Chalmers, D. J. (2003). “Consciousness and its place in nature,” in Blackwell Guide to the Philosophy of Mind, eds S. P. Stich and T. A. Warfield (Oxford: Blackwell), 102–142.

Google Scholar

Chang, A. Y.-C., Kanai, R., and Seth, A. K. (2015). Cross-modal prediction changes the timing of conscious access during the motion-induced blindness. Conscious. Cogn. 31, 139–147. doi: 10.1016/j.concog.2014.11.005

PubMed Abstract | CrossRef Full Text | Google Scholar

Christoff, K., Irving, Z. C., Fox, K. C. R., Spreng, R. N., and Andrews-Hanna, J. R. (2016). Mind-wandering as spontaneous thought: a dynamic framework. Nat. Rev. Neurosci. 17, 718–731. doi: 10.1038/nrn.2016.113

PubMed Abstract | CrossRef Full Text | Google Scholar

Churchland, P. M. (ed.). (1989). “Knowing qualia: a reply to Jackson,” in A Neurocomputational Perspective: The Nature of Mind and the Structure of Science (Cambridge, MA: MIT Press), 67–76.

Google Scholar

Clark, A. (2013). Whatever next? Predictive brains, situated agents, and the future of cognitive science. Behav. Brain Sci. 36, 181–204. doi: 10.1017/S0140525X12000477

PubMed Abstract | CrossRef Full Text | Google Scholar

Clark, A. (2016). Surfing Uncertainty: Prediction, Action, and the Embodied Mind. Oxford: Oxford University Press. doi: 10.1093/acprof:oso/9780190217013.001.0001

CrossRef Full Text | Google Scholar

Crick, F., and Koch, C. (1990). Towards a neurobiological theory of consciousness. Semin. Neurosci. 2, 263–275.

Google Scholar

Dayan, P., Hinton, G. E., Neal, R. M., and Zemel, R. S. (1995). The Helmholtz machine. Neural Comput. 7, 889–904. doi: 10.1162/neco.1995.7.5.889

CrossRef Full Text | Google Scholar

Dehaene, S., and Changeux, J.-P. (2011). Experimental and theoretical approaches to conscious processing. Neuron 70, 200–227. doi: 10.1016/j.neuron.2011.03.018

PubMed Abstract | CrossRef Full Text | Google Scholar

Dehaene, S., and Naccache, L. (2001). Towards a cognitive neuroscience of consciousness: basic evidence and a workspace framework. Cognition 79, 1–37. doi: 10.1016/S0010-0277(00)00123-2

PubMed Abstract | CrossRef Full Text | Google Scholar

Del Cul, A., Baillet, S., and Dehaene, S. (2007). Brain dynamics underlying the nonlinear threshold for access to consciousness. PLOS Biol. 5:e260. doi: 10.1371/journal.pbio.0050260

PubMed Abstract | CrossRef Full Text | Google Scholar

Denison, R. N., Piazza, E. A., and Silver, M. A. (2011). Predictive context influences perceptual selection during binocular rivalry. Front. Hum. Neurosci. 5:166. doi: 10.3389/fnhum.2011.00166

PubMed Abstract | CrossRef Full Text | Google Scholar

Dennett, D. (1993). Consciousness Explained. Penguin UK. Available at: http://scholar.google.com/scholar?cluster=2998512591762468326&hl=en&oi=scholarr

Google Scholar

Dennett, D. (1997). Kinds of Minds: Toward an Understanding of Consciousness, 4th Edn. New York, NY: Basic Books.

Google Scholar

Dennett, D. (2003). Explaining the “magic” of consciousness. J. Cult. Evol. Psychol. 1, 7–19. doi: 10.1556/JCEP.1.2003.1.2

CrossRef Full Text | Google Scholar

Dennett, D. (2005). Sweet Dreams: Philosophical Obstacles to a Science of Consciousness. Cambridge, MA: MIT Press.

Google Scholar

Dennett, D. (2013). Intuition Pumps and other Tools for Thinking. New York City, NY: WW Norton & Company.

Google Scholar

Doesburg, S. M., Green, J. J., McDonald, J. J., and Ward, L. M. (2009). Rhythms of consciousness: binocular rivalry reveals large-scale oscillatory network dynamics mediating visual perception. PLOS ONE 4:e6142. doi: 10.1371/journal.pone.0006142

PubMed Abstract | CrossRef Full Text | Google Scholar

Edelman, G. (1990). The Remembered Present: A Biological Theory of Consciousness, 1st Edn. New York, NY: Basic Books.

Google Scholar

Edelman, G. (2005). Wider Than the Sky: The Phenomenal Gift of Consciousness. New Haven, CT: Yale University Press.

Google Scholar

Engel, A. K., Fries, P., König, P., Brecht, M., and Singer, W. (1999). Temporal binding, binocular rivalry, and consciousness. Conscious. Cogn. 8, 128–151. doi: 10.1006/ccog.1999.0389

PubMed Abstract | CrossRef Full Text | Google Scholar

Fransson, P., and Marrelec, G. (2008). The precuneus/posterior cingulate cortex plays a pivotal role in the default mode network: evidence from a partial correlation network analysis. Neuroimage 42, 1178–1184. doi: 10.1016/j.neuroimage.2008.05.059

PubMed Abstract | CrossRef Full Text | Google Scholar

Friston, K. (2009). The free-energy principle: a rough guide to the brain? Trends Cogn. Sci. 13, 293–301. doi: 10.1016/j.tics.2009.04.005

PubMed Abstract | CrossRef Full Text | Google Scholar

Friston, K. (2010). The free-energy principle: a unified brain theory? Nat. Rev. Neurosci. 11, 127–138. doi: 10.1038/nrn2787

PubMed Abstract | CrossRef Full Text | Google Scholar

Friston, K. (2012). The history of the future of the Bayesian brain. Neuroimage 62, 1230–1233. doi: 10.1016/j.neuroimage.2011.10.004

PubMed Abstract | CrossRef Full Text | Google Scholar

Friston, K., Thornton, C., and Clark, A. (2012). Free-energy minimization and the dark-room problem. Front. Psychol. 3:130. doi: 10.3389/fpsyg.2012.00130

PubMed Abstract | CrossRef Full Text | Google Scholar

Gaillard, R., Dehaene, S., Adam, C., Clémenceau, S., Hasboun, D., Baulac, M., et al. (2009). Converging intracranial markers of conscious access. PLOS Biol. 7:e1000061. doi: 10.1371/journal.pbio.1000061

PubMed Abstract | CrossRef Full Text | Google Scholar

Giambra, L. M. (1989). Task-unrelated thought frequency as a function of age: a laboratory study. Psychol. Aging 4, 136–143. doi: 10.1037/0882-7974.4.2.136

PubMed Abstract | CrossRef Full Text | Google Scholar

Gregory, R. L. (1968). Perceptual illusions and brain models. Proc. R. Soc. Lond. B Biol. Sci. 171, 279–296. doi: 10.1098/rspb.1968.0071

CrossRef Full Text | Google Scholar

Greicius, M. D., Krasnow, B., Reiss, A. L., and Menon, V. (2003). Functional connectivity in the resting brain: a network analysis of the default mode hypothesis. Proc. Natl. Acad. Sci. U.S.A. 100, 253–258. doi: 10.1073/pnas.0135058100

PubMed Abstract | CrossRef Full Text | Google Scholar

Gusnard, D. A., and Raichle, M. E. (2001). Searching for a baseline: functional imaging and the resting human brain. Nat. Rev. Neurosci. 2, 685–694. doi: 10.1038/35094500

PubMed Abstract | CrossRef Full Text | Google Scholar

Havlík, M. (2017). From anomalies to essential scientific revolution? Intrinsic brain activity in the light of Kuhn’s philosophy of science. Front. Syst. Neurosci. 11:7. doi: 10.3389/fnsys.2017.00007

CrossRef Full Text | Google Scholar

Hobson, J., and Friston, K. J. (2016). A response to our theatre critics. J. Conscious. Stud. 23, 245–254.

Google Scholar

Hobson, J. A., and Friston, K. J. (2012). Waking and dreaming consciousness: neurobiological and functional considerations. Prog. Neurobiol. 98, 82–98. doi: 10.1016/j.pneurobio.2012.05.003

PubMed Abstract | CrossRef Full Text | Google Scholar

Hobson, J. A., and Friston, K. J. (2014). Consciousness, dreams, and inference the cartesian theatre revisited. J. Conscious. Stud. 21, 6–32.

Google Scholar

Hohwy, J. (2012). Attention and conscious perception in the hypothesis testing brain. Front. Psychol. 3:96. doi: 10.3389/fpsyg.2012.00096

PubMed Abstract | CrossRef Full Text | Google Scholar

Hohwy, J. (2013). The Predictive Mind. Oxford: Oxford University Press. doi: 10.1093/acprof:oso/9780199682737.001.0001

CrossRef Full Text | Google Scholar

Hohwy, J., Roepstorff, A., and Friston, K. (2008). Predictive coding explains binocular rivalry: an epistemological review. Cognition 108, 687–701. doi: 10.1016/j.cognition.2008.05.010

PubMed Abstract | CrossRef Full Text | Google Scholar

Jackson, F. (1982). Epiphenomenal qualia. Philos. Q. 32, 127–136. doi: 10.2307/2960077

PubMed Abstract | CrossRef Full Text | Google Scholar

Kant, I. (1996). Critique of Pure Reason. Indianapolis, IN: Hackett Publishing.

Google Scholar

Killingsworth, M. A., and Gilbert, D. T. (2010). A wandering mind is an unhappy mind. Science 330, 932. doi: 10.1126/science.1192439

PubMed Abstract | CrossRef Full Text | Google Scholar

King, J.-R., Gramfort, A., Schurger, A., Naccache, L., and Dehaene, S. (2014). Two distinct dynamic modes subtend the detection of unexpected sounds. PLOS ONE 9:e85791. doi: 10.1371/journal.pone.0085791

PubMed Abstract | CrossRef Full Text | Google Scholar

Klink, P. C., van Wezel, R. J., and van Ee, R. (2013). “The future of binocular rivalry research,” in The Constitution of Visual Consciousness: Lessons from Binocular Rivalry, Vol. 90, ed. S. M. Miller (Amsterdam: John Benjamins Publishing), 305–332. doi: 10.1075/aicr.90.12kli

CrossRef Full Text | Google Scholar

Knill, D. C., and Pouget, A. (2004). The Bayesian brain: the role of uncertainty in neural coding and computation. Trends Neurosci. 27, 712–719. doi: 10.1016/j.tins.2004.10.007

PubMed Abstract | CrossRef Full Text | Google Scholar

Koch, C., Massimini, M., Boly, M., and Tononi, G. (2016). Neural correlates of consciousness: progress and problems. Nat. Rev. Neurosci. 17, 307–321. doi: 10.1038/nrn.2016.22

PubMed Abstract | CrossRef Full Text | Google Scholar

Lamme, V. (2015). Predictive Coding is Unconscious, so that Consciousness Happens Now. In Open MIND. Open MIND. Frankfurt am Main: MIND Group. Available at: http://open-mind.net/papers/predictive-coding-is-unconscious-so-that-consciousness-happens-now2014a-reply-to-lucia-melloni/at_download/paperPDF

Google Scholar

Lamy, D., Salti, M., and Bar-Haim, Y. (2009). Neural correlates of subjective awareness and unconscious processing: an ERP study. J. Cogn. Neurosci. 21, 1435–1446. doi: 10.1162/jocn.2009.21064

PubMed Abstract | CrossRef Full Text | Google Scholar

Leech, R., and Sharp, D. J. (2014). The role of the posterior cingulate cortex in cognition and disease. Brain 137, 12–32. doi: 10.1093/brain/awt162

PubMed Abstract | CrossRef Full Text | Google Scholar

Levine, J. (1983). Materialism and qualia: the explanatory gap. Pac. Philos. Q. 64, 354–361. doi: 10.1111/j.1468-0114.1983.tb00207.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Llinas, R., Ribary, U., Joliot, M., and Wang, X.-J. (1994). “Content and context in temporal thalamocortical binding,” in Temporal Coding in the Brain, eds G. Buzsáki and Y. Christen (Berlin: Springer), 251–272.

PubMed Abstract | Google Scholar

Magnin, M., Rey, M., Bastuji, H., Guillemant, P., Mauguière, F., and Garcia-Larrea, L. (2010). Thalamic deactivation at sleep onset precedes that of the cerebral cortex in humans. Proc. Natl. Acad. Sci. U.S.A. 107, 3829–3833. doi: 10.1073/pnas.0909710107

PubMed Abstract | CrossRef Full Text | Google Scholar

Mars, R. B., Neubert, F.-X., Noonan, M. P., Sallet, J., Toni, I., and Rushworth, M. F. (2012). On the Relationship between the “Default Mode Network” and the “Social Brain”. Available at: http://repository.ubn.ru.nl/handle/2066/102753

Google Scholar

Mason, M. F., Norton, M. I., Van Horn, J. D., Wegner, D. M., Grafton, S. T., and Macrae, C. N. (2007). Wandering minds: the default network and stimulus-independent thought. Science 315, 393–395. doi: 10.1126/science.1131295

PubMed Abstract | CrossRef Full Text | Google Scholar

Melloni, L. (2014). Consciousness as Inference in Time. In Open MIND. Open MIND. Frankfurt am Main: MIND Group. Available at: http://open-mind.net/papers/consciousness-as-inference-in-time-a-commentary-on-victor-lamme/at_download/paperPDF

Google Scholar

Melloni, L., Schwiedrzik, C. M., Müller, N., Rodriguez, E., and Singer, W. (2011). Expectations change the signatures and timing of electrophysiological correlates of perceptual awareness. J. Neurosci. 31, 1386–1396. doi: 10.1523/JNEUROSCI.4570-10.2011

PubMed Abstract | CrossRef Full Text | Google Scholar

Melloni, L., and Singer, W. (2010). “Distinct characteristics of conscious experience are met by large scale neuronal synchronization,” in New Horizons in the Neuroscience of Consciousness, eds E. K. Perry, D. Collerton, F. E. N. LeBeau, and H. Ashton (Amsterdam: John Benjamins), 17–28.

Google Scholar

Menon, V., and Uddin, L. Q. (2010). Saliency, switching, attention and control: a network model of insula function. Brain Struct. Funct. 214, 655–667. doi: 10.1007/s00429-010-0262-0

PubMed Abstract | CrossRef Full Text | Google Scholar

Metzinger, T. (2000). Neural Correlates of Consciousness: Empirical and Conceptual Questions. Cambridge, MA: MIT Press.

Google Scholar

Mori, K., Manabe, H., Narikiyo, K., and Onisawa, N. (2013). Olfactory consciousness and gamma oscillation couplings across the olfactory bulb, olfactory cortex, and orbitofrontal cortex. Front. Psychol. 4:743. doi: 10.3389/fpsyg.2013.00743

PubMed Abstract | CrossRef Full Text | Google Scholar

Nagel, T. (1974). What is it like to be a bat? Philos. Rev. 83, 435–450. doi: 10.2307/2183914

CrossRef Full Text | Google Scholar

Nekovarova, T., Fajnerova, I., Horacek, J., and Spaniel, F. (2014). Bridging disparate symptoms of schizophrenia: a triple network dysfunction theory. Front. Behav. Neurosci. 8:171. doi: 10.3389/fnbeh.2014.00171

PubMed Abstract | CrossRef Full Text | Google Scholar

Newman, J., and Baars, B. J. (1993). A neural attentional model for access to consciousness: a global workspace perspective. Concepts Neurosci. 4, 255–290.

Google Scholar

Østby, Y., Walhovd, K. B., Tamnes, C. K., Grydeland, H., Westlye, L. T., and Fjell, A. M. (2012). Mental time travel and default-mode network functional connectivity in the developing brain. Proc. Natl. Acad. Sci. U.S.A. 109, 16800–16804. doi: 10.1073/pnas.1210627109

PubMed Abstract | CrossRef Full Text | Google Scholar

Panichello, M. F., Cheung, O. S., and Bar, M. (2012). Predictive feedback and conscious visual experience. Front. Psychol. 3:620. doi: 10.3389/fpsyg.2012.00620

PubMed Abstract | CrossRef Full Text | Google Scholar

Pitts, M. A., Metzler, S., and Hillyard, S. A. (2014a). Isolating neural correlates of conscious perception from neural correlates of reporting one’s perception. Front. Psychol. 5:1078. doi: 10.3389/fpsyg.2014.01078

PubMed Abstract | CrossRef Full Text | Google Scholar

Pitts, M. A., Padwal, J., Fennelly, D., Martínez, A., and Hillyard, S. A. (2014b). Gamma band activity and the P3 reflect post-perceptual processes, not visual awareness. Neuroimage 101, 337–350. doi: 10.1016/j.neuroimage.2014.07.024

PubMed Abstract | CrossRef Full Text | Google Scholar

Prinz, J. (2005). “A neurofunctional theory of consciousness,” in Cognition and the Brain: Philosophy and Neuroscience Movement, eds A. Brook and K. Akins (New York, NY: Cambridge University Press), 381–396. doi: 10.1017/CBO9780511610608.012

CrossRef Full Text | Google Scholar

Prinz, J. (2012). The Conscious Brain. Oxford: Oxford University Press. doi: 10.1093/acprof:oso/9780195314595.001.0001

CrossRef Full Text | Google Scholar

Raichle, M. E., MacLeod, A. M., Snyder, A. Z., Powers, W. J., Gusnard, D. A., and Shulman, G. L. (2001). A default mode of brain function. Proc. Natl. Acad. Sci. U.S.A. 98, 676–682. doi: 10.1073/pnas.98.2.676

PubMed Abstract | CrossRef Full Text | Google Scholar

Raichle, M. E., and Snyder, A. Z. (2007). A default mode of brain function: a brief history of an evolving idea. Neuroimage 37, 1083–1090. doi: 10.1016/j.neuroimage.2007.02.041

PubMed Abstract | CrossRef Full Text | Google Scholar

Rao, R. P., and Ballard, D. H. (1999). Predictive coding in the visual cortex: a functional interpretation of some extra-classical receptive-field effects. Nat. Neurosci. 2, 79–87. doi: 10.1038/4580

PubMed Abstract | CrossRef Full Text | Google Scholar

Saalmann, Y. B., and Kastner, S. (2011). Cognitive and perceptual functions of the visual thalamus. Neuron 71, 209–223. doi: 10.1016/j.neuron.2011.06.027

PubMed Abstract | CrossRef Full Text | Google Scholar

Salti, M., Bar-Haim, Y., and Lamy, D. (2012). The P3 component of the ERP reflects conscious perception, not confidence. Conscious. Cogn. 21, 961–968. doi: 10.1016/j.concog.2012.01.012

PubMed Abstract | CrossRef Full Text | Google Scholar

Seth, A. K. (2013). Interoceptive inference, emotion, and the embodied self. Trends Cogn. Sci. 17, 565–573. doi: 10.1016/j.tics.2013.09.007

PubMed Abstract | CrossRef Full Text | Google Scholar

Seth, A. K. (2014). Response to Gu and FitzGerald: interoceptive inference: from decision-making to organism integrity. Trends Cogn. Sci. 18, 270–271. doi: 10.1016/j.tics.2014.03.006

PubMed Abstract | CrossRef Full Text | Google Scholar

Seth, A. K., and Critchley, H. D. (2013). Extending predictive processing to the body: emotion as interoceptive inference. Behav. Brain Sci. 36, 227–228. doi: 10.1017/S0140525X12002270

PubMed Abstract | CrossRef Full Text | Google Scholar

Seth, A. K., Suzuki, K., and Critchley, H. D. (2012). An interoceptive predictive coding model of conscious presence. Front. Psychol. 2:395. doi: 10.3389/fpsyg.2011.00395

PubMed Abstract | CrossRef Full Text | Google Scholar

Shoemaker, S. (1982). The inverted spectrum. J. Philos. 79, 357–381. doi: 10.2307/2026213

CrossRef Full Text | Google Scholar

Shulman, G. L., Fiez, J. A., Corbetta, M., Buckner, R. L., Miezin, F. M., Raichle, M. E., et al. (1997). Common blood flow changes across visual tasks: II. Decreases in cerebral cortex. J. Cogn. Neurosci. 9, 648–663. doi: 10.1162/jocn.1997.9.5.648

PubMed Abstract | CrossRef Full Text | Google Scholar

Singer, J. L. (1966). Daydreaming: An Introduction to the Experimental Study of Inner Experience. Available at: http://psycnet.apa.org/psycinfo/1966-06292-000

Google Scholar

Smallwood, J., Brown, K., Baird, B., and Schooler, J. W. (2012). Cooperation between the default mode network and the frontal–parietal network in the production of an internal train of thought. Brain Res. 1428, 60–70. doi: 10.1016/j.brainres.2011.03.072

PubMed Abstract | CrossRef Full Text | Google Scholar

Song, X., and Wang, X. (2012). Mind wandering in Chinese daily lives–an experience sampling study. PLOS ONE 7:e44423. doi: 10.1371/journal.pone.0044423

PubMed Abstract | CrossRef Full Text | Google Scholar

Spreng, R. N., Stevens, W. D., Chamberlain, J. P., Gilmore, A. W., and Schacter, D. L. (2010). Default network activity, coupled with the frontoparietal control network, supports goal-directed cognition. Neuroimage 53, 303–317. doi: 10.1016/j.neuroimage.2010.06.016

PubMed Abstract | CrossRef Full Text | Google Scholar

Steinmann, S., Leicht, G., Ertl, M., Andreou, C., Polomac, N., Westerhausen, R., et al. (2014). Conscious auditory perception related to long-range synchrony of gamma oscillations. Neuroimage 100, 435–443. doi: 10.1016/j.neuroimage.2014.06.012

PubMed Abstract | CrossRef Full Text | Google Scholar

Summerfield, C., Jack, A. I., and Burgess, A. P. (2002). Induced gamma activity is associated with conscious awareness of pattern masked nouns. Int. J. Psychophysiol. 44, 93–100. doi: 10.1016/S0167-8760(02)00003-X

PubMed Abstract | CrossRef Full Text | Google Scholar

Sutton, S., Braren, M., Zubin, J., and John, E. R. (1965). Evoked-potential correlates of stimulus uncertainty. Science 150, 1187–1188. doi: 10.1126/science.150.3700.1187

CrossRef Full Text | Google Scholar

Swanson, L. R. (2016). The predictive processing paradigm has roots in Kant. Front. Syst. Neurosci. 10:79. doi: 10.3389/fnsys.2016.00079

PubMed Abstract | CrossRef Full Text | Google Scholar

Tozzi, A., Zare, M., and Benasich, A. A. (2016). New perspectives on spontaneous brain activity: dynamic networks and energy matter. Front. Hum. Neurosci. 10:247. doi: 10.3389/fnhum.2016.00247

PubMed Abstract | CrossRef Full Text | Google Scholar

Utevsky, A. V., Smith, D. V., and Huettel, S. A. (2014). Precuneus is a functional core of the default-mode network. J. Neurosci. 34, 932–940. doi: 10.1523/JNEUROSCI.4227-13.2014

CrossRef Full Text | Google Scholar

Vidal, J. R., Perrone-Bertolotti, M., Levy, J., De Palma, L., Minotti, L., Kahane, P., et al. (2014). Neural repetition suppression in ventral occipito-temporal cortex occurs during conscious and unconscious processing of frequent stimuli. Neuroimage 95, 129–135. doi: 10.1016/j.neuroimage.2014.03.049

PubMed Abstract | CrossRef Full Text | Google Scholar

Vogt, B. A., and Laureys, S. (2005). Posterior cingulate, precuneal and retrosplenial cortices: cytology and components of the neural network correlates of consciousness. Prog. Brain Res. 150, 205–217. doi: 10.1016/S0079-6123(05)50015-3

PubMed Abstract | CrossRef Full Text | Google Scholar

Von Helmholtz, H. (1866/1925). Concerning the Perceptions in General. Treatise on Physiological Optics, 3rd Edn, Vol. III, trans. J. P. C. Southall. New York, NY: Dover.

Google Scholar

Keywords: consciousness, predictive coding, mind-wandering, default mode network (DMN), neural correlates of consciousness (NCCs)

Citation: Havlík M, Kozáková E and Horáček J (2017) Why and How. The Future of the Central Questions of Consciousness. Front. Psychol. 8:1797. doi: 10.3389/fpsyg.2017.01797

Received: 09 May 2017; Accepted: 28 September 2017;
Published: 11 October 2017.

Edited by:

Lieven Decock, VU University Amsterdam, Netherlands

Reviewed by:

Tomer Fekete, Ben-Gurion University of the Negev, Israel
Robert William Clowes, Universidade Nova de Lisboa, Portugal

Copyright © 2017 Havlík, Kozáková and Horáček. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Marek Havlík, mmshavlik@gmail.com

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.