Abstract
This article reviews thermodynamic relationships in the brain in an attempt to consolidate current research in systems neuroscience. The present synthesis supports proposals that thermodynamic information in the brain can be quantified to an appreciable degree of objectivity, that many qualitative properties of information in systems of the brain can be inferred by observing changes in thermodynamic quantities, and that many features of the brain’s anatomy and architecture illustrate relatively simple information-energy relationships. The brain may provide a unique window into the relationship between energy and information.
Introduction
That information is physical has been suggested by evidence since the founding of classical thermodynamics (Lloyd, 2006; Gleick, 2011). In recent years, Landauer’s principle (Landauer, 1996; Bennett, 2003), which relates information-theoretic entropy to thermodynamic information, has been confirmed (Parrondo et al., 2015), and the experimental demonstration of a form of information-energy equivalence (Alfonso-Faus, 2013) has verified that Maxwell’s demon cannot violate any known laws of thermodynamics (Maruyama et al., 2009). The theoretical finding that entropy is conserved as event horizon area is leading to the resolution of the black hole information paradox (Davies, 2010; Moskowitz, 2015), and there is a fundamental relationship between information and the geometry of spacetime itself (Bousso, 2002; Eling et al., 2006). Current formulations of quantum theory are revealing properties of physical information (Wheeler, 1986; Brukner and Zeilinger, 2003; Lloyd, 2006; Vedral, 2010), and information-interpretive attempts to show that gravity is quantized (Smolin, 2001; Lee et al., 2013) could even lead to the unification of quantum mechanics and the theories of relativity. Although similar approaches are increasingly influential in biology (Schneider and Sagan, 2005; England, 2013; Flack, 2014), “a formalization of the relationship between information and energy is currently lacking in neuroscience” (Collell and Fauquet, 2015). The purpose of this article is to explore a few different sides of this relationship and, along the way, to suggest that many hypotheses and theories in neuroscience can be unified by the physics of information.
Information Bounds
“How can the events in space and time which take place within the spatial boundary of a living organism be accounted for by physics and chemistry?” – (Schrödinger, 1944, from Friston, 2013).
As a fundamental physical entity (Lloyd, 2015), information is not fully understood, and there is currently a significant amount of disagreement over different definitions of information and entropy in the literature (Poirier, 2014; Ben-Naim, 2015). In thermodynamics, however, information can be defined as a negation of thermodynamic entropy (Beck, 2009):
A bit of thermodynamic entropy represents the distinction between two alternative states in a physical system (Stone, 2015). As a result, the total thermodynamic entropy of a system is proportional to the total number of distinguishable states contained in the system (Bekenstein, 2001, 2007). Because thermodynamic entropy is potential information relative to an observer (Lloyd, 2006), and an observer in a physical system is a component of the system itself, the total thermodynamic entropy of a system includes the portion of entropy that is accessible to the observer as relative thermodynamic information (Wheeler, 1989; Collell and Fauquet, 2015):
Since entropy in any physical system is finite (Lloyd, 2006; Rovelli, 2015), the total thermodynamic entropy of any system of the brain can be quantified by applying the traditional form of the universal (Bekenstein, 1981, 1984, 2001, 2004, 2007) information-entropy bound:
where A is area, E is energy including matter, ℏ is the reduced Planck constant, c is the speed of light, k is Boltzmann’s constant, and ζ is a factor such that 0 ≤ ζ ≤ 1.
Setting this factor to 1 in order to quantify the total thermodynamic entropy of a system at a certain level of structure now allows us to quantify thermodynamic information by partitioning the factor into a relative information component (ζI = 1-ζs) and a relative entropy component (ζs = 1-ζI),
Because a maximal level of energy corresponds to a maximal level of thermodynamic information, and a minimal level of energy corresponds to a minimal level of thermodynamic information (Duncan and Semura, 2004), any transitions between energy levels occur as transitions between informational extrema. So, in the event that information enters a system of the brain,
where T is temperature. And, in the case that information exits a system,
Various forms of these relationships, including information-entropy bounds, have been applied in neuroscience (Friston, 2010; Sengupta et al., 2013a,c, 2016; Collell and Fauquet, 2015; Sterling and Laughlin, 2015). The contribution of this review is simply to show that these relationships can be united into a common theoretical framework.
Neurobiology
“… classical thermodynamics… is the only physical theory of universal content which I am convinced, that within the framework of applicability of its basic concepts, will never be overthrown.” – (Einstein, 1949, from Bekenstein, 2001).
This section reviews thermodynamic relationships in systems neuroscience with a focus on information and energy. Beginning with neurons, moving to neural networks, and concluding at the level of the brain as a whole, I discuss the energetics of processes such as learning and memory, excitation and inhibition, and the production of noise in neurobiological systems.
The central role of energy in determining the activity of neurons exposes the close connection between information and thermodynamics at the level of the cell. For instance, the process of depolarization, which occurs as a transition to Emax from a resting state Emin, clearly shows that cellular information content is correlated with energy levels. In this respect, the resemblance between ion concentration gradients in neurons and temperature gradients in thermodynamic demons (i.e., agents that use information from their surroundings to decrease their thermodynamic entropy) is not a coincidence – in order to acquire information, neurons must expend energy to establish proper membrane potentials. Recall that Landauer’s principle (Plenio and Vitelli, 2001; Parrondo et al., 2015) places a lower bound on the quantity of energy released into the surroundings with the removal of information from a system. Thus, reestablishing membrane potentials after depolarization – the neuronal equivalent of resetting a demon’s memory – dissipates energy. Because Landauer’s principle applies to all levels of structure, and cells process large quantities of information, neurons use energy efficiently despite operating at several orders of magnitude above the nominal limit. Parameters including membrane area, spiking frequency, and axon length have all been optimized over the course of evolution to allow neurons to process information efficiently (Sterling and Laughlin, 2015). Examining the energetics of information processing in neurons reinforces the notion that, while it is often convenient to imagine the neuron to be a simple binary element, these cells are intricate computational structures that process more than one bit of information.
Relationships between information and energy can also be seen at the level of neural networks. Attractor networks naturally stabilize by seeking energy minima, and the relative positions of basins of attraction define the geometry of an energy landscape (Amit, 1992). As a result, the transition into an active attractor state occurs as a transition into an information-energy maximum. These transitions correspond to the generation of informational entities such as memories, decisions, and perceptual events (Rolls, 2012). In this way, the energy basins of attractor networks may be analogous to lower-level cellular and molecular energy gradients; a transition between any number of distinguishable energy levels follows the passage of a finite quantity of information. Since processing information requires the expenditure of energy, competitive network features also underscore the need to minimize unnecessary information processing. Lateral inhibition at this level may optimize thermodynamic efficiency by reducing metabolic expenses associated with networks responding less robustly to entering signals. Another interesting thermodynamic property of networks concerns macrostates: the functional states of large-scale neural networks rest emergently on the states of neuronal assemblies (Yuste, 2015). As a result, new computational properties may arise with the addition of new layers of network structure. Finally, the energetic cost of information has influenced network connectivity by imposing selective pressures to save energy by minimizing path length between network nodes (Bullmore and Sporns, 2009).
Again, in accordance with Landauer’s principle, the displacement of information from any system releases energy into the surroundings (Plenio and Vitelli, 2001; Duncan and Semura, 2004). This principle can be understood by imagining an idealized memory device, such as the brain of a thermodynamic demon. Since information is conserved (Susskind and Hrabovsky, 2014), and clearing a memory erases information, the thermodynamic entropy of the surroundings must increase when a demon refreshes its memory to update information. This fundamental connection between information, entropy, and energy appears in many areas of the neurobiology of learning. For example, adjusting a firing threshold in order to change the probability that a system will respond to a conditioned stimulus (Takeuchi et al., 2014; Choe, 2015) optimizes engram fitness by minimizing the quantity of energy needed for its activation (Still et al., 2012). Recurrent collateral connections further increase engram efficiency by enabling a minimal nodal stimulus to elicit its full energetic activation (Rolls, 2012). Experimental evidence also shows that restricting synaptic energy supply impairs the formation of stable engrams (Harris et al., 2012). Because the formation and disassembly of engrams during learning and forgetting optimizes the growth and pruning of networks in response to external conditions, the process of learning is itself a mechanism for minimizing entropy in the brain (Friston, 2003).
As another example of a multiscale process integrated across many levels by thermodynamics, consider the active balance between excitation and inhibition in neurobiological systems. Maintaining proper membrane potentials and adequate concentrations of signaling molecules requires the expenditure of energy, so it is advantageous for systems of the brain to minimize the processing of unnecessary information – to “send only what is needed” (Sterling and Laughlin, 2015). Balancing excitation and inhibition is therefore a crucial mechanism for saving energy. Theoretical evidence that this balancing maximizes the thermodynamic efficiency of processing Shannon information (Sengupta et al., 2013b) is consistent with experimental findings in several areas of research on inhibition. For instance, constant inhibitory modulation is needed to stabilize internal states, and hyperexcitation (e.g., in epilepsy, intoxication syndromes, or trauma) can decrease relative information by reducing levels of consciousness (Haider et al., 2006; Lehmann et al., 2012). Likewise, selective attention is mediated by the activation of inhibitory interneurons (Houghton and Tipper, 1996), and sensory inhibition appears to sharpen internal perceptual states (Isaacson and Scanziani, 2011). The need to balance excitation and inhibition at all levels of structure highlights the energetic cost of information.
A final example worth discussing is the relationship between thermodynamics and the production of noise in neurobiological systems. Noise is present in every system of the brain, and influences all aspects of the organ’s function (Faisal et al., 2008; Rolls and Deco, 2010; Destexhe and Rudolph-Lilith, 2012). Even in the absence of any potential forms of classical stochastic resonance, the noise-driven exploration of different states may optimize thermodynamic efficiency by allowing a system to randomly sample different accessible configurations. Theoretical arguments suggest indeed that noise enables neural networks to respond more quickly to detected signals (Rolls, 2012), and empirical evidence implicates noise as a beneficial means of optimizing the performance of diverse neurobiological processes (McDonnell and Ward, 2011). For example, noise in the form of neuronal DNA breaking (Guo et al., 2011; Herrup et al., 2013; Tognini et al., 2015) could enhance plasticity, since any stochastically optimized configuration would be more likely to survive over time as, in this case, a strengthened connection in a modifiable network. Because noise is a form of relative entropy, optimizing the signal-to-noise ratio in any neurobiological system promotes the efficient use of energy.
At the level of the brain as a whole, the connection between information and thermodynamics is readily apparent in the organ’s functional reliance on energy (Magistretti and Allaman, 2015), its seemingly disproportionate consumption of oxygen and energy substrates (e.g., ATP, glucose, ketones, etc.; Raichle and Gusnard, 2002; Herculano-Houzel, 2011), its vulnerability to hypoxic-ischemic damage (Lutz et al., 2003; Dreier et al., 2013) and in the reduction of consciousness often conferred by the onset of energy restrictions (Shulman et al., 2009; Stender et al., 2016). All fMRI, PET, and EEG interpretation rests on the foundational assumption that changes in the information content of neurobiological systems can be inferred by observing energy changes (Attwell and Iadecola, 2002; Collell and Fauquet, 2015), and it is well known that the information processing capacities of neurobiological systems are limited by energy supply (Howarth et al., 2012; Fox, 2015). Overall, these relationships are consistent with the form of information-energy equivalence predicted by Landauer’s principle and information-entropy bounds. The living brain appears to maintain a state of thermodynamic optimization.
Consciousness and Free Will
“… science appears completely to lose from sight the large and general questions; but all the more splendid is the success when, groping in the thicket of special questions, we suddenly find a small opening that allows a hitherto undreamt of outlook on the whole.” – (Boltzmann, 1892, from Von Baeyer, 1999).
Although neuroscience has yet to explain consciousness or free will at any satisfactory level of detail, relationships between information and energy seem to be recognizable even at this level of analysis. This section reviews attempts to conceptualize major properties of consciousness (unity, continuity, complexity, and self-awareness) as features of information processing in the brain, and concludes with a discussion of free will.
At any given moment, awareness is experienced as a unified whole. Physical information is the substrate of consciousness (Annila, 2016), and the law of conservation of information requires any minimal unit of information to be transferred into a thermodynamic system as a temporally unitary quantity. As a result, it is possible that the passage of perceptual time itself occurs secondarily to the transfer of information, and that the information present in any integrated system of the brain at any observed time is necessarily cohesive and temporally unified. In this framework, the passage of time would vary in proportion to a system’s rate of energy dissipation. Although it is possible that physical systems in general exchange information in temporally unitary quantities, it is likely that many of the familiar features of the perceptual unity of consciousness require the structure and activity of neural networks in the brain. The biological basis of this unity may be the active temporal consolidation of observed events by integrated higher-order networks (Revonsuo, 1999; Varela et al., 2001; Greenfield and Collins, 2005; Dehaene and Changeux, 2011). An informational structure generated by the claustrum has been speculated to contribute to this experiential unity (Crick and Koch, 2005; Koubeissi et al., 2014), but it has also been reported that complete unilateral resection of the system performed in patients with neoplastic lesions of the region produces no externally observable changes in subjective awareness (Duffau et al., 2007). Overall, it appears unlikely that the presence of information in any isolated or compartmentalized network of the brain is responsible for generating the unified nature of conscious experience.
While perceptual time is likely the product of a collection of related informational processes rather than a single, globalized function mediated by any one specific system of the brain, some of the perceptual continuity of consciousness may result from the effectively continuous flow of thermodynamic information into and out of integrated systems of the brain. In this framework, the quantum (Prokopenko and Lizier, 2014) of perceptual time would be the minimal acquisition of information, and the entrance of information into neurobiological systems would occur alongside the entrance of energy. This relationship is implicit in the simple observation that the transition of a large-scale attractor network is progressively less discrete and smoother in time than the activation of a small-scale engram, the propagation of a cellular potential, the docking of a vesicle, the release of an ion, and so forth. Likewise, electroencephalography shows that the summation of a large number of discrete cellular potentials can accumulate into an effectively continuous wave as a network field potential (Nunez and Srinivasan, 2006), disruptions of which are often correlated with decreases in levels of consciousness (Blumenfeld and Taylor, 2003). It is also well known that higher frequency network oscillations tend to indicate states of wakefulness and active awareness, while lower frequency oscillations tend to be associated with internal states of lesser passage of perceptual time, such as dreamless sleep or unconsciousness. The possibility that the experiential arrow of time and the thermodynamic arrow of time share a common origin in the flow of information is supported both by general models of time in neuroscience and the physical interpretation of time as an entropy gradient (Stoica, 2008; Mlodinow and Brun, 2014).
The subjective complexity of consciousness may show that extensive network integration is needed for maximizing the mutual thermodynamic information and internal energy content of systems of the brain (Torday and Miller, 2016). An exemplary structure enabling such experience, likely one of many that together account for the subjective complexity of consciousness, is the thalamocortical complex (Calabrò et al., 2015; Hannawi et al., 2015). The functional architecture of such a network may show that, at any given moment in the internal model of a living brain, a wide range of integrated systems are sharing mutual sources of thermodynamic information. This pattern of structure may reveal that the perceptual depth and complexity of conscious experience is a direct product of recognizable features of the physical brain. However, it also seems that extensive local cortical processing of information is necessary for producing a refined and coherent sensorium within a system, and that both the thalamocortical complex and the brain stem are involved in generating the subjective complexity of consciousness (Edelman et al., 2011; Ward, 2011). The dynamics of attractor networks at higher levels of network structure may show that quantities of complex internal information can be observed as changes in cortical energy landscapes (Rolls, 2012), with a transition between attractor states following the transfer of information. The degree of subjective complexity of information enclosed by such a transition would be proportional to the degree of structural integration of underlying networks.
Self-awareness likely arose as a survival necessity rather than as an accident of evolution (Fabbro et al., 2015), and rudimentary forms of self-awareness likely began to appear early in the course of brain evolution as various forms of perceptual self-environment separation. As a simple example, consider the tickle response (Linden, 2007), which requires the ability to differentiate self-produced tactile sensations from those produced by external systems. The early need to distinguish between self-produced tactile states and those produced by more threatening non-self sources may be reflected by the observation that this recognition process is mediated to a great extent by the cerebellum (Blakemore et al., 2000). While it is possible that other similar developments began occurring very early on, the evolutionary acquisition of the refined syntactical and conceptual self present in the modern brain likely required the merging of pre-existing self networks with higher-level cortical systems. The eventual integration of language and self-awareness would have been advantageous for coordinating social groups (Graziano, 2013), since experiencing self-referential thought as inner speech facilitates verbal communication. Likewise, the coupling of self-awareness to internal sensory, cognitive, and motor states (Metzinger, 2004; Northoff et al., 2006) may be advantageous for maximizing information between systems within an individual brain. Neuropsychological conditions involving different forms of agnosia, neglect, and self-awareness deficits do show that a reduced awareness of self-ownership of motor skills, body parts, or perceptual states can result in significant disability (Parton et al., 2004; Morin, 2006; Orfei et al., 2007; Prigatano, 2009; Tsakiris, 2010; Overgaard, 2011; Fabbro et al., 2015; Chokron et al., 2016). Since experiencing self-awareness optimizes levels of mutual information between the external world and the brain’s internal model (Apps and Tsakiris, 2014), and this activity decreases thermodynamic entropy (Torday and Miller, 2016), self-awareness may be a mechanism for optimizing the brain’s consumption of energy.
Thermodynamic information is also interesting to consider in the context of free will. The brain is predictable within reason, and the performance of an action can be predicted before a decision is reported to have been made (Haggard, 2008). Entities such as ideas, feelings, and beliefs seem to exist as effectively deterministic evaluations of information processed in the brain. Whether or not the flow of information is subject to the brain’s volitional alteration, neuroscience also shows that information can be internally real to a system of the brain, even if this information is inconsistent with an external reality. That the brain can generate an externally inconsistent internal reality is demonstrated by phenomena such as confabulation, agnosia, blindsight, neglect, commissurotomy and hemispherectomy effects, placebo and nocebo effects, reality monitoring deficits, hallucinations, prediction errors, the suspension of disbelief during dreaming, the function of communication in minimizing divergence between internal realities, the quality of many kinds of realistic drug-induced experiences, and the effects of many neuropsychological conditions. The apparent fact that subjective reality is an active construction of the physical brain has even led to the proposal of model-dependent realism (Hawking and Mlodinow, 2011) as a philosophical paradigm in the search for a unified theory of physics. In any case, it is likely that beliefs, including those in free will, exist as information, and that their internal reality is a restatement of its frequently observer-dependent nature.
Empirical Outlook
Before concluding, it is worth reviewing a few notable experiments in greater detail. While considerable advances have been made in discovering how neurobiological systems operate according to principles of thermodynamic efficiency (Sterling and Laughlin, 2015), relationships between information and energy in the brain are only beginning to be understood. The following studies are examples of elegant and insightful experiments that should inspire future research.
Several recent brain imaging studies support the proposal (Annila, 2016) that thermodynamics is able to explain a number of mysteries involving consciousness. For example, Stender et al. (2016) used PET to measure global resting state energy consumption in 131 brain injury patients with impairments of consciousness as defined by the revised Coma Recovery Scale (CRS-R). The preservation of consciousness was found to require a minimal global metabolic rate of ≈ 40% of the average rate of controls; global energy consumption above this level was reported to predict the presence or recovery of consciousness with over 90% sensitivity. These results must be replicated and studied in closer detail before their specific theoretical implications are clear, but it is now established that levels of consciousness are correlated with energetic metrics of brain activity. To what extent there exists a well-defined “minimal energetic requirement for the presence of conscious awareness” (Stender et al., 2016) remains an open question. However, the empirical confirmation of a connection between consciousness and thermodynamics introduces the possibility of developing new experimental methods in consciousness research.
Neurobiological systems, and biological systems in general (Von Baeyer, 1999; Schneider and Sagan, 2005), can be considered thermodynamic demons in the sense that they are agents using information to decrease their thermodynamic entropy. Landauer’s principle requires that, in order not to violate any known laws of thermodynamics, such agents dissipate heat when erasing information from their memory storage devices. In an experimental test of this principle, reviewed along with similar experiments in Parrondo et al. (2015) and Bérut et al. (2012) studied heat dissipation in a simple memory device created by placing a glass bead in an optical double-well potential. Intuitively, this memory stored a bit of information by retaining the bead on one side of the potential rather than on the alternative. By manipulating the height of the optical barrier between wells, researchers moved the bead to one side of the memory without determining its previous location in the potential. This process was therefore logically irreversible, requiring the erasure of prior information from the memory device. Landauer’s principle predicts that, since information is conserved, the entropy of the memory’s surroundings must increase when this occurs. Bérut et al. (2012) have verified that energy is emitted when a memory is cleared. As noted by the authors, “this limit is independent of the actual device, circuit or material used to implement the irreversible operation.” It would be interesting to study the erasure principle in the context of neuroscience.
Experimental applications of information theory in cell biology have already led to the discovery of general principles of brain organization related to thermodynamics (Sterling and Laughlin, 2015). In one particularly interesting study, Niven et al. (2007) measured the energetic efficiency of information coding in retinal neurons. Intracellular recordings of membrane potential and input resistance were used to calculate rates of ATP consumption in response to different background light intensities. These rates of energy consumption were then compared with rates of Shannon information transmission in order to determine metabolic performance. It was found that metabolic demands increase non-linearly with respect to increases in information processing rate: thermodynamics appears to impose a “law of diminishing returns” on systems of the brain. The authors interpret these results as evidence that nature has selected for neurons that minimize unnecessary information processing. Studying how thermodynamics has influenced cellular parameters over the course of evolution is likely to raise many new empirically addressable questions.
Conclusion
This article has reviewed information-energy relationships in the hope that they may eventually provide a general framework for uniting theory and experiment in neuroscience. The physical nature of information and its status as a finite, measurable resource are emphasized to connect neurobiology and thermodynamics. As a scientific paradigm, the information movement currently underway in physics promises profound advances in our understanding of the relationship between energy, information, and the physical brain.
Statements
Author contributions
The author confirms being the sole contributor of this work and approved it for publication.
Acknowledgments
I am grateful to Baroness Susan Greenfield, Dr. Francesco Fermani, Dr. Karl Friston, Dr. Biswa Sengupta, Dr. Roy Frieden, Dr. Bernard Baars, Dr. Brett Clementz, Dr. Cristi Stoica, Dr. Satoru Suzuki, Dr. Paul King, Guillem Collell, Dr. Jordi Fauquet, and others who have helped me improve these ideas. I am also grateful to Dr. Shanta Dhar and her team for introducing me to academic research, to Jim Reid for introducing me to biology, to Alex Tisch for introducing me to physics, and to those affiliated with the Department of Neurosurgery at the University of Virginia Medical Center for introducing me to neuroscience.
Conflict of interest
The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
References
1
Alfonso-FausA. (2013). “Fundamental principle of information-to-energy conversion,” inProceedings of the 7th European Computing Conference, Dubrovnik.
2
AmitD. J. (1992). Modeling Brain Function.Cambridge: Cambridge University Press.
3
AnnilaA. (2016). On the character of consciousness.Front. Syst. Neurosci.10:27. 10.3389/fnsys.2016.00027
4
AppsM. A.TsakirisM. (2014). The free-energy self: a predictive coding account of self-recognition.Neurosci. Biobehav. Rev.4185–97. 10.1016/j.neubiorev.2013.01.029
5
AttwellD.IadecolaC. (2002). The neural basis of functional brain imaging signals.Trends Neurosci.25621–625. 10.1016/S0166-2236(02)02264-6
6
BeckC. (2009). Generalised information and entropy measures in physics.Contemp. Phys.50495–510. 10.1080/00107510902823517
7
BekensteinJ. D. (1981). Universal upper bound on the entropy-to-energy ratio for bounded systems.Phys. Rev. D23287. 10.1103/PhysRevD.23.287
8
BekensteinJ. D. (1984). Entropy content and information flow in systems with limited energy.Phys. Rev. D301669. 10.1103/PhysRevD.30.1669
9
BekensteinJ. D. (2001). The limits of information.Stud. Hist. Philos. Mod. Phys.32511–524. 10.1016/S1355-2198(01)00020-X
10
BekensteinJ. D. (2004). Black holes and information theory.Contemp. Phys.4531–43. 10.1080/00107510310001632523
11
BekensteinJ. D. (2007). Information in the holographic universe.Sci. Am.1766–73. 10.1038/scientificamerican0407-66sp
12
Ben-NaimA. (2015). Information, Entropy, Life and the Universe.Singapore: World Scientific, 4–5.
13
BennettC. H. (2003). Notes on Landauer’s principle, reversible computation, and Maxwell’s Demon.Stud. Hist. Philos. Sci. B34501–510. 10.1016/S1355-2198(03)00039-X
14
BérutA.ArakelyanA.PetrosyanA.CilibertoS.DillenschneiderR.LutzE. (2012). Experimental verification of Landauer/’s principle linking information and thermodynamics.Nature483187–189. 10.1038/nature10872
15
BlakemoreS. J.WolpertD.FrithC. (2000). Why can’t you tickle yourself?Neuroreport11R11–R16. 10.1097/00001756-200008030-00002
16
BlumenfeldH.TaylorJ. (2003). Why do seizures cause loss of consciousness?Neuroscientist9301–310. 10.1177/1073858403255624
17
BoltzmannL. (1892). “On the methods of theoretical physics,” inTheoretical Physics and Philosophical Problems, ed.McGuinnessB. (Dordrecht: Springer Netherlands), 5–12.
18
BoussoR. (2002). The holographic principle.Rev. Mod. Phys.74825. 10.1103/RevModPhys.74.825
19
BruknerČZeilingerA. (2003). “Information and fundamental elements of the structure of quantum theory,” inTime, Quantum and Information, edsCastellL.IschebeckO. (Heidelberg: Springer), 323–354.
20
BullmoreE.SpornsO. (2009). Complex brain networks: graph theoretical analysis of structural and functional systems.Nat. Rev. Neurosci.10186–198. 10.1038/nrn2575
21
CalabròR. S.CacciolaA.BramantiP.MilardiD. (2015). Neural correlates of consciousness: what we know and what we have to learn!Neurol. Sci.36505–513. 10.1007/s10072-015-2072-x
22
ChoeY. (2015). “Hebbian learning,” inEncyclopedia of Computational Neuroscience, edsJaegerD.JungR. (New York, NY: Springer), 1305–1309. 10.1007/978-1-4614-7320-6_672-1
23
ChokronS.PerezC.PeyrinC. (2016). Behavioral consequences and cortical reorganization in homonymous hemianopia.Front. Syst. Neurosci.10:57. 10.3389/fnsys.2016.00057
24
CollellG.FauquetJ. (2015). Brain activity and cognition: a connection from thermodynamics and information theory.Front. Psychol.6:818. 10.3389/fpsyg.2015.00818
25
CrickF. C.KochC. (2005). What is the function of the claustrum?Philos. Trans. R. Soc. Lond. B Biol. Sci.3601271–1279. 10.1098/rstb.2005.1661
26
DaviesP. (2010). “Universe from bit,” inInformation and the Nature of Reality, edsDaviesP.GregersenN. H. (Cambridge: Cambridge University Press), 83–117.
27
DehaeneS.ChangeuxJ. P. (2011). Experimental and theoretical approaches to conscious processing.Neuron70200–227. 10.1016/j.neuron.2011.03.018
28
DestexheA.Rudolph-LilithM. (2012). Neuronal Noise.New York, NY: Springer.
29
DreierJ. P.IseleT.ReiffurthC.OffenhauserN.KirovS. A.DahlemM. A.et al (2013). Is spreading depolarization characterized by an abrupt, massive release of Gibbs free energy from the human brain cortex?Neuroscientist1925–42. 10.1177/1073858412453340
30
DuffauH.MandonnetE.GatignolP.CapelleL. (2007). Functional compensation of the claustrum: lessons from low-grade glioma surgery.J. Neurooncol.81327–329. 10.1007/s11060-006-9236-8
31
DuncanT. L.SemuraJ. S. (2004). The deep physics behind the second law: information and energy as independent forms of bookkeeping.Entropy621–29. 10.3390/e6010021
32
EdelmanG. M.GallyJ. A.BaarsB. J. (2011). Biology of consciousness.Front. Psychol.2:4. 10.3389/fpsyg.2011.00004
33
EinsteinA. (1949). “Autobiographical notes,” inAlbert Einstein, ed.SchilppP. A. (La Salle: Open Court), 33.
34
ElingC.GuedensR.JacobsonT. (2006). Nonequilibrium thermodynamics of spacetime.Phys. Rev. Lett.96121301. 10.1103/PhysRevLett.96.121301
35
EnglandJ. L. (2013). Statistical physics of self-replication.J. Chem. Phys.139121923. 10.1063/1.4818538
36
FabbroF.AgliotiS. M.BergamascoM.ClariciA.PankseppJ. (2015). Evolutionary aspects of self-and world consciousness in vertebrates.Front. Hum. Neurosci.9:157. 10.3389/fnhum.2015.00157
37
FaisalA. A.SelenL. P.WolpertD. M. (2008). Noise in the nervous system.Nat. Rev. Neurosci.9292–303. 10.1038/nrn2258
38
FlackJ. C. (2014). Life’s information hierarchy.Santa Fe Inst. Bull.2813–24.
39
FoxD. (2015). The limits of intelligence.Sci. Am.30536–43.
40
FristonK. J. (2003). Learning and inference in the brain.Neural Netw.161325–1352. 10.1016/j.neunet.2003.06.005
41
FristonK. J. (2010). The free-energy principle: a unified brain theory?Nat. Rev. Neurosci.11127–138. 10.1038/nrn2787
42
FristonK. J. (2013). Life as we know it.J. R. Soc. Interface10:20130475. 10.1098/rsif.2013.0475
43
GleickJ. (2011). The Information.New York, NY: Random House, 269–270.
44
GrazianoM. S. (2013). Consciousness and the Social Brain.Oxford: Oxford University Press.
45
GreenfieldS. A.CollinsT. F. T. (2005). A neuroscientific approach to consciousness.Prog. Brain Res.15011–23. 10.1016/S0079-6123(05)50002-5
46
GuoJ. U.MaD. K.MoH.BallM. P.JangM. H.BonaguidiM. A.et al (2011). Neuronal activity modifies the DNA methylation landscape in the adult brain.Nat. Neurosci.141345–1351. 10.1038/nn.2900
47
HaggardP. (2008). Human volition: towards a neuroscience of will.Nat. Rev. Neurosci.9934–946. 10.1038/nrn2497
48
HaiderB.DuqueA.HasenstaubA. R.McCormickD. A. (2006). Neocortical network activity in vivo is generated through a dynamic balance of excitation and inhibition.J. Neurosci.264535–4545. 10.1523/JNEUROSCI.5297-05.2006
49
HannawiY.LindquistM. A.CaffoB. S.SairH. I.StevensR. D. (2015). Resting brain activity in disorders of consciousness.Neurology841272–1280. 10.1212/WNL.0000000000001404
50
HarrisJ. J.JolivetR.AttwellD. (2012). Synaptic energy use and supply.Neuron75762–777. 10.1016/j.neuron.2012.08.019
51
HawkingS. W.MlodinowL. (2011). The Grand Design.New York, NY: Random House. 746.
52
Herculano-HouzelS. (2011). Scaling of brain metabolism with a fixed energy budget per neuron: implications for neuronal activity, plasticity and evolution.PLoS ONE6:e17514. 10.1371/journal.pone.0017514
53
HerrupK.ChenJ.LiJ. (2013). Breaking news: thinking may be bad for DNA.Nat. Neurosci.16518–519. 10.1038/nn.3384
54
HoughtonG.TipperS. P. (1996). Inhibitory mechanisms of neural and cognitive control: applications to selective attention and sequential action.Brain Cogn.3020–43. 10.1006/brcg.1996.0003
55
HowarthC.GleesonP.AttwellD. (2012). Updated energy budgets for neural computation in the neocortex and cerebellum.J. Cereb. Blood Flow Metab.321222–1232. 10.1038/jcbfm.2012.35
56
IsaacsonJ. S.ScanzianiM. (2011). How inhibition shapes cortical activity.Neuron72231–243. 10.1016/j.neuron.2011.09.027
57
KoubeissiM. Z.BartolomeiF.BeltagyA.PicardF. (2014). Electrical stimulation of a small brain area reversibly disrupts consciousness.Epilepsy Behav.3732–35. 10.1016/j.yebeh.2014.05.027
58
LandauerR. (1996). The physical nature of information.Phys. Lett. A217188–193. 10.1016/0375-9601(96)00453-7
59
LeeJ. W.KimH. C.LeeJ. (2013). Gravity from quantum information.J. Korean Phys. Soc.631094–1098. 10.3938/jkps.63.1094
60
LehmannK.SteineckeA.BolzJ. (2012). GABA through the ages: regulation of cortical function and plasticity by inhibitory interneurons.Neural Plast2012:892784. 10.1155/2012/892784
61
LindenD. J. (2007). The Accidental Mind.Cambridge: Harvard University Press, 9–12.
62
LloydS. (2006). Programming the Universe.New York, NY: Random House.
63
LloydS. (2015). Interview in Closer to Truth: Is Information Fundamental?. Available at: https://www.closertotruth.com/series/information-fundamental\#video-2621
64
LutzP. L.NilssonG. E.PrenticeH. M. (2003). The Brain Without Oxygen.New York, NY: Kluwer.
65
MagistrettiP. J.AllamanI. (2015). A cellular perspective on brain energy metabolism and functional imaging.Neuron86883–901. 10.1016/j.neuron.2015.03.035
66
MaruyamaK.NoriF.VedralV. (2009). Colloquium: the physics of Maxwell’s demon and information.Rev. Mod. Phys.811. 10.1103/RevModPhys.81.1
67
McDonnellM. D.WardL. M. (2011). The benefits of noise in neural systems: bridging theory and experiment.Nat. Rev. Neurosci.12415–426. 10.1038/nrn3061
68
MetzingerT. (2004). Being No One.Cambridge: MIT Press.
69
MlodinowL.BrunT. A. (2014). Relation between the psychological and thermodynamic arrows of time.Phys. Rev. E89:052102. 10.1103/PhysRevE.89.052102
70
MorinA. (2006). Levels of consciousness and self-awareness: a comparison and integration of various neurocognitive views.Conscious. Cogn.15358–371. 10.1016/j.concog.2005.09.006
71
MoskowitzC. (2015). Stephen hawking Hasn’t solved the black hole paradox just yet.Sci. Am.27.
72
NivenJ. E.AndersonJ. C.LaughlinS. B. (2007). Fly photoreceptors demonstrate energy-information trade-offs in neural coding.PLoS Biol.5:e116. 10.1371/journal.pbio.0050116
73
NorthoffG.HeinzelA.De GreckM.BermpohlF.DobrowolnyH.PankseppJ. (2006). Self-referential processing in our brain–a meta-analysis of imaging studies on the self.Neuroimage31440–457. 10.1016/j.neuroimage.2005.12.002
74
NunezP. L.SrinivasanR. (2006). Electric Fields of the Brain.New York, NY: Oxford University Press.
75
OrfeiM. D.RobinsonR. G.PrigatanoG. P.StarksteinS.RüschN.BriaP.et al (2007). Anosognosia for hemiplegia after stroke is a multifaceted phenomenon: a systematic review of the literature.Brain1303075–3090. 10.1093/brain/awm106
76
OvergaardM. (2011). Visual experience and blindsight: a methodological review.Exp. Brain Res.209473–479. 10.1007/s00221-011-2578-2
77
ParrondoJ. M. R.HorowitzJ. M.SagawaT. (2015). Thermodynamics of information.Nat. Phys.11131–139. 10.1038/nphys3230
78
PartonA.MalhotraP.HusainM. (2004). Hemispatial neglect.J. Neurol. Neurosurg. Psychiatry7513–21.
79
PlenioM. B.VitelliV. (2001). The physics of forgetting: Landauer’s erasure principle and information theory.Contemp. Phys.4225–60. 10.1080/00107510010018916
80
PoirierB. (2014). A Conceptual Guide to Thermodynamics.Chichester: Wiley, 77–78.
81
PrigatanoG. P. (2009). Anosognosia: clinical and ethical considerations.Curr. Opin. Neurol.22606–611. 10.1097/WCO.0b013e328332a1e7
82
ProkopenkoM.LizierJ. T. (2014). Transfer entropy and transient limits of computation.Sci. Rep.4:5394. 10.1038/srep05394
83
RaichleM. E.GusnardD. A. (2002). Appraising the brain’s energy budget.Proc. Natl. Acad. Sci. U.S.A.9910237–10239. 10.1073/pnas.172399499
84
RevonsuoA. (1999). Binding and the phenomenal unity of consciousness.Consci. Cogn.8173–185. 10.1006/ccog.1999.0384
85
RollsE. T. (2012). Neuroculture.Oxford: Oxford University Press.
86
RollsE. T.DecoG. (2010). The Noisy Brain.Oxford: Oxford University Press.
87
RovelliC. (2015). “Relative information at the foundation of physics,” inIt From Bit or Bit From It?, edsAguirreA.FosterB.MeraliZ. (Cham: Springer), 10.1007/978-3-319-12946-4_7
88
SchneiderE. D.SaganD. (2005). Into the Cool.Chicago: University of Chicago Press.
89
SchrödingerE. (1944). What is Life?Cambridge: Cambridge University Press, 3.
90
SenguptaB.FaisalA. A.LaughlinS. B.NivenJ. E. (2013a). The effect of cell size and channel density on neuronal information encoding and energy efficiency.J. Cereb. Blood Flow Metab.331465–1473. 10.1038/jcbfm.2013.103
91
SenguptaB.LaughlinS. B.NivenJ. E. (2013b). Balanced excitatory and inhibitory synaptic currents promote efficient coding and metabolic efficiency.PLoS Comput. Biol.9:e1003263. 10.1371/journal.pcbi.1003263
92
SenguptaB.StemmlerM. B.FristonK. J. (2013c). Information and efficiency in the nervous system–a synthesis.PLoS Comput. Biol.9:e1003157. 10.1371/journal.pcbi.1003157
93
SenguptaB.TozziA.CoorayG. K.DouglasP. K.FristonK. J. (2016). Towards a neuronal gauge theory.PLoS Biol.14:e1002400. 10.1371/journal.pbio.1002400
94
ShulmanR. G.HyderF.RothmanD. L. (2009). Baseline brain energy supports the state of consciousness.Proc. Natl. Acad. Sci. U.S.A.10611096–11101. 10.1073/pnas.0903941106
95
SmolinL. (2001). Three Roads to Quantum Gravity.New York, NY: Basic Books. 103169–178.
96
StenderJ.MortensenK. N.ThibautA.DarknerS.LaureysS.GjeddeA.et al (2016). The minimal energetic requirement of sustained awareness after brain injury.Curr. Biol.261494–1499. 10.1016/j.cub.2016.04.024
97
SterlingP.LaughlinS. (2015). Principles of Neural Design.Cambridge: MIT Press.
98
StillS.SivakD. A.BellA. J.CrooksG. E. (2012). Thermodynamics of prediction.Phys. Rev. Lett.109:120604. 10.1103/PhysRevLett.109.120604
99
StoicaO. C. (2008). Flowing with a Frozen River. FQXi, The Nature of Time essay contest. Available at: http://fqxi.org/community/essay/winners/2008.1\#Stoica
100
StoneJ. V. (2015). Information Theory.Sheffield: Sebtel Press, 171.
101
SusskindL.HrabovskyG. (2014). The Theoretical Minimum, Vol. 9. New York, NY: Basic Books, 170
102
TakeuchiT.DuszkiewiczA. J.MorrisR. G. (2014). The synaptic plasticity and memory hypothesis: encoding, storage and persistence.Philos. Trans. R. Soc. B369:20130288. 10.1098/rstb.2013.0288
103
TogniniP.NapoliD.PizzorussoT. (2015). Dynamic DNA methylation in the brain: a new epigenetic mark for experience-dependent plasticity.Front. Cell. Neurosci.9:331. 10.3389/fncel.2015.00331
104
TordayJ. S.MillerW. B.Jr. (2016). On the evolution of the mammalian brain.Front. Syst. Neurosci.10:31. 10.3389/fnsys.2016.00031
105
TsakirisM. (2010). My body in the brain: a neurocognitive model of body-ownership.Neuropsychologia48703–712. 10.1016/j.neuropsychologia.2009.09.034
106
VarelaF.LachauxJ. P.RodriguezE.MartinerieJ. (2001). The brainweb: phase synchronization and large-scale integration.Nat. Rev. Neurosci.2229–239. 10.1038/35067550
107
VedralV. (2010). Decoding Reality.Oxford: Oxford University Press.
108
Von BaeyerH. C. (1999). Maxwell’s Demon.New York, NY: Random House, 100–101.
109
WardL. M. (2011). The thalamic dynamic core theory of conscious experience.Conscious. Cogn.20464–486. 10.1016/j.concog.2011.01.007
110
WheelerJ. (1986). “John Wheeler,” inThe Ghost in the AtomedsDaviesP.BrownJ. (Cambridge: Cambridge University Press), 62.
111
WheelerJ. A. (1989). “Information, physics, quantum: the search for links,” inProceedings of the Third International Symposium on Foundations of Quantum Mechanics, Tokyo, 354–368.
112
YusteR. (2015). From the neuron doctrine to neural networks.Nat. Rev. Neurosci.16487–497. 10.1038/nrn3962
Summary
Keywords
information thermodynamics, Landauer limit, free energy principle, optimization, Bekenstein bound
Citation
Street S (2016) Neurobiology as Information Physics. Front. Syst. Neurosci. 10:90. doi: 10.3389/fnsys.2016.00090
Received
10 August 2016
Accepted
28 October 2016
Published
15 November 2016
Volume
10 - 2016
Edited by
Yan Mark Yufik, Virtual Structures Research, Inc., USA
Reviewed by
Guillem Collell, Katholieke Universiteit Leuven, Belgium; Arto Annila, University of Helsinki, Finland
Updates
Copyright
© 2016 Street.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Sterling Street, sterling.street@uga.edu
Disclaimer
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.