Frontiers reaches 6.4 on Journal Impact Factors

Perspective ARTICLE

Front. Phys., 14 August 2014 |

Beyond Gibbs-Boltzmann-Shannon: general entropies—the Gibbs-Lorentzian example

  • 1International Space Science Institute, Bern, Switzerland
  • 2Geophysics Department, Munich University, Munich, Germany
  • 3Space Research Institute, Extraterrestrial Physics, Austrian Academy of Sciences, Graz, Austria

We propose a generalization of Gibbs' statistical mechanics into the domain of non-negligible phase space correlations. Derived are the probability distribution and entropy as a generalized ensemble average, replacing Gibbs-Boltzmann-Shannon's entropy definition enabling construction of new forms of statistical mechanics. The general entropy may also be of importance in information theory and data analysis. Application to generalized Lorentzian phase space elements yields the Gibbs-Lorentzian power law probability distribution and statistical mechanics. The corresponding Boltzmann, Fermi and Bose-Einstein distributions are found. They apply only to finite temperature states including correlations. As a by-product any negative absolute temperatures are categorically excluded, supporting a recent “no-negative T” claim.

1. Generalisation of Gibbs' Statistical Mechanics

In Gibbs' Statistical Mechanics [1, 2] the probability wii)∝exp(−ϵi/T) of finding a particle in energy state ϵi at constant temperature T (in energy units) is obtained from considering the infinitesimal change of the phase space volume Γ[S(ϵ)] determined by the (normalized) entropy S(ϵ), a function of energy ϵ, as dΓ/dϵ = GG(S) ∝ exp [S(ϵ)]/Δϵ, holding under the ergodic assumption and valid for stochastic (Markovian) processes. The dependence of the phase space element on entropy S(ϵ) can be generalized, defining an arbitrary function G(S) such that

dΓ/dϵ=G[S(ϵ)S(E)]/Δϵ,    (1)

with S(E) the entropy at any given reference energy E. Application to the Gibbs-function GG = exp S just yields the arbitrary proportionality factor exp[−S(E)].

The function G(S) is subject to constraints which have to be determined from the requirement that the probability of finding a particle in energy state ϵi in phase space in the given fixed interval Δϵ around E is

wi [dΓ(ϵ)dϵ]δ(ϵi+ϵE)dϵ.    (2)

Considering the product dΓ1(S1) dΓ2(S2) of two phase space elements of different entropies S1, S2 yields

dΓ1dΓ2G(S1)G(S2).    (3)

It is easy to prove that the only function for which this expression becomes equal to dΓ3G(S3), with G(S3) = G(S1 + S2), is Gibbs' log GG(S)S. [This can be seen by assuming S2 = S1(1 + Δ/S1) which produces an additional quadratic term in G(S1)G(S2). For all GGB this term is irreducible to a constant factor. An illustrational example is given below in Equation (9).] For any G(S) different from GG the entropies of the two phase space elements are not independent but correlated, indicating the presence of phase-space correlations, entropic non-extensivity, and violation of the ergodic assumption [see, e.g., 3, 4, for a more extended discussion].

Expanding the entropy around energy E

S(ϵ)~S(E)+S(ϵ)ϵ|ϵ=E(ϵE)=S(E)+ϵET,    (4)

with ∂S(ϵ)/∂ϵ|ϵ = E = 1/T and inserting into G(S), Equation (2) yields the wanted phase space probability distribution

wi(ϵi)G(ϵi/T).    (5)

For wii) being a real physical probability, the requirement is that G(S) be a real valued positive definite function of its argument which can be normalized by summing over all accessible phase space energy states ∑iwii = 1. This determines the constant of proportionality in Equation (5). Of the whole class of such functions G, only those may have physical relevance which satisfy a number of supplementary constraints.

The first physical constraint is that the probability should conserve energy. This is most easily checked for non-relativistic energies ϵi = p2i/2m, where m is particle mass. Under the simplifying ideal gas assumption [for non-ideal gases one just adds an external or an interaction potential field Φ(x) of spatial dependence] and, for d momentum space dimensions, energy conservation implies that, for large p, the function G{S[ϵ(p)]} asymptotically converges faster than p−d−2.

Any exponentially decaying asymptotic function of momentum p or energy ϵ would thus be appropriate. In contrast to algebraic functions, for which more restrictive conditions would have to be imposed, they also hold under the requirement of conservation of higher moments of the probability distribution. One particular class of such functions is

G(S)=eS(ϵ)G˜(S),    (6)

where G˜(S) is any algebraic function. It produces the modified Gibbsian probability distribution wiG˜(−ϵi/T)exp(−ϵi/T).

A severe restriction imposed on G(S) is the demand that any stationary statistical mechanics based on Equation (1) is required to be in accord with thermodynamics. It needs reproducing the macroscopic relations between entropy S, energy E, pressure P, volume V and the derived thermodynamic potentials. In addition, the temperature T must maintain its thermodynamic meaning.

The obvious path to statistical mechanics paralleling Gibbs' approach is via inverting G(S), finding the appropriate entropy S[wi(G)]∝ S[G−1(wi)] as functional of probability. This is suggested by Gibbs-Boltzmann's use of log wGB, which is the inversion of the Gibbs-Boltzmann probability wGB ∝ exp(−ϵ/T), in the definition of entropy S.

Thus, formally, for any arbitrarily chosen G(S) that satisfies the above two constraints, the entropy is determined as the ensemble average

S= G1[ wiA ]  dp dxw(ϵp)G1[ w(ϵp)A ],    (7)

with A the normalization constant. It requires the existence of the inverse function G−1(S) which, for arbitrary G(S), poses a hurdle of constructing a viable statistical mechanics. Since wp) is normalized, a constant −log A can be added to G−1 in order to adjust for the first law. Of interest is the local entropy density s = wiG−1[wi/A], rather than the global entropy S.

This completely general definition of entropy may be of relevance not only for statistical mechanics but as well for several other fields like maximum entropy methods in data analysis, information theory, where it replaces Shannon's classical definition of information, and also outside physics in the economical and social sciences. Its extension to complex functions G(S) implying complex probabilities is obvious. Since the meaning of complex entropies is unclear, it requires that the entropy S in Equation (7) be real, i.e., calculated as a real expectation value.

In the following we demonstrate that for a limited class of algebraic functions, so-called generalized Lorentzians G(x) = (1 + x/κ)−κ, with arbitrary expression xS independent of 0 < κ ∈ R [cf., e.g., 5], construction of a viable statistical mechanics is nevertheless possible if only the ubiquitous additional constraint is imposed on G(S) that, in some asymptotic limit, it reproduces the exponential dependence of Gibbs' phase space element on entropy.

2. Algebraic Example: Gibbs-Lorentzians

With G(S) an algebraic generalized Lorentzian, substitution into Equation (1) yields

dΓ(ϵ)dϵ=1Δϵ{1+1κ[S(E)S(ϵ)]}κr.    (8)

It is obvious that for κ → ∞ this expression reproduces the Gibbsian variation of the phase space volume. The negative sign in the brackets is self-explanatory. The relevance of entropy S(E) is seen in that, for κ → ∞, it just generates a constant factor in dΓ. We also made use of the freedom of adding a number rR to the exponent, as it has no effect when taking the large κ limit.

It is easy to prove explicitly that finite κ < ∞ imply correlations in phase space by (even for constant κ) considering

dΓ3={ 1+[ S3(E)S3(ϵ) ]κ     +[S1(E)S1(ϵ)][S2(E)S2(ϵ)]κ2 }κr.    (9)

The irreducible quadratic term indicates that the two phase space elements in the κ-generalized Gibbs-Lorentzian model are not independent.

Equation (5) yields the Gibbs-Lorentzian probability

wiκ(ϵi,x)=A{ 1+[ ϵi+Φ(xi) ]/κT }κr,    (10)

generalized here to non-ideal gases by including a potential energy Φ(xi). A is a constant of normalization of the probability when integrating over phase space dΓ. Equation (10) allows for a formulation of Gibbsian statistical mechanics consistent with fundamental thermodynamics.

To determine the value of power r, we switch temporarily from probability to phase space distribution wiκi)→ fκ(p), with p particle momentum and ϵ(p) = p2/2m and restrict to ideal gases. Normalizing (2πħ)−3ʃ fκ(p) dΓ determines Afκ as function of particle number N and volume V to

Afκ=NλT3Vκ32Γ(κ+r)Γ(32)Γ(κ+r32),    (11)

where λT=2π2/mT is the thermal wavelength. The average energy is obtained from the kinetic energy moment of the normalized distribution function. This produces an additional factor (κT)52Γ(52)Γ(κ+r52)/Γ(κ+r). Combination yields for the ideal gas kinetic energy density

EV=32κκ+r52NTV.    (12)

Three degrees of freedom require E=32NT which immediately yields that

r=52,    (13)

proving that T is the physical temperature of the system, as was inferred earlier in the generalized Lorentzian thermodynamics [5] though lacking the determination of r. [Correspondingly, for d degrees of freedom the factor 3/2 and the denominator in Equation (12) become d/2 and κ + r − (2 + d)/2, respectively, which yields for r=1+12d in order for the energy to be 〈 E〉 = (d/2)NT.] Any κ gas embedded into a heat bath necessarily assumes the temperature of the bath. Also, two gases in contact at a boundary will adjust to a common temperature, independent on whether both are κ gases of equal or different κs or one of them being a Boltzmann gas.

We note for completeness that, in the time-asymptotic limit, the correct r value was determined from a direct kinetic calculation of the particular case of interaction between electrons and a thermal Langmuir wave background [6]. It was also inferred in Livadiotis and McComas [7] arguing about the role of total energy E in κ distributions.

Having determined r, the constant particle number ideal gas canonical phase space κ-distribution is

fκ(p)=Afκ(1+ϵi/κT)κ52.    (14)

Including variable particle number N, the entropy S(E,N) becomes a function of N. Expanding S with respect to energy and particle number, defining the derivative of the entropy at constant energy (∂S/∂N)N = 0,E = −μ/T, the extended Gibbs distribution is given by

wiκ(ϵiN)=A(1+ϵiNμNκT)κr.    (15)

The index N identifies the Nth subsystem of fixed particle number N. (If the subsystems contain just one particle, it is the particle number.) This is the general form of the Gibbs-Lorentzian probability distribution with physical temperature T in state (iN).

These expressions also contain the non-Lorentzian power law distribution, wir ∝ [1 + ϵi/T]r with r′ = κ + r, for instance encountered in cosmic ray spectra. Equation (12) then has κ only in the denominator requiring r=72. Defining ℓ the number of the highest conserved statistical moment yields, more generally, r=1+12(2+1).

3. Gibbs-Lorentzian Statistical Mechanics

The form of the Generalized Gibbsian probability distribution inhibits the use of the Gibbsian definition of entropy S = −〈 log w〉 as phase space average 〈…〉 ≡ ʃ … dΓ of the logarithm of w(ϵ). Instead, another form of entropy has to be found enabling construction of a generalized thermodynamics in agreement with the fundamental thermodynamic definitions. Forms have been proposed in Tsallis' q statistical mechanics [8, 9] and in the Generalized Lorentzian thermodynamics [5]. Adopting the latter version we define the functional

g[w]=exp{κ[(Awκ)(κ+r)11]logA},    (16)

whose logarithmic expectation value leads to the entropy S = −〈log g[w]〉. Its particular version is chosen in agreement with Equation (7) for reconciling with thermodynamics by adding an additional normalization constant A. Clearly, log g is related to the inverse function G−1[wκ/A] in this case. Substituting wiκi) and g[wiκi)] into the ensemble average Equation (7) yields

S=logA+E/T.    (17)

The thermodynamic relation 〈 E〉 = TS + F identifies F = Tlog A as the free energy F. The generalized canonical Gibbs κ-probability distribution then reads

wiκ=exp(F/T)(1+ϵi/κT)κ+r.    (18)

Since A is the normalization of wiκ one also has that ∑iwiκ = 1 and, hence, for the free energy

F=TlogdΓ[1+ϵ(p,x)κT]κr=TlogZκ,    (19)

with dΓ = d3p dV/(2πħ)3 the phase space volume element. From the last expression we immediately read the generalized Gibbsian version of the classical canonical partition function

ZκdΓ[1+ϵ(p,x)/κT]κr.    (20)

In the quantum case the integral becomes a sum over all quantum states i:

Zκi(1+ϵi/κT)κr.    (21)

This completes the discussion for a system with fixed particle number since all statistical mechanical information is contained in the partition function Zκ.

Allowing for a variable particle number N and again making use of the chemical potential μ = −T(∂S/∂N)EV, the normalization condition becomes

TlogA=FμNΩ,    (22)

with 〈N〉 the average particle number, and Ω the thermodynamic potential. With F = μN + Ω for the Nth subsystem of particle number N the generalized Gibbs distribution reads

wκ,N=exp(Ωκ/T)[1+(ϵiNμN)/κT]κ+r,    (23)

repeating that N is the index of the subsystems of different particle numbers. Normalization requires summing over these N subsystems, a procedure not affecting Ω and thus yielding

Ωκ=TlogNi[1+(ϵiNμN)/κT]κr                =TlogZκ.    (24)

The argument of the logarithm is the grand partition function Zκ = ∑N Zκ,N which is the sum over all partition functions of the N separate subsystems of different particle number N. For classical systems the inner sum over states i again becomes the Nth phase space integral. All thermodynamic information is contained in it.

4. Undistinguishable Particles

The most interesting case is that of N undistinguishable subsystems (particles) in states i in an ideal gas (again, in non ideal gases one simply adds the external and interaction potentials). Then the sum over N in the logarithm is understood as an exponentiation yielding for the free energy

F=NTlogdΓexp(1)N(2π)3[1+ϵpμκT]κr,    (25)

an expression to be used for the determination of the common chemical potential μ from ∂F/∂N = μ/T yielding, with r=52,

μκT=log[Vexp(1)Nλκ3B(3/2,κ+1)(1μ/κT)(κ+1)].    (26)

The logarithmic dependence on μ is weak, producing a small correction term of order O(κ2T2/|μ|2). Hence the chemical potential of the ideal classical κ-gas of undistinguishable particles is essentially the same as that for the ordinary ideal classical gas.

4.1. The Boltzmann Case

Following Gibbsian philosophy we consider particles in a given state i and write for the index N = ni, with ni the occupation number of state i, and for the energy ϵiN = niϵi. Then

Ωκi=Tlogni[ 1+ni(ϵiμ)/κT ]κ5/2,    (27)

and the Gibbsian probability distribution for the occupation numbers reads

wκni=exp(Ωκi/T)[1+ni(ϵiμ)/κT](κ+5/2).    (28)

The probability for a state to be empty is obtained for ni = 0. Thus wκ 0 = exp(Ωκi/T) is identical to the non-κ case of zero occupation. At high temperatures the occupation numbers of states are generally small. Hence the average occupation 〈ni〉 is obtained for ni = O(1) and Ωκi/T « 1, yielding

ni=niniwκniwκ1[1+(ϵiμ)/κT]κ5/2.    (29)

This is the κ-equivalent of the Boltzmann distribution of occupation numbers of states in an ideal gas (no external interaction potential) of variable particle number, with μ<0 given in Equation (26). Inspection suggests that this distribution of occupations is about flat for ϵi<|μ|. For constant particle number it instead becomes Equation (14), the ordinary (canonical) one-particle κ distribution [10]. In going from occupation of states to the distribution function, normalization is to the particle density N/V. Extracting 1+|μ|/κT from the integral, the contribution of μ can, like in the Boltzmann case, be absorbed into the normalization constant, however with energy in units of κT + |μ|. Since μ is large this distribution is flat for κT<|μ| being of interest only at large T.

4.2. The Fermi Case

From Equation (27) under the Fermi assumption that any energy states can host at most one particle, restricting the occupation numbers to ni = 0,1, yields

ΩκiF=Tlog{1+[1+(ϵiμ)/κT](κ+5/2)}.    (30)

The average occupation number of states follows as

niκF=ΩκiFμ={1+[1+(ϵiμ)/κT]κ+5/2}1.    (31)

Normalized yields the total particle number N. For ϵi>μ both ΩFκi and 〈niFκ vanish when T → 0. On the other hand, any energy level below μ cannot be occupied when the temperature vanishes for the reason that the distribution must be real.

One concludes that T = 0 is not accessible to the ideal gas κ-Fermi distribution. This is in accord with the idea that correlations in phase space imply complicated dynamics, hence, finite temperature. The Fermi-κ distribution does not define any Fermi energy since no degenerate states exist. At T > 0 any positive chemical potential would be bound by μ < κT not providing any new information. Extension to the non-ideal case is straightforward by adding an interaction potential.

4.3. Bose-Einstein Case

When the occupation number of states is arbitrary, summation over all states ni from ni = 0 to ni = ∞ is in place. Again, the chemical potential μ<0 is negative. The Bose-Einstein-κ distribution becomes

niκBE=(1+52κ)ni=1ni[1+ni(ϵiμ)/κT]κ7/21+ni=1[1+ni(ϵiμ)/κT]κ5/2.     (32)

Its low temperature behavior is again determined by the chemical potential μ. It is readily shown that in the limit T → 0 the distribution vanishes for all states ϵi≠ 0 and for all μ<0. There is no Bose-Einstein condensation on the lowest energy level ϵ0 = 0, seen by taking the limit μ → 0 and T → 0. The distribution applies to finite T only.

Temporarily replacing all expressions of the kind (1 + x)→ exp x, all sums become geometric progressions and can be performed. Resolving all exponentials, one ultimately obtains the appropriate approximation for the Bose-Einstein-κ distribution

ni κBE{ 1[1+(ϵiμ)/κT]κ52 }                ×[ (1+5/2κ)[ 1+(ϵiμ)/κT ]κ72{ 1[ 1+(ϵiμ)/κT ]κ72 }2 ].    (33)

5. Conclusions

A recipe is provided for constructing equilibrium statistical mechanics from arbitrary functionals G(S) with universal “Gibbsian" phase space probability distribution Equation (5). If only the inverse functional G−1(S) exists, the generalized entropy follows as its ensemble average (expectation value) allowing for the formulation of an equilibrium statistical mechanics. This form of entropy extends and generalizes the Gibbs-Boltzmann-Shannon definition. It can be extended to complex functions G(S) and complex probabilities under the requirement that the entropy obtained from Equation (7) is real. This version of entropy might be applicable not only in physics, but also in information theory, maximum entropy methods in data analysis, and possibly even in the economic and the social sciences.

As for an example we revisited the Gibbs-Lorentzian statistical mechanics [11] which leads to κ-distributions of energetic particles. Such distributions result from wave particle interaction in plasmas [6, 12] and have been observed in the heliosphere [13, 14]. They also apply to observed cosmic ray spectra.

Gibbs-Lorentzian statistical mechanics is restricted to high temperatures only, excluding vanishing absolute temperatures. It thus categorically forbids any negative absolute temperatures T<0 as they would require cooling across the non-existing state T = 0. Since κ → ∞ reproduces classical and quantum statistical mechanics for all T, this conclusion provides another proof for the nonexistence of negative absolute temperatures following from Gibbsian theory, supporting a recent proof [15] of this fact.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.


Rudolf A. Treumann acknowledges the hospitality of the ISSI staff and support by the ISSI librarians Andrea Fischer and Irmela Schweizer. We are grateful to the friendly critical comments of the two referees.


1. Landau LD, Lifschitz EM. Statist Physics Chapter 3. Vol 5 Pergamon Press Ltd. (1994).

2. Huang K. Statist Mechanics 2nd Edition, Chapter 6, New York, NY: John Wiley & Sons, (1987).

3. Treumann RA, and Baumjohann W. Gibbsian approach to statistical mechanics yielding power law distributions. Cond-mat.Stat-mech. (2014) arXiv:1406.6639.

4. Treumann RA, and Baumjohann W. Superdiffusion revisited in view of collisionless reconnection. Physics (2014) arXiv:1401.2519.

5. Treumann RA. Generalized-lorentzian thermodynamics. Phys Scripta (1999) 59:204. doi: 10.1238/Physica.Regular.059a00204

CrossRef Full Text

6. Yoon PH, Ziebell LF, Gaelzer R, Lin RP, Wang L. Langmuir turbulence and suprathermal electrons. Space Sci Rev. (2012) 173:459. doi: 10.1007/s11214-012-9867-3

CrossRef Full Text

7. Livadiotis G, McComas DJ. Beyond kappa distributions: Exploiting Tsallis statistical mechanics in space plasmas. J Geophys Res. (2009) 114:11105. doi: 10.1029/2009JA014352

CrossRef Full Text

8. Tsallis C. Possible generalization of Boltzmann-Gibbs statistics. J Stat Phys. (1988) 52:479. doi: 10.1007/BF01016429

CrossRef Full Text

9. Gell-Mann M, Tsallis C. Nonextensive Entropy - Interdisciplinary Applications, Murray Gell-Mann and C Tsallis, editors. Oxford UK: Oxford University Press, (2004).

10. Livadiotis G, McComas DJ. Understanding kappa distributions: a toolbox for space science and astrophysics. Space Sci Rev. (2013) 175:183. doi: 10.1007/s11214-013-9982-9

CrossRef Full Text

11. Treumann RA, Jaroschek CH. Gibbsian theory of power-law distributions. Phys Rev Lett. (2008) 100:155005. doi: 10.1103/PhysRevLett.100.155005

Pubmed AbstractPubmed Full TextCrossRef Full Text

12. Hasegawa A, Mima K, Duong-van M. Plasma distribution function in a superthermal radiation field. Phys Rev Lett. (1985) 54:2608. doi: 10.1103/PhysRevLett.54.2608

Pubmed AbstractPubmed Full TextCrossRef Full Text

13. Christon S, Williams DJ, Mitchell DG, Huang CY, Frank LA. Spectral characteristics of plasma sheet ion and electron populations during disturbed geomagnetic conditions. J Geophys Res. (1991) 96:1. doi: 10.1029/90JA01633

CrossRef Full Text

14. Gloeckler G, Fisk L. Anisotropic beams of energetic particles upstream from the termination shock of the solar wind. Astrophys J. (2006) 648:L63. doi: 10.1086/507841

CrossRef Full Text

15. Dunkel J, Hilbert S. Consistent thermostatistics forbids negative absolute temperatures. Nature Phys. (2014) 10:67. doi: 10.1038/nphys2815

CrossRef Full Text

Keywords: statistical mechanics, entropy, generalized-Lorentzian distributions, cosmic ray spectra, information theory, maximum entropy

Citation: Treumann RA and Baumjohann W (2014) Beyond Gibbs-Boltzmann-Shannon: general entropies—the Gibbs-Lorentzian example. Front. Phys. 2:49. doi: 10.3389/fphy.2014.00049

Received: 08 July 2014; Paper pending published: 21 July 2017;
Accepted: 27 July 2014; Published online: 14 August 2014.

Edited by:

Vladislav Izmodenov, Space Research Institute (IKI) Russian Academy of Sciences, Russia

Reviewed by:

Michael S. Ruderman, University of Sheffield, UK
David R. Shklyar, Space Research Institute (IKI) Russian Academy of Sciences, Russia

Copyright © 2014 Treumann and Baumjohann. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Rudolf A. Treumann, International Space Science Institute, Hallerstrasse 6, CH-3012 Bern, Switzerland e-mail: