October 4, 2004
Many physicists think that the maximum entropy formalism is a straightforward application of Bayesian statistical ideas to statistical mechanics. Some even say that statistical mechanics is just the general Bayesian logic of inductive inference applied to large mechanical systems. This approach identifies thermodynamic entropy with the information-theoretic uncertainty of an (ideal) observer's subjective distribution over a system's microstates. In this brief note, I show that this postulate, plus the standard Bayesian procedure for updating probabilities, implies that the entropy of a classical system is monotonically non-increasing on the average -- the Bayesian statistical mechanic's arrow of time points backwards. Avoiding this unphysical conclusion requires rejecting the ordinary equations of motion, or practicing an incoherent form of statistical inference, or rejecting the identification of uncertainty and thermodynamic entropy.
Similar papers 1
May 13, 2019
What is the physical origin of the arrow of time? It is a commonly held belief in the physics community that it relates to the increase of entropy as it appears in the statistical interpretation of the second law of thermodynamics. At the same time, the subjective information-theoretical interpretation of probability, and hence entropy, is a standard viewpoint in the foundations of statistical mechanics. In this article, it is argued that the subjective interpretation is inco...
December 4, 2015
We examine the fundamental aspects of statistical mechanics, dividing the problem into a discussion purely about probability, which we analyse from a Bayesian standpoint. We argue that the existence of a unique maximising probability distribution $\{p(j\vert K)\}$ for states labelled by $j$ given data $K$ implies that the corresponding maximal value of the information entropy $\sigma(\{(p_j\vert K)\}) = -\sum_j (p_j \vert K)\ln{(p_j\vert K)}$ depends explicitly on the data at...
May 27, 2016
In this paper an alternative approach to statistical mechanics based on the maximum information entropy principle (MaxEnt) is examined, specifically its close relation with the Gibbs method of ensembles. It is shown that the MaxEnt formalism is the logical extension of the Gibbs formalism of equilibrium statistical mechanics that is entirely independent of the frequentist interpretation of probabilities only as factual (i.e. experimentally verifiable) properties of the real w...
October 9, 2023
Statistical physics aims to describe properties of macroscale systems in terms of distributions of their microscale agents. Its central tool is the maximization of entropy, a variational principle. We review the history of this principle, first considered as a law of nature, more recently as a procedure for inference in model-making. And while equilibria (EQ) have long been grounded in the principle of Maximum Entropy (MaxEnt), until recently no equally foundational generativ...
July 31, 2008
These lectures deal with the problem of inductive inference, that is, the problem of reasoning under conditions of incomplete information. Is there a general method for handling uncertainty? Or, at least, are there rules that could in principle be followed by an ideally rational mind when discussing scientific matters? What makes one statement more plausible than another? How much more plausible? And then, when new information is acquired how do we change our minds? Or, to pu...
August 10, 2015
We derive a generalization of the Second Law of Thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically-evolving sys...
June 2, 2021
The second law of thermodynamics is asymmetric with respect to time as it says that the entropy of the universe must have been lower in the past and will be higher in the future. How this time-asymmetric law arises from the time-symmetric equations of motion has been the subject of extensive discussion in the scientific literature. The currently accepted resolution of the problem is to assume that the universe began in a low entropy state for an unknown reason. But the probab...
April 8, 2020
I will argue, pace a great many of my contemporaries, that there's something right about Boltzmann's attempt to ground the second law of thermodynamics in a suitably amended deterministic time-reversal invariant classical dynamics, and that in order to appreciate what's right about (what was at least at one time) Boltzmann's explanatory project, one has to fully apprehend the nature of microphysical causal structure, time-reversal invariance, and the relationship between Bolt...
September 26, 2017
Regardless of studies and debates over a century, the statistical origin of the second law of thermodynamics still remains illusive. One essential obstacle is the lack of a proper theoretical formalism for non-equilibrium entropy. Here I revisit the seminal ideas about non-equilibrium statistical entropy due to Boltzmann and due to Gibbs, and synthesize them into a coherent and precise framework. Using this framework, I clarify the anthropomorphic principle of entropy, and an...
September 26, 2005
In statistical thermodynamics the 2nd law is properly spelled out in terms of conditioned probabilities. As such it makes the statement, that `entropy increases with time' without preferring a time direction. In this paper I try to explain this statement--which is well known since the time of the Ehrenfests--in some detail within a systematic Bayesian approach.