ID: cond-mat/0410063

The Backwards Arrow of Time of the Coherently Bayesian Statistical Mechanic

October 4, 2004

View on ArXiv
Cosma Rohilla Shalizi
Condensed Matter
Statistical Mechanics

Many physicists think that the maximum entropy formalism is a straightforward application of Bayesian statistical ideas to statistical mechanics. Some even say that statistical mechanics is just the general Bayesian logic of inductive inference applied to large mechanical systems. This approach identifies thermodynamic entropy with the information-theoretic uncertainty of an (ideal) observer's subjective distribution over a system's microstates. In this brief note, I show that this postulate, plus the standard Bayesian procedure for updating probabilities, implies that the entropy of a classical system is monotonically non-increasing on the average -- the Bayesian statistical mechanic's arrow of time points backwards. Avoiding this unphysical conclusion requires rejecting the ordinary equations of motion, or practicing an incoherent form of statistical inference, or rejecting the identification of uncertainty and thermodynamic entropy.

Similar papers 1

On the statistical arrow of time

May 13, 2019

91% Match
Andreas Henriksson
Statistical Mechanics

What is the physical origin of the arrow of time? It is a commonly held belief in the physics community that it relates to the increase of entropy as it appears in the statistical interpretation of the second law of thermodynamics. At the same time, the subjective information-theoretical interpretation of probability, and hence entropy, is a standard viewpoint in the foundations of statistical mechanics. In this article, it is argued that the subjective interpretation is inco...

Find SimilarView on arXiv

Probabilistic Foundations of Statistical Mechanics: A Bayesian Approach

December 4, 2015

90% Match
B. Buck, A. C. Merchant
Statistical Mechanics

We examine the fundamental aspects of statistical mechanics, dividing the problem into a discussion purely about probability, which we analyse from a Bayesian standpoint. We argue that the existence of a unique maximising probability distribution $\{p(j\vert K)\}$ for states labelled by $j$ given data $K$ implies that the corresponding maximal value of the information entropy $\sigma(\{(p_j\vert K)\}) = -\sum_j (p_j \vert K)\ln{(p_j\vert K)}$ depends explicitly on the data at...

Find SimilarView on arXiv

Maximum information entropy principle and the interpretation of probabilities in statistical mechanics - a short review

May 27, 2016

90% Match
Domagoj Kuic
Statistical Mechanics

In this paper an alternative approach to statistical mechanics based on the maximum information entropy principle (MaxEnt) is examined, specifically its close relation with the Gibbs method of ensembles. It is shown that the MaxEnt formalism is the logical extension of the Gibbs formalism of equilibrium statistical mechanics that is entirely independent of the frequentist interpretation of probabilities only as factual (i.e. experimentally verifiable) properties of the real w...

Find SimilarView on arXiv

The foundations of statistical physics: entropy, irreversibility, and inference

October 9, 2023

89% Match
Jonathan Asher Pachter, Ying-Jen Yang, Ken A. Dill
Statistical Mechanics

Statistical physics aims to describe properties of macroscale systems in terms of distributions of their microscale agents. Its central tool is the maximization of entropy, a variational principle. We review the history of this principle, first considered as a law of nature, more recently as a procedure for inference in model-making. And while equilibria (EQ) have long been grounded in the principle of Maximum Entropy (MaxEnt), until recently no equally foundational generativ...

Find SimilarView on arXiv

Lectures on Probability, Entropy, and Statistical Physics

July 31, 2008

89% Match
Ariel Caticha
physics.data-an
cond-mat.stat-mech
cs.IT
math.IT
math.ST
physics.gen-ph
stat.TH

These lectures deal with the problem of inductive inference, that is, the problem of reasoning under conditions of incomplete information. Is there a general method for handling uncertainty? Or, at least, are there rules that could in principle be followed by an ideally rational mind when discussing scientific matters? What makes one statement more plausible than another? How much more plausible? And then, when new information is acquired how do we change our minds? Or, to pu...

Find SimilarView on arXiv

The Bayesian Second Law of Thermodynamics

August 10, 2015

89% Match
Anthony Bartolotta, Sean M. Carroll, ... , Pollack Jason
Statistical Mechanics

We derive a generalization of the Second Law of Thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically-evolving sys...

Find SimilarView on arXiv

The arrow of time and a-priori probabilities

June 2, 2021

89% Match
Sivapalan Chelvaniththilan
Statistical Mechanics

The second law of thermodynamics is asymmetric with respect to time as it says that the entropy of the universe must have been lower in the past and will be higher in the future. How this time-asymmetric law arises from the time-symmetric equations of motion has been the subject of extensive discussion in the scientific literature. The currently accepted resolution of the problem is to assume that the universe began in a low entropy state for an unknown reason. But the probab...

Find SimilarView on arXiv

In Praise of Clausius Entropy: Reassessing the Foundations of Boltzmannian Statistical Mechanics

April 8, 2020

88% Match
Christopher Gregory Weaver
History and Philosophy of Ph...
Classical Physics

I will argue, pace a great many of my contemporaries, that there's something right about Boltzmann's attempt to ground the second law of thermodynamics in a suitably amended deterministic time-reversal invariant classical dynamics, and that in order to appreciate what's right about (what was at least at one time) Boltzmann's explanatory project, one has to fully apprehend the nature of microphysical causal structure, time-reversal invariance, and the relationship between Bolt...

Find SimilarView on arXiv

Definitions and Evolutions of Statistical Entropy for Hamiltonian Systems

September 26, 2017

88% Match
Xiangjun School of Physics and Astronomy, Shanghai Jiao Tong University Xing
Statistical Mechanics

Regardless of studies and debates over a century, the statistical origin of the second law of thermodynamics still remains illusive. One essential obstacle is the lack of a proper theoretical formalism for non-equilibrium entropy. Here I revisit the seminal ideas about non-equilibrium statistical entropy due to Boltzmann and due to Gibbs, and synthesize them into a coherent and precise framework. Using this framework, I clarify the anthropomorphic principle of entropy, and an...

Find SimilarView on arXiv

On the Statistical Viewpoint Concerning the 2nd Law of Thermodynamics a Reminder on the Ehrenfests' Urn Model

September 26, 2005

88% Match
Domenico Giulini
General Physics

In statistical thermodynamics the 2nd law is properly spelled out in terms of conditioned probabilities. As such it makes the statement, that `entropy increases with time' without preferring a time direction. In this paper I try to explain this statement--which is well known since the time of the Ehrenfests--in some detail within a systematic Bayesian approach.

Find SimilarView on arXiv