January 17, 2007
We develop the argument that the Gibbs-von Neumann entropy is the appropriate statistical mechanical generalisation of the thermodynamic entropy, for macroscopic and microscopic systems, whether in thermal equilibrium or not, as a consequence of Hamiltonian dynamics. The mathematical treatment utilises well known results [Gib02, Tol38, Weh78, Par89], but most importantly, incorporates a variety of arguments on the phenomenological properties of thermal states [Szi25, TQ63, HK65, GB91] and of statistical distributions[HG76, PW78, Len78]. This enables the identification of the canonical distribution as the unique representation of thermal states without approximation or presupposing the existence of an entropy function. The Gibbs-von Neumann entropy is then derived, from arguments based solely on the addition of probabilities to Hamiltonian dynamics.
Similar papers 1
March 28, 2019
The Gibbs entropy of a macroscopic classical system is a function of a probability distribution over phase space, i.e., of an ensemble. In contrast, the Boltzmann entropy is a function on phase space, and is thus defined for an individual system. Our aim is to discuss and compare these two notions of entropy, along with the associated ensemblist and individualist views of thermal equilibrium. Using the Gibbsian ensembles for the computation of the Gibbs entropy, the two notio...
September 26, 2017
Regardless of studies and debates over a century, the statistical origin of the second law of thermodynamics still remains illusive. One essential obstacle is the lack of a proper theoretical formalism for non-equilibrium entropy. Here I revisit the seminal ideas about non-equilibrium statistical entropy due to Boltzmann and due to Gibbs, and synthesize them into a coherent and precise framework. Using this framework, I clarify the anthropomorphic principle of entropy, and an...
April 4, 2005
The paper develop a new approach to the justification of Gibbs canonical distribution for Hamiltonian systems with finite number of degrees of freedom. It uses the condition of nonintegrability of the ensemble of weak interacting Hamiltonian systems.
March 27, 2023
Conventional wisdom holds that the von Neumann entropy corresponds to thermodynamic entropy, but Hemmo and Shenker (2006) have recently argued against this view by attacking von Neumann (1955) and his argument. I argue that Hemmo and Shenker's arguments fail due to several misunderstandings: about statistical-mechanical and thermodynamic domains of applicability, about the nature of mixed states, and about the role of approximations in physics. As a result, their arguments fa...
June 25, 2012
A new axiomatic characterization with a minimum of conditions for entropy as a function on the set of states in quantum mechanics is presented. Traditionally unspoken assumptions are unveiled and replaced by proven consequences of the axioms. First the Boltzmann-Planck formula is derived. Building on this formula, using the Law of Large Numbers - a basic theorem of probability theory - the von Neumann formula is deduced. Axioms used in older theories on the foundations are no...
November 4, 2022
Introducing the Boltzmann distribution very early in a statistical thermodynamics course (in the spirit of Feynmann) has many didactic advantages, in particular that of easily deriving the Gibbs entropy formula. In this note, a short derivation is proposed from the fundamental postulate of statistical mechanics and basics calculations accessible to undergraduate students.
December 29, 2005
Entropy is the distinguishing and most important concept of our efforts to understand and regularize our observations of a very large class of natural phenomena, and yet, it is one of the most contentious concepts of physics. In this article, we review two expositions of thermodynamics, one without reference to quantum theory, and the other quantum mechanical without probabilities of statistical mechanics. In the first, we show that entropy is an inherent property of any syst...
May 27, 2016
In this paper an alternative approach to statistical mechanics based on the maximum information entropy principle (MaxEnt) is examined, specifically its close relation with the Gibbs method of ensembles. It is shown that the MaxEnt formalism is the logical extension of the Gibbs formalism of equilibrium statistical mechanics that is entirely independent of the frequentist interpretation of probabilities only as factual (i.e. experimentally verifiable) properties of the real w...
December 4, 2015
We examine the fundamental aspects of statistical mechanics, dividing the problem into a discussion purely about probability, which we analyse from a Bayesian standpoint. We argue that the existence of a unique maximising probability distribution $\{p(j\vert K)\}$ for states labelled by $j$ given data $K$ implies that the corresponding maximal value of the information entropy $\sigma(\{(p_j\vert K)\}) = -\sum_j (p_j \vert K)\ln{(p_j\vert K)}$ depends explicitly on the data at...
April 4, 2020
In this work we generalize and combine Gibbs and von Neumann approaches to build, for the first time, a rigorous definition of entropy for hybrid quantum-classical systems. The resulting function coincides with the two cases above when the suitable limits are considered. Then, we apply the MaxEnt principle for this hybrid entropy function and obtain the natural candidate for the Hybrid Canonical Ensemble (HCE). We prove that the suitable classical and quantum limits of the HC...