January 17, 2007
Similar papers 5
March 30, 2000
Traditionally the exponential canonical distributions of Gibbsian statistical mechanics are given theoretical justification in at least four different ways: steepest descent method, counting method, Khinchin's method based on te central limit theorem, and maximum entropy principle of Jaynes. Equally ubiquitous power-law canonical distributions are shown to be given similar justification by appropriately adopting these formulations.
November 10, 2014
A question that is currently highly debated is whether the microcanonical entropy should be expressed as the logarithm of the phase volume (volume entropy, also known as the Gibbs entropy) or as the logarithm of the density of states (surface entropy, also known as the Boltzmann entropy). Rather than postulating them and investigating the consequence of each definition, as is customary, here we adopt a bottom-up approach and construct the entropy expression within the microca...
June 26, 2012
The canonical ensemble describes an open system in equilibrium with a heat bath of fixed temperature. The probability distribution of such a system, the Boltzmann distribution, is derived from the uniform probability distribution of the closed universe consisting of the open system and the heat bath, by taking the limit where the heat bath is much larger than the system of interest. Alternatively, the Boltzmann distribution can be derived from the Maximum Entropy Principle, w...
May 11, 2001
In the last quarter of the nineteenth century, Ludwig Boltzmann explained how irreversible macroscopic laws, in particular the second law of thermodynamics, originate in the time-reversible laws of microscopic physics. Boltzmann's analysis, the essence of which I shall review here, is basically correct. The most famous criticisms of Boltzmann's later work on the subject have little merit. Most twentieth century innovations -- such as the identification of the state of a physi...
March 24, 2000
This paper is a non-technical, informal presentation of our theory of the second law of thermodynamics as a law that is independent of statistical mechanics and that is derivable solely from certain simple assumptions about adiabatic processes for macroscopic systems. It is not necessary to assume a-priori concepts such as "heat", "hot and cold", "temperature". These are derivable from entropy, whose existence we derive from the basic assumptions. See cond-mat/9708200 and mat...
November 8, 2018
Gibbs paradox in the context of statistical mechanics addresses the issue of additivity of entropy of mixing gases. The usual discussion attributes the paradoxical situation to classical distinguishability of identical particles and credits quantum theory for enabling indistinguishability of identical particles to solve the problem. We argue that indistinguishability of identical particles is already a feature in classical mechanics and this is clearly brought out when the pr...
January 8, 2022
Statistical thermodynamics is valuable as a conceptual structure that shapes our thinking about equilibrium thermodynamic states. A cloud of unresolved questions surrounding the foundations of the theory could lead an impartial observer to conclude that statistical thermodynamics is in a state of crisis though. Indeed, the discussion about the microscopic origins of irreversibility has continued in the scientific community for more than a hundred years. This paper considers t...
August 10, 2020
In the past several years, observational entropy has been developed as both a (time-dependent) quantum generalization of Boltzmann entropy, and as a rather general framework to encompass classical and quantum equilibrium and non-equilibrium coarse-grained entropy. In this paper we review the construction, interpretation, most important properties, and some applications of this framework. The treatment is self-contained and relatively pedagogical, aimed at a broad class of res...
August 31, 2004
We give a detailed analysis of the Gibbs-type entropy notion and its dynamical behavior in case of time-dependent continuous probability distributions of varied origins: related to classical and quantum systems. The purpose-dependent usage of conditional Kullback-Leibler and Gibbs (Shannon) entropies is explained in case of non-equilibrium Smoluchowski processes. A very different temporal behavior of Gibbs and Kullback entropies is confronted. A specific conceptual niche is a...
August 14, 2007
The combinatorial basis of entropy, given by Boltzmann, can be written $H = N^{-1} \ln \mathbb{W}$, where $H$ is the dimensionless entropy, $N$ is the number of entities and $\mathbb{W}$ is number of ways in which a given realization of a system can occur (its statistical weight). This can be broadened to give generalized combinatorial (or probabilistic) definitions of entropy and cross-entropy: $H=\kappa (\phi(\mathbb{W}) +C)$ and $D=-\kappa (\phi(\mathbb{P}) +C)$, where $\m...