ID: quant-ph/0701127

The Physical Basis of the Gibbs-von Neumann entropy

January 17, 2007

View on ArXiv

Similar papers 5

Statistical Mechanical Foundations for Systems with Nonexponential Distributions

March 30, 2000

87% Match
A. K. Rajagopal, Sumiyoshi Abe
Statistical Mechanics

Traditionally the exponential canonical distributions of Gibbsian statistical mechanics are given theoretical justification in at least four different ways: steepest descent method, counting method, Khinchin's method based on te central limit theorem, and maximum entropy principle of Jaynes. Equally ubiquitous power-law canonical distributions are shown to be given similar justification by appropriately adopting these formulations.

Find SimilarView on arXiv

Construction of microcanonical entropy on thermodynamic pillars

November 10, 2014

87% Match
Michele Campisi
Statistical Mechanics

A question that is currently highly debated is whether the microcanonical entropy should be expressed as the logarithm of the phase volume (volume entropy, also known as the Gibbs entropy) or as the logarithm of the density of states (surface entropy, also known as the Boltzmann entropy). Rather than postulating them and investigating the consequence of each definition, as is customary, here we adopt a bottom-up approach and construct the entropy expression within the microca...

Find SimilarView on arXiv

Microcanonical Origin of the Maximum Entropy Principle for Open Systems

June 26, 2012

87% Match
Julian Lee
Statistical Mechanics

The canonical ensemble describes an open system in equilibrium with a heat bath of fixed temperature. The probability distribution of such a system, the Boltzmann distribution, is derived from the uniform probability distribution of the closed universe consisting of the open system and the heat bath, by taking the limit where the heat bath is much larger than the system of interest. Alternatively, the Boltzmann distribution can be derived from the Maximum Entropy Principle, w...

Find SimilarView on arXiv

Boltzmann's Approach to Statistical Mechanics

May 11, 2001

86% Match
Sheldon Goldstein
Statistical Mechanics
Mathematical Physics
History and Philosophy of Ph...

In the last quarter of the nineteenth century, Ludwig Boltzmann explained how irreversible macroscopic laws, in particular the second law of thermodynamics, originate in the time-reversible laws of microscopic physics. Boltzmann's analysis, the essence of which I shall review here, is basically correct. The most famous criticisms of Boltzmann's later work on the subject have little merit. Most twentieth century innovations -- such as the identification of the state of a physi...

Find SimilarView on arXiv

A Fresh Look at Entropy and the Second Law of Thermodynamics

March 24, 2000

86% Match
Elliott H. Lieb, Jakob Yngvason
Mathematical Physics

This paper is a non-technical, informal presentation of our theory of the second law of thermodynamics as a law that is independent of statistical mechanics and that is derivable solely from certain simple assumptions about adiabatic processes for macroscopic systems. It is not necessary to assume a-priori concepts such as "heat", "hot and cold", "temperature". These are derivable from entropy, whose existence we derive from the basic assumptions. See cond-mat/9708200 and mat...

Find SimilarView on arXiv

The Gibbs Paradox and the Physical Criteria for the Indistinguishability of Identical Particles

November 8, 2018

86% Match
C. S. Unnikrishnan
Statistical Mechanics

Gibbs paradox in the context of statistical mechanics addresses the issue of additivity of entropy of mixing gases. The usual discussion attributes the paradoxical situation to classical distinguishability of identical particles and credits quantum theory for enabling indistinguishability of identical particles to solve the problem. We argue that indistinguishability of identical particles is already a feature in classical mechanics and this is clearly brought out when the pr...

Find SimilarView on arXiv

Foundations of a Finite Non-Equilibrium Statistical Thermodynamics: Extrinsic Quantities

January 8, 2022

86% Match
O. B. Ericok, J. K. Mason
Statistical Mechanics
Mathematical Physics
Classical Physics

Statistical thermodynamics is valuable as a conceptual structure that shapes our thinking about equilibrium thermodynamic states. A cloud of unresolved questions surrounding the foundations of the theory could lead an impartial observer to conclude that statistical thermodynamics is in a state of crisis though. Indeed, the discussion about the microscopic origins of irreversibility has continued in the scientific community for more than a hundred years. This paper considers t...

Find SimilarView on arXiv

A brief introduction to observational entropy

August 10, 2020

86% Match
Dominik Šafránek, Anthony Aguirre, ... , Deutsch J. M.
Quantum Gases
Statistical Mechanics

In the past several years, observational entropy has been developed as both a (time-dependent) quantum generalization of Boltzmann entropy, and as a rather general framework to encompass classical and quantum equilibrium and non-equilibrium coarse-grained entropy. In this paper we review the construction, interpretation, most important properties, and some applications of this framework. The treatment is self-contained and relatively pedagogical, aimed at a broad class of res...

Find SimilarView on arXiv

Differential entropy and time

August 31, 2004

86% Match
Piotr Garbaczewski
Statistical Mechanics
General Physics

We give a detailed analysis of the Gibbs-type entropy notion and its dynamical behavior in case of time-dependent continuous probability distributions of varied origins: related to classical and quantum systems. The purpose-dependent usage of conditional Kullback-Leibler and Gibbs (Shannon) entropies is explained in case of non-equilibrium Smoluchowski processes. A very different temporal behavior of Gibbs and Kullback entropies is confronted. A specific conceptual niche is a...

Find SimilarView on arXiv

Origins of the Combinatorial Basis of Entropy

August 14, 2007

86% Match
Robert K. Niven
Classical Physics

The combinatorial basis of entropy, given by Boltzmann, can be written $H = N^{-1} \ln \mathbb{W}$, where $H$ is the dimensionless entropy, $N$ is the number of entities and $\mathbb{W}$ is number of ways in which a given realization of a system can occur (its statistical weight). This can be broadened to give generalized combinatorial (or probabilistic) definitions of entropy and cross-entropy: $H=\kappa (\phi(\mathbb{W}) +C)$ and $D=-\kappa (\phi(\mathbb{P}) +C)$, where $\m...

Find SimilarView on arXiv