April 29, 2005
An introductory review of Classical Statistical Mechanics
March 10, 2015
In this paper we discuss about the validity of the Shannon entropy functional in connection with the correct Gibbs-Hertz probability distribution function. We show that there is no contradiction in using the Shannon-Gibbs functional and restate the validity of information theory applied to equilibrium statistical mechanics. We show that under these assumptions, entropy is always a monotone function of energy, irrespective to the shape of the density of states, leading always ...
March 31, 2021
An essential role of information in microscopic thermodynamics (e.g. Maxwell's demon) opens a challenging question if there exists a formulation of the second law of thermodynamics based only on pure information ideas. Here, such a formulation is suggested for unitary processes by introducing information as a full-valuable physical quantity defining an (objective) microscopic information entropy as 'information about microstate'. We show that various forms of entropy (Boltzma...
December 14, 2016
The laws of thermodynamics, despite their wide range of applicability, are known to break down when systems are correlated with their environments. Here, we generalize thermodynamics to physical scenarios which allow presence of correlations, including those where strong correlations are present. We exploit the connection between information and physics, and introduce a consistent redefinition of heat dissipation by systematically accounting for the information flow from syst...
February 8, 2013
Boltzmann's Principle S = k ln W was repeatedly criticized by Einstein since it lacked a proper dynamical foundation in view of the thermal motion of the particles, out of which a physical system consists. This suggests, in particular, that the statistical mechanics of a system in thermal equilibrium should be based on dynamics. As an example, a dynamical derivation of the density expansions of the two-particle distribution function, as well as of the thermodynamic properties...
January 3, 2007
A unification of thermodynamics and information theory is proposed. It is argued that similarly to the randomness due to collisions in thermal systems, the quenched randomness that exists in data files in informatics systems contributes to entropy. Therefore, it is possible to define equilibrium and to calculate temperature for informatics systems. The obtained temperature yields correctly the Shannon information balance in informatics systems and is consistent with the Claus...
May 27, 2011
The statistical mechanics of Gibbs is a juxtaposition of subjective, probabilistic ideas on the one hand and objective, mechanical ideas on the other. In this paper, we follow the path set out by Jaynes, including elements added subsequently to that original work, to explore the consequences of the purely statistical point of view. We show how standard methods in the equilibrium theory could have been derived simply from a description of the available problem information. In ...
March 10, 2015
A kinetic approach to the notion of information is proposed, based on Liouville kinetic theory. The general kinetic equation for the evolution of the N-particle information $\mathcal{I}_N$ in a Hamiltonian system of large particle number $N\gg 1$ is obtained. It is shown that the $N$-particle information is strictly conserved. Defining reduced particle number information densities in phase space should be possible to obtain a kinetic equation for the ordinary one-particle inf...
March 6, 2019
We show that the generalized Boltzmann distribution is the only distribution for which the Gibbs-Shannon entropy equals the thermodynamic entropy. This result means that the thermodynamic entropy and the Gibbs-Shannon entropy are not generally equal, but rather than the equality holds only in the special case where a system is in equilibrium with a reservoir.
January 25, 2006
Ludwig Boltzmann had a hunch that irreversibility exhibited by a macroscopic system arises from the reversible dynamics of its microscopic constituents. He derived a nonlinear integro-differential equation - now called the Boltzmann equation - for the phase space density of the molecules of a dilute fluid. He showed that the Second law of thermodynamics emerges from Newton's equations of motion. However Boltzmann realized that stosszahlansatz, employed in the derivation, smug...