ID: cond-mat/0203036

Test of Information Theory on the Boltzmann Equation

March 2, 2002

View on ArXiv

Similar papers 4

The Einstein-Boltzmann Relation for Thermodynamic and Hydrodynamic Fluctuations

October 30, 2007

83% Match
A. J. McKane, F. Vazquez, M. A. Olivares-Robles
Statistical Mechanics
Mesoscale and Nanoscale Phys...

When making the connection between the thermodynamics of irreversible processes and the theory of stochastic processes through the fluctuation-dissipation theorem, it is necessary to invoke a postulate of the Einstein-Boltzmann type. For convective processes hydrodynamic fluctuations must be included, the velocity is a dynamical variable and although the entropy cannot depend directly on the velocity, $\delta^{2} S$ will depend on velocity variations. Some authors do not incl...

Find SimilarView on arXiv

Contributions of steady heat conduction to the rate of chemical reaction

January 8, 2003

83% Match
Kim Hyeon-Deuk, Hisao Hayakawa
Statistical Mechanics

We have derived the effect of steady heat flux on the rate of chemical reaction based on the line-of-centers model using the explicit velocity distribution function of the steady-state Boltzmann equation for hard-sphere molecules to second order. It is found that the second-order velocity distribution function plays an essential role for the calculation of it. We have also compared our result with those from the steady-state Bhatnagar-Gross-Krook(BGK) equation and information...

Find SimilarView on arXiv

Thermodynamics of Information

June 20, 2023

83% Match
Juan M. R. Parrondo
Statistical Mechanics
History and Philosophy of Ph...

As early as 1867, two years after the introduction of the concept of entropy by Clausius, Maxwell showed that the limitations imposed by the second law of thermodynamics depend on the information that one possesses about the state of a physical system. A "very observant and neat-fingered being", later on named Maxwell demon by Kelvin, could arrange the molecules of a gas and induce a temperature or pressure gradient without performing work, in apparent contradiction to the se...

Find SimilarView on arXiv

Derivation of Boltzmann Principle

November 11, 2009

83% Match
Michele Campisi, Donald H. Kobe
Statistical Mechanics

We present a derivation of Boltzmann principle $S_{B}=k_{B}\ln \mathcal{W}$ based on classical mechanical models of thermodynamics. The argument is based on the heat theorem and can be traced back to the second half of the nineteenth century with the works of Helmholtz and Boltzmann. Despite its simplicity, this argument has remained almost unknown. We present it in a modern, self-contained and accessible form. The approach constitutes an important link between classical mech...

Find SimilarView on arXiv

Boltzmann and Gibbs: An Attempted Reconciliation

January 6, 2004

83% Match
D. A. Lavis
Statistical Mechanics

There are three levels of description in classical statistical mechanics, the microscopic/dynamic, the macroscopic/statistical and the thermodynamic. At one end there is a well-used concept of equilibrium in thermodynamics and at the other dynamic equilibrium does not exist in measure-preserving reversible dynamic systems. Statistical mechanics attempts to situate equilibrium at the macroscopic level in the Boltzmann approach and at the statistical level in the Gibbs approach...

Find SimilarView on arXiv

Applications of Information Theory: Statistics and Statistical Mechanics

March 5, 2016

83% Match
Khizar Qureshi
Statistics Theory
Statistics Theory

The method of optimizing entropy is used to (i) conduct Asymptotic Hypothesis Testing and (ii) determine the particle distribution for which Entropy is maximized. This paper focuses on two related applications of Information Theory: Statistics and Statistical Mechanics.

Find SimilarView on arXiv

Stochastic Thermodynamics and Dynamics: A Tail of Unexpected

March 6, 2007

83% Match
Maria K. Koleva
General Physics

The problem of the insensitivity of the macroscopic behavior of any thermodynamical system to partitioning generates a bias between the reproducibility of its macroscopic behavior viewed as the simplest form of causality and its long-term stability. The overcoming of this controversy goes through certain modification of the dynamics that involves self-assembling of the boundary conditions. Subsequently the proposed approach justifies parity between the increase and the decrea...

Find SimilarView on arXiv

Maximum information entropy principle and the interpretation of probabilities in statistical mechanics - a short review

May 27, 2016

83% Match
Domagoj Kuic
Statistical Mechanics

In this paper an alternative approach to statistical mechanics based on the maximum information entropy principle (MaxEnt) is examined, specifically its close relation with the Gibbs method of ensembles. It is shown that the MaxEnt formalism is the logical extension of the Gibbs formalism of equilibrium statistical mechanics that is entirely independent of the frequentist interpretation of probabilities only as factual (i.e. experimentally verifiable) properties of the real w...

Find SimilarView on arXiv

Information and the second law of thermodynamics

September 3, 2018

83% Match
B. Ahmadi, S. Salimi, A. S. Khorashad
Statistical Mechanics

The second law of classical thermodynamics, based on the positivity of the entropy production, only holds for deterministic processes. Therefore the Second Law in stochastic quantum thermodynamics may not hold. By making a fundamental connection between thermodynamics and information theory we will introduce a new way of defining the Second Law which holds for both deterministic classical and stochastic quantum thermodynamics. Our work incorporates information well into the S...

Find SimilarView on arXiv

Entropy: Mystery and Controversy. Plethora of Informational-Entropies and Unconventional Statistics

July 14, 2003

83% Match
Roberto Luzzi, Áurea R. Vasconcellos, J. Galvão Ramos
Statistical Mechanics

Some general considerations on the notion of entropy in physics are presented. An attempt is made to clarify the question of the differentiation between physical entropy (the Clausius-Boltzmann one) and quantities called entropies associated to Information Theory, which are in fact generating functionals for the derivation of probability distributions and not thermodynamic functions of state. The role of them in the construction of the so-called Unconventional Statistical Mec...

Find SimilarView on arXiv