October 4, 2004
Similar papers 3
July 9, 2021
This paper is a review of a particular approach to the method of maximum entropy as a general framework for inference. The discussion emphasizes the pragmatic elements in the derivation. An epistemic notion of information is defined in terms of its relation to the Bayesian beliefs of ideally rational agents. The method of updating from a prior to a posterior probability distribution is designed through an eliminative induction process. The logarithmic relative entropy is sing...
July 9, 2014
I show that the maximum entropy principle can be replaced by a more natural assumption, that there exists a phenomenological function of entropy consistent with the microscopic model. The requirement of existence provides then a unique construction of the related probability density. I conclude the letter with an axiomatic formulation of the notion of entropy, which is suitable for exploration of the non-equilibrium phenomena.
July 10, 2003
The problem of assigning probability distributions which objectively reflect the prior information available about experiments is one of the major stumbling blocks in the use of Bayesian methods of data analysis. In this paper the method of Maximum (relative) Entropy (ME) is used to translate the information contained in the known form of the likelihood into a prior distribution for Bayesian inference. The argument is inspired and guided by intuition gained from the successfu...
December 16, 2007
After the justification of the maximum entropy approach for equilibrium thermodynamic system, and of a maximum path entropy algorithm for nonequilibrium thermodynamic systems by virtue of the principle of virtual work, we present in this paper another application of the principle to thermodynamic systems out of equilibrium. Unlike the justification of maximum path entropy for the motion trajectories during a period of time, this work is on the maximum of the entropy defined a...
May 2, 2017
We start with reviewing the origin of the idea that entropy and the Second Law are associated with the Arrow of Time. We then introduced a new definition of entropy based on Shannons Measure of Information, SMI. The SMI may be defined on any probability distribution, and therefore it is a very general concept. On the other hand entropy is defined on a very special set of probability distributions. More specifically the entropy of a thermodynamic system is related the probabil...
March 5, 2016
The method of optimizing entropy is used to (i) conduct Asymptotic Hypothesis Testing and (ii) determine the particle distribution for which Entropy is maximized. This paper focuses on two related applications of Information Theory: Statistics and Statistical Mechanics.
June 8, 2015
Predictive statistical mechanics is a form of inference from available data, without additional assumptions, for predicting reproducible phenomena. By applying it to systems with Hamiltonian dynamics, a problem of predicting the macroscopic time evolution of the system in the case of incomplete information about the microscopic dynamics was considered. In the model of a closed Hamiltonian system (i.e. system that can exchange energy but not particles with the environment) tha...
April 2, 2008
This is an extensive review of recent work on the foundations of statistical mechanics. Subject matters discussed include: interpretation of probability, typicality, recurrence, reversibility, ergodicity, mixing, coarse graining, past hypothesis, reductionism, phase average, thermodynamic limit, interventionism, entropy.
January 7, 2013
Prompted by the realisation that the statistical entropy of an ideal gas in the micro-canonical ensemble should not fluctuate or change over time, the meaning of the H-theorem is re-interpreted from the perspective of information theory in which entropy is a measure of uncertainty. We propose that the Maxwellian velocity distribution should more properly be regarded as a limiting distribution which is identical with the distribution across particles in the asymptotic limit of...
May 11, 2001
In the last quarter of the nineteenth century, Ludwig Boltzmann explained how irreversible macroscopic laws, in particular the second law of thermodynamics, originate in the time-reversible laws of microscopic physics. Boltzmann's analysis, the essence of which I shall review here, is basically correct. The most famous criticisms of Boltzmann's later work on the subject have little merit. Most twentieth century innovations -- such as the identification of the state of a physi...