December 17, 2004
The exact Maxwell-Boltzmann (MB), Bose-Einstein (BE) and Fermi-Dirac (FD) entropies and probabilistic distributions are derived by the combinatorial method of Boltzmann, without Stirling's approximation. The new entropy measures are explicit functions of the probability and degeneracy of each state, and the total number of entities, N. By analysis of the cost of a "binary decision", exact BE and FD statistics are shown to have profound consequences for the behaviour of quantum mechanical systems.
Similar papers 1
October 6, 2005
The exact forms of the degenerate Maxwell-Boltzmann (MB), Bose-Einstein (BE) and Fermi-Dirac (FD) entropy functions, derived by Boltzmann's principle without the Stirling approximation (Niven, Physics Letters A, 342(4) (2005) 286), are further examined. Firstly, an apparent paradox in quantisation effects is resolved using the Laplace-Jaynes interpretation of probability. The energy cost of learning that a system, distributed over s equiprobable states, is in one such state (...
April 29, 2016
We derive Bose-Einstein statistics and Fermi-Dirac statistics by Principle of Maximum Entropy applied to two families of entropy functions different from the Boltzmann-Gibbs-Shannon entropy. These entropy functions are identified with special cases of modified Naudts' $\phi$-entropy.
April 7, 2020
Combining intuitive probabilistic assumptions with the basic laws of classical thermodynamics, using the latter to express probabilistic parameters in terms of the thermodynamic quantities, we get a simple unified derivation of the fundamental ensembles of statistical physics avoiding any limiting procedures, quantum hypothesis and even statistical entropy maximization. This point of view leads also to some related classes of correlated particle statistics.
September 20, 2007
The combinatorial basis of entropy by Boltzmann can be written $H= {N}^{-1} \ln \mathbb{W}$, where $H$ is the dimensionless entropy of a system, per unit entity, $N$ is the number of entities and $\mathbb{W}$ is the number of ways in which a given realization of the system can occur, known as its statistical weight. Maximizing the entropy (``MaxEnt'') of a system, subject to its constraints, is then equivalent to choosing its most probable (``MaxProb'') realization. For a sys...
August 15, 2008
Generalized probability distributions for Maxwell-Boltzmann, Bose-Einstein and Fermi-Dirac statistics, with unequal source probabilities $q_i$ for each level $i$, are obtained by combinatorial reasoning. For equiprobable degenerate sublevels, these reduce to those given by Brillouin in 1930, more commonly given as a statistical weight for each statistic. These distributions and corresponding cross-entropy (divergence) functions are shown to be special cases of the P\'olya urn...
January 11, 2016
In the works on Statistical Mechanics and Statistical Physics, when deriving the distribution of particles of ideal gases, one uses the method of Lagrange multipliers in a formal way. In this paper we treat rigorously this problem for Bose--Einstein, Fermi--Dirac and Maxwell--Boltzmann entropies and present a complete study in the case of the Maxwell--Boltzmann entropy. Our approach is based on recent results on series of convex functions.
November 6, 2023
Expressions for the entropy and equations for the quantum distribution functions in systems of non-interacting fermions and bosons with an arbitrary, including small, number of particles are obtained in the paper
October 6, 2021
This article presents a study of the grand canonical Bose-Einstein (BE) statistics for a finite number of particles in an arbitrary quantum system. The thermodynamical quantities that identify BE condensation -- namely, the fraction of particles in the ground state and the specific heat -- are calculated here exactly in terms of temperature and fugacity. These calculations are complemented by a numerical calculation of fugacity in terms of the number of particles, without tak...
December 9, 2004
When dealing with certain kind of complex phenomena the theoretician may face some difficulties -- typically a failure to have access to information for properly characterize the system -- for applying the full power of the standard approach to the well established, physically and logically sound, Boltzmann-Gibbs statistics. To circumvent such difficulties, in order to make predictions on properties of the system and looking for an understanding of the physics involved (for e...
February 18, 2009
We examine the {combinatorial} or {probabilistic} definition ("Boltzmann's principle") of the entropy or cross-entropy function $H \propto \ln \mathbb{W}$ or $D \propto - \ln \mathbb{P}$, where $\mathbb{W}$ is the statistical weight and $\mathbb{P}$ the probability of a given realization of a system. Extremisation of $H$ or $D$, subject to any constraints, thus selects the "most probable" (MaxProb) realization. If the system is multinomial, $D$ converges asymptotically (for n...