October 10, 2000
An updated review [1] of nonextensive statistical mechanics and thermodynamics is colloquially presented. Quite naturally the possibility emerges for using the value of q-1 (entropic nonextensivity) as a simple and efficient manner to provide, at least for some classes of systems, some characterization of the degree of what is currently referred to as complexity [2]. A few historical digressions are included as well.
Similar papers 1
September 4, 2003
In this lecture we briefly review the definition, consequences and applications of an entropy, $S_q$, which generalizes the usual Boltzmann-Gibbs entropy $S_{BG}$ ($S_1=S_{BG}$), basis of the usual statistical mechanics, well known to be applicable whenever ergodicity is satisfied at the microscopic dynamical level. Such entropy $S_q$ is based on the notion of $q$-exponential and presents properties not shared by other available alternative generalizations of $S_{BG}$. The th...
May 27, 2002
We briefly review the present status of nonextensive statistical mechanics. We focus on (i) the central equations of the formalism, (ii) the most recent applications in physics and other sciences, (iii) the {\it a priori} determination (from microscopic dynamics) of the entropic index $q$ for two important classes of physical systems, namely low-dimensional maps (both dissipative and conservative) and long-range interacting many-body hamiltonian classical systems.
November 6, 2009
We briefly review central concepts concerning nonextensive statistical mechanics, based on the nonadditive entropy $S_q=k\frac{1-\sum_{i}p_i^q}{q-1} (q \in {\cal R}; S_1=-k\sum_{i}p_i \ln p_i)$. Among others, we focus on possible realizations of the $q$-generalized Central Limit Theorem, including at the edge of chaos of the logistic map, and for quasi-stationary states of many-body long-range-interacting Hamiltonian systems.
May 26, 2003
Boltzmann-Gibbs statistical mechanics is based on the entropy $S_{BG}=-k \sum_{i=1}^W p_i \ln p_i$. It enables a successful thermal approach of ubiquitous systems, such as those involving short-range interactions, markovian processes, and, generally speaking, those systems whose dynamical occupancy of phase space tends to be ergodic. For systems whose microscopic dynamics is more complex, it is natural to expect that the dynamical occupancy of phase space will have a less tri...
March 23, 2001
The problem of defining and studying complexity of a time series has interested people for years. In the context of dynamical systems, Grassberger has suggested that a slow approach of the entropy to its extensive asymptotic limit is a sign of complexity. We investigate this idea further by information theoretic and statistical mechanics techniques and show that these arguments can be made precise, and that they generalize many previous approaches to complexity, in particular...
August 11, 2016
Boltzmann introduced in the 1870's a logarithmic measure for the connection between the thermodynamical entropy and the probabilities of the microscopic configurations of the system. His entropic functional for classical systems was extended by Gibbs to the entire phase space of a many-body system, and by von Neumann in order to cover quantum systems as well. Finally, it was used by Shannon within the theory of information. The simplest expression of this functional correspon...
April 8, 2004
Information entropy is applied to the analysis of time series generated by dynamical systems. Complexity of a temporal or spatio-temporal signal is defined as the difference between the sum of entropies of the local linear regions of the trajectory manifold and the entropy of the globally linearized manifold. When the entropies are Tsallis entropies the complexity is characterized by the value of q.
December 31, 2023
Nonextensive Statistical Mechanics has developed into an important framework for modeling the thermodynamics of complex systems and the information of complex signals. Upon the 80th birthday of the field's founder, Constantino Tsallis, a review of open problems that can stimulate future research is provided. Over the thirty-year development of NSM a variety of criticisms have been published ranging from questions about the justification for generalizing the entropy function t...
October 24, 2002
During the past dozen years there have been numerous articles on a relation between entropy and probability which is non-additive and has a parameter $q$ that depends on the nature of the thermodynamic system under consideration. For $q=1$ this relation corresponds to the Boltzmann-Gibbs entropy, but for other values of $q$ it is claimed that it leads to a formalism which is consistent with the laws of thermodynamics. However, it is shown here that the joint entropy for syste...
December 23, 2008
The entropic form $S_q$ is, for any $q \neq 1$, {\it nonadditive}. Indeed, for two probabilistically independent subsystems, it satisfies $S_q(A+B)/k=[S_q(A)/k]+[S_q(B)/k]+(1-q)[S_q(A)/k][S_q(B)/k] \ne S_q(A)/k+S_q(B)/k$. This form will turn out to be {\it extensive} for an important class of nonlocal correlations, if $q$ is set equal to a special value different from unity, noted $q_{ent}$ (where $ent$ stands for $entropy$). In other words, for such systems, we verify that $...