ID: 1907.06486

The Poincar\'e-Boltzmann Machine: from Statistical Physics to Machine Learning and back

July 6, 2019

View on ArXiv
Pierre Baudot
Quantitative Biology
Neurons and Cognition

This paper presents the computational methods of information cohomology applied to genetic expression in and in the companion paper and proposes its interpretations in terms of statistical physics and machine learning. In order to further underline the Hochschild cohomological nature af information functions and chain rules, following, the computation of the cohomology in low degrees is detailed to show more directly that the $k$ multivariate mutual-informations (I_k) are k-coboundaries. The k-cocycles condition corresponds to I_k=0, generalizing statistical independence. Hence the cohomology quantifies the statistical dependences and the obstruction to factorization. The topological approach allows to investigate information in the multivariate case without the assumptions of independent identically distributed variables and without mean field approximations. We develop the computationally tractable subcase of simplicial information cohomology represented by entropy H_k and information I_k landscapes and their respective paths. The I_1 component defines a self-internal energy U_k, and I_k,k>1 components define the contribution to a free energy G_k (the total correlation) of the k-body interactions. The set of information paths in simplicial structures is in bijection with the symmetric group and random processes, provides a trivial topological expression of the 2nd law of thermodynamic. The local minima of free-energy, related to conditional information negativity, and conditional independence, characterize a minimum free energy complex. This complex formalizes the minimum free-energy principle in topology, provides a definition of a complex system, and characterizes a multiplicity of local minima that quantifies the diversity observed in biology. I give an interpretation of this complex in terms of frustration in glass and of Van Der Walls k-body interactions for data points.

Similar papers 1

Topological Information Data Analysis

July 6, 2019

89% Match
Pierre Baudot, Monica Tapia, ... , Goaillard Jean-Marc
Other Statistics
Information Theory
Information Theory
Neurons and Cognition

This paper presents methods that quantify the structure of statistical interactions within a given data set, and was first used in \cite{Tapia2018}. It establishes new results on the k-multivariate mutual-informations (I_k) inspired by the topological formulation of Information introduced in. In particular we show that the vanishing of all I_k for 2\leq k \leq n of n random variables is equivalent to their statistical independence. Pursuing the work of Hu Kuo Ting and Te Sun ...

Find SimilarView on arXiv

Statistical physics of complex information dynamics

October 8, 2020

87% Match
Arsham Ghavasieh, Carlo Nicolini, Domenico Manlio De
Physics and Society
Disordered Systems and Neura...
Statistical Mechanics

The constituents of a complex system exchange information to function properly. Their signalling dynamics often leads to the appearance of emergent phenomena, such as phase transitions and collective behaviors. While information exchange has been widely modeled by means of distinct spreading processes -- such as continuous-time diffusion, random walks, synchronization and consensus -- on top of complex networks, a unified and physically-grounded framework to study information...

Find SimilarView on arXiv

From statistical mechanics to information theory: understanding biophysical information-processing systems

June 22, 2010

87% Match
Gašper Tkačik
Molecular Networks

These are notes for a set of 7 two-hour lectures given at the 2010 Summer School on Quantitative Evolutionary and Comparative Genomics at OIST, Okinawa, Japan. The emphasis is on understanding how biological systems process information. We take a physicist's approach of looking for simple phenomenological descriptions that can address the questions of biological function without necessarily modeling all (mostly unknown) microscopic details; the example that is developed throu...

Find SimilarView on arXiv

Information physics: From energy to codes

November 26, 1996

86% Match
P. Fraundorf
Physics Education
Statistical Mechanics
Biological Physics
Quantitative Methods

We illustrate in terms familiar to modern day science students that: (i) an uncertainty slope mechanism underlies the usefulness of temperature via its reciprocal, which is incidentally around 42 [nats/eV] at the freezing point of water; (ii) energy over kT and differential heat capacity are ``multiplicity exponents'', i.e. the bits of state information lost to the environment outside a system per 2-fold increase in energy and temperature respectively; (iii) even awaiting des...

Find SimilarView on arXiv

Information theory and learning: a physical approach

September 9, 2000

86% Match
Ilya Nemenman
Data Analysis, Statistics an...
Disordered Systems and Neura...
Machine Learning
Adaptation and Self-Organizi...

We try to establish a unified information theoretic approach to learning and to explore some of its applications. First, we define {\em predictive information} as the mutual information between the past and the future of a time series, discuss its behavior as a function of the length of the series, and explain how other quantities of interest studied previously in learning theory - as well as in dynamical systems and statistical mechanics - emerge from this universally defina...

Find SimilarView on arXiv

Bits from Biology for Computational Intelligence

November 30, 2014

86% Match
Michael Wibral, Joseph T. Lizier, Viola Priesemann
Neurons and Cognition
Information Theory
Information Theory
Data Analysis, Statistics an...

Computational intelligence is broadly defined as biologically-inspired computing. Usually, inspiration is drawn from neural systems. This article shows how to analyze neural systems using information theory to obtain constraints that help identify the algorithms run by such systems and the information they represent. Algorithms and representations identified information-theoretically may then guide the design of biologically inspired computing systems (BICS). The material cov...

Find SimilarView on arXiv

Methods of Information Theory and Algorithmic Complexity for Network Biology

January 15, 2014

86% Match
Hector Zenil, Narsis A. Kiani, Jesper Tegnér
Molecular Networks
Quantitative Methods

We survey and introduce concepts and tools located at the intersection of information theory and network biology. We show that Shannon's information entropy, compressibility and algorithmic complexity quantify different local and global aspects of synthetic and biological data. We show examples such as the emergence of giant components in Erdos-Renyi random graphs, and the recovery of topological properties from numerical kinetic properties simulating gene expression data. We...

Find SimilarView on arXiv

Spectral Simplicial Theory for Feature Selection and Applications to Genomics

November 8, 2018

85% Match
Kiya W. Govek, Venkata S. Yamajala, Pablo G. Camara
Machine Learning
Machine Learning
Quantitative Methods

The scale and complexity of modern data sets and the limitations associated with testing large numbers of hypotheses underline the need for feature selection methods. Spectral techniques rank features according to their degree of consistency with an underlying metric structure, but their current graph-based formulation restricts their applicability to point features. We extend spectral methods for feature selection to abstract simplicial complexes and present a general framew...

Find SimilarView on arXiv

An Information-Theoretic Formalism for Multiscale Structure in Complex Systems

September 16, 2014

85% Match
Benjamin Allen, Blake C. Stacey, Yaneer Bar-Yam
Statistical Mechanics

We develop a general formalism for representing and understanding structure in complex systems. In our view, structure is the totality of relationships among a system's components, and these relationships can be quantified using information theory. In the interest of flexibility we allow information to be quantified using any function, including Shannon entropy and Kolmogorov complexity, that satisfies certain fundamental axioms. Using these axioms, we formalize the notion of...

Find SimilarView on arXiv

Combinatorial Information Theory: I. Philosophical Basis of Cross-Entropy and Entropy

December 1, 2005

85% Match
Robert K. Niven
Statistical Mechanics
Information Theory
Information Theory
Mathematical Physics
Data Analysis, Statistics an...

This study critically analyses the information-theoretic, axiomatic and combinatorial philosophical bases of the entropy and cross-entropy concepts. The combinatorial basis is shown to be the most fundamental (most primitive) of these three bases, since it gives (i) a derivation for the Kullback-Leibler cross-entropy and Shannon entropy functions, as simplified forms of the multinomial distribution subject to the Stirling approximation; (ii) an explanation for the need to max...

Find SimilarView on arXiv