ID: 1907.06486

The Poincar\'e-Boltzmann Machine: from Statistical Physics to Machine Learning and back

July 6, 2019

View on ArXiv

Similar papers 5

Information Theory for Complex Systems Scientists

April 24, 2023

84% Match
Thomas F. Varley
Information Theory
Information Theory
Data Analysis, Statistics an...
Quantitative Methods
Other Statistics

In the 21st century, many of the crucial scientific and technical issues facing humanity can be understood as problems associated with understanding, modelling, and ultimately controlling complex systems: systems comprised of a large number of non-trivially interacting components whose collective behaviour can be difficult to predict. Information theory, a branch of mathematics historically associated with questions about encoding and decoding messages, has emerged as somethi...

Find SimilarView on arXiv

Feature extraction from complex networks: A case of study in genomic sequences classification

December 17, 2014

84% Match
Bruno Mendes Moro Conque, André Yoshiaki Kashiwabara, Fabrício Martins Lopes
Computational Engineering, F...
Machine Learning
Quantitative Methods

This work presents a new approach for classification of genomic sequences from measurements of complex networks and information theory. For this, it is considered the nucleotides, dinucleotides and trinucleotides of a genomic sequence. For each of them, the entropy, sum entropy and maximum entropy values are calculated.For each of them is also generated a network, in which the nodes are the nucleotides, dinucleotides or trinucleotides and its edges are estimated by observing ...

Find SimilarView on arXiv

Learning Physics from Data: a Thermodynamic Interpretation

September 3, 2019

84% Match
Francisco Chinesta, Elias Cueto, Miroslav Grmela, Beatriz Moya, ... , Sipka Martin
Data Analysis, Statistics an...
Statistical Mechanics
Adaptation and Self-Organizi...

Experimental data bases are typically very large and high dimensional. To learn from them requires to recognize important features (a pattern), often present at scales different to that of the recorded data. Following the experience collected in statistical mechanics and thermodynamics, the process of recognizing the pattern (the learning process) can be seen as a dissipative time evolution driven by entropy from a detailed level of description to less detailed. This is the w...

Find SimilarView on arXiv

Spectral entropies as information-theoretic tools for complex network comparison

September 5, 2016

84% Match
Domenico Manlio De, Jacob Biamonte
Physics and Society
Disordered Systems and Neura...

Any physical system can be viewed from the perspective that information is implicitly represented in its state. However, the quantification of this information when it comes to complex networks has remained largely elusive. In this work, we use techniques inspired by quantum statistical mechanics to define an entropy measure for complex networks and to develop a set of information-theoretic tools, based on network spectral properties, such as Renyi q-entropy, generalized Kull...

Find SimilarView on arXiv

On Lattices and the Dualities of Information Measures

July 31, 2013

84% Match
David J. Galas, Nikita A. Sakhanenko, Benjamin Keller
Information Theory
Information Theory
Quantitative Methods

Measures of dependence among variables, and measures of information content and shared information have become valuable tools of multi-variable data analysis. Information measures, like marginal entropies, mutual and multi-information, have a number of significant advantages over more standard statistical methods, like their reduced sensitivity to sampling limitations than statistical estimates of probability densities. There are also interesting applications of these measure...

Find SimilarView on arXiv

A Mathematical Walkthrough and Discussion of the Free Energy Principle

August 30, 2021

84% Match
Beren Millidge, Anil Seth, Christopher L Buckley
Artificial Intelligence

The Free-Energy-Principle (FEP) is an influential and controversial theory which postulates a deep and powerful connection between the stochastic thermodynamics of self-organization and learning through variational inference. Specifically, it claims that any self-organizing system which can be statistically separated from its environment, and which maintains itself at a non-equilibrium steady state, can be construed as minimizing an information-theoretic functional -- the var...

Find SimilarView on arXiv

The Global Dynamical Complexity of the Human Brain Network

December 21, 2016

84% Match
Xerxes D. Arsiwalla, Paul Verschure
Neurons and Cognition
Information Theory
Dynamical Systems
Information Theory
Biological Physics

How much information do large brain networks integrate as a whole over the sum of their parts? Can the dynamical complexity of such networks be globally quantified in an information-theoretic way and be meaningfully coupled to brain function? Recently, measures of dynamical complexity such as integrated information have been proposed. However, problems related to the normalization and Bell number of partitions associated to these measures make these approaches computationally...

Find SimilarView on arXiv

Information Geometry of the Probability Simplex: A Short Course

November 5, 2019

84% Match
Giovanni Pistone
Statistics Theory
Statistics Theory

This set of notes is intended for a short course aiming to provide an (almost) self-contained and (almost) elementary introduction to the topic of Information Geometry (IG) of the probability simplex. Such a course can be considered an introduction to the original monograph by Amari and Nagaoka (2000), and to the recent monographs by Amari (2016} and by Ay et al. (2017). The focus is on a non-parametric approach, that is, I consider the geometry of the full probability simple...

Find SimilarView on arXiv

Interaction Networks from Discrete Event Data by Poisson Multivariate Mutual Information Estimation and Information Flow with Applications from Gene Expression Data

February 6, 2020

84% Match
Jeremie Fish, Jie Sun, Erik Bollt
Methodology
Data Analysis, Statistics an...

In this work, we introduce a new methodology for inferring the interaction structure of discrete valued time series which are Poisson distributed. While most related methods are premised on continuous state stochastic processes, in fact, discrete and counting event oriented stochastic process are natural and common, so called time-point processes (TPP). An important application that we focus on here is gene expression. Nonparameteric methods such as the popular k-nearest neig...

Find SimilarView on arXiv

What is Information?

January 22, 2016

84% Match
Christoph Adami
Adaptation and Self-Organizi...
Information Theory
Information Theory
Biological Physics
Quantitative Methods

Information is a precise concept that can be defined mathematically, but its relationship to what we call "knowledge" is not always made clear. Furthermore, the concepts "entropy" and "information", while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a p...

Find SimilarView on arXiv