April 23, 2015
Similar papers 4
July 27, 2016
Algorithmic statistics has two different (and almost orthogonal) motivations. From the philosophical point of view, it tries to formalize how the statistics works and why some statistical models are better than others. After this notion of a "good model" is introduced, a natural question arises: it is possible that for some piece of data there is no good model? If yes, how often these bad ("non-stochastic") data appear "in real life"? Another, more technical motivation come...
May 20, 2014
Suppose that we are given an infinite binary sequence which is random for a Bernoulli measure of parameter $p$. By the law of large numbers, the frequency of zeros in the sequence tends to~$p$, and thus we can get better and better approximations of $p$ as we read the sequence. We study in this paper a similar question, but from the viewpoint of inductive inference. We suppose now that $p$ is a computable real, but one asks for more: as we are reading more and more bits of ou...
December 10, 2008
We obtain an index of the complexity of a random sequence by allowing the role of the measure in classical probability theory to be played by a function we call the generating mechanism. Typically, this generating mechanism will be a finite automata. We generate a set of biased sequences by applying a finite state automata with a specified number, $m$, of states to the set of all binary sequences. Thus we can index the complexity of our random sequence by the number of states...
May 29, 2022
Diverse applications of Kolmogorov complexity to learning [CIKK16], circuit complexity [OPS19], cryptography [LP20], average-case complexity [Hir21], and proof search [Kra22] have been discovered in recent years. Since the running time of algorithms is a key resource in these fields, it is crucial in the corresponding arguments to consider time-bounded variants of Kolmogorov complexity. While fruitful interactions between time-bounded Kolmogorov complexity and different areas...
May 13, 2010
This is a short introduction to Kolmogorov Complexity. The interested reader is referred to the text books by Cover & Thomas as well as Li & V\'itanyi, which cover the fields of information theory and Kolmogorov complexity in depth and with all the necessary rigor.
July 6, 2022
Developing new ways to estimate probabilities can be valuable for science, statistics, and engineering. By considering the information content of different output patterns, recent work invoking algorithmic information theory has shown that a priori probability predictions based on pattern complexities can be made in a broad class of input-output maps. These algorithmic probability predictions do not depend on a detailed knowledge of how output patterns were produced, or histo...
February 25, 2022
The capacity of a channel can usually be characterized as a maximization of certain entropic quantities. From a practical point of view it is of primary interest to not only compute the capacity value, but also to find the corresponding optimizer, i.e., the capacity-achieving input distribution. This paper addresses the general question of whether or not it is possible to find algorithms that can compute the optimal input distribution depending on the channel. For this purpos...
April 20, 2015
The mission of statistics is to provide adequate statistical hypotheses (models) for observed data. But what is an "adequate" model? To answer this question, one needs to use the notions of algorithmic information theory. It turns out that for every data string $x$ one can naturally define "stochasticity profile", a curve that represents a trade-off between complexity of a model and its adequacy. This curve has four different equivalent definitions in terms of (1)~randomness ...
September 5, 2013
This is a review of aspects of the theory of algorithmic information that may contribute to a framework for formulating questions related to complex highly unpredictable systems. We start by contrasting Shannon Entropy and Kolmogorov-Chaitin complexity epitomizing the difference between correlation and causation to then move onto surveying classical results from algorithmic complexity and algorithmic probability, highlighting their deep connection to the study of automata fre...
January 5, 2007
Presents a history of the evolution of the author's ideas on program-size complexity and its applications to metamathematics over the course of more than four decades. Includes suggestions for further work.