February 26, 2003
Similar papers 4
July 4, 2008
People solve different problems and know that some of them are simple, some are complex and some insoluble. The main goal of this work is to develop a mathematical theory of algorithmic complexity for problems. This theory is aimed at determination of computer abilities in solving different problems and estimation of resources that computers need to do this. Here we build the part of this theory related to static measures of algorithms. At first, we consider problems for fini...
September 5, 2013
This is a review of aspects of the theory of algorithmic information that may contribute to a framework for formulating questions related to complex highly unpredictable systems. We start by contrasting Shannon Entropy and Kolmogorov-Chaitin complexity epitomizing the difference between correlation and causation to then move onto surveying classical results from algorithmic complexity and algorithmic probability, highlighting their deep connection to the study of automata fre...
November 23, 2006
Some Goedel centenary reflections on whether incompleteness is really serious, and whether mathematics should be done somewhat differently, based on using algorithmic complexity measured in bits of information. [Enriques lecture given Monday, October 30, 2006, at the University of Milan.]
February 26, 2011
The notion of Kolmogorov complexity (=the minimal length of a program that generates some object) is often useful as a kind of language that allows us to reformulate some notions and therefore provide new intuition. In this survey we provide (with minimal comments) many different examples where notions and statements that involve Kolmogorov complexity are compared with their counterparts not involving complexity.
December 22, 2021
Motivated by algorithmic information theory, the problem of program discovery can help find candidates of underlying generative mechanisms of natural and artificial phenomena. The uncomputability of such inverse problem, however, significantly restricts a wider application of exhaustive methods. Here we present a proof of concept of an approach based on IMP, a high-level imperative programming language. Its main advantage is that conceptually complex computational routines ar...
September 1, 2016
We investigate the properties of a Block Decomposition Method (BDM), which extends the power of a Coding Theorem Method (CTM) that approximates local estimations of algorithmic complexity based upon Solomonoff-Levin's theory of algorithmic probability providing a closer connection to algorithmic complexity than previous attempts based on statistical regularities e.g. as spotted by some popular lossless compression schemes. The strategy behind BDM is to find small computer pro...
January 15, 2014
Not only did Turing help found one of the most exciting areas of modern science (computer science), but it may be that his contribution to our understanding of our physical reality is greater than we had hitherto supposed. Here I explore the path that Alan Turing would have certainly liked to follow, that of complexity science, which was launched in the wake of his seminal work on computability and structure formation. In particular, I will explain how the theory of algorithm...
June 19, 2009
We propose a test based on the theory of algorithmic complexity and an experimental evaluation of Levin's universal distribution to identify evidence in support of or in contravention of the claim that the world is algorithmic in nature. To this end we have undertaken a statistical comparison of the frequency distributions of data from physical sources on the one hand--repositories of information such as images, data stored in a hard drive, computer programs and DNA sequences...
June 27, 2005
In 1686 in his Discours de Metaphysique, Leibniz points out that if an arbitrarily complex theory is permitted then the notion of "theory" becomes vacuous because there is always a theory. This idea is developed in the modern theory of algorithmic information, which deals with the size of computer programs and provides a new view of Godel's work on incompleteness and Turing's work on uncomputability. Of particular interest is the halting probability Omega, whose bits are irre...
January 2, 2001
Numerous definitions for complexity have been proposed over the last half century, with little consensus achieved on how to use the term. A definition of complexity is supplied here that is closely related to the Kolmogorov Complexity and Shannon Entropy measures widely used as complexity measures, yet addresses a number of concerns raised against these measures. However, the price of doing this is to introduce context dependence into the definition of complexity. It is argue...