February 26, 2003
Two philosophical applications of the concept of program-size complexity are discussed. First, we consider the light program-size complexity sheds on whether mathematics is invented or discovered, i.e., is empirical or is a priori. Second, we propose that the notion of algorithmic independence sheds light on the question of being and how the world of our experience can be partitioned into separate entities.
Similar papers 1
March 27, 2003
Most work on computational complexity is concerned with time. However this course will try to show that program-size complexity, which measures algorithmic information, is of much greater philosophical significance. I'll discuss how one can use this complexity measure to study what can and cannot be achieved by formal axiomatic mathematical theories. In particular, I'll show (a) that there are natural information-theoretic constraints on formal axiomatic theories, and that pr...
January 5, 2007
Presents a history of the evolution of the author's ideas on program-size complexity and its applications to metamathematics over the course of more than four decades. Includes suggestions for further work.
June 24, 2009
The use of algorithmic information theory (Kolmogorov complexity theory) to explain the relation between mathematical probability theory and `real world' is discussed.
October 2, 2002
We discuss views about whether the universe can be rationally comprehended, starting with Plato, then Leibniz, and then the views of some distinguished scientists of the previous century. Based on this, we defend the thesis that comprehension is compression, i.e., explaining many facts using few theoretical assumptions, and that a theory may be viewed as a computer program for calculating observations. This provides motivation for defining the complexity of something to be th...
September 16, 2008
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining `information'. We discuss the extent to which Kolmogorov's and Shannon's information theory have a common purpose, and where they are fundamentally different. We indicate how recent developments within the theory allow one to formally distinguish between `structural' (meaningful) and `random' information as meas...
March 1, 2002
This article discusses what can be proved about the foundations of mathematics using the notions of algorithm and information. The first part is retrospective, and presents a beautiful antique, Godel's proof, the first modern incompleteness theorem, Turing's halting problem, and a piece of postmodern metamathematics, the halting probability Omega. The second part looks forward to the new century and discusses the convergence of theoretical physics and theoretical computer sci...
November 14, 2002
In this paper the author presents some non-conventional thoughts on the complexity of the Universe and the algorithmic reproducibility of the human brain, essentially sparked off by the notion of algorithmic complexity. We must warn that though they evoke suggestive scenarios, they are still quite speculative.
August 23, 2017
Would it be possible to explain the emergence of new computational ideas using the computation itself? Would it be feasible to describe the discovery process of new algorithmic solutions using only mathematics? This study is the first effort to analyze the nature of such inquiry from the viewpoint of effort to find a new algorithmic solution to a given problem. We define program reachability as a probability function whose argument is a form of the energetic cost (algorithmic...
March 24, 2020
Some established and also novel techniques in the field of applications of algorithmic (Kolmogorov) complexity currently co-exist for the first time and are here reviewed, ranging from dominant ones such as statistical lossless compression to newer approaches that advance, complement and also pose new challenges and may exhibit their own limitations. Evidence suggesting that these different methods complement each other for different regimes is presented and despite their man...
September 2, 2007
How best to quantify the information of an object, whether natural or artifact, is a problem of wide interest. A related problem is the computability of an object. We present practical examples of a new way to address this problem. By giving an appropriate representation to our objects, based on a hierarchical coding of information, we exemplify how it is remarkably easy to compute complex objects. Our algorithmic complexity is related to the length of the class of objects, r...