April 23, 2015
Similar papers 3
September 5, 2000
Given a reference computer, Kolmogorov complexity is a well defined function on all binary strings. In the standard approach, however, only the asymptotic properties of such functions are considered because they do not depend on the reference computer. We argue that this approach can be more useful if it is refined to include an important practical case of simple binary strings. Kolmogorov complexity calculus may be developed for this case if we restrict the class of availabl...
July 4, 2008
People solve different problems and know that some of them are simple, some are complex and some insoluble. The main goal of this work is to develop a mathematical theory of algorithmic complexity for problems. This theory is aimed at determination of computer abilities in solving different problems and estimation of resources that computers need to do this. Here we build the part of this theory related to static measures of algorithms. At first, we consider problems for fini...
December 20, 2014
In this paper, we study the problem of determining a minimum state probabilistic finite state machine capable of generating statistically identical symbol sequences to samples provided. This problem is qualitatively similar to the classical Hidden Markov Model problem and has been studied from a practical point of view in several works beginning with the work presented in: Shalizi, C.R., Shalizi, K.L., Crutchfield, J.P. (2002) \textit{An algorithm for pattern discovery in tim...
October 12, 2020
In this article we explore the limiting behavior of the universal prior distribution applied over multiple meta-level hierarchy of program and output data of a Turing machine. We were motivated to reduce the effect of Solomonoff's assumption that all computable functions/hypothesis of the same length are equally likely, by weighing each program in turn by the algorithmic probability of their description number encoding. In the limiting case we converge the set of all possible...
July 17, 1994
This is an alternative version of the course notes in chao-dyn/9407003. The previous version is based on measuring the size of lisp s-expressions. This version is based on measuring the size of what I call lisp m-expressions, which are lisp s-expressions with most parentheses omitted. This formulation of algorithmic information theory is harder to understand than the one that was presented in chao-dyn/9407003, but the constants obtained in all theorems are now less than half ...
May 23, 2005
We present a complexity measure for any finite time series. This measure has invariance under any monotonic transformation of the time series, has a degree of robustness against noise, and has the adaptability of satisfying almost all the widely accepted but conflicting criteria for complexity measurements. Surprisingly, the measure is developed from Kolmogorov complexity, which is traditionally believed to represent only randomness and to satisfy one criterion to the exclusi...
December 30, 2012
We propose a measure based upon the fundamental theoretical concept in algorithmic information theory that provides a natural approach to the problem of evaluating $n$-dimensional complexity by using an $n$-dimensional deterministic Turing machine. The technique is interesting because it provides a natural algorithmic process for symmetry breaking generating complex $n$-dimensional structures from perfectly symmetric and fully deterministic computational rules producing a dis...
February 4, 2019
The purpose of this paper is to answer two questions left open in [B. Durand, A. Shen, and N. Vereshchagin, Descriptive Complexity of Computable Sequences, Theoretical Computer Science 171 (2001), pp. 47--58]. Namely, we consider the following two complexities of an infinite computable 0-1-sequence $\alpha$: $C^{0'}(\alpha )$, defined as the minimal length of a program with oracle $0'$ that prints $\alpha$, and $M_{\infty}(\alpha)$, defined as $\liminf C(\alpha_{1:n}|n)$, whe...
June 24, 2009
The use of algorithmic information theory (Kolmogorov complexity theory) to explain the relation between mathematical probability theory and `real world' is discussed.
November 3, 2013
Random sequences attain the highest entropy rate. The estimation of entropy rate for an ergodic source can be done using the Lempel Ziv complexity measure yet, the exact entropy rate value is only reached in the infinite limit. We prove that typical random sequences of finite length fall short of the maximum Lempel-Ziv complexity, contrary to common belief. We discuss that, for a finite length, maximum Lempel-Ziv sequences can be built from a well defined generating algorithm...