ID: 1504.06240

A Computable Measure of Algorithmic Probability by Finite Approximations with an Application to Integer Sequences

April 23, 2015

View on ArXiv

Similar papers 3

Complexity analysis for algorithmically simple strings

September 5, 2000

85% Match
Andrei N. Royal Holloway, University of London Soklakov
Machine Learning

Given a reference computer, Kolmogorov complexity is a well defined function on all binary strings. In the standard approach, however, only the asymptotic properties of such functions are considered because they do not depend on the reference computer. We argue that this approach can be more useful if it is refined to include an important practical case of simple binary strings. Kolmogorov complexity calculus may be developed for this case if we restrict the class of availabl...

Find SimilarView on arXiv

Algorithmic Problem Complexity

July 4, 2008

85% Match
Mark Burgin
Computational Complexity
Information Theory
Information Theory

People solve different problems and know that some of them are simple, some are complex and some insoluble. The main goal of this work is to develop a mathematical theory of algorithmic complexity for problems. This theory is aimed at determination of computer abilities in solving different problems and estimation of resources that computers need to do this. Here we build the part of this theory related to static measures of algorithms. At first, we consider problems for fini...

Find SimilarView on arXiv

Minimum Probabilistic Finite State Learning Problem on Finite Data Sets: Complexity, Solution and Approximations

December 20, 2014

85% Match
Elisabeth Paulson, Christopher Griffin
Formal Languages and Automat...
Data Structures and Algorith...
Optimization and Control

In this paper, we study the problem of determining a minimum state probabilistic finite state machine capable of generating statistically identical symbol sequences to samples provided. This problem is qualitatively similar to the classical Hidden Markov Model problem and has been studied from a practical point of view in several works beginning with the work presented in: Shalizi, C.R., Shalizi, K.L., Crutchfield, J.P. (2002) \textit{An algorithm for pattern discovery in tim...

Find SimilarView on arXiv

Quines are the fittest programs: Nesting algorithmic probability converges to constructors

October 12, 2020

85% Match
Aritra Sarkar
Logic in Computer Science
Formal Languages and Automat...
Information Theory
Information Theory

In this article we explore the limiting behavior of the universal prior distribution applied over multiple meta-level hierarchy of program and output data of a Turing machine. We were motivated to reduce the effect of Solomonoff's assumption that all computable functions/hypothesis of the same length are equally likely, by weighing each program in turn by the algorithmic probability of their description number encoding. In the limiting case we converge the set of all possible...

Find SimilarView on arXiv

The Limits of Mathematics---Alternative Version

July 17, 1994

85% Match
G. J. IBM Research Division Chaitin
Chaotic Dynamics

This is an alternative version of the course notes in chao-dyn/9407003. The previous version is based on measuring the size of lisp s-expressions. This version is based on measuring the size of what I call lisp m-expressions, which are lisp s-expressions with most parentheses omitted. This formulation of algorithmic information theory is harder to understand than the one that was presented in chao-dyn/9407003, but the constants obtained in all theorems are now less than half ...

Find SimilarView on arXiv

Easily Adaptable Complexity Measure for Finite Time Series

May 23, 2005

85% Match
Da-Guan Ke, Qin-Ye Tong
Chaotic Dynamics

We present a complexity measure for any finite time series. This measure has invariance under any monotonic transformation of the time series, has a degree of robustness against noise, and has the adaptability of satisfying almost all the widely accepted but conflicting criteria for complexity measurements. Surprisingly, the measure is developed from Kolmogorov complexity, which is traditionally believed to represent only randomness and to satisfy one criterion to the exclusi...

Find SimilarView on arXiv

Two-Dimensional Kolmogorov Complexity and Validation of the Coding Theorem Method by Compressibility

December 30, 2012

85% Match
Hector Zenil, Fernando Soler-Toscano, ... , Gauvrit Nicolas
Computational Complexity
Information Theory
Information Theory

We propose a measure based upon the fundamental theoretical concept in algorithmic information theory that provides a natural approach to the problem of evaluating $n$-dimensional complexity by using an $n$-dimensional deterministic Turing machine. The technique is interesting because it provides a natural algorithmic process for symmetry breaking generating complex $n$-dimensional structures from perfectly symmetric and fully deterministic computational rules producing a dis...

Find SimilarView on arXiv

Descriptive Complexity of Computable Sequences Revisited

February 4, 2019

85% Match
Nikolay Vereshchagin
Logic

The purpose of this paper is to answer two questions left open in [B. Durand, A. Shen, and N. Vereshchagin, Descriptive Complexity of Computable Sequences, Theoretical Computer Science 171 (2001), pp. 47--58]. Namely, we consider the following two complexities of an infinite computable 0-1-sequence $\alpha$: $C^{0'}(\alpha )$, defined as the minimal length of a program with oracle $0'$ that prints $\alpha$, and $M_{\infty}(\alpha)$, defined as $\liminf C(\alpha_{1:n}|n)$, whe...

Find SimilarView on arXiv

Algorithmic Information Theory and Foundations of Probability

June 24, 2009

84% Match
Alexander Shen
History and Overview

The use of algorithmic information theory (Kolmogorov complexity theory) to explain the relation between mathematical probability theory and `real world' is discussed.

Find SimilarView on arXiv

On the non-randomness of maximum Lempel Ziv complexity sequences of finite size

November 3, 2013

84% Match
E. Estevez-Rams, R. Lora Serrano, ... , Reyes I. Brito
Chaotic Dynamics
Information Theory
Information Theory

Random sequences attain the highest entropy rate. The estimation of entropy rate for an ergodic source can be done using the Lempel Ziv complexity measure yet, the exact entropy rate value is only reached in the infinite limit. We prove that typical random sequences of finite length fall short of the maximum Lempel-Ziv complexity, contrary to common belief. We discuss that, for a finite length, maximum Lempel-Ziv sequences can be built from a well defined generating algorithm...

Find SimilarView on arXiv