ID: 1504.06240

A Computable Measure of Algorithmic Probability by Finite Approximations with an Application to Integer Sequences

April 23, 2015

View on ArXiv

Similar papers 5

Efficient Induction of Finite State Automata

February 6, 2013

84% Match
Matthew S. Collins, Jonathan Oliver
Artificial Intelligence
Formal Languages and Automat...

This paper introduces a new algorithm for the induction if complex finite state automata from samples of behavior. The algorithm is based on information theoretic principles. The algorithm reduces the search space by many orders of magnitude over what was previously thought possible. We compare the algorithm with some existing induction techniques for finite state automata and show that the algorithm is much superior in both run time and quality of inductions.

Find SimilarView on arXiv

Foundations of Probability Theory for AI - The Application of Algorithmic Probability to Problems in Artificial Intelligence

March 27, 2013

84% Match
Ray Solomonoff
Artificial Intelligence

This paper covers two topics: first an introduction to Algorithmic Complexity Theory: how it defines probability, some of its characteristic properties and past successful applications. Second, we apply it to problems in A.I. - where it promises to give near optimum search procedures for two very broad classes of problems.

Find SimilarView on arXiv

Probabilistic Recursion Theory and Implicit Computational Complexity (Long Version)

June 12, 2014

84% Match
Ugo Dal Lago, Sara Zuppiroli
Logic in Computer Science

We show that probabilistic computable functions, i.e., those functions outputting distributions and computed by probabilistic Turing machines, can be characterized by a natural generalization of Church and Kleene's partial recursive functions. The obtained algebra, following Leivant, can be restricted so as to capture the notion of polytime sampleable distributions, a key concept in average-case complexity and cryptography.

Find SimilarView on arXiv

Kolmogorov complexity and generalized length functions

November 17, 2016

84% Match
Cameron Fraize, Christopher P. Porter
Computational Complexity
Logic

Kolmogorov complexity measures the algorithmic complexity of a finite binary string $\sigma$ in terms of the length of the shortest description $\sigma^*$ of $\sigma$. Traditionally, the length of a string is taken to measure the amount of information contained in the string. However, we may also view the length of $\sigma$ as a measure of the cost of producing $\sigma$, which permits one to generalize the notion of length, wherein the cost of producing a 0 or a 1 can vary in...

Find SimilarView on arXiv

Very Simple Chaitin Machines for Concrete AIT

August 11, 2005

84% Match
Michael Stay
Information Theory
Information Theory

In 1975, Chaitin introduced his celebrated Omega number, the halting probability of a universal Chaitin machine, a universal Turing machine with a prefix-free domain. The Omega number's bits are {\em algorithmically random}--there is no reason the bits should be the way they are, if we define ``reason'' to be a computable explanation smaller than the data itself. Since that time, only {\em two} explicit universal Chaitin machines have been proposed, both by Chaitin himself. ...

Find SimilarView on arXiv

Kolmogorov Complexity in perspective. Part I: Information Theory and Randomnes

October 15, 2010

84% Match
Marie LIAFA Ferbus-Zanda, Serge LIAFA Grigorieff
Logic in Computer Science
Computational Complexity
Information Theory
Information Theory

We survey diverse approaches to the notion of information: from Shannon entropy to Kolmogorov complexity. Two of the main applications of Kolmogorov complexity are presented: randomness and classification. The survey is divided in two parts in the same volume. Part I is dedicated to information theory and the mathematical formalization of randomness based on Kolmogorov complexity. This last application goes back to the 60's and 70's with the work of Martin-L\"of, Schnorr, Cha...

Find SimilarView on arXiv

A Programming Language Oriented Approach to Computability

February 10, 2014

84% Match
Aaron Karper
Programming Languages
Logic in Computer Science

The field of computability and complexity was, where computer science sprung from. Turing, Church, and Kleene all developed formalisms that demonstrated what they held "intuitively computable". The times change however and today's (aspiring) computer scientists are less proficient in building automata or composing functions and are much more native to the world of programming languages. This article will try to introduce typical concepts of computability theory and complexity...

Find SimilarView on arXiv

Computable Model Discovery and High-Level-Programming Approximations to Algorithmic Complexity

December 22, 2021

84% Match
Vladimir Lemusa, Eduardo Acuña, Víctor Zamora, ... , Zenil Hector
Information Theory
Logic in Computer Science
Information Theory

Motivated by algorithmic information theory, the problem of program discovery can help find candidates of underlying generative mechanisms of natural and artificial phenomena. The uncomputability of such inverse problem, however, significantly restricts a wider application of exhaustive methods. Here we present a proof of concept of an approach based on IMP, a high-level imperative programming language. Its main advantage is that conceptually complex computational routines ar...

Find SimilarView on arXiv

Towards a Definitive Compressibility Measure for Repetitive Sequences

October 4, 2019

84% Match
Tomasz Kociumaka, Gonzalo Navarro, Nicola Prezza
Data Structures and Algorith...

Unlike in statistical compression, where Shannon's entropy is a definitive lower bound, no such clear measure exists for the compressibility of repetitive sequences. Since statistical entropy does not capture repetitiveness, ad-hoc measures like the size $z$ of the Lempel--Ziv parse are frequently used to estimate it. The size $b \le z$ of the smallest bidirectional macro scheme captures better what can be achieved via copy-paste processes, though it is NP-complete to compute...

Find SimilarView on arXiv

Random numbers as probabilities of machine behaviour

May 19, 2016

84% Match
George Barmpalias, Douglas Cenzer, Christopher P. Porter
Computational Complexity

A fruitful way of obtaining meaningful, possibly concrete, algorithmically random numbers is to consider a potential behaviour of a Turing machine and its probability with respect to a measure (or semi-measure) on the input space of binary codes. For example, Chaitin's Omega is a well known Martin-Loef random number that is obtained by considering the halting probability of a universal prefix-free machine. In the last decade, similar examples have been obtained for higher for...

Find SimilarView on arXiv