Algorithmic Randomness and Complexity


Book Description

Computability and complexity theory are two central areas of research in theoretical computer science. This book provides a systematic, technical development of "algorithmic randomness" and complexity for scientists from diverse fields.




Algorithmic Randomness


Book Description

The last two decades have seen a wave of exciting new developments in the theory of algorithmic randomness and its applications to other areas of mathematics. This volume surveys much of the recent work that has not been included in published volumes until now. It contains a range of articles on algorithmic randomness and its interactions with closely related topics such as computability theory and computational complexity, as well as wider applications in areas of mathematics including analysis, probability, and ergodic theory. In addition to being an indispensable reference for researchers in algorithmic randomness, the unified view of the theory presented here makes this an excellent entry point for graduate students and other newcomers to the field.




Algorithmic Learning in a Random World


Book Description

Algorithmic Learning in a Random World describes recent theoretical and experimental developments in building computable approximations to Kolmogorov's algorithmic notion of randomness. Based on these approximations, a new set of machine learning algorithms have been developed that can be used to make predictions and to estimate their confidence and credibility in high-dimensional spaces under the usual assumption that the data are independent and identically distributed (assumption of randomness). Another aim of this unique monograph is to outline some limits of predictions: The approach based on algorithmic theory of randomness allows for the proof of impossibility of prediction in certain situations. The book describes how several important machine learning problems, such as density estimation in high-dimensional spaces, cannot be solved if the only assumption is randomness.




Information and Randomness


Book Description

"Algorithmic information theory (AIT) is the result of putting Shannon's information theory and Turing's computability theory into a cocktail shaker and shaking vigorously", says G.J. Chaitin, one of the fathers of this theory of complexity and randomness, which is also known as Kolmogorov complexity. It is relevant for logic (new light is shed on Gödel's incompleteness results), physics (chaotic motion), biology (how likely is life to appear and evolve?), and metaphysics (how ordered is the universe?). This book, benefiting from the author's research and teaching experience in Algorithmic Information Theory (AIT), should help to make the detailed mathematical techniques of AIT accessible to a much wider audience.




Kolmogorov Complexity and Algorithmic Randomness


Book Description

Looking at a sequence of zeros and ones, we often feel that it is not random, that is, it is not plausible as an outcome of fair coin tossing. Why? The answer is provided by algorithmic information theory: because the sequence is compressible, that is, it has small complexity or, equivalently, can be produced by a short program. This idea, going back to Solomonoff, Kolmogorov, Chaitin, Levin, and others, is now the starting point of algorithmic information theory. The first part of this book is a textbook-style exposition of the basic notions of complexity and randomness; the second part covers some recent work done by participants of the “Kolmogorov seminar” in Moscow (started by Kolmogorov himself in the 1980s) and their colleagues. This book contains numerous exercises (embedded in the text) that will help readers to grasp the material.




Computability and Randomness


Book Description

The interplay between computability and randomness has been an active area of research in recent years, reflected by ample funding in the USA, numerous workshops, and publications on the subject. The complexity and the randomness aspect of a set of natural numbers are closely related. Traditionally, computability theory is concerned with the complexity aspect. However, computability theoretic tools can also be used to introduce mathematical counterparts for the intuitive notion of randomness of a set. Recent research shows that, conversely, concepts and methods originating from randomness enrich computability theory. The book covers topics such as lowness and highness properties, Kolmogorov complexity, betting strategies and higher computability. Both the basics and recent research results are desribed, providing a very readable introduction to the exciting interface of computability and randomness for graduates and researchers in computability theory, theoretical computer science, and measure theory.




Kolmogorov Complexity and Algorithmic Randomness


Book Description

Looking at a sequence of zeros and ones, we often feel that it is not random, that is, it is not plausible as an outcome of fair coin tossing. Why? The answer is provided by algorithmic information theory: because the sequence is compressible, that is, it has small complexity or, equivalently, can be produced by a short program. This idea, going back to Solomonoff, Kolmogorov, Chaitin, Levin, and others, is now the starting point of algorithmic information theory. The first part of this book is a textbook-style exposition of the basic notions of complexity and randomness; the second part covers some recent work done by participants of the “Kolmogorov seminar” in Moscow (started by Kolmogorov himself in the 1980s) and their colleagues. This book contains numerous exercises (embedded in the text) that will help readers to grasp the material.




Exploring RANDOMNESS


Book Description

This essential companion to Chaitin's successful books The Unknowable and The Limits of Mathematics, presents the technical core of his theory of program-size complexity. The two previous volumes are more concerned with applications to meta-mathematics. LISP is used to present the key algorithms and to enable computer users to interact with the authors proofs and discover for themselves how they work. The LISP code for this book is available at the author's Web site together with a Java applet LISP interpreter. "No one has looked deeper and farther into the abyss of randomness and its role in mathematics than Greg Chaitin. This book tells you everything hes seen. Don miss it." John Casti, Santa Fe Institute, Author of Goedel: A Life of Logic.'




An Introduction to Kolmogorov Complexity and Its Applications


Book Description

Briefly, we review the basic elements of computability theory and prob ability theory that are required. Finally, in order to place the subject in the appropriate historical and conceptual context we trace the main roots of Kolmogorov complexity. This way the stage is set for Chapters 2 and 3, where we introduce the notion of optimal effective descriptions of objects. The length of such a description (or the number of bits of information in it) is its Kolmogorov complexity. We treat all aspects of the elementary mathematical theory of Kolmogorov complexity. This body of knowledge may be called algo rithmic complexity theory. The theory of Martin-Lof tests for random ness of finite objects and infinite sequences is inextricably intertwined with the theory of Kolmogorov complexity and is completely treated. We also investigate the statistical properties of finite strings with high Kolmogorov complexity. Both of these topics are eminently useful in the applications part of the book. We also investigate the recursion theoretic properties of Kolmogorov complexity (relations with Godel's incompleteness result), and the Kolmogorov complexity version of infor mation theory, which we may call "algorithmic information theory" or "absolute information theory. " The treatment of algorithmic probability theory in Chapter 4 presup poses Sections 1. 6, 1. 11. 2, and Chapter 3 (at least Sections 3. 1 through 3. 4).




Randomness And Undecidability In Physics


Book Description

Recent findings in the computer sciences, discrete mathematics, formal logics and metamathematics have opened up a royal road for the investigation of undecidability and randomness in physics. A translation of these formal concepts yields a fresh look into diverse features of physical modelling such as quantum complementarity and the measurement problem, but also stipulates questions related to the necessity of the assumption of continua.Conversely, any computer may be perceived as a physical system: not only in the immediate sense of the physical properties of its hardware. Computers are a medium to virtual realities. The foreseeable importance of such virtual realities stimulates the investigation of an “inner description”, a “virtual physics” of these universes of computation. Indeed, one may consider our own universe as just one particular realisation of an enormous number of virtual realities, most of them awaiting discovery.One motive of this book is the recognition that what is often referred to as “randomness” in physics might actually be a signature of undecidability for systems whose evolution is computable on a step-by-step basis. To give a flavour of the type of questions envisaged: Consider an arbitrary algorithmic system which is computable on a step-by-step basis. Then it is in general impossible to specify a second algorithmic procedure, including itself, which, by experimental input-output analysis, is capable of finding the deterministic law of the first system. But even if such a law is specified beforehand, it is in general impossible to predict the system behaviour in the “distant future”. In other words: no “speedup” or “computational shortcut” is available. In this approach, classical paradoxes can be formally translated into no-go theorems concerning intrinsic physical perception.It is suggested that complementarity can be modelled by experiments on finite automata, where measurements of one observable of the automaton destroys the possibility to measure another observable of the same automaton and it vice versa.Besides undecidability, a great part of the book is dedicated to a formal definition of randomness and entropy measures based on algorithmic information theory.