Church's Thesis After 70 Years


Book Description

Church's Thesis (CT) was first published by Alonzo Church in 1935. CT is a proposition that identifies two notions: an intuitive notion of a effectively computable function defined in natural numbers with the notion of a recursive function. Despite of the many efforts of prominent scientists, Church's Thesis has never been falsified. There exists a vast literature concerning the thesis. The aim of the book is to provide one volume summary of the state of research on Church's Thesis. These include the following: different formulations of CT, CT and intuitionism, CT and intensional mathematics, CT and physics, the epistemic status of CT, CT and philosophy of mind, provability of CT and CT and functional programming.




Ways of Proof Theory


Book Description

On the occasion of the retirement of Wolfram Pohlers the Institut für Mathematische Logik und Grundlagenforschung of the University of Münster organized a colloquium and a workshop which took place July 17 – 19, 2008. This event brought together proof theorists from many parts of the world who have been acting as teachers, students and collaborators of Wolfram Pohlers and who have been shaping the field of proof theory over the years. The present volume collects papers by the speakers of the colloquium and workshop; and they produce a documentation of the state of the art of contemporary proof theory.




Computing and Philosophy


Book Description

This volume offers very selected papers from the 2014 conference of the “International Association for Computing and Philosophy” (IACAP) - a conference tradition of 28 years. The theme of the papers is the two-way relation between computing technologies and philosophical questions: Computing technologies both raise new philosophical questions, and shed light on traditional philosophical problems. The chapters cover: 1) philosophy of computing, 2) philosophy of computer science & discovery, 3) philosophy of cognition & intelligence, 4) computing & society, and 5) ethics of computation.




Computability


Book Description

Computer scientists, mathematicians, and philosophers discuss the conceptual foundations of the notion of computability as well as recent theoretical developments. In the 1930s a series of seminal works published by Alan Turing, Kurt Gödel, Alonzo Church, and others established the theoretical basis for computability. This work, advancing precise characterizations of effective, algorithmic computability, was the culmination of intensive investigations into the foundations of mathematics. In the decades since, the theory of computability has moved to the center of discussions in philosophy, computer science, and cognitive science. In this volume, distinguished computer scientists, mathematicians, logicians, and philosophers consider the conceptual foundations of computability in light of our modern understanding. Some chapters focus on the pioneering work by Turing, Gödel, and Church, including the Church-Turing thesis and Gödel's response to Church's and Turing's proposals. Other chapters cover more recent technical developments, including computability over the reals, Gödel's influence on mathematical logic and on recursion theory and the impact of work by Turing and Emil Post on our theoretical understanding of online and interactive computing; and others relate computability and complexity to issues in the philosophy of mind, the philosophy of science, and the philosophy of mathematics. Contributors Scott Aaronson, Dorit Aharonov, B. Jack Copeland, Martin Davis, Solomon Feferman, Saul Kripke, Carl J. Posy, Hilary Putnam, Oron Shagrir, Stewart Shapiro, Wilfried Sieg, Robert I. Soare, Umesh V. Vazirani




The Foundations of Computability Theory


Book Description

This book offers an original and informative view of the development of fundamental concepts of computability theory. The treatment is put into historical context, emphasizing the motivation for ideas as well as their logical and formal development. In Part I the author introduces computability theory, with chapters on the foundational crisis of mathematics in the early twentieth century, and formalism. In Part II he explains classical computability theory, with chapters on the quest for formalization, the Turing Machine, and early successes such as defining incomputable problems, c.e. (computably enumerable) sets, and developing methods for proving incomputability. In Part III he explains relative computability, with chapters on computation with external help, degrees of unsolvability, the Turing hierarchy of unsolvability, the class of degrees of unsolvability, c.e. degrees and the priority method, and the arithmetical hierarchy. Finally, in the new Part IV the author revisits the computability (Church-Turing) thesis in greater detail. He offers a systematic and detailed account of its origins, evolution, and meaning, he describes more powerful, modern versions of the thesis, and he discusses recent speculative proposals for new computing paradigms such as hypercomputing. This is a gentle introduction from the origins of computability theory up to current research, and it will be of value as a textbook and guide for advanced undergraduate and graduate students and researchers in the domains of computability theory and theoretical computer science. This new edition is completely revised, with almost one hundred pages of new material. In particular the author applied more up-to-date, more consistent terminology, and he addressed some notational redundancies and minor errors. He developed a glossary relating to computability theory, expanded the bibliographic references with new entries, and added the new part described above and other new sections.




Introduction to Mathematical Logic


Book Description

The new edition of this classic textbook, Introduction to Mathematical Logic, Sixth Edition explores the principal topics of mathematical logic. It covers propositional logic, first-order logic, first-order number theory, axiomatic set theory, and the theory of computability. The text also discusses the major results of Godel, Church, Kleene, Rosse




SOFSEM 2012: Theory and Practice of Computer Science


Book Description

This book constitutes the refereed proceedings of the 38th Conference on Current Trends in Theory and Practice of Computer Science, SOFSEM 2012, held in Špindlerův Mlýn, Czech Republic, in January 2012. The 43 revised papers presented in this volume were carefully reviewed and selected from 121 submissions. The book also contains 11 invited talks, 10 of which are in full-paper length. The contributions are organized in topical sections named: foundations of computer science; software and Web engineering; cryptography, security, and verification; and artificial intelligence.




Gödel's Disjunction


Book Description

The logician Kurt Gödel in 1951 established a disjunctive thesis about the scope and limits of mathematical knowledge: either the mathematical mind is not equivalent to a Turing machine (i.e., a computer), or there are absolutely undecidable mathematical problems. In the second half of the twentieth century, attempts have been made to arrive at a stronger conclusion. In particular, arguments have been produced by the philosopher J.R. Lucas and by the physicist and mathematician Roger Penrose that intend to show that the mathematical mind is more powerful than any computer. These arguments, and counterarguments to them, have not convinced the logical and philosophical community. The reason for this is an insufficiency if rigour in the debate. The contributions in this volume move the debate forward by formulating rigorous frameworks and formally spelling out and evaluating arguments that bear on Gödel's disjunction in these frameworks. The contributions in this volume have been written by world leading experts in the field.




Language and the Rise of the Algorithm


Book Description

A wide-ranging history of the algorithm. Bringing together the histories of mathematics, computer science, and linguistic thought, Language and the Rise of the Algorithm reveals how recent developments in artificial intelligence are reopening an issue that troubled mathematicians well before the computer age: How do you draw the line between computational rules and the complexities of making systems comprehensible to people? By attending to this question, we come to see that the modern idea of the algorithm is implicated in a long history of attempts to maintain a disciplinary boundary separating technical knowledge from the languages people speak day to day. Here Jeffrey M. Binder offers a compelling tour of four visions of universal computation that addressed this issue in very different ways: G. W. Leibniz’s calculus ratiocinator; a universal algebra scheme Nicolas de Condorcet designed during the French Revolution; George Boole’s nineteenth-century logic system; and the early programming language ALGOL, short for algorithmic language. These episodes show that symbolic computation has repeatedly become entangled in debates about the nature of communication. Machine learning, in its increasing dependence on words, erodes the line between technical and everyday language, revealing the urgent stakes underlying this boundary. The idea of the algorithm is a levee holding back the social complexity of language, and it is about to break. This book is about the flood that inspired its construction.




Emergent Computation


Book Description

This book is dedicated to Professor Selim G. Akl to honour his groundbreaking research achievements in computer science over four decades. The book is an intellectually stimulating excursion into emergent computing paradigms, architectures and implementations. World top experts in computer science, engineering and mathematics overview exciting and intriguing topics of musical rhythms generation algorithms, analyse the computational power of random walks, dispelling a myth of computational universality, computability and complexity at the microscopic level of synchronous computation, descriptional complexity of error detection, quantum cryptography, context-free parallel communicating grammar systems, fault tolerance of hypercubes, finite automata theory of bulk-synchronous parallel computing, dealing with silent data corruptions in high-performance computing, parallel sorting on graphics processing units, mining for functional dependencies in relational databases, cellular automata optimisation of wireless sensors networks, connectivity preserving network transformers, constrained resource networks, vague computing, parallel evolutionary optimisation, emergent behaviour in multi-agent systems, vehicular clouds, epigenetic drug discovery, dimensionality reduction for intrusion detection systems, physical maze solvers, computer chess, parallel algorithms to string alignment, detection of community structure. The book is a unique combination of vibrant essays which inspires scientists and engineers to exploit natural phenomena in designs of computing architectures of the future.