On Monotonicity Testing and the 2-to-2 Games Conjecture


Book Description

This book discusses two questions in Complexity Theory: the Monotonicity Testing problem and the 2-to-2 Games Conjecture. Monotonicity testing is a problem from the field of property testing, first considered by Goldreich et al. in 2000. The input of the algorithm is a function, and the goal is to design a tester that makes as few queries to the function as possible, accepts monotone functions and rejects far-from monotone functions with a probability close to 1. The first result of this book is an essentially optimal algorithm for this problem. The analysis of the algorithm heavily relies on a novel, directed, and robust analogue of a Boolean isoperimetric inequality of Talagrand from 1993. The probabilistically checkable proofs (PCP) theorem is one of the cornerstones of modern theoretical computer science. One area in which PCPs are essential is the area of hardness of approximation. Therein, the goal is to prove that some optimization problems are hard to solve, even approximately. Many hardness of approximation results were proved using the PCP theorem; however, for some problems optimal results were not obtained. This book touches on some of these problems, and in particular the 2-to-2 games problem and the vertex cover problem. The second result of this book is a proof of the 2-to-2 games conjecture (with imperfect completeness), which implies new hardness of approximation results for problems such as vertex cover and independent set. It also serves as strong evidence towards the unique games conjecture, a notorious related open problem in theoretical computer science. At the core of the proof is a characterization of small sets of vertices in Grassmann graphs whose edge expansion is bounded away from 1.




On Monotonicity Testing and the 2-to-2 Games Conjecture


Book Description

This book discusses two questions in Complexity Theory: the Monotonicity Testing problem and the 2-to-2 Games Conjecture. Monotonicity testing is a problem from the field of property testing, first considered by Goldreich et al. in 2000. The input of the algorithm is a function, and the goal is to design a tester that makes as few queries to the function as possible, accepts monotone functions and rejects far-from monotone functions with a probability close to 1. The first result of this book is an essentially optimal algorithm for this problem. The analysis of the algorithm heavily relies on a novel, directed, and robust analogue of a Boolean isoperimetric inequality of Talagrand from 1993. The probabilistically checkable proofs (PCP) theorem is one of the cornerstones of modern theoretical computer science. One area in which PCPs are essential is the area of hardness of approximation. Therein, the goal is to prove that some optimization problems are hard to solve, even approximately. Many hardness of approximation results were proved using the PCP theorem; however, for some problems optimal results were not obtained. This book touches on some of these problems, and in particular the 2-to-2 games problem and the vertex cover problem. The second result of this book is a proof of the 2-to-2 games conjecture (with imperfect completeness), which implies new hardness of approximation results for problems such as vertex cover and independent set. It also serves as strong evidence towards the unique games conjecture, a notorious related open problem in theoretical computer science. At the core of the proof is a characterization of small sets of vertices in Grassmann graphs whose edge expansion is bounded away from 1.




Prophets of Computing


Book Description

When electronic digital computers first appeared after World War II, they appeared as a revolutionary force. Business management, the world of work, administrative life, the nation state, and soon enough everyday life were expected to change dramatically with these machines’ use. Ever since, diverse prophecies of computing have continually emerged, through to the present day. As computing spread beyond the US and UK, such prophecies emerged from strikingly different economic, political, and cultural conditions. This volume explores how these expectations differed, assesses unexpected commonalities, and suggests ways to understand the divergences and convergences. This book examines thirteen countries, based on source material in ten different languages—the effort of an international team of scholars. In addition to analyses of debates, political changes, and popular speculations, we also show a wide range of pictorial representations of "the future with computers."







Analysis of Boolean Functions


Book Description

This graduate-level text gives a thorough overview of the analysis of Boolean functions, beginning with the most basic definitions and proceeding to advanced topics.




Art Gallery Theorems and Algorithms


Book Description

Art gallery theorems and algorithms are so called because they relate to problems involving the visibility of geometrical shapes and their internal surfaces. This book explores generalizations and specializations in these areas. Among the presentations are recently discovered theorems on orthogonal polygons, polygons with holes, exterior visibility, visibility graphs, and visibility in three dimensions. The author formulates many open problems and offers several conjectures, providing arguments which may be followed by anyone familiar with basic graph theory and algorithms. This work may be applied to robotics and artificial intelligence as well as other fields, and will be especially useful to computer scientists working with computational and combinatorial geometry.




Paradigms of Combinatorial Optimization


Book Description

Combinatorial optimization is a multidisciplinary scientific area, lying in the interface of three major scientific domains: mathematics, theoretical computer science and management. The three volumes of the Combinatorial Optimization series aim to cover a wide range of topics in this area. These topics also deal with fundamental notions and approaches as with several classical applications of combinatorial optimization. Concepts of Combinatorial Optimization, is divided into three parts: - On the complexity of combinatorial optimization problems, presenting basics about worst-case and randomized complexity; - Classical solution methods, presenting the two most-known methods for solving hard combinatorial optimization problems, that are Branch-and-Bound and Dynamic Programming; - Elements from mathematical programming, presenting fundamentals from mathematical programming based methods that are in the heart of Operations Research since the origins of this field.




The Probabilistic Method


Book Description

Praise for the Third Edition “Researchers of any kind of extremal combinatorics or theoretical computer science will welcome the new edition of this book.” - MAA Reviews Maintaining a standard of excellence that establishes The Probabilistic Method as the leading reference on probabilistic methods in combinatorics, the Fourth Edition continues to feature a clear writing style, illustrative examples, and illuminating exercises. The new edition includes numerous updates to reflect the most recent developments and advances in discrete mathematics and the connections to other areas in mathematics, theoretical computer science, and statistical physics. Emphasizing the methodology and techniques that enable problem-solving, The Probabilistic Method, Fourth Edition begins with a description of tools applied to probabilistic arguments, including basic techniques that use expectation and variance as well as the more advanced applications of martingales and correlation inequalities. The authors explore where probabilistic techniques have been applied successfully and also examine topical coverage such as discrepancy and random graphs, circuit complexity, computational geometry, and derandomization of randomized algorithms. Written by two well-known authorities in the field, the Fourth Edition features: Additional exercises throughout with hints and solutions to select problems in an appendix to help readers obtain a deeper understanding of the best methods and techniques New coverage on topics such as the Local Lemma, Six Standard Deviations result in Discrepancy Theory, Property B, and graph limits Updated sections to reflect major developments on the newest topics, discussions of the hypergraph container method, and many new references and improved results The Probabilistic Method, Fourth Edition is an ideal textbook for upper-undergraduate and graduate-level students majoring in mathematics, computer science, operations research, and statistics. The Fourth Edition is also an excellent reference for researchers and combinatorists who use probabilistic methods, discrete mathematics, and number theory. Noga Alon, PhD, is Baumritter Professor of Mathematics and Computer Science at Tel Aviv University. He is a member of the Israel National Academy of Sciences and Academia Europaea. A coeditor of the journal Random Structures and Algorithms, Dr. Alon is the recipient of the Polya Prize, The Gödel Prize, The Israel Prize, and the EMET Prize. Joel H. Spencer, PhD, is Professor of Mathematics and Computer Science at the Courant Institute of New York University. He is the cofounder and coeditor of the journal Random Structures and Algorithms and is a Sloane Foundation Fellow. Dr. Spencer has written more than 200 published articles and is the coauthor of Ramsey Theory, Second Edition, also published by Wiley.




Beyond the Worst-Case Analysis of Algorithms


Book Description

Introduces exciting new methods for assessing algorithms for problems ranging from clustering to linear programming to neural networks.




Mathematical Reviews


Book Description




Recent Books