The Stanford GraphBase


Book Description

The Stanford GraphBase: A Platform for Combinatorial Computing represents the first efforts of Donald E. Knuth's preparation for Volume Four of The Art of Computer Programming. The book's first goal is to use examples to demonstrate the art of literate programming. Each example provides a programmatic essay that can be read and enjoyed as readily as it can be interpreted by machines. In these essays/programs, Knuth makes new contributions to several important algorithms and data structures, so the programs are of special interest for their content as well as for their style. The book's second goal is to provide a useful means for comparing combinatorial algorithms and for evaluating methods of combinatorial computing. To this end, Knuth's programs offer standard, freely available sets of data - the Stanford GraphBase - that may be used as benchmarks to test competing methods. The data sets are both interesting in themselves and applicable to a wide variety of problem domains. With objective tests, Knuth hopes to bridge the gap between theoretical computer scientists and programmers who have real problems to solve. As with all of Knuth's writings, this book is appreciated not only for the author's unmatched insight, but also for the fun and the challenge of his work. He illustrates many of the most significant and most beautiful combinatorial algorithms that are presently known and provides sample programs that can lead to hours of amusement. In showing how the Stanford GraphBase can generate an almost inexhaustible supply of challenging problems, some of which may lead to the discovery of new and improved algorithms, Knuth proposes friendly competitions. His own initial entries into such competitions are included in the book, and readers are challenged to do better. Features Includes new contributions to our understanding of important algorithms and data structures Provides a standard tool for evaluating combinatorial algorithms Demonstrates a more readable, more practical style of programming Challenges readers to surpass his own efficient algorithms 0201542757B04062001




The Stanford GraphBase


Book Description

Data -- Data Structures.




The Boost Graph Library


Book Description

The Boost Graph Library (BGL) is the first C++ library to apply the principles of generic programming to the construction of the advanced data structures and algorithms used in graph computations. Problems in such diverse areas as Internet packet routing, molecular biology, scientific computing, and telephone network design can be solved by using graph theory. This book presents an in-depth description of the BGL and provides working examples designed to illustrate the application of BGL to these real-world problems. Written by the BGL developers, The Boost Graph Library: User Guide and Reference Manual gives you all the information you need to take advantage of this powerful new library. Part I is a complete user guide that begins by introducing graph concepts, terminology, and generic graph algorithms. This guide also takes the reader on a tour through the major features of the BGL; all motivated with example problems. Part II is a comprehensive reference manual that provides complete documentation of all BGL concepts, algorithms, and classes. Readers will find coverage of: Graph terminology and concepts Generic programming techniques in C++ Shortest-path algorithms for Internet routing Network planning problems using the minimum-spanning tree algorithms BGL algorithms with implicitly defined graphs BGL Interfaces to other graph libraries BGL concepts and algorithms BGL classes–graph, auxiliary, and adaptor Groundbreaking in its scope, this book offers the key to unlocking the power of the BGL for the C++ programmer looking to extend the reach of generic programming beyond the Standard Template Library.




Literate Programming


Book Description

Literate programming is a programming methodology that combines a programming language with a documentation language, making programs more easily maintained than programs written only in a high-level language. A literate programmer is an essayist who writes programs for humans to understand. When programs are written in the recommended style they can be transformed into documents by a document compiler and into efficient code by an algebraic compiler. This anthology of essays includes Knuth's early papers on related topics such as structured programming as well as the Computer Journal article that launched literate programming. Many examples are given, including excerpts from the programs for TeX and METAFONT. The final essay is an example of CWEB, a system for literate programming in C and related languages. Index included.




The Algorithm Design Manual


Book Description

This newly expanded and updated second edition of the best-selling classic continues to take the "mystery" out of designing algorithms, and analyzing their efficacy and efficiency. Expanding on the first edition, the book now serves as the primary textbook of choice for algorithm design courses while maintaining its status as the premier practical reference guide to algorithms for programmers, researchers, and students. The reader-friendly Algorithm Design Manual provides straightforward access to combinatorial algorithms technology, stressing design over analysis. The first part, Techniques, provides accessible instruction on methods for designing and analyzing computer algorithms. The second part, Resources, is intended for browsing and reference, and comprises the catalog of algorithmic resources, implementations and an extensive bibliography. NEW to the second edition: • Doubles the tutorial material and exercises over the first edition • Provides full online support for lecturers, and a completely updated and improved website component with lecture slides, audio and video • Contains a unique catalog identifying the 75 algorithmic problems that arise most often in practice, leading the reader down the right path to solve them • Includes several NEW "war stories" relating experiences from real-world applications • Provides up-to-date links leading to the very best algorithm implementations available in C, C++, and Java







MMIXware


Book Description

MMIX is a RISC computer designed by Don Knuth to illustrate machine-level aspects of programming. In the author's book series "The Art of Computer Programming", MMIX replaces the 1960s-style machine MIX. A particular goal in the design of MMIX was to keep its machine language simple, elegant, and easy to learn. At the same time, all of the complexities needed to achieve high performance in practice are taken into account. This book constitutes a collection of programs written in CWEB that make MMIX a virtual reality. Among other utilities, an assembler converting MMIX symbolic files to MMIX objects and two simulators executing the programs in given object files are provided. The latest version of all programs can be downloaded from MMIX's home page. The book provides a complete documentation of the MMIX computer and its assembly language. It also presents mini-indexes, which make the programs much easier to understand. A corrected reprint of the book has been published in August 2014, replacing the version of 1999.




Logic Programming


Book Description

Topics covered: Theoretical Foundations. Higher-Order Logics. Non-Monotonic Reasoning. Programming Methodology. Programming Environments. Extensions to Logic Programming. Constraint Satisfaction. Meta-Programming. Language Design and Constructs. Implementation of Logic Programming Languages. Compilation Techniques. Architectures. Parallelism. Reasoning about Programs. Deductive Databases. Applications. 13-16 June 1995, Tokyo, Japan ICLP, which is sponsored by the Association for Logic Programming, is one of two major annual international conferences reporting recent research results in logic programming. Logic programming originates from the discovery that a subset of predicate logic could be given a procedural interpretation which was first embodied in the programming language, Prolog. The unique features of logic programming make it appealing for numerous applications in artificial intelligence, computer-aided design and verification, databases, and operations research, and for exploring parallel and concurrent computing. The last two decades have witnessed substantial developments in this field from its foundation to implementation, applications, and the exploration of new language designs. Topics covered: Theoretical Foundations. Higher-Order Logics. Non-Monotonic Reasoning. Programming Methodology. Programming Environments. Extensions to Logic Programming. Constraint Satisfaction. Meta-Programming. Language Design and Constructs. Implementation of Logic Programming Languages. Compilation Techniques. Architectures. Parallelism. Reasoning about Programs. Deductive Databases. Applications. Logic Programming series, Research Reports and Notes




Advances in Network Clustering and Blockmodeling


Book Description

Provides an overview of the developments and advances in the field of network clustering and blockmodeling over the last 10 years This book offers an integrated treatment of network clustering and blockmodeling, covering all of the newest approaches and methods that have been developed over the last decade. Presented in a comprehensive manner, it offers the foundations for understanding network structures and processes, and features a wide variety of new techniques addressing issues that occur during the partitioning of networks across multiple disciplines such as community detection, blockmodeling of valued networks, role assignment, and stochastic blockmodeling. Written by a team of international experts in the field, Advances in Network Clustering and Blockmodeling offers a plethora of diverse perspectives covering topics such as: bibliometric analyses of the network clustering literature; clustering approaches to networks; label propagation for clustering; and treating missing network data before partitioning. It also examines the partitioning of signed networks, multimode networks, and linked networks. A chapter on structured networks and coarsegrained descriptions is presented, along with another on scientific coauthorship networks. The book finishes with a section covering conclusions and directions for future work. In addition, the editors provide numerous tables, figures, case studies, examples, datasets, and more. Offers a clear and insightful look at the state of the art in network clustering and blockmodeling Provides an excellent mix of mathematical rigor and practical application in a comprehensive manner Presents a suite of new methods, procedures, algorithms for partitioning networks, as well as new techniques for visualizing matrix arrays Features numerous examples throughout, enabling readers to gain a better understanding of research methods and to conduct their own research effectively Written by leading contributors in the field of spatial networks analysis Advances in Network Clustering and Blockmodeling is an ideal book for graduate and undergraduate students taking courses on network analysis or working with networks using real data. It will also benefit researchers and practitioners interested in network analysis.




Graph Data Management


Book Description

This book presents a comprehensive overview of fundamental issues and recent advances in graph data management. Its aim is to provide beginning researchers in the area of graph data management, or in fields that require graph data management, an overview of the latest developments in this area, both in applied and in fundamental subdomains. The topics covered range from a general introduction to graph data management, to more specialized topics like graph visualization, flexible queries of graph data, parallel processing, and benchmarking. The book will help researchers put their work in perspective and show them which types of tools, techniques and technologies are available, which ones could best suit their needs, and where there are still open issues and future research directions. The chapters are contributed by leading experts in the relevant areas, presenting a coherent overview of the state of the art in the field. Readers should have a basic knowledge of data management techniques as they are taught in computer science MSc programs.