Document and Image Compression


Book Description

Although it's true that image compression research is a mature field, continued improvements in computing power and image representation tools keep the field spry. Faster processors enable previously intractable compression algorithms and schemes, and certainly the demand for highly portable high-quality images will not abate. Document and Image Compression highlights the current state of the field along with the most probable and promising future research directions for image coding. Organized into three broad sections, the book examines the currently available techniques, future directions, and techniques for specific classes of images. It begins with an introduction to multiresolution image representation, advanced coding and modeling techniques, and the basics of perceptual image coding. This leads to discussions of the JPEG 2000 and JPEG-LS standards, lossless coding, and fractal image compression. New directions are highlighted that involve image coding and representation paradigms beyond the wavelet-based framework, the use of redundant dictionaries, the distributed source coding paradigm, and novel data-hiding techniques. The book concludes with techniques developed for classes of images where the general-purpose algorithms fail, such as for binary images and shapes, compound documents, remote sensing images, medical images, and VLSI layout image data. Contributed by international experts, Document and Image Compression gathers the latest and most important developments in image coding into a single, convenient, and authoritative source.




Digital Image Compression Techniques


Book Description

In order to utilize digital images effectively, specific techniques are needed to reduce the number of bits required for their representation. This Tutorial Text provides the groundwork for understanding these image compression tecniques and presents a number of different schemes that have proven useful. The algorithms discussed in this book are concerned mainly with the compression of still-frame, continuous-tone, monochrome and color images, but some of the techniques, such as arithmetic coding, have found widespread use in the compression of bilevel images. Both lossless (bit-preserving) and lossy techniques are considered. A detailed description of the compression algorithm proposed as the world standard (the JPEG baseline algorithm) is provided. The book contains approximately 30 pages of reconstructed and error images illustrating the effect of each compression technique on a consistent image set, thus allowing for a direct comparison of bit rates and reconstucted image quality. For each algorithm, issues such as quality vs. bit rate, implementation complexity, and susceptibility to channel errors are considered.







The Data Compression Book


Book Description

Described by Jeff Prosise of PC Magazine as one of my favorite books on applied computer technology, this updated second edition brings you fully up-to-date on the latest developments in the data compression field. It thoroughly covers the various data compression techniques including compression of binary programs, data, sound, and graphics. Each technique is illustrated with a completely functional C program that demonstrates how data compression works and how it can be readily incorporated into your own compression programs. The accompanying disk contains the code files that demonstrate the various techniques of data compression found in the book.




Compressed Image File Formats


Book Description

Since not all graphic formats are of equal complexity, author John Miano does not simply choose a number of file formats and devote a chapter to each one. Instead, he offers additional coverage for the more complex image file formats like PNG (a new standard) and JPEG, while providing all information necessary to use the simpler file formats. While including the well-documented BMP, XBM, and GIF formats for completeness, along with some of their less-covered features, this book gives the most space to the more intricate PNG and JPEG, from basic concepts to creating and reading actual files. Among its highlights, this book covers: -- JPEG Huffman coding, including decoding sequential mode JPEG images and creating sequential JPEG files-- Optimizing the DCT-- Portable Network Graphics format (PNG), including decompressing PNG image data and creating PNG files-- Windows BMP, XBM, and GIF




Selected Papers on Design of Algorithms


Book Description

Donald Knuth's influence in computer science ranges from the invention of methods for translating and defining programming languages to the creation of the TEX and METAFONT systems for desktop publishing. His award-winning textbooks have become classics that are often given credit for shaping the field; his scientific papers are widely referenced and stand as milestones of development over a wide variety of topics. The present volume, which is the seventh in a series of his collected papers, is devoted to his work on the design of new algorithms. It covers methods for numerous discrete problems such as sorting, searching, data compression, optimization, theorem-proving, and cryptography, as well as methods for controlling errors in numerical computations and for Brownian motion. Nearly thirty of Knuth's classic papers on the subject are collected in this book, brought up to date with extensive revisions and notes on subsequent developments. Many of these algorithms have seen wide use--for example, Knuth's algorithm for optimum search trees, the Faller-Gallagher-Knuth algorithm for adaptive Huffman coding, the Knuth-Morris-Pratt algorithm for pattern matching, the Dijkstra-Knuth algorithm for optimum expressions, and the Knuth-Bendix algorithm for deducing the consequences of axioms. Others are pedagogically important, helping students to learn how to design new algorithms for new tasks. One or two are significant historically, as they show how things were done in computing's early days. All are found here, together with more than forty newly created illustrations.




Understanding Compression


Book Description

If you want to attract and retain users in the booming mobile services market, you need a quick-loading app that won’t churn through their data plans. The key is to compress multimedia and other data into smaller files, but finding the right method is tricky. This witty book helps you understand how data compression algorithms work—in theory and practice—so you can choose the best solution among all the available compression tools. With tables, diagrams, games, and as little math as possible, authors Colt McAnlis and Aleks Haecky neatly explain the fundamentals. Learn how compressed files are better, cheaper, and faster to distribute and consume, and how they’ll give you a competitive edge. Learn why compression has become crucial as data production continues to skyrocket Know your data, circumstances, and algorithm options when choosing compression tools Explore variable-length codes, statistical compression, arithmetic numerical coding, dictionary encodings, and context modeling Examine tradeoffs between file size and quality when choosing image compressors Learn ways to compress client- and server-generated data objects Meet the inventors and visionaries who created data compression algorithms







Geostatistics for the Next Century


Book Description

To honour the remarkable contribution of Michel David in the inception, establishment and development of Geostatistics, and to promote the essence of his work, an international Forum entitled Geostatistics for the Next Century was convened in Montreal in June 1993. In order to enhance communication and stimulate geostatistical innovation, research and development, the Forum brought together world leading researchers and practitioners from five continents, who discussed-debated current problems, new technologies and futuristic ideas. This volume contains selected peer-reviewed papers from the Forum, together with comments by participants and replies by authors. Although difficult to capture the spontaneity and range of a debate, comments and replies should further assist in the promotion of ideas, dialogue and criticism, and are consistent with the spirit of the Forum. The contents of this volume are organized following the Forum's thematic sessions. The role of theme sessions was not only to stress important topics of tOday but in addition, to emphasize common ground held among diverse areas of geostatistical work and the need to strengthen communication between these areas. For this reason, any given section of this book may include papers from theory to applications, in mining, petroleum, environment, geohydrology, image processing.