Data Quality Requirements Analysis and Modeling (Classic Reprint)


Book Description

Excerpt from Data Quality Requirements Analysis and Modeling Computerworld, September 28, 1992, p. 80-84. The Wall Street Journal, May 26, 1992, page B6. Just like the well publicized concept of zero defect products in the manufacturing literature.high for the vital few projects whereas the trivial many defects are not worth tackling because the cure costs more than the disease (juran Gryna, In sum, when the cost is prohibitively high, it is not feasible to attain zero defect data. Given that zero defect data may not always be necessary nor attainable, it would be useful to be able to judge the quality of data. This suggests that we tag data with quality indicators which are characteristics of the data and its manufacturing process. From these quality indicators, the user can make a judgment of the quality of the data for the specific application at hand. In making a financial decision to purchase stocks, for example, it would be useful to know the quality of data through quality indicators such as who originated the data, when the data was collected, and how the data was collected. About the Publisher Forgotten Books publishes hundreds of thousands of rare and classic books. Find more at www.forgottenbooks.com This book is a reproduction of an important historical work. Forgotten Books uses state-of-the-art technology to digitally reconstruct the work, preserving the original format whilst repairing imperfections present in the aged copy. In rare cases, an imperfection in the original, such as a blemish or missing page, may be replicated in our edition. We do, however, repair the vast majority of imperfections successfully; any imperfections that remain are intentionally left to preserve the state of such historical works.







Data Quality


Book Description

Data Quality provides an exposé of research and practice in the data quality field for technically oriented readers. It is based on the research conducted at the MIT Total Data Quality Management (TDQM) program and work from other leading research institutions. This book is intended primarily for researchers, practitioners, educators and graduate students in the fields of Computer Science, Information Technology, and other interdisciplinary areas. It forms a theoretical foundation that is both rigorous and relevant for dealing with advanced issues related to data quality. Written with the goal to provide an overview of the cumulated research results from the MIT TDQM research perspective as it relates to database research, this book is an excellent introduction to Ph.D. who wish to further pursue their research in the data quality area. It is also an excellent theoretical introduction to IT professionals who wish to gain insight into theoretical results in the technically-oriented data quality area, and apply some of the key concepts to their practice.




The Practitioner's Guide to Data Quality Improvement


Book Description

The Practitioner's Guide to Data Quality Improvement offers a comprehensive look at data quality for business and IT, encompassing people, process, and technology. It shares the fundamentals for understanding the impacts of poor data quality, and guides practitioners and managers alike in socializing, gaining sponsorship for, planning, and establishing a data quality program. It demonstrates how to institute and run a data quality program, from first thoughts and justifications to maintenance and ongoing metrics. It includes an in-depth look at the use of data quality tools, including business case templates, and tools for analysis, reporting, and strategic planning. This book is recommended for data management practitioners, including database analysts, information analysts, data administrators, data architects, enterprise architects, data warehouse engineers, and systems analysts, and their managers. - Offers a comprehensive look at data quality for business and IT, encompassing people, process, and technology. - Shows how to institute and run a data quality program, from first thoughts and justifications to maintenance and ongoing metrics. - Includes an in-depth look at the use of data quality tools, including business case templates, and tools for analysis, reporting, and strategic planning.




Data Quality


Book Description

Poor data quality can seriously hinder or damage the efficiency and effectiveness of organizations and businesses. The growing awareness of such repercussions has led to major public initiatives like the "Data Quality Act" in the USA and the "European 2003/98" directive of the European Parliament. Batini and Scannapieco present a comprehensive and systematic introduction to the wide set of issues related to data quality. They start with a detailed description of different data quality dimensions, like accuracy, completeness, and consistency, and their importance in different types of data, like federated data, web data, or time-dependent data, and in different data categories classified according to frequency of change, like stable, long-term, and frequently changing data. The book's extensive description of techniques and methodologies from core data quality research as well as from related fields like data mining, probability theory, statistical data analysis, and machine learning gives an excellent overview of the current state of the art. The presentation is completed by a short description and critical comparison of tools and practical methodologies, which will help readers to resolve their own quality problems. This book is an ideal combination of the soundness of theoretical foundations and the applicability of practical approaches. It is ideally suited for everyone – researchers, students, or professionals – interested in a comprehensive overview of data quality issues. In addition, it will serve as the basis for an introductory course or for self-study on this topic.




Metrics and Models for Evaluating the Quality and Effectiveness of ERP Software


Book Description

Enterprise resource planning (ERP) is a class of integrated software that uses software technologies to implement real-time management of business processes in an organization. ERPs normally cut across organizations, making them large and complex. Software researchers have for many years established that complexity affects software quality negatively and must therefore be controlled with novel metrics and models of evaluation that can determine when the software is at acceptable levels of quality and when not. Metrics and Models for Evaluating the Quality and Effectiveness of ERP Software is a critical scholarly publication that examines ERP development, performance, and challenges in business settings to help improve decision making in organizations that have embraced ERPs, improve the efficiency and effectiveness of their activities, and improve their return on investments (ROI). Highlighting a wide range of topics such as data mining, higher education, and security, this book is essential for professionals, software developers, researchers, academicians, and security professionals.




Data Quality and High-dimensional Data Analysis


Book Description

Poor data quality is known to compromise the credibility and efficiency of commercial and public endeavours. Also, the importance of managing data quality has increased manifold as the diversity of sources, formats and volume of data grows. This volume targets the data quality in the light of collaborative information systems where data creation and ownership is increasingly difficult to establish.




Data Engineering


Book Description

DATA ENGINEERING: Mining, Information, and Intelligence describes applied research aimed at the task of collecting data and distilling useful information from that data. Most of the work presented emanates from research completed through collaborations between Acxiom Corporation and its academic research partners under the aegis of the Acxiom Laboratory for Applied Research (ALAR). Chapters are roughly ordered to follow the logical sequence of the transformation of data from raw input data streams to refined information. Four discrete sections cover Data Integration and Information Quality; Grid Computing; Data Mining; and Visualization. Additionally, there are exercises at the end of each chapter. The primary audience for this book is the broad base of anyone interested in data engineering, whether from academia, market research firms, or business-intelligence companies. The volume is ideally suited for researchers, practitioners, and postgraduate students alike. With its focus on problems arising from industry rather than a basic research perspective, combined with its intelligent organization, extensive references, and subject and author indices, it can serve the academic, research, and industrial audiences.




Measuring Data Quality for Ongoing Improvement


Book Description

The Data Quality Assessment Framework shows you how to measure and monitor data quality, ensuring quality over time. You'll start with general concepts of measurement and work your way through a detailed framework of more than three dozen measurement types related to five objective dimensions of quality: completeness, timeliness, consistency, validity, and integrity. Ongoing measurement, rather than one time activities will help your organization reach a new level of data quality. This plain-language approach to measuring data can be understood by both business and IT and provides practical guidance on how to apply the DQAF within any organization enabling you to prioritize measurements and effectively report on results. Strategies for using data measurement to govern and improve the quality of data and guidelines for applying the framework within a data asset are included. You'll come away able to prioritize which measurement types to implement, knowing where to place them in a data flow and how frequently to measure. Common conceptual models for defining and storing of data quality results for purposes of trend analysis are also included as well as generic business requirements for ongoing measuring and monitoring including calculations and comparisons that make the measurements meaningful and help understand trends and detect anomalies. - Demonstrates how to leverage a technology independent data quality measurement framework for your specific business priorities and data quality challenges - Enables discussions between business and IT with a non-technical vocabulary for data quality measurement - Describes how to measure data quality on an ongoing basis with generic measurement types that can be applied to any situation