Measuring Data Quality for Ongoing Improvement


Book Description

The Data Quality Assessment Framework shows you how to measure and monitor data quality, ensuring quality over time. You'll start with general concepts of measurement and work your way through a detailed framework of more than three dozen measurement types related to five objective dimensions of quality: completeness, timeliness, consistency, validity, and integrity. Ongoing measurement, rather than one time activities will help your organization reach a new level of data quality. This plain-language approach to measuring data can be understood by both business and IT and provides practical guidance on how to apply the DQAF within any organization enabling you to prioritize measurements and effectively report on results. Strategies for using data measurement to govern and improve the quality of data and guidelines for applying the framework within a data asset are included. You'll come away able to prioritize which measurement types to implement, knowing where to place them in a data flow and how frequently to measure. Common conceptual models for defining and storing of data quality results for purposes of trend analysis are also included as well as generic business requirements for ongoing measuring and monitoring including calculations and comparisons that make the measurements meaningful and help understand trends and detect anomalies. - Demonstrates how to leverage a technology independent data quality measurement framework for your specific business priorities and data quality challenges - Enables discussions between business and IT with a non-technical vocabulary for data quality measurement - Describes how to measure data quality on an ongoing basis with generic measurement types that can be applied to any situation







Data quality assurance. Module 3. Site assessment of data quality


Book Description

This publication is one of the three module toolkit and provide technical guidance and tools to support the work on strengthening data quality in countries. This is part of the Division of Data, Analytics and Delivery for Impact’s scope of work providing normative guidance for health information system strengthening.







Assessing the quality of agricultural market information systems: A self-assessment guide


Book Description

Over approximately the past 40 years, many developing countries invested in the establishment of agricultural market information systems or services (MIS). These systems or services were initially run by government agencies, but since the turn of the millennium private organizations have shown interest in providing data on a commercial basis. To date, however, these private services, while usually being more efficient than the government-run ones, have also largely depended on donor support for their continued operation. It has proved difficult to develop a profitable business model as many of the clients are small farmers and traders. Agricultural market information systems or services (MIS) can cover staples, horticultural crops, livestock, and export commodities. They are generally designed to collect, process, and disseminate or distribute data of relevance to farmers, traders and other buyers, such as processors, but the data they generate can also be used for a variety of purposes by governments, donors, international organizations and others.




Consolidated guidance on tuberculosis data generation and use. Module 1. Tuberculosis surveillance


Book Description

Since 1995, WHO has ensured a consistent approach to national, regional and global TB surveillance by providing standardized definitions, forms and registers for the recording and reporting of individual-level and aggregated data about people diagnosed with and treated for TB, which are used worldwide. This standardization has facilitated the regular reporting of TB data to WHO from 215 countries and areas in annual rounds of global TB data collection, with findings published in an annual WHO global TB report since 1997 and data made publicly available via the online WHO global TB database. The goal of this 2024 edition of WHO guidance on TB surveillance (following the last major update published in 2013) is to ensure the continued worldwide standardization of TB surveillance, in the context of the WHO End TB Strategy, the latest WHO guidelines on TB screening, prevention, diagnosis and treatment, and commitments made at the 2023 UN high-level meeting on TB, while also promoting the establishment or strengthening of digital, case-based TB surveillance that is integrated within the overall public health architecture. This 2024 edition provides a comprehensive and consolidated package, bringing together both updated guidance as well as (within web annexes) closely related WHO products, tools and documentation related to TB surveillance. The web annexes (and associated links to them) are listed below. The package was informed by (and includes a summary of) lessons learned about TB surveillance during more than 100 national TB epidemiological reviews conducted since 2013.




Data Quality


Book Description

Poor data quality can seriously hinder or damage the efficiency and effectiveness of organizations and businesses. The growing awareness of such repercussions has led to major public initiatives like the "Data Quality Act" in the USA and the "European 2003/98" directive of the European Parliament. Batini and Scannapieco present a comprehensive and systematic introduction to the wide set of issues related to data quality. They start with a detailed description of different data quality dimensions, like accuracy, completeness, and consistency, and their importance in different types of data, like federated data, web data, or time-dependent data, and in different data categories classified according to frequency of change, like stable, long-term, and frequently changing data. The book's extensive description of techniques and methodologies from core data quality research as well as from related fields like data mining, probability theory, statistical data analysis, and machine learning gives an excellent overview of the current state of the art. The presentation is completed by a short description and critical comparison of tools and practical methodologies, which will help readers to resolve their own quality problems. This book is an ideal combination of the soundness of theoretical foundations and the applicability of practical approaches. It is ideally suited for everyone – researchers, students, or professionals – interested in a comprehensive overview of data quality issues. In addition, it will serve as the basis for an introductory course or for self-study on this topic.




Non-Invasive Data Governance


Book Description

Data-governance programs focus on authority and accountability for the management of data as a valued organizational asset. Data Governance should not be about command-and-control, yet at times could become invasive or threatening to the work, people and culture of an organization. Non-Invasive Data Governance™ focuses on formalizing existing accountability for the management of data and improving formal communications, protection, and quality efforts through effective stewarding of data resources. Non-Invasive Data Governance will provide you with a complete set of tools to help you deliver a successful data governance program. Learn how: • Steward responsibilities can be identified and recognized, formalized, and engaged according to their existing responsibility rather than being assigned or handed to people as more work. • Governance of information can be applied to existing policies, standard operating procedures, practices, and methodologies, rather than being introduced or emphasized as new processes or methods. • Governance of information can support all data integration, risk management, business intelligence and master data management activities rather than imposing inconsistent rigor to these initiatives. • A practical and non-threatening approach can be applied to governing information and promoting stewardship of data as a cross-organization asset. • Best practices and key concepts of this non-threatening approach can be communicated effectively to leverage strengths and address opportunities to improve.




Data quality assurance. Module 2. Discrete desk review of data quality


Book Description

This publication is one of the three module toolkit and provide technical guidance and tools to support the work on strengthening data quality in countries. This is part of the Division of Data and Delivery for Impact's scope of work providing normative guidance for health information system strengthening.




Assuring Data Quality at U.S. Geological Survey Laboratories


Book Description

The U.S. Geological Survey (USGS) mission is to provide reliable and impartial scientific information to understand Earth, minimize loss of life and property from natural disasters, and manage water, biological, energy, and mineral resources. Data collection, analysis, interpretation, and dissemination are central to everything the USGS does. Among other activities, the USGS operates some 250 laboratories across the country to analyze physical and biological samples, including water, sediment, rock, plants, invertebrates, fish, and wildlife. The data generated in the laboratories help answer pressing scientific and societal questions or support regulation, resource management, or commercial applications. At the request of the USGS, this study reviews a representative sample of USGS laboratories to examine quality management systems and other approaches for assuring the quality of laboratory results and recommends best practices and procedures for USGS laboratories.