The Practitioner's Guide to Data Quality Improvement


Book Description

The Practitioner's Guide to Data Quality Improvement offers a comprehensive look at data quality for business and IT, encompassing people, process, and technology. It shares the fundamentals for understanding the impacts of poor data quality, and guides practitioners and managers alike in socializing, gaining sponsorship for, planning, and establishing a data quality program. It demonstrates how to institute and run a data quality program, from first thoughts and justifications to maintenance and ongoing metrics. It includes an in-depth look at the use of data quality tools, including business case templates, and tools for analysis, reporting, and strategic planning. This book is recommended for data management practitioners, including database analysts, information analysts, data administrators, data architects, enterprise architects, data warehouse engineers, and systems analysts, and their managers. - Offers a comprehensive look at data quality for business and IT, encompassing people, process, and technology. - Shows how to institute and run a data quality program, from first thoughts and justifications to maintenance and ongoing metrics. - Includes an in-depth look at the use of data quality tools, including business case templates, and tools for analysis, reporting, and strategic planning.




A Process View of Data Quality


Book Description

This paper is on data quality definition from a process perspective. A formal process model of an information system offers precise process constructs for characterizing data quality. With these constructs, we rigorously define the key dimensions of data quality.




Executing Data Quality Projects


Book Description

Executing Data Quality Projects, Second Edition presents a structured yet flexible approach for creating, improving, sustaining and managing the quality of data and information within any organization. Studies show that data quality problems are costing businesses billions of dollars each year, with poor data linked to waste and inefficiency, damaged credibility among customers and suppliers, and an organizational inability to make sound decisions. Help is here! This book describes a proven Ten Step approach that combines a conceptual framework for understanding information quality with techniques, tools, and instructions for practically putting the approach to work – with the end result of high-quality trusted data and information, so critical to today's data-dependent organizations. The Ten Steps approach applies to all types of data and all types of organizations – for-profit in any industry, non-profit, government, education, healthcare, science, research, and medicine. This book includes numerous templates, detailed examples, and practical advice for executing every step. At the same time, readers are advised on how to select relevant steps and apply them in different ways to best address the many situations they will face. The layout allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, best practices, and warnings. The experience of actual clients and users of the Ten Steps provide real examples of outputs for the steps plus highlighted, sidebar case studies called Ten Steps in Action. This book uses projects as the vehicle for data quality work and the word broadly to include: 1) focused data quality improvement projects, such as improving data used in supply chain management, 2) data quality activities in other projects such as building new applications and migrating data from legacy systems, integrating data because of mergers and acquisitions, or untangling data due to organizational breakups, and 3) ad hoc use of data quality steps, techniques, or activities in the course of daily work. The Ten Steps approach can also be used to enrich an organization's standard SDLC (whether sequential or Agile) and it complements general improvement methodologies such as six sigma or lean. No two data quality projects are the same but the flexible nature of the Ten Steps means the methodology can be applied to all. The new Second Edition highlights topics such as artificial intelligence and machine learning, Internet of Things, security and privacy, analytics, legal and regulatory requirements, data science, big data, data lakes, and cloud computing, among others, to show their dependence on data and information and why data quality is more relevant and critical now than ever before. - Includes concrete instructions, numerous templates, and practical advice for executing every step of The Ten Steps approach - Contains real examples from around the world, gleaned from the author's consulting practice and from those who implemented based on her training courses and the earlier edition of the book - Allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, and best practices - A companion Web site includes links to numerous data quality resources, including many of the templates featured in the text, quick summaries of key ideas from the Ten Steps methodology, and other tools and information that are available online




A Process View of Data Quality (Classic Reprint)


Book Description

Excerpt from A Process View of Data Quality We posit that the term data quality, though used in a variety of research and practitioner contexts, has been inadequately conceptualized and defined. To improve data quality, we must bound and define the concept of data quality. In the past, researchers have tended to take a product oriented view of data quality. Though necessary, this view is insufficient for three reasons. First, data quality defects in general, are difficult to detect by simple inspection of the data product. Second, definitions of data quality dimensions and defects, while useful intuitively, tend to be ambiguous and interdependent. Third, in line with a cornerstone of tqm philosophy, emphasis should be placed on process management to improve product quality. The objective of this paper is to characterize the concept of data quality from a mess perspective. A formal process model of an information system (is) is developed which offers precise process constructs for characterizing data quality. With these constructs, we rigorously define the key dimensions of data quality. The analysis also provides a framework for examining the cm data quality problems. Finally, facilitated by the exactness of the model, an analysis is presented of the interdependencies among the various data quality dimensions. About the Publisher Forgotten Books publishes hundreds of thousands of rare and classic books. Find more at www.forgottenbooks.com This book is a reproduction of an important historical work. Forgotten Books uses state-of-the-art technology to digitally reconstruct the work, preserving the original format whilst repairing imperfections present in the aged copy. In rare cases, an imperfection in the original, such as a blemish or missing page, may be replicated in our edition. We do, however, repair the vast majority of imperfections successfully; any imperfections that remain are intentionally left to preserve the state of such historical works.




Executing Data Quality Projects


Book Description

Information is currency. Recent studies show that data quality problems are costing businesses billions of dollars each year, with poor data linked to waste and inefficiency, damaged credibility among customers and suppliers, and an organizational inability to make sound decisions. In this important and timely new book, Danette McGilvray presents her "Ten Steps approach to information quality, a proven method for both understanding and creating information quality in the enterprise. Her trademarked approach—in which she has trained Fortune 500 clients and hundreds of workshop attendees—applies to all types of data and to all types of organizations.* Includes numerous templates, detailed examples, and practical advice for executing every step of the "Ten Steps approach.* Allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, and best practices.* A companion Web site includes links to numerous data quality resources, including many of the planning and information-gathering templates featured in the text, quick summaries of key ideas from the Ten Step methodology, and other tools and information available online.




Competing with High Quality Data


Book Description

Create a competitive advantage with data quality Data is rapidly becoming the powerhouse of industry, but low-quality data can actually put a company at a disadvantage. To be used effectively, data must accurately reflect the real-world scenario it represents, and it must be in a form that is usable and accessible. Quality data involves asking the right questions, targeting the correct parameters, and having an effective internal management, organization, and access system. It must be relevant, complete, and correct, while falling in line with pervasive regulatory oversight programs. Competing with High Quality Data: Concepts, Tools and Techniques for Building a Successful Approach to Data Quality takes a holistic approach to improving data quality, from collection to usage. Author Rajesh Jugulum is globally-recognized as a major voice in the data quality arena, with high-level backgrounds in international corporate finance. In the book, Jugulum provides a roadmap to data quality innovation, covering topics such as: The four-phase approach to data quality control Methodology that produces data sets for different aspects of a business Streamlined data quality assessment and issue resolution A structured, systematic, disciplined approach to effective data gathering The book also contains real-world case studies to illustrate how companies across a broad range of sectors have employed data quality systems, whether or not they succeeded, and what lessons were learned. High-quality data increases value throughout the information supply chain, and the benefits extend to the client, employee, and shareholder. Competing with High Quality Data: Concepts, Tools and Techniques for Building a Successful Approach to Data Quality provides the information and guidance necessary to formulate and activate an effective data quality plan today.




Data Science Strategy For Dummies


Book Description

All the answers to your data science questions Over half of all businesses are using data science to generate insights and value from big data. How are they doing it? Data Science Strategy For Dummies answers all your questions about how to build a data science capability from scratch, starting with the “what” and the “why” of data science and covering what it takes to lead and nurture a top-notch team of data scientists. With this book, you’ll learn how to incorporate data science as a strategic function into any business, large or small. Find solutions to your real-life challenges as you uncover the stories and value hidden within data. Learn exactly what data science is and why it’s important Adopt a data-driven mindset as the foundation to success Understand the processes and common roadblocks behind data science Keep your data science program focused on generating business value Nurture a top-quality data science team In non-technical language, Data Science Strategy For Dummies outlines new perspectives and strategies to effectively lead analytics and data science functions to create real value.




Master Data Management


Book Description

The key to a successful MDM initiative isn't technology or methods, it's people: the stakeholders in the organization and their complex ownership of the data that the initiative will affect.Master Data Management equips you with a deeply practical, business-focused way of thinking about MDM—an understanding that will greatly enhance your ability to communicate with stakeholders and win their support. Moreover, it will help you deserve their support: you'll master all the details involved in planning and executing an MDM project that leads to measurable improvements in business productivity and effectiveness. - Presents a comprehensive roadmap that you can adapt to any MDM project - Emphasizes the critical goal of maintaining and improving data quality - Provides guidelines for determining which data to "master. - Examines special issues relating to master data metadata - Considers a range of MDM architectural styles - Covers the synchronization of master data across the application infrastructure




Business Intelligence Guidebook


Book Description

Between the high-level concepts of business intelligence and the nitty-gritty instructions for using vendors' tools lies the essential, yet poorly-understood layer of architecture, design and process. Without this knowledge, Big Data is belittled – projects flounder, are late and go over budget. Business Intelligence Guidebook: From Data Integration to Analytics shines a bright light on an often neglected topic, arming you with the knowledge you need to design rock-solid business intelligence and data integration processes. Practicing consultant and adjunct BI professor Rick Sherman takes the guesswork out of creating systems that are cost-effective, reusable and essential for transforming raw data into valuable information for business decision-makers. After reading this book, you will be able to design the overall architecture for functioning business intelligence systems with the supporting data warehousing and data-integration applications. You will have the information you need to get a project launched, developed, managed and delivered on time and on budget – turning the deluge of data into actionable information that fuels business knowledge. Finally, you'll give your career a boost by demonstrating an essential knowledge that puts corporate BI projects on a fast-track to success. - Provides practical guidelines for building successful BI, DW and data integration solutions. - Explains underlying BI, DW and data integration design, architecture and processes in clear, accessible language. - Includes the complete project development lifecycle that can be applied at large enterprises as well as at small to medium-sized businesses - Describes best practices and pragmatic approaches so readers can put them into action. - Companion website includes templates and examples, further discussion of key topics, instructor materials, and references to trusted industry sources.




Corporate Data Quality


Book Description

Data is the foundation of the digital economy. Industry 4.0 and digital services are producing so far unknown quantities of data and make new business models possible. Under these circumstances, data quality has become the critical factor for success. This book presents a holistic approach for data quality management and presents ten case studies about this issue. It is intended for practitioners dealing with data quality management and data governance as well as for scientists. The book was written at the Competence Center Corporate Data Quality (CC CDQ) in close cooperation between researchers from the University of St. Gallen and Fraunhofer IML as well as many representatives from more than 20 major corporations. Chapter 1 introduces the role of data in the digitization of business and society and describes the most important business drivers for data quality. It presents the Framework for Corporate Data Quality Management and introduces essential terms and concepts. Chapter 2 presents practical, successful examples of the management of the quality of master data based on ten cases studies that were conducted by the CC CDQ. The case studies cover every aspect of the Framework for Corporate Data Quality Management. Chapter 3 describes selected tools for master data quality management. The three tools have been distinguished through their broad applicability (method for DQM strategy development and DQM maturity assessment) and their high level of innovation (Corporate Data League). Chapter 4 summarizes the essential factors for the successful management of the master data quality and provides a checklist of immediate measures that should be addressed immediately after the start of a data quality management project. This guarantees a quick start into the topic and provides initial recommendations for actions to be taken by project and line managers. Please also check out the book's homepage at cdq-book.org/