Data Quality Management: What It Means and Why It Is Important?

By | 2019-09-02T12:27:14+00:00 April 10th, 2019|

Data is the lifeblood of an organization which forms the basis for many critical business decisions. However, organizations should have an extensive process in place to ensure the viability of data, as accurate data can help deliver valuable results. Therefore, to capitalize on the explosive growth of big data, businesses need to employ a data quality framework first before they can start extracting actionable insights from information.

What is Data Quality Management?

Data quality management (DQM) refers to the set of business practices that involves employing the right people, processes, and technologies to derive actionable insights from the available information. A well-established DQM framework ensures that data quality is maintained throughout the data lifecycle.

As part of a DQM plan, users specify certain data quality checks throughout the data journey to eliminate any inconsistencies or errors and to ensure reliable data for analytics and business intelligence processes.

Common Causes of Bad Data

Research shows that 40 percent of business initiatives fail to achieve their targets due to poor data quality. Hence, it is critical for data stewards to identify the root causes of bad data quality and build a robust data profiling and validation plan to improve the accuracy of information used for decision making.

According to 451 Research, the top three reasons for poor data quality include1:

   1. Manual Data Entry

Many organizations rely on their employees to manually feed data into business systems which may lead to errors due to lack of expertise, human error, or monotonous nature of work.  Other common consequences of manual data keying include duplicate records and missing information.

   2. Data Migration and Conversion Projects

Data migration projects involve the transfer of data between different types of file formats, databases, and storage systems sources which can often lead to duplication or missing records. Moreover, migrating from a legacy information system to a new one often involves converting data into a compatible format which, if not done correctly, can result in poor data quality.

   3. Entries by Multiple Users

In many departments, multiple employees are involved in the process of handling and modifying data. This can cause discrepancies, such as different names for the same supplier. For instance, some employees might enter the supplier’s name as ‘Dell’, while others might use ‘Dell Inc.’ for the same vendor.

Benefits of Using High-Quality Data for Businesses

High-quality data has the potential to improve business operations and make them more efficient and profitable. A few benefits of ensuring high data quality at every step of the business process are:

  • Data Helps Identify New Opportunities and Improve Business Outcomes

Business decisions based on quality data are more likely to result in positive outcomes, as managers have an accurate, up-to-date, and complete picture of critical data assets. Moreover, high-quality data helps managers identify and leverage new opportunities, enabling the business to grow and stay competitive.

For example, incorrect financial information, such as overstated profits may result in misleading financial ratios, which are often used for evaluating a company’s past performance. This analysis should be based on accurate and trusted data, as it lays the foundation for many important decisions, such as potential target markets and price changes. Similarly, updated financials can help the company decide which market segments are more profitable so that managers can explore new growth opportunities in those areas.

  • Data Quality Aids Successful Data Migrations

Poor data quality is one of the reasons why data migration projects fail, as these projects involve the movement of high volumes of data in disparate formats. To ensure a high success rate, data quality rules should be used to identify and correct any errors before the migration can take place. This helps in carrying out data migration projects faster and with greater accuracy.

For instance, to create a unified repository for customer data, a company plans to move from a decentralized information storage system to a centralized one such as a data warehouse. Previously, data was manually entered by employees and had errors including duplicate records and missing information. An effective data quality tool can help the business identify those errors and correct them before migrating data into a data warehouse.

  • Ensuring Data Quality Reduces Data Processing Time and Costs

According to Gartner, poor data quality can have an average financial impact of $9.7 million per year. In addition, bad data means incorrect information is being processed which might involve rework. However, if companies make data governance a part of their overall business process, time and cost spent on rework can be minimized.

Measuring Data Quality – Assessment Metrics

Having a well-defined set of assessment metrics in place is vital for assessing the performance of an enterprise’s data quality management initiatives. It helps determine whether the data quality management strategy is bearing fruit to meet organizational goals.

Figure 1: Dimensions of Data Quality

 

Some key dimensions of data quality include:

  • Completeness indicates whether the data gathered is sufficient to draw conclusions. This can be assessed by ensuring that there is no missing information in any data set.
  • Consistency ensures that data across all the systems in an organization is synchronized and reflects the same information. An example of consistent data includes recording the shipment date in the same date format as in a customer’s information spreadsheet.
  • Accuracy implies whether the data that has been collected accurately represents what it should. This can be measured against source data and validated against user-defined business rules.
  • Timeliness means that the data is available as and when expected to facilitate data-driven decision making. Many businesses are leveraging tools that support real-time data integration to gain up-to-date business knowledge.
  • Uniqueness involves making sure that there are no duplicates present in the data. For example, the lack of unique data can result in multiple emails being sent to a single customer due to duplicate records.
  • Validity measures whether the data meet the standards or criteria set by the business user. For instance, a business can place a data quality check on the order quantity field, i.e., ‘Order Quantity >= 0’ as negative order quantity implies invalid information.

Selecting the Right Data Quality Management Tool

Data drives decision-making, and therefore, managing data quality has become a top priority for businesses. However, managing data quality manually can be error-prone and time-consuming due to increased data volumes and disparity. This is where DQM tools come into play.

Here are some important factors that businesses should consider when selecting the right data quality tool:

  • Data Profiling and Cleansing Functionality

An effective data quality tool should include data profiling features, which can automate the identification of metadata and provide clear visibility into the source data to identify any discrepancies.

Moreover, data cleansing capabilities in a data quality tool can help prevent errors and resolve them before data is loaded onto a destination.

  • Data Quality Checks

Data quality checks are objects or rules that can be integrated into the information flow for monitoring and reporting any errors that may occur while processing data. They ensure that the data being processed is validated based on defined business rules to ensure data integrity.

  • Data Lineage Management

Data lineage management helps control and analyze the flow of information by describing the data origin and its journey, such as the steps at which the data was transformed or written to the destination.

  • Connectivity to Multiple Data Sources

With the increasing variety and number of data sources, it has become crucial to assess and validate internal and external data sets. Businesses should select DQM tools that offer support for data in any format and complexity, whether structured or unstructured, flat or hierarchical, legacy or modern.

Creating a Centralized DQM Strategy

Ensuring data quality is an on-going process, which evolves with the changing needs of the organization. This means organizations must have a centralized DQM strategy with a robust framework to address the data quality challenges and reap the benefits of high-quality data.

The steps for creating a centralized data quality management strategy include:

  • Define the key success objectives for the data quality program

This involves defining the data quality metrics, such as data-to-errors ratio and percentage of blank records. This provides users a clear understanding of the data that is being analyzed and the dimensions, including completeness, uniqueness, accuracy, etc., that will be used to assess data integrity.

  • Communicate the DQM plan organization-wide

Ensuring data quality is the responsibility of all information stakeholders, including data architects, business analysts, and IT. Hence, employees should know the expected data quality levels, business benefits of the set data quality standards, and assessment metrics for smooth implementation of the DQM strategy.

  • Assess incoming business data against the set data quality metrics

Ensuring data quality is easier with an advanced data quality tool as it enables users to define data quality rules and assess incoming data based on the predefined criteria.

  • Analyze data quality results and identify the root causes of bad data

Once the data has been processed, users can evaluate the quality of data and identify the reasons for flagged records. For instance, the screenshot below shows that one of the records was erroneous because of the incorrect email address.

Figure 2: Flagged Records

  • Monitor and adjust the data quality workflows based on changing data needs

Users must verify the data quality workflows at periodic intervals to ensure that the data quality rules are in sync with the overall business goals. This also includes taking necessary actions to improve data quality standards based on prior results.

Ensuring Data Quality with Astera Centerprise 

Astera Centerprise is an end-to-end data management solution that enables businesses to accomplish complex data integration tasks while ensuring data quality. The advanced data profiling and data quality capabilities allow users to measure the integrity of critical business data, speeding up data integration projects in an agile, code-free environment.

Want to find out how Centerprise can aid successful data quality management? Download the free trial version and experience for yourself!

1 https://siliconangle.com/files/2016/01/Blazent_State_of_Data_Quality_Management_2016.pdf