Climb the Steps Toward Data Quality Success
Melissa Data's Data Quality Analyst Joseph Vertido explains how Gartner defines the critical data quality steps to prevent bad data from entering your systems in the first place, and then keep it clean over time.
November 3, 2015
Sponsored Post
Full spectrum solutions work across the entire data quality lifecycle--at the point of entry--to prevent bad data from entering your systems in the first place, and to continuously monitor and update your data.
IT research and advisory firm Gartner Inc. has identified a series of critical steps to achieve full spectrum data quality success:
1. Profiling: Analysis of data to capture statistics (metadata) that provide insight into the quality of the data and aid in the identification of data quality issues.
2. Monitoring: Deployment of controls to ensure ongoing conformance of data to business rules that define data quality for the organization.
3. Parsing and Standardization: The decomposition of text fields into component parts and formatting of values into consistent layouts based on industry standards, local standards (for example, postal authority standards for address data), user-defined business rules, and knowledge bases of values and patterns.
4. Generalized “Cleansing”: This is the modification of data values to meet domain restrictions, integrity constraints or other business rules that define sufficient data quality for the organization.
5. Enrichment: Enhancing the value of internally held data by appending related attributes from external sources (for example, consumer demographic attributes or geographic descriptors).
6. Matching: The process of identification, linking, or merging related entries within or across sets of data.
Melissa Data Solutions
Melissa Data is a leading data quality vendor providing hosted data quality solutions, along with tools for technology users to consume for internal deployment in their IT infrastructure.
The company's tools are implemented in support of general data quality improvement initiatives, as well as within critical applications, such as ERP, CRM and BI. As data quality becomes increasingly pervasive, many data integration tools now include data quality management functionality, and Melissa Data is positioned to successfully partner with companies big and small.
Melissa Data's full spectrum solutions incorporate the Gartner-defined steps to empower full lifecycle data management. Our developer APIs and Data Quality Components for SQL Server Integration Services (SSIS) profile cleanse, enrich and dedupe your data so you can cut waste, drive revenue, improve business decisions and stay in touch with your best customers.
Data Quality Is a Process
It's important to keep in mind that data quality is not just a one-time execution. It is a continuous process to make sure that your data is clean and stays clean.
As you continue to get new data, new data quality problems arise. Data changes all the time, and so do the rules that govern them. This is why the maintenance and governance of data is essential to the success of data quality implementations. In order to have the best and most accurate information in your data warehouse, continuous data quality is a must.
A Data Quality Analyst at Melissa Data, Joseph Vertido is an expert in the field of data quality. He has worked with numerous clients in understanding their business needs for data quality, analyzing their architecture and environment, and recommending strategic solutions for how to successfully integrate data quality within their infrastructure. He has also written several articles for implementing data quality solutions and techniques. Joseph holds a degree in Computer Science from the University of California, Irvine.
Guest blogs such as this one are published monthly and are part of SQL Server Pro's annual platinum sponsorship.
About the Author
You May Also Like