Harness the power of Data Quality Audits and Assessments to evaluate data integrity and ensure compliance with industry standards and regulations
Introduction
Faced with a daunting challenge of a dead 14-ton whale decomposing on their beach, a town council once devised what they believed to be a brilliant plan.
Their solution? Load the carcass with dynamite and blow it up, hoping it would eliminate the problem. Little did they know that this ill-conceived plan would only lead to the dispersal of rotting whale meat over a vast area, resulting in an overwhelming cleanup task that persisted for years.

The Pitfall of Overwhelming Metrics
When it comes to establishing metrics for data governance organizations, it may initially appear advantageous to apply every possible metric to all data.
After all, the adage “if you can’t measure it, you can’t manage it” holds some truth.
However, if metrics are not carefully considered and applied to meaningful data sets, you run the risk of metaphorically “blowing up the whale” for your data governance initiative. This refers to reporting an extensive and ultimately irrelevant array of exceptions.
The Importance of Meaningful Metrics
For instance, imagine a business that requires the date of birth field to be populated in compliance with local legislation.
Conducting a completeness test for empty “Date of Birth” across ten million records in your Client database may yield three million exceptions. Each of these exceptions would need to be addressed by the Client Relations team.
However, upon delivering the report, you realize that the business only has one million active clients, rendering two million exceptions irrelevant.
Furthermore, the one million records requiring action become obscured amidst the overwhelming amount of data, likely resulting in them being overlooked. This can be demoralizing for the business and give the impression that the problem is either too vast or inadequately defined to be resolved.
Crafting Effective Data Quality Metrics
It is crucial to comprehend that data quality metrics should serve to manage performance and measure the improvement of data quality. To ensure the efficacy of your metrics, it is important to seek answers to key data governance questions such as:
- Who will be responsible for owning this metric?
- How will we define and measure success?
- What approach will we adopt to track improvement?
- Is this metric truly relevant to our goals?
- Can the identified issues be addressed effectively?
Seeking Professional Support
Setting appropriate metrics represents a critical success factor for any data governance or data quality initiative. To ensure optimal outcomes, consider enlisting the support of experienced consultants who specialize in this domain.
Conclusion
Data quality metrics are essential tools for managing and improving data integrity. However, it is crucial to avoid overwhelming stakeholders with excessive and irrelevant metrics that can hinder progress. By carefully crafting meaningful metrics and seeking professional guidance, you can avoid the pitfalls of “blowing up the whale” and drive successful data governance initiatives.
Understand why data quality metrics and expertise are indispensable for unlocking the potential of big data, as outlined in why data quality metrics and expertise are crucial to big data
Understand the critical role of data quality audits in ensuring the accuracy and reliability of reference data sets with insights from why data quality audits are critical for reference data sets

Leave a comment