The data quality cook book


“If you don’t use good ingredients, the outcome is never going to be excellent. But if you get the freshest ingredients that are in season, at their peak, and you cook with them, you can’t really go wrong.”

Those are the words of famed culinary expert Gail Simmons, and if you’ve ever spent a couple of hours slaving away over a hot stove you will surely agree – good ingredients are way more important than a good recipe.

Yet, in the world of information technology this is a lesson that we seem to forget. We spend hundreds of millions providing more infrastructure, better systems, new functionality, more controls, better business processes – and still we struggle to meet the needs of business.

Could this be due to the reality that the underlying ingredients, the information and data captured, manipulated and exploited within these systems is of poor quality?

Poor data quality is a symptom of broken business processes. For example, if the customer billing address is frequently not captured correctly this indicates that the process for capturing required customer information is flawed.

In many cases, data capture tasks are performed by relatively junior staff, who do not have any understanding of the multiple purposes for which their data will be used. If these staff are measured, as is typically the case, on through put then quality will suffer.

Automated validations can help but in many cases they are bypassed in order to achieve processing targets. For example, a drop down list of Street Names may take many minutes to populate, resulting in unacceptable delays, or may not have all possible street names, forcing the user to capture invalid data.

Automated validations need to support existing processes, provide flexibility to support exceptions to the rule, and happen quickly and with minimal impact. Enterprise Data Quality platforms are specifically designed to enable this kind of solution, across multiple systems and data elements.

Automation should be supported by a data governance approach that incorporates a clear understanding of how data will be used and communicates this to data capture staff. This approach gives staff the flexibility to handle exceptions with an understanding of the implications, and not be bound purely by habit or the constraints of systems.

Data quality metrics should be used to ensure that exceptions fall within acceptable limits and can provide an alternative, more useful, measure of staff performance than pure through put.

Ultimately, the best meals come from a combination of good ingredients, a good recipe and a little ingenuity from the cook. The best customer service, the most efficient operations and the best decisions come from a similar combination – good data quality, good systems and motivated, involved staff.

Advertisements

2 thoughts on “The data quality cook book

  1. Pingback: What is the best way to manage metadata? | Data Quality Matters

  2. Pingback: 2013 in review – what topics trended | Data Quality Matters

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s