A series of posts by Henrik Liliendahl Sørensen, most recently this one, have posed questions to the common definition of quality data as being data “that is fit for purpose.”
Henrik’s concern is that this definition may allow data to be of acceptable quality for one application, but may not be easily adapted to meet additional purposes at a later date. At some point, he suggests, it will become more cost effective to simply model the real world object, rather than have to constantly adapt it to address new uses.
“Fitness for purpose” is a legal term prescribing that something is good enough to do the job it is intended to do. The principle of “good enough” is critical to our goal of managing data by value – it is not cost effective to over-engineer a solution.
If we are building a house, it is reasonable to agree that it should be fit for purpose – have a bathroom, a kitchen, bedrooms, be water tight, heated, have running water, etc. Depending on the size of my family, my budget and my specification the final building may vary from person to person.
But in very few cases would the modern family home include a manufacturing centre or a surgery. It would not be pragmatic to plan for every possible use of a building, nor would it be cost effective.
Similarly, when working with data we need to balance what is cost effective against what is necessary.
Where I do agree with Henrik is that poorly thought, tactical solutions may not scale to meet enterprise needs. At some point, as Henrik suggests, a break even point may be reached where an enterprise view of data may require a redesign of tactical data solutions – this does not conflict with the “fit for purpose” definition.
A value driven approach to data management seeks to leverage multiple uses of data across the enterprise to minimize costs and maximize reuse. Data governance principles can be used to plan projects to meet tactical goals cost effectively without compromising on the ability to address additional goals later.
But ultimately, the quality of data is in the eye of the beholder. There is no global benchmark to reach towards – if its good enough for your requirement then it is of good quality!
This post was first published on the dataqualitymatters blog!