The most dangerous disconnect in most IT projects are those we don’t realise we have. They are caused by assumptions that, because we use the same terms, we mean the same thing.
In many cases, terms that are used are, at best, loosely understood and, at worst, may have very different meanings for different business uses. For example, at one of our clients the term Recourse Business Unit is used by almost the entire business to identify the business unit that KYCd the client. For the credit risk team, however, this term is used to describe the business unit that is carrying the credit risk – in many cases a different BU.
This really becomes a problem when delivering enterprise wide projects, such as master data management or ERP, or when trying to reuse tactical solutions in other areas to reduce overall IT spend.
This problem is the driver for the various metadata initiatives that are being kicked off in some organisations. As discussed in my earlier post, I don’t like the word metadata. What is metadata anyway? In itself the term is ambigous and can lead to its own levels of confusion.
Some examples of critical business metadata may include:
A business terms dictionary – what do we mean when we talk about Contracts, or Profit? Is there a standard enterprise understanding or calculation – or do we need to cater for departmental uses? These are key questions that should be addressed by the data goverance team (if you have one) or by the project team if you don’t.
A data dictionary of key attributes holding information such as their Use, their lineage, their owner, as well as technical data quality rules such as whether they are required, or should be unique. Of particular interest, in my opinion, is the source of data. Do all users recognise the same source or do you have multiple versions of the truth creating inconsitency for key reports or metrics?
A set of data governance or data quality rules such as “A supplier record must include valid banking details to facilitate payment.” These are necessary in order to understand whether data is fit for purpose. They are also key indicators of broken business processes and for root cause analysis.
Technical metadata may include objects such as data flows and data models.
Organisations should focus on key business processes to identify which data elements require attention and documentation. Any organisation will be overwhlemed by the volumes and complexity if they try to address everything at once. Focus on the key processes and this will limit your problem to a small number of key indicators and a manageable number of data attributes and systems.
Finally, be cautious of significant investments in passive metadata – i.e. documentation that cannot be reused and will quickly get out of date. Where possible. metadata should be used actively – for example, data quality rules implemented in a data quality platform can provide dashboards and trending on the fitness for purpose of real data that can be adapted to address changes and can be acted upon to improve data quality.
So what is brown and sticky? Well I don’t know what you were thinking but the answer is a stick.
This post was originally published on the dataqualitymatters blog