One thing that is constant in IT is change. Companies continously adapt or replace existing IT systems and applications in order to meet the changing requirements of business.
Yet one thing that remains constant is the underlying data – typically this moves from application to application without any changes.
Two key issues that arise from this are:
1.) A persistance of “stale” data that must be filtered in order to use the new system effectively.
2.) Non-compliant data. For example, the new system applies a validation routine to new data captured but the old data (brought across as part of the migration from the previous system) is not compliant with the new validation rule and does not support the underlying business process.
If data persists across the life of the architecture then why is so much money spent on replacing systems, but so little money spent on managing data? Data has historically been an after thought – the assumption from business is that the systems put in place would automitically ensure valid data.
All the evidence shows that this is simply not true. In most organisations, the part of the company responsible for collecting, storing and extracting data are seperate from the part resposnsible for using the data. This means that data requirements have historically not been well understood and, as such, systems have not catered for them.
Lean data governance, coupled with enterprise data quality, can extend the life of applications and drive value. If data is the one persistent element of your IT infrastructure surely it makes sense to manage this with a long term view?