One thing that no one considers when creating a new application is how to store data. Sure, they may define some fields and data structures (if they aren’t going to rely on Excel), but no one knows exactly how the physical information will be captured.
It’s amazing how fast new data gets captured. Servers are filled, or modified, or copied without any sense of what may be required to get that information out in a usable way. Add on a lack of consistent data entry, or even inconsistent naming conventions and you could find yourself with data that is nearly unusable.
Now try to consolidate information from many systems, with their own quirks and inconsistencies, and the problem grows exponentially. People want to sort through these massive amounts of data to create an accurate master record for client or product data, or to generate reports for spend analysis or for regulatory compliance.
But when you look at the data nothing makes sense! Data management is often considered a technical task – because the mess that has been created from years of neglect can be technically challenging to resolve.
Think about the importance of proper data capture standards from the beginning. As part of your data governance function you should identify the critical data attributes that support key business processes and ensure that data is captured with appropriate levels of quality, possibly through investing in real time data cleansing and matching capabilities.
Data standards may be the last thing on your mind when you are designing a system, but you’ll be kicking yourself in a few months when you have to go back and do what you should have done in the beginning. Set up some data standards from the start.
This post was originally published at the dataqualitymatters blog