Poor data quality costs companies billions annually. It is a major inhibitor to accurate customer service and analytics, creates unnecessary costs in Procurement and the Supply Chain, and is the cause of unreliable reports and analytics.
Poor Data Quality is recognised as a significant contributor to IT project failure. Master Data Management projects fail due to a lack of Data Governance and poor data quality – most successful MDM projects show that at least 30% of the time and budget went to resolving data quality issues. ERP and CRM projects hit significant hurdles at the point of data take on – frequently requiring significant rework to address data issues causing projects to run significantly late and over budget.
The problem can be addressed. An enterprise approach, combining methodology and automated tools, is required. The recent Bloor Research whitepaper, The Importance of Data Quality (and why it shouldn’t be just a tick box item) discusses why a proper investigation of data quality options is important – not all tools are created equal and simply taking a tool because it is bundled with your ETL or MDM solution is fraught with risk. I recommend that you perform an on site proof of concept, with your data, where you can evaluate just how easily the tools selected will handle your unique business requirements. Over a day or two stronger tools should show significant results.
Your partner should also have a focus on Data Governance and Data Quality. In many cases, data quality solutions are provided as add ons by vendors focussing on ETL or other technical aspects. These vendors may not have developed the specialist methodologies and subject matter expertise required to manage data at a content level. Data Quality requires a combined business and technical approach that places it at a different level to pure data integration projects.