The 2013 Information Difference Data Quality survey of 210 respondents showed that very little has changed since 2009, when the survey was first run.
In fact, the results paint quite a grim picture!
Data Quality needs attention
This year, for the first time, the survey asked about Big Data. The result: 55% of respondents believe that data quality is “very relevant” for big data projects.
Yet, as discussed in my post Governance trumps Big Data companies are still struggling to resolve small data issues.
In fact, the report hows that perceived quality levels have declined over the last four years, in spite of increased spend in this area.
Whereas the 2009 survey showed a slightly higher perception of overall data quality, the report also shows an increased use of data quality metrics to measure data quality.
it is reasonable that 2013 shows a more realistic measure based on actual measures, rather than on the guess work implied from 2009. There is clearly work to be done.
- Data quality must be measured if it is to be realistically quantified and managed.
Another reason for this could be due to the increased commoditisation of data quality as both a technology and a service. As data quality has hit the mainstream so, everybody is suddenly a “data quality expert.” Is it possible that some of these so called experts cannot deliver the desired results?
- The report shows that data quality experience is probably important from your data quality vendor.
Yet, people still struggle to build to business case for data quality – this remains the single biggest obstacle to solving data quality problems.
In most cases, this is due to our difficulty in communicating the link between data issues and business problems, as discussed in the post “Improving address accuracy” is not a business driver!