Various sources cite poor data quality as a significant contributor to IT project failures – and it is also recognised to create inefficiencies and associated hidden costs across all business functions.
Yet, for most organisations, it is difficult to articulate how improvements to data quality have saved money and improved business process efficiency. For these reasons, data quality projects may struggle to compete for budget with the countless other projects, such as Data Warehousing, ERP and CRM that IT must deliver.
In many cases these projects are delivered with no focus on data, leading to project failures or a failure to achieve the expected business returns.
A data quality dashboard can be defined as as a visually representative report to define, measure and trend the consistency and quality of data over time, against business defined key performance indicators.
So, who needs a data quality dashboard?
While banks and listed companies are required by legislation (e.g. Basel II, SOX) to measure data quality other organisations can also benefit from a clear understanding of how data is impacting key metrics.
Any organisation undertaking a large IT implementation should have an understanding of the current data state, and manage the project risk by taking appropriate steps to standardise, clean and de-duplicate the existing data before migrating to the new application.
For example, in a Gartner report entitled “Nine Fatal Flaws in BI Implementations” data quality issues were specifically mentioned – people will not use BI applications that are founded on irrelevant, incomplete or questionable data. Gartner recommends that automated controls be established that identify and block low quality data from entering the BI platform.
A data quality dashboard, combined with a business user oriented data profiling tool to easily and accurately measure data quality, provides a low risk solution to achieve this goal as it allows rapid deployment and supports an integrated business and IT approach to creating the critical metrics.
In addition supporting operational data quality improvements, data quality plays a key role in creating trust in insights driven from data. Decision makers can use data quality scorecards to gauge the level of confidence that they should have in any report, machine learning model or data driven recommendation.
Driving a return on you data quality dashboard
Like any report, the Return on Investment of a data quality dashboard can be linked to the insights that it delivers and the actions that are taken, against the cost of creating and maintaining the dashboard.
Ultimately, the dashboard should help us to correct underlying data quality issues that have real business costs, and measure the success of preventive measures taken.
One key recommendation is to dashboard data quality baselines – measuring trends that show how data quality is improving, or declining, over time, and linking these to business goals and outcomes that have a quantifiable cost or saving.
In many companies data quality efforts are absorbed into standard operational costs with no real understanding of the amount of wasted effort and rework required.
For example, how many of the standard daily, weekly or monthly reports require significant manual data cleansing effort to complete? Expensive IT staff may spend days monthly tweaking data, often in inconsistent ways, to provide some kind of meaningful result.
How often are problems discovered at the last minute, requiring reports to be recalculated and reconciled? Is manual reconciliation accurate, or even possible?
In many cases, data management issues such as these take up significant amounts of time for both IT and business staff, but are neither recognised nor measured.
Nor should one fall in to the trap of reinventing the wheel.
Measuring data quality statistics and delivering dashboards can become a complex problem if approached as a business intelligence problem.
Data issues may not easily be uncovered using query based approaches – as the analyst has to know what to look for.
Data profiling tools such as Trillium Discovery help your business to rapidly uncover the hidden data quality insights in your data. By simplifying the data discovery process your staff can quickly define custom metrics and deliver these into the reporting platform of your choice.
The ad hoc reports driven by Discovery provide actionable insights for tactical improvements and, as a hidden benefit, the insights uncovered by Discovery can assist your business stakeholders to define the formal data quality metrics that you need to achieve your goals.
Ultimately, data quality metrics can help to baseline and drive your data governance program and ensure practical outcomes.
Contact us to set up a free discussion about your data quality program, and how we can help