Is poor data quality like an oil spill?


Business leaders and technologists alike have latched on to information and, in particular, Big Data, as the next big thing with the phrase “Data is the new oil” popping up all over the place.

These comments are based on the simple fact that information volumes are growing exponentially, particularly in the realm of social media and the Internet.” Surely!”, the pundits claim. “Surely, with all this insight out there it must be possible to drive massive commercial benefits – for marketing, for sales, for Research and Development.”

In practice, poor data quality already has a significant impact on business’ ability to function. A simple example, many corporations still function as groups of business units – each with their own systems, data stores and processes. Customer master and behavioral records can be split across business units, product systems or accounts.

This kind of siloed approach makes it difficult, if not impossible, to gain an accurate, enterprise level view of customer spending habits, total value, or, in many cases, even a consistent view of standard information such as telephone number or address, as clients may update one system with more recent details but neglect to update others.

Small wonder, then, that companies struggle to gain an accurate understanding of customer behavior, in spite of millions spent on business analytics and reporting tools. Adding large volumes of poor quality information to this mess exacerbates the problem!

In order to drive value from Big Data companies must achieve two key data quality goals. Firstly, companies must identify relationships between client data held internally, in traditional data stores such as the CRM or Billing systems, and client data held in Big Data sources such as Social Media. For example, short term insurance companies can use driving patterns drawn from vehicle tracking systems, or mobile telephones, to better assess the risk profiles of individual drivers – but only if the link between the driver and the vehicle is clearly understood.

Secondly, companies must cut volumes down to manageable sizes by filtering poor quality and irrelevant sources from their feeds before these enter the organization,or risk overwhelming management and storage costs. The same insurance company should not be interested in the driving behaviour of an uninsured third party, for example.

Historically, companies have struggled to manage data quality, even without the complexities added by trendy, new data sources. An enterprise Data Governance strategy should include a measurable enterprise Data Quality plan, in order to minimise the impact of Big Data on analytics and operations.

Big Data adds an additional challenge – companies must ensure that the goal of increased insight does not put them in breach of privacy legislation. Increasingly, global legislation is intended to protect the rights of individuals, and companies, from the unauthorized use of personal records for purposes for which it was not intended. Companies should ensure that their Data Governance programs are designed to ensure both the appropriate use of information, as well as the quality there of.

Good quality enterprise data is an essential building block for successful analytics, whether for Big Data or otherwise.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s