Is data quality giving you headaches


The digital revolution has given rise to a literal explosion of data,  as the volume of information shared, distributed and stored increases  exponentially on a daily basis. An ever-growing list of regulatory compliance adds to the burden, generating masses of additional data  that needs to be managed. This is further complicated by emerging  trends such as big data, cloud data, change data capture, data  integration, data analysis and so on. To add to these challenges,  businesses need to manage a data avalanche on an ever-tightening  budget, and there’s really no wonder that data has become the source  of so many business woes, causing the proverbial headache.

Compliance generates reams of data that need to be managed, adding  complexity onto an already overwhelming data environment. The reality is that compliance is a burning pain for organisations across all  industries and sectors, with a long and continuously growing list of compliance legislation and guidelines including: Sarbanes Oxley; King III; The Protection of Personal Information Bill (POPI); Basel II and  Basel III; Solvency II; the Consumer Protection Act (CPA); the  Financial Intelligence Centre Act (FICA), the Regulation and Interception of Communications Act (RICA), Legal Entity Identifier  (LEI), Public Finance Management Act (PFMA) and the latest Foreign Account Tax Compliance Act (FATCA).

A report recently issued by Aberdeen Research indicates that almost half of finance employees are “challenged by the fact that their organizations are leveraging risk and compliance data in different formats, making it difficult to compare data.”  According to the report, complying with regulations is a key concern for CFOs. And a distressing number of respondents indicated that the existing IT  infrastructure is lacking in the advanced capabilities needed to support governance, risk and compliance (GRC) initiatives.

This is far from the end of the problem, however. Gartner predicts further headaches, stating that “by 2016, 20% of CIOs in regulated industries will lose their jobs for failing to implement the discipline of information governance successfully”. Gartner also recommends that these regulated businesses invest in information archiving technology in order to bring data under control.

While Business Intelligence (BI) is touted as the answer to help businesses make sense of their data avalanche, the reality could not be further from the truth. BI has become yet another headache for organisations, along with Master Data Management (MDM), Enterprise Resource Planning (ERP), Customer Relationship Management (CRM), Extract Transform Load (ETL) and a host of other tools. Despite the lauded ability of BI to help businesses improve decision making, the majority of BI implementations fail or never get off the ground.

One of the main reasons for this failure is the fact that the underlying data is fragmented, duplicated, inaccurate, irrelevant and outdated. While BI has the potential to deliver fast, reliable information and ‘intelligence’ along with a single version of the truth, this entire house of cards is balanced on one single point of  failure – the quality of the data.

Maintaining accurate data, which in turn facilitates compliance and paves the way for successful BI and MDM, is key in providing the cure for data headaches. However, this typically proves to be a major challenge, mainly due to inefficient processes or reliance on inappropriate technology. Various people may have different ways of documenting the same information and without processes and tools in place to ensure standardisation, this leads to duplication and inaccurate data.

Fortunately there is a remedy at hand. Data governance – a combination of disciplines, improved processes and the right technology. If pragmatically applied data governance should assist business to cut through the overload and identify and address the critical data issues that will drive the biggest returns, resulting in clean data that deliver results and information that is accurate. Quality data delivers insight based on fact rather than guesswork, addressing myriad compliance regulations and is the key to successful BI, MDM, CRM and back office implementations.

Solving the data quality problem will provide a painkiller for the majority of data headaches.

This post was originally published on the dataqualitymatters blog

Advertisements

One thought on “Is data quality giving you headaches

  1. Pingback: Testing CurationSoft

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s