User demand for real time insight driving demand for real time data quality

Discover the growing demand for real-time data quality and its impact on business intelligence in the job market. Explore how businesses are moving beyond traditional data integration methods and embracing real-time consolidation for better insights and decision-making.


A number of independent experts have commented recently on the need for an enterprise data quality approach that goes beyond trying to fix data as you load it into the data warehouse.

Streamline your data quality initiatives by understanding the key criteria for evaluating a data quality management tool, enabling effective data governance and decision-making processes.


Interested in the role of Business Intelligence in the job market? We answer the question: Is Business Intelligence in Demand?


An announcement by American pharmaceutical giant, Pfizer inc, that it is replacing traditional ETL with a data virtualisation approach is a response to the ongoing business need for rapid turnaround on data critical applications.  According to Pfizer BIS team lead, Micheal Linhares, traditional ETL development took too long and cost too much.

Photo by Barbara Olsen on Pexels.com

Demand for real-time consolidation of data

While not everybody is ready to give up their ETL environments, the demand for real-time consolidation of data goes beyond data virtualisation.

In her post Judgement Day for Data Quality, Forrester analyst Michele Goetz talks about other technologies, such as Hadoop processes and data appliances that create and persist new data silos that require management by BI professionals

I agree with her that these new business demands place an even stronger demand on  data quality tools to “place a higher value on governance enablement and the ability to extend sophisticated and mature processing across the entire data management spectrum.”

Characteristics of Big Data Analytics

Of course, big data analytics, as it matures, is making increasing use of these technologies to provide insight on the fly.

Big data analytics is frequently characterised by a large number of small servers, each working in parallel to process a small amount of the total volume, which must then be brought together in order to provide meaningful insight.

This real-time consolidation of vast data sets requires real-time standardisation and matching of related records in order to drive meaning.  Data validation at source must be extended to enable real-time validation of these new virtual data sources that are becoming the norm.

As data integration shifts to real-time so must data quality initiatives. In other words, Pfizer may be leading a trend beyond batch-driven data integration and data cleansing.

Navigate the complexities of the data quality landscape by understanding key factors highlighted in Choosing a Data Quality tool, ensuring a strategic approach to tool selection.

Explore the capabilities of Precisely Data Integrity suite and its role in enhancing data quality with insights from the Precisely Data Integrity suite product sheet.

———–

Is a lack of trust inhibiting adoption of AI in South Africa? We explore the AI opportunity and challenges

If you’re interested in learning more about why trustworthy analytics are so important, check out our post on Executives Want Analytics They Can Trust

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.



Related posts

Discover more from Data Quality Matters

Subscribe now to keep reading and get our new posts in your email.

Continue reading