The 1-10-100 rule

1-10-100 rule

The 1-10-100 rule is a quality management concept developed by G. Loabovitz and Y. Chang that is used to quantify the hidden costs of poor quality.

When relating the concept to data quality it must be recognized that the principle, rather than the exact numbers will apply.

So how does it work?

The 1-10-100 rule refers to the hidden costs of waste associated with poor quality.

Join 1,701 other followers

Remediation costs more than prevention

The principle suggests that the cost of fixing bad data is an order of magnitude greater that the cost of stopping it from entering the system.

These costs may be obvious – we may set up back office teams that are responsible for validating and correcting errors in created in the front office. In effect we are spending money to capture data twice.

Failure costs more than remediation

Yet, the costs of remediation pale into insignificance when compared to the costs of leaving bad data in place.

Poor quality data impacts our ability to operate. If we invoice the incorrect amount then we don’t get paid.

If we deliver to the wrong address then we have to pay for another delivery.

If we provide the wrong risk assessment then we increase our chance of a bad debt.

Our focus should be on prevention

Far too many data quality initiatives are focused on remediation after the fact.

What is your company doing to stop bad data from entering your systems?

7 thoughts on “The 1-10-100 rule

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.