The Seven Habits of Highly Effective Data Quality

May 20, 2011
By William Sharp

7 Habits of Highly Effective Data Quality

I’ve been reading Stephen Covey’s The 7 Habits of Highly Effective People and I couldn’t help but notice the parallels between effective people and effective data management.  In the book Covey discloses that there are principles, centered on self-discipline, that lead to success and fulfillment.  Sounds great, right?

The seven habits include some ear-cringing buzz words, but let’s take a look at them and their data quality doppelgänger.

Be Proactive

For years data quality has been a discipline striving to transform itself from reactive to proactive. In fact, the ROI in data quality programs centers on being more proactive to avoid regulatory issues and costs and improving decision making.  It’s an understatement to say that data quality programs need to be focused on taking the initiative and become proactive programs of change.

Proactive data quality means identifying and remediating data quality issues before they become proliferated throughout the enterprise.  Simply put, proactive data quality is about having identification and remediation processes at data entry points and addressing issues at the source.

Begin with the end in mind

Beginning with the end in mind brings a smile to my face.  This was practically the title of one of my first posts for this blog.  Without knowing where you need to end, your route to that end will almost inevitably be scattered and twisted.  For it is only by setting a clear destination that a clear path can be developed.  Often, in the world of data quality, setting a destination focuses on developing metrics and targets that will bring about positive change in the organization.

Put first things first

Putting first things first is about setting priorities and building a course of action(s) that will address the prioritized list of objectives.  In others words, don’t focus on everything all at once but rather break down large tasks into smaller more achievable parts.  This is often useful when developing and implementing data quality programs because there are so many moving parts that need to be put in place simultaneously.

Think Win-Win

Win-wins in the data management / data quality arena are all about implementing rules that help multiple business units improve their data and its use.  There are some easy domains where one data quality service equates to a win-win.

Address validation is a prime example of the win-win scenario.  Every business unit benefits from more accurate customer addresses.  Implementing address validation processes can be orchestrated in such a way that the process can accept different address sources and implement the same validation routines.  Not only is this a win-win, it also cost effective and generates a high rate of return on investment.

Seek First to Understand, Then to Be Understood

This one is pretty straight forward.  Data quality / data management is all about solving problems and building effective change.  You can’t be affective at solving a problem without first knowing what it is.  A more subtle point I’d like to make here is that all too often there is a tendency in the technology field to explain the intricacies of the solution.  Frankly, business people don’t care how you solve the issue just that you do solve it accurately.  Only understanding issue ensures that you can do this.


Cringe!  Worst buzzword ever?  Maybe.  In essence synergy means bringing together a whole that is greater than a sum of its parts.  As described in the win-win section, synergies in data management / data quality are largely derived from building a solution that works for multiple business units in such a way that they produce a benefit greater than if the solution was only built for one unit.

That said, building a solution that “chains” several beneficial processes together like address validation and duplicate reduction can also be thought of as a way of bringing together a whole greater than the sum of its parts.

Sharpen the Saw

My personal favorite!  Sharpening the saw has to do with the continuous process of developing skills.  In part due to the wide range of data quality modules, there is always a need to sharpen the saw.  For example, I am currently working on expanding my ability to produce more accurate matching techniques so I can be sure that I identify true duplicates and produce the minimal amount of false positives.  In addition, I am always searching for more knowledge on address validation techniques.

Sharpening the saw with regard to data quality processes is a way to revisit the existing solution and make it better.  This is an essential practice due to the growing number and varied nature of data sources continuously added to the enterprise landscape.


Effective people and effective projects and strategies can learn a lot from Covey’s 7 habits research.  I encourage those of you reading this post to try and implement these habits not only in yourself but also in your projects!

Thanks for taking the time to visit the weblog!

William Sharp

Tags: , , , , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *