Big Data … Little Data Quality

June 6, 2011
By William Sharp
Big Data: water wordscape

Image by Marius B via Flickr

Is Big Data better Data Quality?

Big Data is everywhere.  Chances are you’ve used a big data solution today.  However, are big data solutions delivering big data quality?

High Availability versus High Data Quality

Typically, Big Data solutions are designed to ensure high availability.  High availability is based on the concept that it is more important to collect and store data transactions than it is to determine the uniqueness or accuracy of the transaction.  Some common examples of big data / high availability solutions are Twitter and Facebook

It is possible to configure a big data solution to validate uniqueness and accuracy.  I want to make sure I state that clearly.  However, in order to do so you need to sacrifice some of the aspects of high availability to do so.  So, in some regard, big data and data quality are at odds.

This is because one of the fundamental aspects of high availability is to write transactions to whichever node is available.  In this model, consistency of transactional data is sacrificed in the name of data capture.  Most often, consistency is eventually configured on data inquiries, or on data reads as opposed to data writes. 

In other words, at some given point in time you do not have consistency in a big data dataset.  Even more troubling is the fact that most transactional conflicts are resolved based on timestamps.  Which is to say that the most recently updated transaction is commonly regarded as the most accurate.  This approach is, obviously, an issue that requires further examination.

Room for improvement

As we examine big data solutions and learn more about implementing them, it is important to design more robust conflict resolution approaches that ensure that big data includes big data quality.

More on that to come …

Enhanced by Zemanta


Thanks for taking the time to visit the weblog!

William Sharp

sharp@thedataqualitychronicle.org

Tags: , , , ,

3 Responses to Big Data … Little Data Quality

  1. John Owens on June 10, 2011 at 12:14 am

    Great post, William

    The Gung Ho attitudes of, “Just give us the data!” and “Big Data is Good Data”, are major obstacles to data quality.

    What may people and enterprises fail to appreciate is that data of itself has no intrinsic value. It is only valuable when it conveys information and then, only if that information is significant to the enterprise – i.e. it supports the Business Functions.

    So, rapidly collecting large amounts of data might be – and all too often is – a means of overwhelming an enterprise with garbage.

    Remember the old days of programming when the warning was GIGO – Garbage in Garbage out. That is still true today.

    It is this failure to follow the fundamentals of business and data quality that are in danger of turning the Data Quality business in a Data Garbage business – in some quarters this has already happened.

    Regards
    John

  2. William Sharp on June 10, 2011 at 8:28 am

    John,
    I agree. As we move forward into the age of big data solutions, there also needs to be a message that conveys the fact that collecting more means analyzing more. Analyzing in the sense of verification that the data is accurate and consistent

  3. SnapLogic (@SnapLogic) on July 8, 2011 at 11:00 am

    Big Data … Little Data Quality – http://goo.gl/PBUfY via @dqchronicle

Leave a Reply

Your email address will not be published. Required fields are marked *

*