I spoke to a client recently who had this question: “I need help to develop a survey to ask our employees around the company how they regard the various subjective metrics tied to data quality.”
The question got more curious. Apparently, the executive leadership including COO in the business was asking for definitions and progress on defining and measuring subjective measures of data quality. This was curious since in many other cases business executives tend to ask about impact; lower down the organization you get into concrete measures of data quality, and maybe even esoteric arguments about subjective measures.
I advised the client I don’t cover data quality in detail and that I know who does; and I was not sure that we have published any survey tools for such a task. But I offered a thought to the client: Why bother with all that work?
I went on: What if the subjective measures all get scored or rated at 100%? What does the COO learn from this? The client responded that this was indeed a real good question.
I followed up: The data quality metrics being looked at seemed devoid of reference to anything other than a need to focus on data quality; and how to get more use out of some recently invested technology. I noted: If these metrics were all at 100%, what happens to their importance if the outcome of the business fails? What then? What was the point of the data quality work?
I explained to the client:
- Stop this bottom up data-focused work. Starting, being focused on, and ending with data in mind is a dead end. You might die on the vine trying to define and justify so many data quality metrics. There might be a change in leadership or business performance and the whole project will get shut down anyway since no one could work out the value add anyway.
- Start with an outcome-based data governance program that:
- Embed the work of governing data (this includes the needed DQ policies) in prioritized business outcomes/challenges/opportunities
- From this, you identify the least amount of data that drives the most important outcomes
- Work on the least amount of data quality for the least amount of data driving those outcomes
- Iterate (to learn more about what data matters most, and least)
The client agreed this was the “more mature” approach they wanted to take. “More mature?” I queried. This Is not about being the most mature. This is about getting started with a winning approach. The other rarely ever works, and unlikely conditions have to persist for unnatural lengths of time. You will never get to 100% data quality and nor do you need it. If the mission of the organisation and the associated outcomes do not drive data quality or data governance, the likelihood of failure is beyond significant.