by Doug Laney | August 2, 2017 | Comments Off on The Cost of Data Quality: One Night of Coding and an Energy Drink
In 1999, the $125 million Mars Climate Orbiter notoriously crashed due to the navigation team at the Jet Propulsion Laboratory using the metric system in calculations while their colleagues at Lockheed Martin Astronautics provided acceleration data in inches, feet and pounds.
Amazingly, nearly two decades later companies still suffer similar challenges, perhaps not costing them as much, but costing them much more than would a simple fix. For example, how much are errant search results like this below costing Amazon or other retailers in unnecessary returns, lost business, reputation and margins? Undoubtedly a lot more than it would cost to pay some teenager to code a solution one evening after school. Or, as my colleague Jamie Popkin suggested, “Smart retailers might just pass along the cost of returns by triggering reduced shelf space or promotions.” In either case, “follow the data” today has become a way to “follow the money.”
Certainly product information brokers like Indix and natural language search capabilities from companies like EasyAsk can help. But at the core is data quality, plain and simple. Data quality consultant Martin Spratt has assessed that 23 percent of the workforce at a major Australian bank cannot perform its primary job adequately because of bad data, and that 13 percent of the workforce at an energy retailer and 24 percent at a health insurance provider suffer the same problem. Some business people are taking up to 30 hours per week to deal with data quality issues, in some cases leading to hundreds of millions of dollars of previously unmeasured “commercial damage,” Spratt told me. The larger concern is IT people don’t want to hear it because it makes them look bad, and besides, they’re not compensated on good data the way others around the company are compensated on the quality of the assets they produce or manage. Spratt even demonstrated he can even predict employee attrition based on a company’s level of data quality. Particularly in the realm of Big Data, these issues and their economic impact are greatly amplified.
Indeed, bad product information is just one of several types of data quality issues. Gartner’s data quality assessment toolkit shows how to measure a dozen different data quality dimensions. And more specific to product data, my colleague Simon James Walker has laid out a best practice for the publication and synchronization of product data and its critical importance for digital business success.
Bottom line: If you want to land planetary rovers, or land customers, you need to pay better attention to data governance, measuring data quality, master data management and synchronizing product data. In short–managing information as an actual company asset.
My book, Infonomics: How to Monetize, Manage and Measure Information for Competitive Advantage, is now available on Amazon (publication: September 2017).
Category: cdo data-and-analytics-strategies data-broker data-governance data-quality digital-marketing infonomics information-management it-cost-optimization
Tags: data-broker data-governance data-quality ecommerce infonomics information-governance information-management lockheed-martin masterdata-management mdm nasa retail
Comments or opinions expressed on this blog are those of the individual contributors only, and do not necessarily represent the views of Gartner, Inc. or its management. Readers may copy and redistribute blog postings on other blogs, or otherwise for private, non-commercial or journalistic purposes, with attribution to Gartner. This content may not be used for any other purposes in any other formats or media. The content on this blog is provided on an "as-is" basis. Gartner shall not be liable for any damages whatsoever arising out of the content or use of this blog.