The blame for poor-quality data is too often laid at IT's door, when it should be the business taking responsibility, analysts have said.
Data quality becomes a problem because people in enterprises don't treat information as an asset but as a by-product, according to Gartner research vice president Ted Friedman, with duplicate versions of the same data — especially from legacy systems — creating confusion and inaccuracies.
Where a consistent data model operates across the whole enterprise, Friedman said, there will be no problems with data quality. But "the environment doesn't sit still" and, when external applications such as SAP are running on enterprise systems, the business has no chance of enforcing such standards.
When data is not complete, correct and consistent, the business often puts the blame on IT, Friedman said.
"It's natural for organisations to think of data as being in the IT area," he said, adding: "Any data-quality problem is going to have to involve technology somewhere."
However, IT departments can't manage data quality by themselves, Friedman said: "IT is not capable to know what's good enough."
The business, however, can determine what standard of data is required. "Business needs to be in the driver's seat," Friedman said. "At the moment, we feel that the focus on the topic is way, way too much in the IT camp."
To advance data quality, Friedman suggested the use of a data steward, who is responsible for benchmarking current levels of data quality and measuring the impact of bad data on the business. The data steward would look at the data-transfer processes, making sure, for instance, that the data passes through as few people as possible.
Data stewards would come from a business background but have good relations to IT, Friedman said. They would only be effective if they are held accountable for their progress and receive bonuses for meeting quality targets, he added.