The blame for poor quality data is too often laid at IT's door, when it should be the business taking responsibility, say analysts.
Data quality becomes a problem because people in enterprises don't treat information as an asset but as a by-product, according to Gartner research VP Ted Friedman, with duplicate versions of the same data -- especially from legacy systems -- creating confusion and inaccuracies.
Where a consistent data model operates across the whole enterprise, Friedman said, there will be no problems with data quality. But "the environment doesn't sit still" and when external applications such as SAP are running on enterprise systems, the business has no chance of enforcing such standards.
When data is not complete, correct and consistent, business often puts the blame on IT, Friedman said.
"It's natural for organisations to think of data as being in the IT area," he said, adding: "Any data quality problem is going to have to involve technology somewhere."
However, IT departments can't manage data quality by themselves, he said: "IT is not capable to know what's good enough."
Business, however, can determine what standard of data is required. "Business needs to be in the driver's seat," Friedman said. "At the moment we feel that the focus on the topic is way way too much in the IT camp."
To advance data quality, Friedman suggests the use of a data steward, who is responsible for benchmarking current levels of data quality and measuring the impact on the business of bad data. The data steward looks at the data transfer processes, making sure, for instance, that the data passes through as few people as possible.
Data stewards will come from a business background, but have good relations to IT, Friedman said. They will only be effective if they are held accountable for their progress, and receive bonuses for meeting quality targets.