X
Business

Data monitoring from the pilot's seat

Commentary--Just as a plane's control panel has all the necessary information for a safe flight, businesses an easy way to to monitor their data.
Written by Tony Fisher, Contributor
Commentary--Anyone who has ever flown a plane--or even glanced into a cockpit when boarding a commercial flight--can appreciate the complex array of gauges and monitors that the pilot must check. All the data about a plane's speed, course, fuel and other details are available at a glance, each giving the pilot the information necessary to make sound, safe decisions.


Tony Fisher, DataFlux
Similarly, organizations rely on data to provide the foundation for business decisions. For years, companies have implemented business intelligence programs to achieve one goal: make better decisions from their corporate information. Many companies have discovered one inescapable truth: it's impossible to make an informed decision based on outdated or erroneous information. Just as a pilot needs to monitor the health of the aircraft, organizations need to constantly gauge the health of their data.

Companies worldwide are increasingly aware of the importance of data quality in business intelligence programs, customer relationship management (CRM) initiatives and enterprise resource planning (ERP) systems. However, companies often believe that after cleaning data once, they have solved their data problems. But new data arriving at a data warehouse can be outdated or inconsistent, compromising the "intelligence" of your business intelligence initiatives.

Put simply, building and keeping good data on customers, prospects, products and inventory takes constant vigilance. To manage data effectively, an organization must institute a data management program based on continual, routine monitoring of data to increase the control on data quality.

'Once and done' is not enough
The impact of "data decay" can influence--and hinder--many enterprise initiatives. Imagine a manufacturing company that builds a data warehouse to serve as a single repository for all of its information about customers, products and inventory. From that data, they can uncover trends about customer adoption, resource allocation and future needs.

After a review of the data, this company finds that new, non-standard information is constantly arriving at the repository. The effect of this bad data may not be felt until much later. Whenever the company explores this data to identify patterns or tendencies, the presence of bad data can skew the results.

The solution for building high-quality corporate data on an ongoing basis is data monitoring. With data monitoring, technology and business users can create rules to examine data automatically to uncover problems as they occur. These users can also chart metrics related to data quality on a periodic basis and begin to address some of the underlying reasons that bad data is being collected in the first place.

The role of data monitoring
Process improvement programs such as Six Sigma or Total Quality Management (TQM) have consistently addressed the need for a control mechanism to ensure the quality of finished products. These methodologies go beyond analyzing and improving the manufacturing process; they institute controls to prevent similar problems from occurring in the future.

Companies need to apply similar tactics to another valuable resource: data. Instead of loading questionable information into a data warehouse, data monitoring puts checks and controls on incoming information to keep high levels of data quality.

A data monitoring regimen can accomplish a number of tasks, such as:

• Detecting problems from incoming data. Since data warehouses typically receive periodic loads, this allows companies to validate existing data against established business rules to uncover and address data integrity issues – before they become a problem later during business intelligence programs.
• Generating instant alerts. Set up automated system notifications and emails to flag problematic data as a new, inconsistent record enters the system.
• Identifying trends in data quality metrics. View ongoing statistics about data to see when the value of data starts to decline.

Data monitoring extends the reach of traditional data quality programs by making good data a corporate priority. When data does get out of control, users know immediately--and they can react to problems before the quality of the data declines.

Add monitoring to data management initiatives
For organizations that have already started an effort to improve data quality, most of the elements are already in place to build a data monitoring program. In fact, data monitoring is an extension of the effort required to get data into a reliable state in the first place. The same business rules used to cleanse, standardize and verify data in the initial data quality project can serve as the rules to examine and flag data integrity issues over time.

Building consistent, accurate and reliable data is not easy. Periodic fixes will only provide temporary relief from the various problems that can arise because of bad data. With data monitoring, companies can better control their data and build more reliable information to support any future business intelligence efforts.

biography
Tony Fisher is president and general manager of DataFlux, a SAS company providing data management solutions. DataFlux is headquartered in Cary, N.C.

Editorial standards