Are organizations drowning in their own application data?

New survey finds most organizations lack a coherent strategy to deal with all the new data that is surging their enterprises, and don't know how to deal with resulting performance issues.
Written by Joe McKendrick, Contributing Writer

No question about it, we face an information tsunami. And most organizations do not yet have a coherent strategy to deal with all the new data that is surging their enterprises.

One enterprise resource planning (ERP) system is plenty to keep megabytes' worth of reports churning. There are many organizations out there with up to 20 ERP systems, as a result of acquisitions, mergers, and diversified global operations. Add to that all the data now pouring in from the Web and devices such as RFID tags.

What's the response been to handling all this information? When applications are brought to their knees by all the data, the typical response is to throw more hardware at the problem. Now, there's something else to consider as well -- thanks to legal issues and regulatory mandates, organizations need to keep all the data that's brought in, and keep in somewhere for a long time in case it's needed again.

As part of my work with Unisphere Research/Information Today Inc., I recently had the opportunity to help develop and publish a survey that explored data management challenges among 277 enterprise application managers and professionals affiliated with the Oracle Applications Users Group (OAUG).  (Full survey report available for download here -- registration required.) The survey, underwritten by Informatica, looked at the issues around application information lifecycle management, or ILM.

The survey finds at this time that awareness of ILM as a strategy is low. Just under three out of 10 enterprises have adopted variations of ILM. Another 16 percent are considering such plans.

ILM offers measures for properly defining, managing, and storing data -- from original creation to final disposition.  For example, in healthcare organizations, some records need to be maintained and stored for the lifetime of a patient. Many organizations, ever conscious of legal ramifications, intend to store email and other communications documents forever -- and that's a very long time. Though disk space is cheap, there's still a cost to every bit and byte of data stored.

The survey confirms that growing volumes of transaction data are being digitally captured and stored, along with unstructured forms of data files such as email, video and graphics. Adding to this tsunami are multiple copies of all this data being stored throughout organizations. At the same time, increasingly tight mandates and regulations put the onus on organizations to maintain this data and keep it available for years to come.

Close to a third of enterprises now need to support more than a terabyte’s worth of data in their applications, but just as many respondents don’t have a grasp on the data volumes within their enterprise applications. Indeed, enterprises already are feeling the impact of uncontrolled data growth on overall performance. Nine out of 10 respondents say this is an issue, and only one out of four report they currently meet all service-level agreements.

Most respondents try to combat performance issues with solutions of limited effectiveness such as tuning the application stack, which yields diminishing returns, and upgrading or expanding their hardware environments, which adds complexity and costs.

  • 87 percent "blame their performance issues on "data growth"
  • 27 percent are currently meeting all service level agreements

Maintenance costs are disproportional to the usefulness of the application. The majority of those surveyed have no formal method for legacy application retirement.

  • 42 percent require one to five full-time employees to maintain a "legacy  application"
  • One in seven requires even more headcount, and 14 percent devote a tenth of their annual IT budget to maintaining such applications

Another issue is the use of full copies of production data in internal, offshore and outsourced development and test environments, a practice that increases both enterprise data volumes and business risk. Additionally, the study concludes that "more enterprises need to better ensure that data is stripped of any identifiers that could expose sensitive data on customers and partners."

  • 75 percent make up to five copies of live production data for  non-production purposes
  • 78 percent use real production data in non-production environments
  • Only 31 percent use masking to hide confidential information

The data growth problem is compounded by mandates and policies that require data to be kept accessible for extended periods.

  • 60 percent keep data for seven or more years
  • 16 percent keep it "forever"
  • 66 percent say that archived data should be readily available as needed

Corporate systems are only going to be increasingly taxed by huge surges of data moving through enterprises. A deliberate, well-thought-out, end-to-end strategy on how information is managed and retired needs to be put in place to keep these systems running up to par.

This post was originally published on Smartplanet.com

Editorial standards