Data virtualization: 6 best practices to help the business 'get it'

Summary:Data virtualization engages the entire enterprise, and challenges tend to be more organizational and cultural than technical.

Something that doesn't get talked about enough in the service orientation world is data virtualization. That is, it's handy to be able to pull data from various sources into an abstracted service layer, versus having services or applications tapping live production databases. This helps cut down the need for physical storage, and provides a common interface for all applications using the data, especially BI, analytics, and transaction systems.

The whys and hows of data virtualization are explored by Judith Davis and Robert Eve in a new book, Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. As with any service technology engagement, data virtualization involves a lot of players across the enterprise, so challenges tend to be more organizational and cultural than technical.

Davis and Eve outline 6 key best practices anyone undertaking a data virtualization effort needs to consider:

1) Centralize responsibility for data virtualization. "The key benefit here is the ability advance the effort quickly and to take on bigger concepts, such as defining common canonicals and implementing an intelligent storage component," the authors say.

2) Agree on and implement a common data model. "This will ensure consistent, high quality data, make business users more confident in the data and make IT staff more agile and productive."

3) Establish a governance approach. "This needs to include how to manage the data virtualization environment. Key issues are who is responsible for the shared infrastructure and for shared services."

4) Educate the business side on the benefits of data virtualization. "Allocate time to consult with business users and make sure they understand the data," Davis and Eve advise. "Establish an ongoing effort to make data virtualization acceptable to other areas of the organization."

5) Pay attention to performance tuning and scalability. "Tune performance and test solution scalability early in the development process. Consider bringing in massively parallel processing capability to handle query performance on high-volume data. Accommodate the fact that users are unpredictable on ad hoc analysis and reporting."

6) Take a phased approach to implementing data virtualization. "First abstract the data sources, then layer the BI applications on top and gradually implement the more advanced federation capabilities of data virtualization."

Topics: Virtualization, Cloud, CXO, Hardware, Storage

About

Joe McKendrick is an author and independent analyst who tracks the impact of information technology on management and markets. Joe is co-author, along with 16 leading industry leaders and thinkers, of the SOA Manifesto, which outlines the values and guiding principles of service orientation. He speaks frequently on cloud, SOA, data, and... Full Bio

zdnet_core.socialButton.googleLabel Contact Disclosure

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Related Stories

The best of ZDNet, delivered

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
Subscription failed.