Marketing and Business Development

Robert Eve

Subscribe to Robert Eve: eMailAlertsEmail Alerts
Get Robert Eve: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn

Related Topics: Virtualization Magazine, Desktop Virtualization Journal


A Maturity Model for Measuring Data Virtualization Platforms

Increasing IT agility, improving efficiency and staff productivity, and reducing costs

Data virtualization has become a hot topic as enterprises and government agencies add this maturing technology to their data integration toolkits in pursuit of greater IT agility and lower costs.

Rising key performance indicators (KPIs) for both demand and supply clearly tell this story.  With demand exploding fivefold[1], and ETL, ESB and BI vendors extending their existing data integration offerings and bringing new data virtualization wares to market, enterprises and government agencies are seeking a way to measure and compare the data virtualization platform offerings available, to determine which is the most appropriate for their information architectures.

A Model for Measuring Data Virtualization Platform Maturity
To assess data virtualization offerings in a systematic way, we developed a Data Virtualization Platform Maturity Model. This model has two critical dimensions. The first uses a five-stage maturity timeline to provide a common framework for measuring the various phases typical in software innovation. The second dimension looks at key functionality categories that, when successfully combined, create viable data virtualization platforms.

In the first dimension, the five stages of maturity include the following phases and their definitions:

  • Entry Level: First product release that implements a minimal set of functionality to credibly enter the market.
  • Limited: Follow-on product release(s), aimed at satisfying initial customer demands within narrow (often vertical market) use-cases.
  • Intermediate: Product releases where the functionality expands rapidly based on traction in the marketplace. Feature rich, these releases address a growing market and an expanding set of use cases.
  • Advanced: Product releases addressing more complex use cases as well as supporting large-scale, enterprise-wide infrastructure requirements.
  • Comprehensive: Mature product releases that increase functional depth and expand market penetration. Often products incorporate functionality from adjacent functionality areas.

The relation between data virtualization platform maturity and time can be seen in Figure 1.

Figure 1: Product Maturity over Time

Key Functionality Categories
The following eight functionality categories, in combination, create viable data virtualization platforms. These categories are derived from Composite Software's hundreds of man years of R&D, and millions of hours of operational deployment at Global 2000 enterprises and government agencies:

  • Query Processing: At its core, data virtualization's primary purpose is on-demand query of widely dispersed enterprise data. Consequently, data virtualization platforms must ensure these queries are efficient and responsive. If the high-performance query processing engine is immature or poorly architected, the rest of the functionality is of little consequence. Maturity is typically measured by the breadth and efficiency of optimization algorithms.
  • Caching: Traditional data integration worked by periodically consolidating (or staging) data in physical stores. In contrast, data virtualization platforms dynamically combine data in-memory, on-demand. Caching addresses the middle ground between these two approaches by enabling optional pre-materialization of queries' result sets. This flexibility can improve query performance, work around unavailable sources, reduce source system loads, and more. Maturity is measured by the breadth of caching options across factors such as triggering, storage, distribution, update, etc.
  • Data Source Access: There are a wide variety of structured and semi-structured data sources in a typical large enterprise. Data virtualization platforms must reach and extract data efficiently from all of them. Further, they must include methods to programmatically extend data source access to handle unique, non-standard data sources. Maturity is measured in the breadth of data source formats and protocols supported.
  • Transformation (includes Data Quality): Because source data is rarely a 100 percent match with data consumer needs, data virtualization platforms must transform and improve data, typically abstracting disparate source data into standardized canonical models for easier sharing by multiple consumers. Maturity is measured in the ease of use, breadth, flexibility and extensibility of transformation functions.
  • Data Delivery: Enterprise end-users consume data using a wide variety of applications, visualization tools and analytics. Data virtualization platforms must deliver data to the consumers using the standards-based data access mechanisms these consumers require. Further, they must enable delivery of common data to different consumers via different methods. Examples include an XML document via SOAP, and a relational view via ODBC. Maturity is measured in the breadth of data consumer formats and protocols supported.
  • Security: Data virtualization platforms must secure the data that passes through them. Deploying data virtualization should not force reinvention of existing well-developed security policies; it should leverage the standards and security frameworks already implemented in the enterprise. Maturity is measured in the breadth of authentication, authorization and encryption standards supported as well as a high degree of transparency.
  • Developer Tools: Design-time productivity with its concomitant faster time-to-solution is one of data virtualization's biggest benefits. To ensure developer adoption, the tools must be intuitive to use and standards-based. Further, they must automate work steps via code generation, in-line testing, tight links to the source control system and more. Maturity is measured by the degree that data virtualization platforms make easy things easy and hard things possible.
  • Enterprise-Scale Operation: Because data virtualization serves critical business needs 7x24x365, operational support is a core requirement in enterprise data virtualization deployments. Data virtualization platforms must be highly deployable, reliable, available, scalable, manageable and maintainable. Maturity is measured in the breadth and depth of operational support capabilities.

Drilling Down using Query Processing
By overlaying the five stages across the eight categories, enterprises and government agencies can use the Data Virtualization Platform Maturity Model to form a comprehensive listing of key capabilities by stage.

Figure 2 provides an example of how one of the categories, query processing, maps into the five stages listed above.

Figure 2: Query Processing Maturity

This comprehensive detail is helpful both during first-time data virtualization solution evaluation, as well as during on-going usage. Some examples of on-going evaluation include:

  • When developing a data virtualization capabilities adoption roadmap
  • When aligning staff development, release deployment and related plans with the adoption roadmap
  • When measuring the viability of the selected data virtualization offering over time

Data Virtualization offers the potential for increasing IT agility, improving efficiency and staff productivity, and reducing costs through the elimination of steps otherwise required by physical consolidation methods. In addition to well-established, pure-play data virtualization platform suppliers, the number of data virtualization options available is expanding as vendors expert in complementary technology areas have recently begun introducing data virtualization offerings. The Data Virtualization Maturity Model presented in this article provides a systematic approach for evaluating the myriad options available.


  1. Magic Quadrant for Data Integration Tools - Gartner Research, November 2009, ID: G00171986 © 2009

More Stories By David Besemer

David Besemer, CTO for Composite Software, Inc., drives the technical vision of the company. Customers in a diverse range of industries including financial services, pharmaceutical, manufacturing and consumer produce/services use Composite’s Enterprise Information Integration (EII) solutions to create individual data-services layers, thus enabling these users to find, access, combine and deliver on-demand information from critical and disparate sources located across the enterprise, including SAP, Siebel, Oracle and applications.

More Stories By Robert Eve

Robert "Bob" Eve is vice president of marketing at Composite Software. Prior to joining Composite, he held executive-level marketing and business development roles at several other enterprise software companies. At Informatica and Mercury Interactive, he helped penetrate new segments in his role as the vice president of Market Development. Bob ran Marketing and Alliances at Kintana (acquired by Mercury Interactive in 2003) where he defined the IT Governance category. As vice president of Alliances at PeopleSoft, Bob was responsible for more than 300 partners and 100 staff members. Bob has an MS in management from MIT and a BS in business administration with honors from University of California, Berkeley. He is a frequent contributor to publications including SYS-CON's SOA World Magazine and Virtualization Journal.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.