Mid way through HP’s announcement on the 10th of April of its Converged Cloud portfolio we find the announcement of version 2.0 of HP’s Service Virtualization software for testing applications using simulated services in a virtualized environment. Although HP places this in the context of cloud and mobile applications, in fact, it solves a long standing and ever growing problem in both functional and performance testing of many types of applications.

Test environments for heterogeneous applications with complex interfaces are always a challenge. It doesn’t matter whether the interface is internal or external: if it involves coordinating another organization’s activities and resources around your test schedule, synchronizing test data and system availability with your test schedules becomes an administrative and technical nightmare. Even though technologies like web services and messaging middleware have well-defined interfaces that in theory should simplify matters, the sheer variety of technologies for services, applications and middleware means that the problem just keeps growing.

IBM’s recent acquisition of Green Hat and HP’s release of Service Virtualization 2.0 are recognition that there’s a problem worth solving.

At its core, HP Service Virtualization provides a rich set of features for virtualizing systems by simulating their interfaces – both data and behaviour. Nothing to do with hardware and operating system virtualization. Here the software on the other side of an interface just doesn’t exist. Instead, a Service Virtualization Server accepts and returns the traffic over the interface that normally talks to the real application, middleware or service. The product’s Designer component is used before testing to set up the format and behaviour of the virtualized system’s interface that runs on the Server.

HP Service Virtualization 1.0 and 1.1 appeared in 2011 with support for the most common web services protocols. Version 2.0 adds some additional protocols – notably MQ, multiple endpoints and response types, and service description learning and editing. The solution is targeted at both functional and performance testers, with HP claiming that, compared to competitors, it excels in its usefulness (obviously with Performance Center and LoadRunner) for performance testing. This is because not only can you simulate the functional behaviour of the virtualized interface, you can also model its performance.

There are a few aspects of the product that I have doubts about. The pricing of both Server and Designer licences place this product firmly into the domain of tools for specialists. That might be appropriate for performance testing, but not for functional testing. The workflow – resting on a small number of Designers to configure an interface – sounds like a sure fire way to create bottlenecks. Although you could argue that this ensures reusability, it seems very out of step with Agile and post-Agile practices. And finally, although HP declares proudly that interface definition is a ‘script-free’ process – totally declarative, I’ll bet a heap of 4GLs and a ton of keyword testing tools that HP will be forced to introduce some sort of break-out hooks for scripting or API calls to handle unusually complex situations.

Nevertheless, the attention on solving a killer of a problem – how to get a representative test environment in complex, heterogeneous environments – is hugely welcome.