Back on 8th September 2014, I attended the IBM Systems and Technology Group Analyst Insights conference in Old Greenwich, Connecticut. This is IBM’s annual opportunity to lay out where it sees the future for STG. One of the reasons that made this year especially important, is that it was the last one before the entire x86 division was finally acquired by Lenovo.

However, it was not that which ended up being one of the key moments of the conference. It was IBM’s announcement of a major strategy shift in how it talked about its technologies to customers.

Cross messaging causing confusion for customers

One of the challenges for IBM over the last eighteen months, in particular, has been the cross messaging between System x, POWER and System z. While IBM has been marketing its CAMS (Cloud, Analytics,  Mobile and Social media) story, the different hardware platforms have been hard at work selling their story to customers as to why they are the only, or at least the best, platform for CAMS.

Each of the platforms shares common elements, such as storage, software and security, but the often confused messaging as to which platform to buy has been a problem. Nowhere was this more evident than in the big data and analytics story, where customers were faced with moving their data between platforms, in order to leverage the tools on the platform they had purchased.

The new model – take the app to the data

What happened in Old Greenwich was that IBM has realised the problem and changed the story. Instead of take the data to the application, it is now take the application to the data. It might seem a subtlety, a nuance or a splitting of hairs, but in reality, it is far more than that.

It shows that IBM is getting back to its roots and taking advantage of technology it originally created to help customers get the most out of its platforms. However, it also opens up some significant questions that IBM has to address, if it is to make this new approach work.  Some of those questions may take some serious internal business restructuring and a change of business approach from IBM.

Taking the application to the data makes sense. Moving vast amounts of data around the network requires a lot of bandwidth. There is also a significant amount of processing power required to transform the data and to ensure that the data is accurate and usable. All of this takes time, and more importantly power, and as any datacentre owner knows, power equals money out the door in energy and cooling costs.

Packaging up applications into a more portable approach is something that virtualisation makes possible. For System z, this could be as simple as installing the application into an LPAR (logical partition) and then moving it to the relevant System z device. More likely, is that this is going to be done through Linux on System z, where IBM can use container technology, which is effectively a virtual machine.

For IBM POWER, there are two options. When running little-endian (x86 compatible) Linux, the easiest solution will be to use a product such as Docker, which would allow customers to create one Docker container with an application and move it across any platform on which they are running little-endian Linux. When running on the IBM i operating system, IBM will need to provide users with an alternative container approach.

What this achieves is the ability to deploy applications wherever they are needed, something that operations teams have been doing with virtualisation for ages. The amount of bandwidth required for the containers/LPARs is extremely low, compared to moving Terabytes or even Petabytes of data around. There is also no additional processing power, so no heat and cooling costs involved, provided that operations teams keep control of the number of application instances that are created.

There are other benefits. Once everything is containerised, it can be installed through a self-service menu that can track installs, usage and be easily updated through the use of golden images. This dramatically lowers management costs and improves patch management and security.

Of course, none of this is new. Anyone who has worked in a highly virtualised environment for the last decade, or even 15 years, will recognise this approach. What is new is that IBM is now promoting a story around the location of the data, rather than the location of the application.

A new approach to licensing applications is required

Where this requires a rethink of the business is in licensing. The ability to move the application to the data means that companies could very quickly exceed their license count and, more worryingly for the CFO and CIO, find themselves facing some pretty high costs for the application usage.

This is where IBM will need to rethink its current business model to make this really work as a mass customer solution. Applications will need to have granular charging; where they are charged by the day, hour or even minute of use. This is something that is common in cloud hardware, but not yet common in cloud software use.

If a customer has a master licence for an application, for example IBM DB2 with BLU Acceleration, it would then be able to allow users to deploy it to solve problems as many times as they want, because the cost will related to just that job. It means that charges can be applied back to departments ,or at least shown in terms of usage, just as those departments are now accounting for when they use cloud software.

Provided IBM does not get hugely greedy and set the period of use or the cost per period too high, it should also encourage a lot more use of applications. This would benefit IBM substantially in terms of the bottom line for the software division. It would also allow IBM to take a lead in terms of enterprise application charging and provide a seamless cost between on-premise software and cloud software.

There is even a bigger cloud opportunity here. Customers using IBM’s hybrid cloud offering could have access to a large library of software that they would never otherwise have purchased. That software would be curated by the cloud provider’s operations team, ensuring it was properly patched and secure. The end-user would then select the software, the period of use and the container would be deployed to the private cloud, to run as required.

Only time will tell

The only real question that is left unanswered is “Can IBM really change its licensing model and deliver this?” If the answer turns out to be no, it will not  be long before users return to moving the data to the application, because the cost of moving and transforming data is something that they never get to see.