Flexible terms

In information technology, as in life, long-awaited good news is all too frequently accompanied by a sobering dash of unlooked for bad news.

Today, the long-awaited good news is that 21st century systems technology is starting to support the flexibility that customers need to more closely match their IT resources to their business requirements. The bad news is that the full realisation of these long-awaited systems flexibility is still being hampered by the relative inflexibility of 20th century software licensing.

Certainly, from a technological perspective, enterprise IT users have never had it so good. After years of bemoaning the constraints placed on business agility by inflexible, monolithic IT systems, and the glacial pace of new system development, testing and deployment, today’s IT users are awash with adaptive opportunities.

In the data centre, virtualisation and autonomic systems technologies are breaking the chains that have traditionally bound “stove-piped” applications stacks to grossly under-optimised and rigidly configured physical resources. Whilst, further up the application stack, component-based development platforms and web services, harnessed to service-oriented architecture (SOA), are starting to have a similarly liberating impact on the creation and delivery of new business services.

To be sure, there are still plenty of technology puzzles that have to be solved before IT is entirely subordinated to the requirements of its business users. However, as a senior IT executive from a leading UK financial services group recently told Information Age: “It feels like we’re in touching distance of the future – to being where we have always wanted to be. Where we have precise control over IT resources and can closely map them against the real needs of our business processes.”

But, “it’s still difficult,” he added, “and, it’s not made easier by software suppliers that still licence products in ways that suit their own business processes, but make it very difficult for me to do the same.”

The software suppliers that he and other IT executives perceive to be part of the problem, are those whose licensing models continue to conform to rigid 20th century ideas of IT value, even whilst their products are encouraging customers to explore the more flexible opportunities offered by 21st century systems. Such suppliers are far from being among a small minority.

Indeed, although it sometimes seems as though vendors deliver new software licensing models almost as they release new products, most of them – even today’s vaunted new software-as-a-service approach – are really only variations on well two long-established approaches, says David Mitchell, head of software research at industry advisory group, Ovum

“They [software vendors] have all these patented new technologies that are creating greater flexibility, but software licensing is still really done the same way it has always been done – by CPU and named user”, says Mitchell.

This mismatch between the adaptability of software, and the relatively intractable nature of the terms and conditions that govern its use is already more than just an inconvenience for customers, says Mitchell. It has encouraged companies to explore an ever-widening variety of different licence models, many of which, such as concurrent licensing, have proven difficult or, in the case of some organisations, impossible to manage effectively.

The extent of the existing problem is reflected in the results of a recent Ovum survey of the status of its customers’ licence management. “Between 70% and 80% of CIOs don’t know the status of their license compliance. They have absolutely no idea whether they are under-provisioned, or over-provisioned,” says Mitchell.

This is not an encouraging picture. Apart from the obvious implications it has for the ability of many organisations to meet their industry and financial regulatory obligations, it also speaks volumes about the difficulty customers face matching their software costs to the real business requirements.

And, as Mitchell points out, these requirements are currently relatively static. In the near future, as businesses exploit virtualisation and SOA to build systems that have the potential to be constantly changing – dynamically accessing different physical and logical resources on a daily, hourly or even minute-by-minute basis – already confusing and inadequate licensing models will become an even greater obstacle to accurate business and IT alignment.

Virtual licences?

Some sense of what lies ahead can already be gleaned from some of the early problems that companies have encountered with the deployment of virtualisation technology.

Thanks to products such as VMware’s ESX server, says Mitchell, “customers today can routinely expect to run multiple instances of a database on one CPU”. This makes good business sense, enabling server companies to realise the optimal value from their underlying hardware resources and, because database vendors typically license their products on a per-socket (or per-CPU) basis, it can be done without necessarily incurring any licence premium.

However, virtualisation can be deployed in a variety of ways, and at different levels in the application stack, and its use can have widely differing implications for software licensing. Instead of creating multiple instances of a database on one CPU, for instance, a company might prefer to retain a single image of a key data asset, but configure it against multiple CPUs, or even against CPUs that support multiple cores.

Is a dual-core CPU still a single processor, or is it really two computers condensed onto a single chip? Two years ago, when Intel introduced its first dual-core chips some software vendors, notably Oracle, took the latter view, and warned their customers that they intended to charge them accordingly.

In the eyes of many of its customers Oracle’s decision to insist that using a dual-core chip meant buying two database licences was unfair. It penalised them for investing in state-of-the-art hardware platforms, in much the same way mainframe users had once been “gouged” by application vendors whenever they had the temerity to upgrade their CPUs.

Oracle, of course, argued that – leaving aside the technical arguments over whether one chip can ever really be two computers – by deploying multi-core processors customers were deriving significantly greater value from Oracle products, so it was entitled to charge a premium to reflect this.

After a short but rancorous dispute with its major customers, Oracle backed down – but not completely. In a statement designed to mollify its customers, the company declared that it was committed to provide “simple and flexible pricing models to meet our customers’ needs”. Accordingly, it would “continue to recognise each core [in a multi-core chip] as a separate processor.” But, “for the purposes of counting the number of processors that require licensing, the number of cores in a multi-core chip shall now be multiplied by 0.75.”

Oracle’s clash with its customers may have been the first, and so far the most high-profile example of suppliers and customers wrangling over where the new boundaries of software licensing need to be drawn, but is certainly not likely to be the last. On the contrary, most industry analysts agree that Oracle’s experience is only the tip of much larger, if far less well publicised, iceberg.

Throughout the software industry, from vendors of systems software all the way up the stack to major packaged application software companies, suppliers are experimenting with new licensing models for the next generation of agile systems technology. “Everyone is taking a stab at working it out. The trouble is that everyone us trying to do it at their own level of the stack,” says Mitchell.

Ultimately standard flexible methods of charging for software may emerge that will be as widely accepted and understood as today’s per-processor and per-user models, but it may be some time before they do.

Arm yourself

As things stand today, according to Forrester Research licensing analysts Julie Giera and John Rymer, any company planning to deploy multi-core processors and create a virtualised infrastructure had better be prepared. “Bring a pricing analyst and a stack of product catalogues with you when it comes time to buy the software,” they recommended in recent report on software pricing trends.

“Each major vendor has its own formula for dealing with multi-core processors, with pure middleware vendors favouring schemes that run the risk of raising current prices. Worse, these software vendors reserve the right to change the relative value of a processor core in the future, as industry-standard ratings [of different chip vendors’ core architectures] don’t exist.”

Giera and Rymer’s report might be focussed primarily on the problems faced by organisations planning to deploy multi-core processor architectures, but in most respects, the problems it highlights are already familiar to buyers of all categories of business software. In particular, no organisation that has started down the route towards a virtualised, service-oriented IT infrastructure will argue with their key observation: that the industry’s creation of a plethora of solutions is a major new source of complexity for systems planners.

Indeed, although customers largely welcome the effort that suppliers are making to create licence models that are more in tune with modern business needs and technological capabilities, it’s also clear that they wish the process could be faster, and more orderly.

However, software suppliers still can’t afford to be complacent, or to underestimate how keenly their customers are observing the evolution of their software licensing policies. Currently, says Willy Ross, UK CEO of application virtualisation middleware vendor, Data Synapse, some application software suppliers are being sheltered from the full impact that virtualisation may have on their licence revenues because their customers are at an early stage of their virtual technology deployment.

>For the time being, the increased flexibility and lower operating costs that customers are realising by using virtualisation to “sweat their physical assets”, are more than enough to overcome worries about consequent increases in licence costs. However, says Ross, this may not be the case for much longer. “As people begin to enter the next phase of their virtualisation strategy, they will be looking elsewhere for more savings,” says Ross. In a year or two, perhaps even sooner, application vendors that haven’t got their licensing models right could be facing some very tough negotiations.

Dow Chemical: Awaiting the end game

At Dow Chemical, chief architect Mark Fenske is currently pushing through the creation of the company’s next generation enterprise architecture. The object of the project is to provide Dow with a highly adaptable, secure and automated corporate IT capable of delivering new services quickly, predictably and at the lowest possible cost.

“Virtualisation is very important to us in achieving these goals,” says Fenske, and the company has already collapsed its application server footprint by a factor of 5:1 over the last 18 months. Technically, he says, Dow is now “comfortable” with virtualisation technology, and he is personally confident that the industry is on course to iron-out outstanding technology issues reasonably soon. However, he is not as confident about the prospect of an early resolution to the industry’s licensing issues.

“Generally, we feel that the pricing and licensing in the industry needs to change,” says Fenske and, just as importantly, software suppliers need to do more to harmonise the way they tackle the issue. Currently, he says, “we have pricing that’s based around revenue, pricing based on users, we have licences that are based on the number of CPUs that a product is installed on, and two or three more [different models].”

Fenske actually takes a sympathetic view of the licensing challenges that vendors are facing. “I think the software industry, particularly the application software suppliers, are in the early stages of figuring out how to offer customers flexible terms and conditions, and at the same time not upset the predictability of their of their software revenues.”

However, Fenske’s sympathy for software vendors only goes so far, and does not extend to supporting the constantly shifting policies that many suppliers are adopting as they search for the right business model. “When we draw up plans for new applications at Dow, we’re planning for the very long-term, and we look at the cost implications of new software over a period of between five and 10 years.”

At the moment, he says, many suppliers are reluctant or unable to make licence term commitments that extend this far into the future. Consequently, he says, “there is a lack of transparency in the end game that is making people nervous.”

So far, this nervousness is not so pronounced that it is dissuading companies such as Dow Chemical from moving ahead with projects based on virtualisation and service-oriented component technologies. “We see very real cost savings to be made from virtualisation today,” says Fenske, and he is confident that his main software suppliers – IBM, Hewlett-Packard and Microsoft – are heading in the right direction in licensing terms.

Further reading in Information Age

Software licensing model shift – May 2007

Pete Swabey

Pete Swabey

Pete was Editor of Information Age and head of technology research for Vitesse Media plc from 2005 to 2013, before moving on to be Senior Editor and then Editorial Director at The Economist Intelligence...

Related Topics