Power plays in the Utility Computing sector

The apogee of utility computing is metered billing for hardware, software and support services. “But there is no question that it will be difficult to do,” says Alex Tatham, vice president of global software distribution at Bell Microproducts, one of the biggest software resellers in the UK.

That is not to say that technology does not exist to monitor usage, to bill users on an hourly basis, and keep an audited record of activity – even for software products. For some time, in fact, outsourcing companies and hardware suppliers have offered customers the chance to pay for processing power and storage capacity on a monthly usage basis. Hewlett-Packard, for example, has a fee structure whereby customers are billed 50% of a constant rate each month, plus a variable cost based on the average capacity used during the month.

There is concern, however, over the impact of monitoring and auditing software on overall application performance. Some argue that the footprint of such applications is too large and that they slow down transactions.

But the real reason for resistance to metered billing has nothing to do with technology. Many software suppliers, and customers, are unwilling to change the economics of software provision – particularly at a time of depressed spending and tight budgets. Today, with most software bought on a per-processor or per-seat basis, suppliers and customers have a relatively straightforward method of forecasting spend. Usage charging is likely to be much more volatile, especially in the short-term.

Nevertheless, supporters say the momentum behind utility computing is unstoppable. “The world is changing and the software industry has got to respond to that,” says Kieran Lees, managing director of northern Europe for Platform Computing, a specialist software company that has been selling middleware for distributed computing for a decade.

There are signs that the industry is already beginning to respond, albeit gradually. Tatham says he is now being asked to facilitate software distribution agreements that allow for monthly billing. Some suppliers, including Computer Associates, have begun to favour subscription-based licensing agreements. Others, including Oracle, back the software-as-a-service business model made famous – or notorious – by dozens of application service providers. Microsoft, meanwhile, has been gearing itself up to move to annuity billing.

If these giants of the industry lead the charge to pay-per-use billing, says Tatham, then others will be forced to follow.

Not so long ago in business history, factories generated their own power. Major manufacturers throughout the UK invested huge sums in building large electricity generators to ensure a constant and plentiful supply of power; smaller facilities spent less and risked running short.

Then, in the 1920s, came the beginnings of a national grid of electricity transmission. Power companies exchanged their surplus power to better balance supplies and recast themselves as utilities that billed customers on a usage basis. But the idea was not an immediate success – for years many companies maintained their own generators, reluctant to relinquish control over such a critical resource.

Today, UK businesses face a similar dilemma in relation to information technology. Should they continue to build up huge arsenals of computing equipment in order to generate sufficient capacity to see them through periods of peak demand? Or would it be more cost-effective for them to tap into a shared computing resource, scaling capacity up and down as needed and paying for technology on a metered basis?

The provision of on-tap information technology services is, of course, a great deal more complicated than electricity supply, but most organisations are becoming aware of the value in moving to the use of shared computing resources – whether through the more efficient organisation and distribution of internal computing or by engaging a third-party technology services provider.

Take the cost of running hardware. Analysts have concluded that, at the average company, only between 10% and 50% of the processing power of individual systems is utilised at various points during the day. The same goes for storage devices.

That means that companies applying a utility computing model, whereby key resources are shared, should be able to get by with a fifth fewer servers and a smaller support staff.

IDC, the IT market research company, talks of overall cost savings of up to 60% as a result of implementing utility computing. IBM, a key advocate of utility computing – or computing on-demand as the systems giant calls it – does not have detailed statistics on payback or cost of ownership, but sales executives say that a high street bank should make a return inside 12 months and cut its IT costs by between 25% and 30% within three to five years.

Then there is the prospect of getting more out of wasteful spending – analysts at Gartner, for example, believe that 70% of companies that have bought high-end servers rarely use all these systems’ functions.

Productivity also ought to benefit, thanks to improved application availability and response times. The sharing of resources, say utility computing proponents, will allow companies to be far more responsive to business opportunities, scaling up or down their system requirements as needed. That, in theory at least, will help reduce the time-to-market of new products.

In addition, utility computing can enable applications – and even entire businesses – that otherwise would be untenable without the capital expenditure on large scale computing resources.

There is also the potential reduction in software costs – utility computing is likely to lead to the biggest reform in software licensing since the shift to client/server architectures more than a decade ago.

Admittedly, some fresh investment in technology is needed to make the move to utility computing feasible, but once the new infrastructure is in place IT costs will fall – permanently. If anything, the rationale seems even clearer-cut in a spending downturn than in a time of plentiful budgets.

Human hurdles

Cultural resistance to utility computing will almost certainly mean that the old arguments about outsourcing are replayed.

Utility computing provides an implicit threat to the job security of internal IT staff. But there is an important distinction between utility computing and outsourcing. In the traditional outsourcing market, the threat to IT staff is mitigated by the fact that some employees are transferred to the outsourcer as part of the deal. One of the underlying principles of utility computing is that there is no staff transfer.

That could make the case for applying utility computing on an internal basis, as opposed to engaging a third-party provider, much stronger. But even then, it is likely to meet resistance.

In many companies, IT budgets are allocated at a line of business or department level. And just as many hot-desking initiatives have fallen down due to the fact that employees are unwilling to share their personal workspace, so internal utility computing projects could founder on the reluctance of different departments to share servers and storage.

The minority is right

So where are all the customers? The rationale may be compelling but few companies have yet been stirred into action.

With some notable exceptions, only the great financial institutions are signing up for utility computing. Large sections of the IT industry are sceptical, and even utility computing advocates like IBM are tight-lipped about revenues and customer references. The adoption of utility computing seems certain to be as cautious as the move to on-demand electricity was in the first half of the twentieth century.

A damaging split within the supplier community is not helping. Proving the business case of utility computing – and that goes for suppliers as well as customers – has so far divided the IT industry. Most vendors have still to decide whether to support it, and a few are firmly against. In the user community, many are rallying against the cultural changes that utility computing demands.

Dell Computer is one company with major reservations. “It’s unclear to us how much of the promotion of utility computing is a self-serving way of changing the rules of the game to benefit [competing vendors] instead of ultimately being a good thing for customers,” Gary Cotshott, the vice president and general manager of Dell’s services business, has said.

His thinking, set out in an interview almost a year ago, underscores the polarity of opinion.

Many observers agree, saying that by supporting utility computing, the big suppliers sense an opportunity to revive the era of the single vendor lock-in, and hence to ramp up prices. “That is why you hear some suppliers saying: ‘We can do utility computing, but you have to buy it from us,'” says Michael Hjalsted, a marketing director for the systems division of Unisys.

All change

At a technological level, the shift to utility computing is not quite the revolution that it appears. The idea may seem new, but its foundations are as old as distributed computing, which enterprises have been deploying for over a decade. What is more, in many cases utility computing will be just the next stage in the evolution of traditional IT outsourcing and managed services.

What is revolutionary, however, is the lasting effect it is likely to have on the IT industry. “It will dramatically alter the landscape of players and change the process of buying and selling IT services,” says David Tapper, an IDC analyst. “Service providers will consolidate down to just a handful, as only a few will have the economies of scale, the breadth of capabilities and deep enough pockets to successfully deliver on such a service model.”

The consequences for customers of such large-scale consolidation can only be guessed at. Less competition traditionally leads to higher prices, but that may be offset by the commoditisation of hardware and software products that utility computing is expected to accelerate.

In addition, the externalised utility computing model – where companies outsource total responsibility for computer service provision rather than simply employing solutions internally to improve the distribution and management of resources – will reduce the need for third-party consulting and integration services. “Customers will push more of the responsibility for the planning, design and integration of systems back on to providers,” says one analyst. That spells fresh danger for IT services companies and could force them to cut fees even further.

There may also be important consequences for the supplier-customer relationship. The balance of power could move away from traditional IT systems and services suppliers, such as IBM and EDS, towards the owners of the ‘network’ over which the services are delivered. That is likely to favour the world’s great telecommunications companies, such as AT&T of the US, BT of the UK and Deutsche Telekom of Germany.

In a recent IDC poll of US enterprises, roughly two-thirds of companies expressed an interest in utility computing and most of those expected IBM to be the most important vendor (see table). But behind IBM, those surveyed felt AT&T would be as likely as HP to be the second mosy prominent utility company. Perhaps one reason is that the involvement of telecoms companies is expected to force IT prices down even lower. Many believe that network operators will price their services very keenly, due both to their large economies of scale and their determination to offer bundled voice and data services.

It is not only the name and background of the IT service provider that could change. Utility computing will ultimately transform the nature of the provider-customer relationship. “A utility service, by definition, creates a more ‘arm’s reach’ type of relationship with the customer,” says Tapper.

The industry is clearly a long way off this today, with most suppliers eager to work very closely with early adopters. Over time, however, there are likely to be marked differences between the suppliers, and some can be expected to charge a premium for agreeing to maintain a more consultative relationship, particularly when hosting business processes. Clearly much will depend on the ‘rules of engagement’ now being set by the early providers, such as IBM, HP and Sun Microsystems.

He fears customers will distrust the motives of providers until there are ubiquitous broadband networks and a universal agreement on standards.

But the big suppliers deny any ulterior motives. “We would prefer a customer to migrate to our technology, because we think it is the best, but we will run our competitors’ technology,” says Roy Cochrane, IBM business development executive for the European ‘ebusiness on-demand’ division. “If anything, that does not always make us very popular within the [IBM] organisation.”

Moreover, some analysts dismiss comments by Dell executives as ‘sour grapes’, claiming that the company’s commodity-level systems strategy means it is not yet in a position to provide high-end utility computing ‘solutions’.

In any case, even the largest suppliers recognise that they could not sustain single vendor lock-in, even if they wanted to.

A $10bn punt

Given the slow build up of early adopters, the investments in the technology by the big suppliers seem to have an element of market-making to them. Thus, when the CEO of IBM, Sam Palmisano, told a press conference in October 2002 that he was placing a $10 billion bet on utility computing, or ‘ebusiness on demand’, the move suddenly represented one of the biggest wagers in business history.

Certainly, he was in a hurry to spend the money – some $3.5 billion on IT services company PwC Consulting, for example, as well as untold millions developing utility computing technologies, overhauling IBM’s network of data centres and setting up a number of ‘on-demand development centres’ around the world.

But that confidence is not misguided. One discipline fashionable in business schools is ‘complexity theory’ – as systems get more complex, they become autonomous and begin to organise themselves. While a typical data centre has become more complex, it has not yet developed qualities of autonomy and self-organisation – compelling IT directors to throw people at the problem. That is an “embarrassingly inefficient” solution, say analysts at Grid Technology Partners, a Californian grid-computing specialist.

Utility computing provides an answer. Sharing resources and adding self-healing functions – two important tenets of utility computing – will increase technological complexity while reducing the cost of running a data centre. And therein lies its appeal.

But if utility computing proves complexity theory, it follows that a business applying its methods must already have complex IT issues. “Organisations have to have enough complexity to be able to justify the [entry] cost – from 30 servers and up,” says Gartner analyst Jim Cassell. “If you have only three servers, or if the servers are all dedicated to simple activity, utility computing doesn’t do anything for you.”

Nevertheless, service providers are optimistic that they will be able to capture customers during the economic downturn that ordinarily would not have considered outsourcing as an option to reduce their internal IT costs. Most customers have still to be persuaded. But it is inevitable that utility computing will be widely applied. The business rationale is simply too strong to ignore.

“There are many benefits to utility computing, but the force behind it can only come through demand and pressure from customers… it will be several years before it is realised in its complete form,” says Hjalsted.

 

Favoured utility computing service providers*
IBM 74%
AT&T 29%
HP 29%
EDS 15%
Sun 15%
IDC *US enterprises asked to name three companies they felt were well placed to bring utility computing to market
 

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics