Turning IT into a utility

Since the beginning of 2002, one item has dominated the agenda of the major computer vendors: the rationalisation of IT architectures to deliver computing power as a dependable, efficient service. What has driven that focus at companies such as IBM, HP and Sun Microsystems is pressure from customers.

To address Y2K issues, to take advantage of open systems, to gear up for the Internet boom, and to devolve IT decision-making down to departments, organisations had invested rapidly during the previous five years, organically growing their infrastructures to include hundreds or even thousands of systems, many incompatible and most running at a fraction of their full capacity.

Utility computing, along with the related approaches of grid and autonomic computing, enables suppliers to attack that complexity, and for the first time enable organisations to consume computing power as an on-demand service akin to electricity or other utilities.

     
 

How it works

Utility computing leverages a host of emerging technologies to reduce computing infrastructure complexity and deliver IT as a scalable, efficient service.

That is achieved by consolidating and networking together core server and storage resources. Virtualisation software enables administrators to view these as a single pool of resources and, using provisioning software, applications can be rapidly allocated the processing resources they require to meet fluctuating demand.

Systems management software is used to automate much of this resource allocation, but, at a higher level, policy management tools can automate the process by reconfiguring and serving up resources when pre-defined thresholds are met.

Metering software tracks usage levels, so that different departments are billed only for the resources they consume.

Two related models complement the utility computing approach. Grid computing spreads the processing workload of a specific computing task over a network of servers or even PCs, returning the results of the distributed processing to a original server.

Additionally, autonomic computing extends traditional fault tolerant approaches to the network, enabling servers to protect their environment by passing their workload to other devices when there is a failure.  

 
     

     
 

Business drivers

The business drivers behind the move to utility computing can be summed up in two words: complexity and cost.

"Today’s server infrastructures are a mess," says Galen Schreck, an analyst at Forrester Research. As organisations deployed applications over the late 1990s and into this decade, they built vast server farms and storage silos that have been expensive to administer, under-utilised, and inflexible when it comes to deploying new applications.

Server consolidation has only gone so far, failing to untie applications from specific hosts.

Organisations now want to treat their servers and storage as a single, virtual system, and the flexibility to move workload around to cost-effectively and fully exploit these resources.


Paybacks

By building a single pool of resources that can be tapped on demand, organisations should be able to:

  • lower their expenditure on servers, storage, and IT administration;
     
  • match IT resources to changing business demands and implement new services more rapidly;
     
  • meter and bill usage accurately;
     
  • reduce risk by only paying for the IT resources they consume. °

 

 
     

     
 

Data centre efficiency

Executives such as Sam Palmisano, CEO of IBM, and Scott McNealy, CEO of Sun, predict that utility or on-demand computing will drive new economics within the computer department. Sun, for example, predicts a massive effect on data centre efficiency:

Greater utilisation System utilisation rates will rise from an average of 6%-15% to over 80%.

Lower administration overheads The average number of servers managed by each systems administrator will jump from 15-30 servers to in excess of 500; each database administrator will be able to manage 100 times more data; and network administrators should be able to oversee over 500 ports, rather than the 50-100 they manage today.

Time to service deployment With the help of provisioning software, the time required to bring a new service online will be reduced from weeks to a matter of hours.


Early adopters

Utility computing is at the proof of concept stage, so vendors are still building their reference site lists.

IBM cites financial services companies JP Morgan Chase and American Express as its flagship users, while Hewlett-Packard points to Kelloggs and Jaguar Racing. There is also a growing list of commercial users of grid computing, including engine builder Pratt &Whitney and sharedealer Charles Schwab.  

 
     

     
 

The name game

The rush to establish mindshare – and early market leadership – in utility computing has had vendors, analysts and consultants scrambling to establish their unique definition of the revolutionary approach.

After pioneering grid computing and autonomic computing, IBM has chosen to focus on the business deliverables by grouping together its initiatives under the concept of ‘E-business on demand’.

Perhaps its reticence to embrace the utility term stems from the fact that Hewlett-Packard was the first entrant in the market with its Utility Data Center product in 2001 and has tried to take ownership of the utility computing catch line ever since.

In comparison, Sun Microsystems seems to want to cover all bases, saying its N1 architecture is "an extension of both the grid and utility models. Sun itself sells a Grid Engine, but has yet to coin a label for what N1 is, although it has been flirting with "Just-in-time computing".

Several analyst groups want to go even further. Forrester Research, for example, refers to the new architecture as Fabric-Based Computing. It defines that as "a computing model that provides utility-like processing power on demand using a non-proprietary high-speed network", putting great emphasis on the construction of a fast network backbone for moving application modules and data around.

Outdoing everyone in its contrariness, though, is analyst group IDC, which refers to the new approach to IT delivery as ‘computing utility services’.


Quote/unquote

What the suppliers say

"With utility computing, we are building computers out of the network. The hardware is easy; the hardest part is systems software and virtualisation."
Sun Microsystems CEO Scott McNealy

"[We are making] a $10 billion bet, to put all of these things together to make on-demand e-business a reality. A bold bet, yes. A risky bet? I don’t think so."
CEO IBM Sam Palmisano

What the analysts say

"To combat spiralling IT complexity, data centres will be transformed into networked utilities."
Galen Schreck, Forrester Research  

 
     

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics