Utility computing ‘will change the IT industry’

 
 
 

10 January 2003 Petroleum Geo-Services (PGS), the flagship customer for IBM’s new utility-based supercomputing strategy, has said that the ‘on-demand’ technology model could spark a major change in the way that IT is bought and deployed.

Chris Usher, president of worldwide data processing at PGS, said he expects the company to save up to $1.5 million (€1.4m) in 2003 by outsourcing peaks in demand for supercomputing processing to IBM’s new utility computing facility in Poughkeepsie, New York.

PGS is a $1.2 billion (€1.14bn) revenue company that provides geophysical and oil production services.

The facility was built by IBM to cater for the needs of industries such as petroleum services, life sciences and digital media, industries that often need to rapidly process numerically intensive calculations. The facility is part of IBM’s ‘on-demand’ initiative, which uses technologies such as utility and grid computing to provide low-cost processing power as and when organisations need it.

“Customers in some sectors want access to large scale computing power in short bursts,” said David Turek, vice president of IBM Supercomputing. “The ability to buy computing power on-demand allows customers to save on server maintenance, management and to scale their infrastructure rapidly, in response to business demands,” he added.

For example, PGS is using the facility to provide additional processing power for a three-month project in Mexico. In addition to the cost savings, Usher said that the ‘on-demand’ service had removed the 6-8 weeks it usually takes to provision in-house supercomputing power for such projects.

According to Usher, such benefits will quickly persuade many of PGS’s competitors to sign up to similar services – particularly in view of the current uncertainty surrounding petrol prices. “Right now, everyone is worrying about petroleum costs… and it would be very effective to pool [IT processing capacity] amongst competitors,” he said.

Usher added that PGS will consider moving more of its supercomputing infrastructure over to IBM during 2003, depending on the success of this project.

As part of its ‘on-demand’ initiative, IBM is also offering more typical enterprise applications to medium and large organisations based on its utility computing technology. IBM Global Services has attributed its success in winning major outsourcing deals with financial services giants JP Morgan Chase and Deutsche Bank to its adoption of utility computing. It says it enabled it to offer a bigger ‘pay as you go’ element to the contracts.

However, PGS’s Usher points out that ‘on-demand’ will not be ideal in every situation. Although it will almost certainly be cheaper to outsource applications to a utility computing infrastructure if a company has no extra facilities to scale up its in-house IT, Usher said that the cost of outsourcing supercomputing processing would become unattractive after 6-8 months compared to the cost of running the same capacity in-house.

The February issue of Information Age will include a special report on utility computing. For a free trial subscription to Information Age, please click

Related Topics