Service-centric Computing – A disruptive force

“We are,” says Dr. John Manley, director of Hewlett-Packard’s Bristol-based utility computing laboratory, “at the beginning of the fifth age of computing.” Like the Internet era it succeeds, this new era of ‘service-centric computing’ will revolutionise business computing, increasing the power available even to small companies and individuals and reshaping the global IT industry in the process.

With such momentous changes afoot, vendors are urgently researching two key questions: what technologies will drive the service-centric computing era, and what will the IT industry look like once they have been found?

So far, HP’s exploration of the first question has centred on the development of concepts such as grids, utility data centres, ‘on-demand’ computing, virtualisation and provision. However, the efforts to answer the second question – on the future shape of a utility-based industry – are altogether less mature.

Certainly, it is no surprise to find that Manley’s group at HP is pushing hard to refine the virtualisation technology required to build a service-centric computing world. “We are transforming the whole of the HP Labs organisation into an ‘adaptive enterprise’ built on a utility infrastructure,” says Manley.

To that end, HP has consolidated its three Bristol data centres down to one, and is on course to increase its rack mounted machines from 500 to 2,000. They will support all of the 170 principle applications used by HP Labs, allowing HP to benefit from substantial savings.

However, the company’s ultimate object is to provide a prototype platform for the adaptive enterprise environment.

‘Utility farms’

The whole adaptive enterprise concept seeks to create a computing infrastructure that is so flexible that it can be quickly and dynamically mapped to an organisation’s changing business requirements. Pooling servers, storage and network devices into a single virtual resource is the first step; the next is to provide management tools that allow this resource to be cut and diced to meet shifting demand with minimal, and ultimately no, human intervention.

To do this HP is developing the concept of templated systems or ‘farms’. Using a graphical management console, administrators construct these farms by dragging and dropping icons representing servers, routers, disks and, eventually, application software. These templates are then submitted to a super-administrator and, so long as the resources are available within the resource pool, and the request complies with predefined business use parameters, the desired resource package is automatically made available.

Using this templated method, HP is able to demonstrate from scratch the design, construction and ‘ignition’ of a web server capability in roughly 20 minutes. And because the process involves creating a template, “it is fast, repeatable, reliable and very flexible”, says Manley.

HP’s research suggests that this kind of flexible ‘utility computing fabric’ may have a dramatic long-term impact on IT practice and the supply-side industry.

Horsepower on tap

Bristol is a global centre of the animation industry, home to a number of the kind of small- to medium-sized companies that typically inhabit this sector. To test the utility service business model against its new utility fabric, HP invited one of these companies, 422, to create a short animated film, using the HP data centre to handle the graphics rendering.

Rendering is the most compute intensive part of animation. A single frame can take two hours of computation, and each film uses 25 frames every second. Given that a finished film may only use 15% of the total frames created, even a four minute film could consume 36,000 frames – or 72,000 hours of compute time.

Using HP’s utility data centre, 422 was able to remotely access computer power as it was required – at one point its animators were using a virtual machine containing 104 processors. According to Andy Davies-Coward, 422’s chief creative officer, a 6,000 frame film that would normally have taken the company 50 days to create was completed in 17 days.

The significance of this for the animation business is enormous. It means that small companies will be able to afford the same computing resources that have previously only been available to industry giants such as Pixar. But what does it mean for the IT industry?

For application vendors such as Maya, whose rendering software was used by 422, it means that more companies can now afford the platforms needed to make their applications practicable. For telecoms operators, it suggests that there will genuinely be a vibrant market for heavy duty broadband services. And, for a myriad of other smaller service providers and aggregators, a whole new set of opportunities may be about to emerge.

Ironically, when the utility role of major hardware players is considered, the picture starts to look a little more threatening. Providing the raw resources – ‘the bits, bytes and cycles’ – that drive utility computing services is likely to a be a very low-margin business, but it is precisely the part of the utility computing equation that companies such as HP are currently best equipped to deliver.

This must give HP, IBM, Sun and even Dell pause for thought. As Manley points out, “[Today] there is a lot of talk about cost reduction. I think utility computing is really about disrupting industry structures, business models and ways of working.”

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics