Laying the foundations for virtualisation

For too long, the IT function has been a laggard, unable to respond sure-footedly to the shifting sands of the enterprise. So says Pat Gelsinger, the tech industry veteran currently chief of virtualisation maven VMWare.

As he strode across the stage in Barcelona, at the firm's annual European user gathering this Autumn, he had a singular message for the assembled throng: 'It's time we had IT infrastructure that operates at the speed of business,' he exhorted.

According to Gelsinger, the only way to deliver an IT function that can react instantaneously to business demands is through the soup-to-nuts implementation of virtualisation software across the datacentre.

The only way business demand can be met is to decouple the applications at the heart of the enterprise from the physical hardware of the datacentre. 'Virtualisation has been the most powerful tool for IT over the last decade for transformation and cost savings,' Gelsinger told his audience.

> See also: Cisco buys cloud virtualisation company for $180m

By extending the deployment of virtualisation to cover 100% of an enterprise's compute function – desktop and datacentre applications – to its network and storage and all wrapped up in management tools, firms can deliver on-demand IT, argues Gelsinger, making the IT function both responsive and cost effective.

And in the event that a datacentre hits peak capacity, this all encompassing use of virtualisation can allow firms to seamlessly add capacity from a separate facility – or even a public cloud. The result is dubbed the software-defined data centre (SDDC).

This vision has been enthusiastically embraced by some industry watchers. Forrester Research analyst Richard Fichera argues that the integration of legacy IT infrastructure along with newer virtual machine-based and cloud architectures will see SDDC become the defining data centre architecture over the next five years.

Yet while the SDDC provides a seductive image of how the IT function might evolve to become one ready to meet whatever challenges the business throws at it, the path to achieving is by no means smooth. Indeed, those considering adopting a SDDC approach, could face the terrible realisation that the best place to start is not with the datacentres they currently have.

Facilities ostensibly built for mainframes, or perhaps client server architectures may struggle to adapt to this new software-defined world.

To understand some of the problems on the horizon, it's instructive to consider how server virtualisation has evolved.

In its 2013 State of Virtualization report, storage firm Datacore estimated that two-thirds of enterprises now have more than half of their servers running virtualised mission-critical applications. That is expected to rise to 80 percent of mission-critical applications by the year's end. Gelisginer's vision of 100% of applications running on virtual machines may be realised in the not-too-distant future.

But the increasing proportion of applications running inside virtual machines has some very real-world implications for the physical structure of the datacentre, says Matthew Baynes, enterprise sales director at datacentre infrastructure firm Schneider Electric. Here, there is potential to create unexpected hot spots within the datacentre, as server utilisation rates rise.

The whole notion of server virtualisation is predicated on being able to achieve higher CPU utilisation rates on the remaining physical hosts. But in so doing, the servers draw more power. Furthermore, virtualised servers tend to get housed in the same part of the datacentre, creating localised high-density areas and potentially hot spots. Unless firms plan for additional cooling for virtualised environments, they may be in for a nasty surprise, says Baynes.

And while server and even desktop virtualisation has become relatively commonplace in today's enterprise, software-defined networking and virtualised storage provision both promise to expand the datacentre infrastructure coming under the virtualisation umbrella.

On the storage side, several emerging vendors such as Nutanix, Simplivity, Tintri and Virsto are transforming physical infrastructures. This is particularly suited to firms considering virtual desktop clusters, Hadoop and big data environments and private clouds, all of which feature prominently on IT leaders' to-do-lists, says Forrester's Fichera.

'The next evolution of IT infrastructure architecture is happening now, shifting from a server-centric to a storage-centric approach,' he says.

These storage-centric products combine x86 server architecture integrated with storage capabilities, providing firms with a path to the SDDC, argues Fichera. The allow firms to increase their levels of storage virtualisation as they see fit, he adds.

For too long storage has been a major headache for datacentre operators, says Howard Ting, marketing chief of Nutanix. 'What's happening today is that firms are moving away from the restrictions of dedicated rigid [storage] assets to a more fluid, agile approach.'

But where as server virtualisation introduced the possibility of increasing heat generation in some aisles, Nutanix appliances, which combine compute and storage capabilities, help reduce the need for cooling, argues Ting.

Nutanix appliances can reduce the power drawn in the datacentre dramatically he argues, because they replace dedicated server and storage systems, which typically each have their own dedicated fans and CPUs. 'Our appliances actually see power and cooling requirements reduce,' he claims.

Nonetheless, SDDC presents further challenges to datacentre energy consumption, even if less power and cooling is needed in the rack, warns Schneider's Baynes. This is the power usage effectiveness (PUE) problem of SDDC, he adds.

In many existing datacentres, power and cooling infrastructure were baked in at the outset, with particular demands in mind. So if the power and cooling requirements of the racks drops thanks to virtualisation, the facility is still geared up to deliver more than is needed – and the PUE rating gets worse.

“For many regulators looking at datacentre energy use, PUE has become the touchstone,” says Baynes.

This is a particular issue for legacy facilities; more recently, firms have adopted a more modular approach to datacentre design, which is better able to take account for fluctuations in power demand, providing the bare minimum level of cooling needed, adds Baynes.

Nonetheless, legacy datacentres are an everyday reality for most enterprises. To fill the void, a new breed of datacentre infrastructure management (DCIM) tools is emerging, such as those from Optimum Path.

These tools are intended to enable IT decision makers to understand the power characteristics of their IT equipment – whether that be storage, compute or networking. That can then be used to provide a forecast of what power and cooling requirements will be with the datacenter operating under varying conditions.

Indeed, some of these new DCIM systems promise to even check energy prices in real time, enabling datacentre operators to make their facilities even more cost effective. So for firms that truly are hellbent on embracing the SDDC, is what they really need to keep things ticking over smoothly is yet more software?

Not everybody is altogether convinced. Analyst house the 451 Group suggests that spending on DCIM tools is set to rise significantly over the coming years, toPpping $1.8bn by 2016, compared to $621m in 2013.

While the benefits of DCIM may seem readily apparent, notes Rhonda Ascierto, a research manager at 451 Research, it is a cost that many businesses have not accounted for in their initial datacentre planning. Furthermore, 'because DCIM crosses the typical divide between IT and facilities departments within companies, there are questions over whose budget should pay for DCIM,” she adds.

If software truly is the panacea for the complexity and cost of the traditional datacentre, implementing looks like being the single biggest challenge.

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics

Virtualisation