Looking out over Camp Williams, 25 miles south of Salt Lake City, Utah, it’s possible to glimpse one vision of the future of the data centre. There, work has just begun on the construction of the US National Security Agency’s $1.2 billion state- of-the-art data centre.
When complete, the 100,000 sq ft space will become the nerve centre of US intelligence operations. The scale of this particular project says something of the pivotal role that data and the facilities that house it play in national security today.
Monolithic facilities such as this, however, remain the exception. In the UK, there are only a limited number of sites that could meet the requirements for power and telecoms links needed for such an ambitious project. Indeed, competition for data centre space in the country remains fierce, despite the sluggish economy. That fact can be seen in the prices being charged by third-party data centre providers, who have been upping the rents steadily. And according to market watcher The Broad Group, third-party data centre space providers will increase their revenues by 17% in the coming year.
But it is not just supply and demand economics that are shaping the UK’s data centre landscape. Within the walls of the data centre a number of technical challenges are forcing IT leaders to evaluate whether existing facilities are fit for purpose – or can be made so.
Data centre efficiency has long been on IT chiefs’ radar, but it has taken on additional significance in the past year. A case in point is multinational oil giant Shell, which last year kicked off a global energy efficiency project focused on improving its IT infrastructure.
Energy efficiency has been pushed further to the fore thanks to government intervention. The first phase of the UK’s Carbon Reduction Commitment (CRC) Energy Efficiency Scheme – a mandatory, auction-based cap-and-trade scheme – came nto force in April 2010. This covered 5,000 of the UK’s largest energy users, forcing them to measure and report on their energy consumption and carbon emissions.
However, following the coalition government’s Comprehensive Spending Review, the carbon-trading scheme has been pushed back, from 2013/14 to 2014/15. That buys IT leaders some time to ensure that their data centre is operating as efficiently as possible. However, the financial imperative fuelling that efficiency drive – the impetus to analyse the amount of computational power gained for energy consumed – remains pressing.
The Green Grid, an IT industry consortium focused on green data centres, has led the way in providing ways to measure energy use. Its Power Usage Effectiveness (PUE) rating system is rapidly becoming the de facto measurement of data centre efficiency. In 2010, the Green Grid sought to make it easier to quantify data centre operations in terms of carbon footprint. It published its system – known as Carbon Usage Effectiveness (CUE) – for determining the volume of greenhouse gas emissions in delivering work from the IT equipment in a data centre. CUE should help business leaders to understand the extent to which their IT operations will impact upon their carbon allowances.
Small and dense
When IT managers come to scrutinise their data centres, the opportunities for wringing out efficiency gains are often obvious. Typically, large parts of the x86 server estate remain underutilised, while sparsely populated racks have helped to dissipate heat but gobbled up floor space.
The past year has seen an increasing emphasis on small, dense data centre designs, reports analyst company Gartner. It predicts that new data centres will be able to provide a 300% increase in computing capacity while taking up 60% less space than existing data centres.
The transition to a compact, high-density data centre of the future presents today’s IT leaders with some interesting new challenges, not least of which is getting up to speed with non-IT related issues such as fluid dynamics. The flow of air – or even water – through a data centre is a critical component of its efficiency, ensuring that heat dissipation takes place as effectively as possible.
For some organisations, such as Formula 1 racing team Lotus Renault, the way to deliver cooling most effectively has been to build their data centre underground, where ambient temperatures are reliably low.
Another approach is to harness fresh-air cooling. By using fresh air to cool its warm exhaust air, Hewlett-Packard has cut the carbon footprint of its newly opened 300,000 sq ft data centre in Wynard, near Newcastle upon Tyne, by 12,500 tonnes a year, compared with what it might otherwise have been.
And according to market watcher Pike Research, power and cooling infrastructure has become the fastest-rising line item on the data centre budget. Over the next five years, it predicts that this will account for nearly half of all data centre spending.
But the major trend in today’s data centre continues to be virtualisation, as IT leaders aim to drive the utilisation of their server estate ever higher. The use of virtualisation technology from the likes of VMware, Microsoft and Citrix has been gaining momentum for several years. But as it has become a common feature of the data centre, IT chiefs are growing accustomed to the knock-on effect that virtual nfrastructure has on physical systems. Over the past year, there has been a growing recognition that existing networks, whose design predates the fashion for server virtualisation, need an overhaul.
The New York Stock Exchange is emblematic of how some in the industry think the problem can be resolved. There, a programme is under way to flatten its network infrastructure, taking out network tiers. The result will be lower latency within the network, which in turn makes tasks such as migrating live virtual machines possible.
So while organisations with the clout of the US National Security Agency may be building massive new megastructures, for most data centre operators 2011 will be about continuing to squeeze efficiency from existing facilities and making new-builds as compact as possible.