It used to be that executives regarded their organisation’s data centre as little more than an expensive shed for housing the company computers. Today, though, as IT’s role as the engine room of corporate performance has been widely recognised, those data centres are seen in a different light: as highly valuable strategic assets. But as the data centre has garnered greater executive attention, it is becoming apparent that a further transformation is needed: from shed to greenhouse.
Radical changes in server technology are redefining the requirements of most data centres. Where once the primary consideration was of physical space, today’s priorities are now much more about power, cooling and energy efficiency. Blade servers and high-density racks are proving to be game-changing technologies – but while they have offered businesses convenient flexibility in the infrastructure, the effect on the data centre is proving profound. The ability to get enough power in and the excess heat out is redefining the art of data centre design.
Power-hungry and heat-generating servers demand a great deal of additional equipment to support them – souped-up air conditioning systems or even water-cooled chassis. On the whole, retro-fitting existing data centres to deal with these demands is simply too costly and too complex to be viable, says Rakesh Kumar, a research vice president at IT advisory group Gartner. That means, for many businesses, building new data centres.
Data centre design has to start with what might seem like a non-strategic decision: where to put it. Traditionally, data centres have been housed in close proximity to business operations – often in basements and out-buildings. But the wisdom of this is being questioned.
“So far, electricity use in data centres has been ignored by regulators. But if governments want to meet carbon emission targets, that will change.”
Andrew Jay, CB Richard Ellis
The shortage of all types of office space within busy metropolitan areas has pushed up accommodation costs. And in balancing the need to house employees or to find additional space for the company’s servers, cubicals and water coolers are deemed less of an overhead than environmentally controlled clean rooms and re-enforced, raised flooring. As a result, many companies are looking outside of their city centre headquarters for new IT sites.
Lessons from those that have already gone down that road suggest that there are some simple ground rules that can be applied when re-siting a data centre.
Firstly, there is the need for high-bandwidth network links. Richard Warley, European managing director of data centre hosting company Savvis, says ensuring access to two ‘fat network’ pipes is common, but three is ideal. “You’d be surprised how often we see network outages, and when it comes to transporting business-critical data, you can’t afford to be cut off.”
While business leaders may have given up on city centre spots, location still remains a critical decision-point: sound business continuity reasons make the siting of any data centre on land situated on or near a flood plain – as many parts of South East England are – unsuitable. For the same reasons, sites located on or near flight paths are dismissed. And any point where insufficient power can be drawn from the national grid is also struck from the list.
The specific requirements of a data centre, and the number of organisations chasing sites, is giving rise to a shortage of suitable land, warns Gartner’s Kumar. “A lot of it has been taken already; property developers have been wise to the fact and have been speculatively acquiring it.”
This atmosphere has sparked some hoarding – having found a suitable plot, organisations often decide to grab as much acreage as possible to ensure room for future expansion. But while that may, on the face of it, appear to be an extravagant approach, it is in fact a cunning, cost-effective ploy, says Kumar.
Tax saving technologies
As part of the Enhanced Capital Allowance Scheme, UK businesses can claim tax relief on investments made in energy-efficient equipment. Under the scheme, businesses can claim 100% first-year capital allowances for qualifying purchases.
The number of technologies on the list was broadened in September 2006, making it easier than ever for businesses to invest in energy-efficient data centres. The list includes such items as efficient generators and coolers, heat scavenging equipment and monitoring system.
For anyone looking at building a new data centre, such investments should already be a no-brainer, says Andrew Jay, of construction services group CB Richard Ellis. The tax incentive scheme simply adds to the allure. Equipment such as air-to-air recovery machinery – which can be used to capture and reuse heat – has “a very quick” payback, says Jay.
At Sungard Availability’s London Technology Centre, based near Heathrow airport, such equipment has helped the company in its efforts to meet energy efficiency targets. “Fluctuating power costs make it a pretty straightforward investment case,” says Sungard’s services development manager Chris Higgins.
But not everyone is convinced of the value of such equipment in the data centre. Most businesses want to get heat out of the data centre and there is a limited appetite for that surplus heat.
While much has been made of the possibilities of putting the heat generated by the server farms to better use – in theory it could be transferred to nearby hospitals or residential units – such ideas receive short shrift from experts such as Ian Bitterlin, vice president at generator maker, Active Power.
The heat generated within a data centre is ‘low-grade’, he explains: it needs to be used on the spot. “About the only profitable use for this heat would be to grow cannabis,” he says.
While this approach means large up-front investments in land, its value is only going to increase, while allowing the capital investment to be written down over the five to 10 year lifespan of the data centre.
Before any building can start, however, technology decision-makers also need to fully understand the business purpose of their data centre, particularly around the question of uptime.
The data centre think tank, the Uptime Institute, lists four classes of data centre, based on their availability. The highest class, Tier 4, is one that is available for at least 99.995% of the time. That equates to about half an hour’s downtime a year for a facility running 24/7.
The level of redundancy required to meet those performance criteria is immense, and managers would be well advised to carefully consider their needs before committing to such top-of-the-range facilities.
Sungard Availability Services operates a Tier 4 data centre in the industrial estates surrounding London’s Heathrow Airport. The 130,000 sq ft site can deliver up to 12 megawatts of power; in the event of a power cut, it has 5,000 batteries to supply an immediate failover, after that four diesel-powered generators can kick in. Sungard says it has enough diesel in on-site fuel tanks to keep the site running for 72 hours.
To remove the heat from the servers and other equipment, the site has an air conditioning system filled with 135,000 litres of chilled water. And to safeguard the building, there is an extensive IP-based CCTV system, identity card-controlled entry gates, as well as raised mounds, high-security fencing and a manned security gate. “Facilities like this don’t come cheap,” says Chris Higgins, services development manager at Sungard.
Not every company wants to go that far, but even with the new data centre site located and the budget in place, there are further hurdles, not least of which is obtaining planning permission. Local authorities have woken up to the practical considerations that surround data centres, says Andrew Jay, a senior director at building services group CB Richard Ellis. Noise issues can be a concern, he says: recently one of his clients was required to ensure that the data centre cooling system would be 14 decibels quieter than typical installations, to minimise disruptions to neighbours. Consequently, “the cost of the chiller went up 50%.”
But that is small beer compared to some of the looming building regulations. Energy efficiency is a hot topic: the Greater London Authority, led by the Mayor Ken Livingstone, has recently established new rules governing large building projects that mandate the use of renewable energy sources or require the owners to demonstrate the building is highly energy efficient.
Environmental considerations are already beginning to bite: Jay describes how one client calculated how many wind turbines it would need to meet the target of using 10% of its energy from renewable target.
“Once the council was presented with the plan for the building of 32, 45-metre-high turbines in a site in West London, they soon came round to negotiating an alternative,” says Jay.
Such local authority rulings could just be the start. To date, national regulators have ignored the huge levels of power being drawn by corporate data centres, much of which is frequently used inefficiently. But as governments – the UK included – struggle to meet the carbon emission targets they have signed up to under the Kyoto Protocol, they are likely to turn their attention to heavy uses, says Andrew Jay. That is likely to mean legislation.
There may be a lower chance of regulations if the industry – both vendors and end users – can show they have taken steps to address inefficiency. But there is a long way to go, says Ron Mann, a director of enterprise infrastructure services at IT systems and services company, Hewlett-Packard. While vendors are gradually coming to market with more power-efficient servers and cooling systems, many data centre managers are still ignoring best-practice guidelines, such as how to position racks to minimise heat build-up.
“You’d be amazed at the number of data centres I’ve walked into where they’ve ignored basic things like hot isle, cool isle,” he says.
However, despite the low starting point, there are today some examples of cutting-edge energy-saving data centres in operation. The US construction services company EIS Swinerton now gets 34% of the energy needed to power its Californian data centre from solar panels, saving itself 40% on electricity bills. It has calculated that this has reduced carbon dioxide emissions by 199,172 pounds each year.
Elsewhere, one British insurer is conducting a feasibility study on the use of wind farms to reduce its reliance on the electricity grid. In both cases, the rising cost of energy and the potential for cost savings has proved attractive.
But such cases are thin on the ground says Garter’s Kumar. There is “definitely interest” in pursuing more energy-efficient strategies, “but as an industry, we’re still at the very early stages,” he adds.
While government regulation is frequently unwelcome and cumbersome, some of the early efforts to encourage energy efficiency are positive, says CB Richard Ellis’ Jay. The introduction of the Enhanced Capital Allowance scheme, where businesses can claim 100% first-year capital allowances on investments in energy-saving technologies and products, should encourage take-up (see box ‘Tax-saving technologies’).
Long term change
However, while money – and planet – saving initiatives may seem like sound investments today, it is difficult to predict how this will play out in the longer term. Historically, data centre investments could be written off over as much as 20 years. But given such time-frames, will today’s priorities remain valid?
Kumar doesn’t think so. The energy crisis impacting today’s data centres may not last beyond 2011, he says: With companies such as IBM, Sun Microsystems and Hewlett Packard all investing in energy-efficient servers, the accelerating demand for power will be curtailed.
Others disagree. HP’s Mann believes that technological advances may see energy demand reaching a plateau in that timescale, but the leveling-off will be temporary. “You have to remember, this is being driven by applications. Businesses are demanding faster, more powerful applications, and that isn’t going to stop.” With the likelihood that energy costs will continue to spiral upwards, business leaders need to get smarter about energy efficiency, he adds.
But with little agreement on how much electricity a data centre is likely to require in five or more years, there is confusion about how to safely invest such large amounts in building a data centre for the long-term.
In the near-term IT organisations should expect to incorporate water cooling facilities into their data centre designs, says Kumar. However other decisions are not so clear cut.
The complexities of data centres have become so great that IT decision makers need to re-examine the very basic principles that have governed the need for such monolithic structures, says Paul Leonard, data centre marketing manager at Sun. His vision of the future is of a virtualised infrastructure on a large scale: small, relatively inexpensive groups of servers linked globally. “We’re a long way from being there yet,” he admits, “but it’s definitely where we are heading, and CIOs need to plan how to get there.”
The other grand vision for the data centre being touted by technology futurists, such as Nicholas Carr, is based on the concept of utility computing: outsourcing the entire infrastructure and buying back computing power on a usage basis. It may be an appealing vision for some, but for industries such as banking, the prospect of losing control of critical data is simply unpalatable. For these businesses, the problems of designing a future-proof data centre will persist.