Cool dudes: how UK’ data centres can get smarter at being greener in summer

According to the Met Office, April 2015 was the sunniest April on record in the UK. It’s a declaration we, in Britain, hear on a regular basis – wettest, coldest, hottest since records began – and undoubtedly, despite the end of May being a tad chilly and windy the weather is now hotting up and I expect there will be plenty more predictions about our favourite topic in the weeks to come.

However, while these statements appear to be fillers on quiet news days, we shouldn’t mock the impact this little island’s climate can have upon modern day business. The majority of companies across the country, whether enterprise or SME, require the use of technology to function and while the advent of cloud platforms and services means less infrastructure onsite, the software, services, applications, and platforms utilised by so many have to reside somewhere.

That 'somewhere' is the data centre and at present there are estimated to be 231 data centres in the UK. But what have over 200 data centres in the UK got to do with the weather? Recent estimates suggest that about 1% of the energy consumed by the entire world each year and 5% of Europe’s annual energy bill is spent just on cooling computers.

> See also: Forget big data: energy-savvy businesses need green data

Combine that with other comparisons such as data centres being in the same league as wine producers in terms of water consumption, for cooling and you start to get an idea of the environmental implications. So while we may have less hardware and kit at our companies’ physical locations those computers, servers and mainframes still reside in a building somewhere in this tempermental climate.

Combine our erratic weather, the increase in cloud adoption and the fact that the data centre industry is now only second to the airline sector in overall carbon footprint, and you’ll recognise the need for smarter, greener data centres. And why we take note of the weather forecasts!

Yet I’m pleased to report that all is not lost and many of us are striving to make the resource-hungry data centre a thing of the past. There are several steps the industry can take to become more environmentally responsible, cost effective and efficient.

Power Usage Effectiveness (PUE)

First of all you need to know what the PUE of the data centre in question is. The PUE is the most common and quoted method of measuring how efficient a data centre actually is. PUE is calculated by dividing the total amount of power a data centre consumes by that which is used by the IT equipment.

A PUE score of around 2 for a data centre would be considered ‘average’, while a PUE of 1.5 is deemed ‘efficient’. Anything below 1.5 was, until recently, difficult to achieve but by aspiring to do better it is possible. 4D-DC has achieved a PUE of 1.1 by combining all of the following:

Hot and cold aisles

The first significant shift in data centre design towards creating a more efficient cooling environment was the introduction of hot and cold aisles. Rather than trying to bring the overall ambient air temperature of the whole data centre down to 19°C, known as the brute force method, racks and equipment are set up in such a way that cold air coming out of the sub floor plenum is delivered to the front of the servers and storage area networks (SANs) (in the ‘cold’ aisle) and the hot exhaust air is focused into a specific area for the computer room air conditioners (CRACs) to handle (in the ‘hot’ aisle).

The next logical step in this design was to physically segregate the hot and cold air and thus cold aisle corridors were born, bringing with them the first significant improvement in cooling efficiency. At 4D-DC, retrofitting cold aisle corridors has improved efficiency by 15-20%.

Evaporative cooling and chillers

More recently, a new cooling technology is emerging which is set to significantly disrupt the traditional methods of cooling forever, both financially and environmentally – evaporative cooling. It’s also sometimes referred to as adiabatic cooling.

As with all good ideas, nature has been using this very process for millions of years already. Just in the same way that people, (and all mammals for that matter) naturally sweat during the summer to cool themselves down, this technology draws warm ambient air through a wetted filter or fine mist, which in turn causes some of the water to evaporate and cool the ambient air down.

Compared with traditional ‘compressor’ based air conditioning units, evaporative chillers use up to 90% less power – an efficiency figure previously impossible with traditional cooling methods. Even big US companies such as Facebook and Google have recently built new data centres, which work exclusively on evaporative cooling.

But you’re probably wondering what about all the water used? On a very hot day in the UK, a typical evaporative cooler will use an estimated 100 litres per hour. That sounds like a lot, but to put this in perspective, taking a bath or a power shower uses around 80 litres and washing your car with a hosepipe uses between 400-480 litres.

To further minimise water wastage, evaporative chillers will do clever things such as recirculate a reservoir of water so that water that isn’t evaporated can be re-used. Waste water (water that’s been recirculated for a while) can even be drained and used as ‘grey water’ in things such as toilets.

On a like-for-like basis, even taking into account the carbon cost of water production, evaporative cooling is still significantly more efficient than traditional cooling systems.

Wind in your hair

While the biggest energy consuming components of a refrigerant based CRAC unit are the compressors (which usually can’t be improved upon without great expense), the second most power-hungry parts are usually the CRAC fans themselves which circulate the air around the data floor.

Traditionally, and certainly on older CRACs, these fans work at just one speed, regardless of the level of cooling required. Not only does the CRAC fan work at 100% speed regardless of the heat load (thus drawing the maximum amount of power all the time) it also generates heat itself, which also needs to be cooled.

This particular issue can be solved relatively cheaply by installing variable speed fans. This may seem simple enough to do, but qualified AC engineers are still required, as the control system of the CRAC needs to be updated to control the speed in relation the amount of cooling required.

Plugging the gaps

Even cheaper quick wins include methods of reducing cold air leakage. If you open your fridge at home, the compressor keeping the food cool has to work extra hard as all the cold air is spilling out into the warmer ambient air of the kitchen. The same is true of cold air in a cold aisle and any gaps that lead into the hot aisle. If the cold air isn’t being directed through hardware, it’s being wasted.

> See also: How the Internet of Things will forever change the data centre

Brush grommets, which insulate the hole in which power and data cables enter a rack, are one good way of reducing leakage, as are blanking plates that can be fitted above and below servers to help prevent air seeping out. Installing grommets, adding blanking plates and expanding foam strips between racks resulted in a 5-6% jump in efficiency across the 4D-DC data floor.

Some bright ideas

Not many people know that you can buy LED tube lighting that (with a minor modification) will fit into the old-style fluorescent light fittings. The light quality is excellent and even though per unit they’re more expensive in the short run, they use half the amount of electricity to run and generate half the amount of heat, meaning less work for your CRACs!

Combined with Passive Infrared (PIR) activated lighting, these two changes alone can both reduce your carbon footprint and provide you with a return on investment in under 2 years. At 4D-DC we have 148 lightbulbs to deal with and by making this change we halved the cost of lighting and reduced the heat load created.

So whether you operate a data centre yourself, or your business is making the move towards cloud and colocation and reducing you carbon footprint is a priority for you, think carefully. Selecting the right partner and tools for the job can stop you getting hot under the collar.

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics

Data
Hosting