Free cooling in data centres
Natural Cooling is increasingly Popular
A survey has found that half of respondents are now using natural cooling to save energy and cost and 25% are considering adopting the technology in the future.
A triumph for the natural cooling movement and the environment then? Well not quite, the survey also contained a curious twist: despite the use of natural cooling, the survey conclusion was that there was little difference between the reported Power Usage Effectiveness (PUE) of data centres using natural cooling and those that don’t.
On close examination of the data, perhaps the survey was being a bit hard on natural cooling in indicating that there was no statistically significant difference: the average PUE of surveyed data centers using natural cooling was 1.56; the average for data centres not using natural cooling was 1.83. That difference will still result in total electricity costs savings overall of millions of pounds over the lifetime of those data centres and carbon reduction running in the thousands of tonnes.
The survey also indicates that one of the main reasons for not implementing natural cooling was the costs of retro-fitting the system to an existing data centre. Having recently replaced an ageing CRAC system with a Stulz free cooling system, we have a number of benefits in both the business case for retrofitting free cooling and maximising energy savings in life:
1. Understand the climate and remember that a temperate climate is ideal. Despite the current interest in locating data centres in Scandinavia and Iceland, it’s worth noting that very cold weather may require heating of cooling air from outside coming into the data centre. We have found that in recent very cold weather, our free cooling system has increased energy consumption not decreased it.
2. Understand and manage your equipment lifecyles continually and effectively and understand the TCO of managing and maintaining your existing systems. It may not cost in today to ‘rip and replace’ an existing system today but overtime, it may become more and more attractive, particularly as IT equipment density and power consumption increases.
3. Do the data centre basics right: it’s no good spending heavily on replacing your air conditioning if there are high levels of air flow leakage or wrongly positioned perforated tiles.
4. Consider deploying free cooling on a modular manner in your data centre to complement existing cooling. If you have space, it may be practical to retain an existing CRAC system for summer use, adding a free cooling solution alongside. Control can be a challenge with old and new systems; but could be simple as having older units on a slightly higher set point.
5. Understand the different IT loads and density both existing and future, particularly in different halls or suites in your data centre and match cooling accordingly. This will reduce the risk of having to revisit a new system in the future at added cost.
6. Buy a cooling system with variable speed drives. Our units are fitted with Electronically Commutated (EC) fans that are variable speed so the fans can slow down to a minimum speed when not required. As fan laws dictate that power input is a cube of fan speed any reduction in fan speed has a large reduction in the CRAC unit’s power input. Halve the fan speed and the fan power input is just one eighth.
7. Maximise the use of free cooling by ensuring that it is operational for as many hours as possible during the year, outside temperate permitting, by setting targets for use; measuring use on a weekly or monthly basis and incentivising individual data centre managers to meet these targets, potentially with financial reward.
8. Huge strides have been made by IT vendors in increasing the operating temperature range of equipment. Take a holistic view of your energy consumption when refreshing IT equipment, by ensuring that you understand the operating temperature range of your IT equipment. To maximise efficiency, be willing to align the operating parameters of your IT equipment with that of your natural cooling system.
9. Manage and monitor your airflows - deploy plenum return ceiling, aisle containment etc. This raises the return air temperature and allows the supply air set point to be raised a few degrees, both of which maximise the efficiency and operating hours of free cooling.
10. Measure, monitor and maintain: you need to constantly check PUE, temperature set points etc to make sure everything in your data centre is running as it should. Don’t wait for the electricity bill to find out!In conclusion, free cooling is gaining favour and it can deliver substantial cost savings and environment gains, but like any other technology, it needs to be implemented and used correctly to maximise efficiency savings.
Looking to gain a better understanding of data centre free cooling systems? Contact us on 01993 774444 or email email@example.com