Why Cooling is Key
A Guide to Data Centre Cooling
When servers are running, they generate an awful lot of heat, which if not dealt with properly, can ruin your data centre very quickly. The heat builds up exponentially, as the hot air is sucked back into the server, further heating them both, creating risks such as downtime due to thermal override, and damage to the server’s electrical components. This is why the design of the cooling system is so important.
It’s possible to run the inlet temperature much higher, at around 27°C. However, this will cause the fans in most server hardware to run faster and use more power, and negate any cost savings from the cooling system. It’s advisable to reconfigure the algorithms for the server fans systems to mitigate this potential issue.
When working out your data centre cooling needs, the general rule is that every kilowatt of energy used by your hardware needs to be matched by a kilowatt of cooling. So a server room running at 5kW means the cooling systems need 5kW net cooling. This is a slight overestimation of the power needed, but it’s always wise to allow for extra to ensure you don’t suffer problems with excess heat.
To accurately calculate the design duty of your cooling system you also need to allow for additional heat gains such as thermal gains from electrical systems such as switch gear and lighting.
The efficiency of your data centre cooling systems are also important, both from a financial and environmental viewpoint. The more efficient the data centre cooling system is, the less energy used and less money spent. There are several ways you can make your data centre more efficient, such as the initial design, using a hot and cold aisle layout, and even using fresh air data centre cooling to provide free cooling through the colder months.