Skip to main content

Keeping it cool

Patrick Giangrosso, general manager at Coolcentric, put it best when he said that data centres are at a crossroads. One of the most critical issues being faced, he explained, is that as the average heat density per enclosure increases, it becomes increasingly difficult to provide the cooling necessary to properly operate the data centre. In some cases, the cooling that must be provided is twice what the heat loads require, due to inefficiencies within the cooling system. Giangrosso added that existing data centres need an easily implementable, non-disruptive cooling solution that can accommodate existing infrastructure, completely satisfy today’s cooling needs, and grow as technology and user-available resource requirements grow.

Meeting all those criteria is no small undertaking, as there are several factors at play. ‘Increased density is a hugely common system requirement – one that has led manufacturers to introduce a new generation of multi-core processors and densely packed, highly integrated server components like the Dell C8000 PowerEdge and Supermicro FatTwin series,’ said Dr Alex Ninaber, technical director at ClusterVision. ‘While addressing the performance-per-footprint requirements, these high-density, multi-function components can significantly challenge the effectiveness and power efficiency of standard on-board cooling.

‘From an economic perspective, HPC consumers are also increasingly aware of their total cost of ownership. With energy costs and heat dissipation accounting for up to 40 per cent of the total ongoing power budget, people are understandably looking for improvements both in cooling performance and the overall power-efficiency. Liquid heat-exchange and oil-submersion cooling can be particularly cost-effective for new data centres where the high inlet temperatures and re-use of the drawn heat can be used to optimise the overall thermal management of the installation,’ he added.

Looking to liquid

According to Graham Whitmore, president of Motivair, in the past few years there has been an increased focus on rack-level water cooling for HPC clusters as water and refrigerant cooling systems replaced air cooling, which became less effective for higher density loads. ‘Rack-mounted coolers typically operate with cooling water supply temperatures from 60 to 75oF – always above the data centre dew point temperature,’ he said. ‘An outdoor water-cooling source of chillers, cooling towers, or aquifer water is required to reject the heat from rack-mounted coolers, which are non-invasive and easily installed on new racks or retro-fitted to existing racks.

‘In-server water-cooling can be achieved with warmer water because the contact points (cooling pads) on the circuit boards replace the standard OEM heat sinks, where the board temperatures are highest. The cooling source for warmer water can be an outdoor radiator or cooling tower with pumped water, and no refrigeration is needed for the higher water temperature. But installation must be incorporated into the servers and requires direct water connections to all the circuit boards via water capillaries and headers inside the racks and the servers,’ said Whitmore.

Stephen Empedocles, director of business development at Asetek, added that liquid-cooling is a very different approach to the one being taken by most data centres today: ‘Water is approximately 4,000 times more efficient at removing heat than air. And yet, only a tiny fraction of data centres today use liquid cooling.’ He stated that, although liquid cooling isn’t new, it’s also not very pervasive and tends to be restricted to a handful of the highest-performance supercomputing clusters. He attributed this to the upfront cost and the complexity of operating and maintaining the highly customised equipment.

Going under

The term ‘liquid-cooling’ covers a spectrum of solutions using a range of temperatures and methods. Fully submerging servers in racks filled with a dielectric coolant is the approach being taken by Green Revolution Cooling. The company’s CEO and founder, Christiaan Best, explained the technlogy: ‘Components are completely submerged in a safe, non-toxic oil-based coolant that circulates at 40°C or higher – depending on whether it is being used for heat recapture. The figures are pretty impressive: the removal of fans and cooler components represents a server power reduction of roughly 20 per cent at the server level, while the cooling system itself uses just one to two per cent of the total computing power.’ Best said that the technology lowers the power consumption of the server and the power usage effectiveness (PUE) of the system, and provides compelling value starting at 6 kW, per rack.

Cluster specialist ClusterVision is working with Green Revolution Cooling to design and engineer customised solutions using the full contact oil cooling technology. ClusterVision’s Alex Ninaber explained that the combination of the high thermal conductivity and the intimate contact of the fluid to the server components creates a very effective heat dissipation.

The coolant returns to the rack unit at a reduced temperature, lowering the overall temperature of the rack and delivering consistent and uniform cooling to the servers. ‘While this is not necessarily a solution for all of our customers’ current cooling challenges, we are already seeing that this new technology can deliver both a highly effective cooling performance and energy-efficiency gains in excess of 90 per cent over standard air-alone methods,’ said Ninaber.

Taking an alternative approach is Motivair, whose Motivair Chilled Door solution is an active rear door cooler, which includes a chilled water cooling coil; multiple, individually fused EC fans; motorised water valve; and a PLC to control air and water flow with remote communication via LON, BACnet or Modbus. Graham Whitmore, president of Motivair, explained: ‘The multiple, hot-swappable fans provide total airflow redundancy. The Chilled Door has the ability to match the maximum server airflow and automatically modulate both air and water flow to precisely match changing server loads. Cooling capacity is currently up to 45 kW per standard 600 mm x 42U rack, using up to 75oF water.’  Addressing concerns that exist surrounding the introduction of water into an electronic device, a Motivair Leak Prevention System (LPS) is standard equipment. On detection of water under a Chilled Door the Motivair LPS simultaneously switches the fans to full speed, isolates the water supply to the door and sets off the alarm locally and remotely.

The adjoining Chilled Doors automatically increase cooling capacity to manage the additional load, while the Chilled Door in LPS alarm continues to cool the rack with maximum flow of room temperature air.

Hot or cold?

The temperature of the liquid is an important factor, as Asetek’s Stephen Empedocles explained: ‘The difference between warm and hot water cooling is not so much in the server cooling efficiency – in both cases they can capture 100 per cent of the heat. The difference is the ability to recover and re-use the waste heat. The hotter the water, the more energy can be extracted.’

Asetek’s direct to chip solution uses liquid running with an input temperature of 105oF and an output of 140oF. An integrated micro-pump cold plate transfers heat directly from the microprocessors into the liquid, and because the unit is the same size as the heat sink it replaces, it’s a simple drop-in installation.

There are also thin liquid channels that fit between each one of the memory cards in the server. The hot water goes into the rack-cooling distribution unit, which fits on the back of a server rack and has a series of liquid-to-liquid heat exchangers. A secondary loop picks up the heat and removes it from the building, where it can be cooled by the external ambient temperature.

Also taking the direct contact liquid-cooling approach is CoolIT. Geoff Lyon, CEO/CTO, commented that the company’s solution differs from those using chilled water and large heat exchangers, either in a rear door or in-row configuration.

Cold plates and pumps are utilised, and liquid heat dissipation is used directly on the CPU, GPU, RAM, etc. Lyon added that one current design challenge is that it can be difficult to make solutions compatible with an architecture that has evolved in an air-cooling realm.

The fact that the industry has previously relied so heavily on air cooling presents somewhat of an adjustment for those considering liquid, as highlighted by Paul Wright, president of LiquidCool Solutions: ‘There are concerns around the introduction of water into any electronic device.

‘A water-based system can have hundreds if not thousands of potential failure points; all it takes is one small leak to create a significant problem.’

LiquidCool Solutions (LCS) employs directed or ‘intelligent’ flow submersion cooling of virtually any electronic device, not just servers, using an eco-friendly, non-volatile dielectric carbon-based oil solution.

‘In a worst-case scenario, if we were to have a leak, you would have a small spill to wipe up. We don’t need to introduce water into the device or facility,’ he added. Wright also commented that facility footprint can be reduced by 50 per cent or more, and the energy needed to operate and cool IT equipment can be reduced by 40 per cent or more.

Offering further reassurance with its solution is Iceotope, whose coolant is not a hydrocarbon – in fact, it’s more like an extinguisher than anything else and is actually used in fire-suppressant systems. This primary coolant is in contact with every single component that generates heat. As an ultra-convective material, it passively regulates itself in terms of flow rate, without the need for pumps.

Despite all the benefits outlined in this article, liquid-cooling still has a lot of ground to cover in terms of adoption.

However, as Wright commented: ‘Liquid-cooling is beginning to gain traction; no-one denies the physics and advantages of liquid over air in removing and handling heat.

‘Even though the industry is averse to change, as most industries are, liquid-cooling’s day is coming.’



Media Partners