+31 (0) 348 72 45 00

datacenteristock-472618178artisteer.jpg

Image: artisteer, Getty Images/iStockphoto

Liquid cooling for high-performance computers is not new, but it’s different today compared to what most system administrators ever used.

You could simplify corporate IT workers into three categories: Those who may recall liquid cooling on big iron, those raised among air-cooled commodity servers, and those too young to know big iron but who may know about liquid cooling from modern gaming PCs.

SEE: Data backup request form (Tech Pro Research)

“The industry got away from it for a little while during an era of clustered computing, where there wasn’t a lot of specialization between one vendor’s servers and another vendor’s servers. All computing was thought to be the same and there wasn’t a lot of differentiation,” but now, “all of these data centers are looking for differentiated levels of performance again,” said Addison Snell, CEO of market research company Intersect360, which specializes in high-performance computing.

Data Center Must-Reads

Companies such as Asetek, CoolIT, and Green Revolution Cooling are independent leaders in liquid cooling systems for high-end servers and data center racks. There are also liquid cooling systems from some of the name-brand server manufacturers, including Lenovo, which recently announced new products in that space.

The return to liquid cooling is driven by processors that make more heat; increasing electrical costs for air conditioning systems; increasing power draw in servers and racks; and diminishing returns where stronger fans also need more electricity, negating some of their advantage, explained Scott Tease, executive director of high-performance computing and artificial intelligence, in the Lenovo data center group.

Tease offered useful advice for those about to embark down the liquid-cooled path:

  • A common mistake is there’s not good communication between facilities-centric people who operate data centers and people who work in IT. “The most important thing is to make that connection,” Tease said. “That is best practice today. It doesn’t happen as much as it should, but it’s going to happen more in the future.”
  • Safety is important. Water should be supplemented with a cleaning additive every few months; this helps minimize the damage if there’s a leak or flood.
  • Server cooling add-on hardware takes up space, so one drawback is you may need more servers to do the same job. For example, it may not be ideal for large storage installations, but it’s good for transactional servers or math-intensive applications that can run hot due to constant activity.
  • There are some servers that may not be compatible with liquid cooling at all. For example, if the RAM chips are very close together, then there may not be space for the cooling pipes.
  • If your data center is in a location with low-cost power, then liquid cooling may not be cost-effective.

SEE: The cloud v. data center decision (ZDNet special report) | Download the PDF version (TechRepublic)

Intersect360’s Snell added some of his own advice:

  • Before starting, have a clear understanding of your goals—is it to save money, improve cooling efficiency, both, or something else?
  • You might need separate water chilling units. (Lenovo’s Tease noted that some of his products work on unchilled water.)
  • Generally, understand that modern liquid cooling is a step forward, not a step backward. It’s not unlike mainframes, COBOL, or tape drives in that such technologies remain in use because they work very well, not because companies are foolish.

“Liquid cooling is traditionally associated with custom architectures [as] a specialized thing that only runs one or two specific applications,” Snell said. “That’s not the world of high-performance computing today. Cray is still out there, but these systems are largely made from industry-standard components.”

Also see