The mainstreaming of artificial intelligence, blockchain, and other emerging technologies could force data centers to turn to liquid cooling.
Ever the industry leader, Google recently introduced liquid cooling into the data centers it uses to host a wide range of applications—including Gmail and Google Photos—many of which feature an ever-expanding set of artificial intelligence (AI) capabilities. This move was precipitated by the tech giant’s 2018 rollout of AI-friendly Tensor Processing Unit 3.0 chips across its entire data center infrastructure. These powerful chips consume immense amounts of energy, and therefore necessitate the use of cutting-edge liquid cooling in lieu of traditional air-based cooling techniques.
Liquid cooling is not a new concept. In fact, in the ‘80s and ‘90s, some of the original data centers used liquid-based methods to cool mainframe infrastructures. However, for much of the last two decades, most data centers have shied away from liquid-based cooling techniques.
In short, liquid-based cooling uses water-cooled racks to lower server temperatures. Because water conducts electricity, it is never brought into direct contact with any server components, but rather is contained in basins, funneled through pipes, and pumped through tower pumps. In other words, in a liquid cooling system, cold water is positioned alongside server cases to bring down the temperature of the hardware contained therein.
Liquid immersion cooling is a slightly different approach in which a liquid coolant is allowed to flow directly across hot server components. The servers are fully immersed in the coolant, a dielectric fluid that does not conduct electricity. The drawback to using this fluid is that it can damage certain system components if used improperly.
Generally speaking, both approaches to liquid cooling tend to be more efficient than standard air cooling. First and foremost, liquid cooling vendors claim their solutions significantly reduce data center power consumption and Power Usage Effectiveness (PUE). Additionally, because liquid cooling is extremely efficient, data center managers can increase their processing per square foot beyond what’s possible when using forced air cooling. This opens up valuable floor space, increasing the capacity of the data center.
For colocation centers seeking to attract business with promises of better efficiency, liquid cooling can be a powerful market differentiator. Liquid-cooled servers run cooler than traditionally-cooled servers, which enables considerable performance increases and/or cost reductions. In fact, some data centers have managed to cut their energy costs by 56 percent just by switching to liquid cooling.
Of course, liquid cooling systems don’t come without their drawbacks.
In reality, not all data centers will achieve massive energy savings by switching to liquid-based cooling. The degree to which a switch to liquid cooling will lower energy costs is contingent upon the type of cooling technology used and the unique architecture of the data center in question.
Further, it’s next to impossible for an existing data center to convert to liquid cooling in one fell swoop—doing so is a complex process that requires experienced professionals to convert one rack at a time. Indeed, despite its near-unlimited resources, even Google’s adoption of liquid cooling has proceeded row-by-row. Consequently, data centers angling to make the switch to liquid cooling must be prepared to manage two cooling systems simultaneously during the transition period, a juggling act that can be both costly and complicated.
Liquid cooling also remains somewhat of a Wild West as far as industry standards are concerned. Data centers must prepare their own IT equipment for liquid cooling, a charge that has slowed industry-wide adoption considerably.
Of course, the risk of corrosion or electrocution has also put off many data centers from exploring liquid cooling. Though systems that use dielectric fluid present no risk of electrocution, many systems still use chilled or warm water, which can do serious damage to servers and other critical components.
At the end of the day, it bears asking why data centers are starting to consider a return to liquid cooling. The short answer is that the underlying computing demands of a variety of emerging technologies are tipping cost-benefit analyses that have previously been tosses-up in liquid cooling systems’ favor.
Most notably, liquid cooling is an absolute necessity for organizations interested in trying to capitalize on the rise of AI. For instance, Graphics Processing Units (GPUs) are a pivotal part of machine learning tools, and their hardware accelerators have a much higher thermal design point than traditional CPUs.
But AI isn’t the only technological advancement demanding more from data center cooling infrastructures. High-frequency trading systems and blockchain-based applications produce computationally-intensive workloads that require ultrahigh-performance CPUs and GPUs. Not only are servers running these kinds of workloads harder to cool, but this increased demand for computing capacity tends to lead to higher rack density—a recipe for overheating.
Ultimately, while liquid cooling may not have reached every data center or colocation facility just yet, there’s abundant evidence that suggests this is where the industry is headed. As such, it’s only a matter of time before data center professionals will have to take liquid cooling very seriously.