Microsoft immerses servers in boiling liquid to cool them

This site can earn affiliate commissions from the links on this page. Terms of use.

Microsoft has been exploring innovative ways to cool its data center servers for a few years. In the past, the company has made waves for cooling its offshore data center using seawater through its Natick Project. Now, it shows a two-phase liquid cooling solution that it says enables even higher server densities.

The new system uses a non-conductive refrigerant fluid. Microsoft doesn’t identify it precisely, but it sounds similar to 3M’s Novec 1230, with a very low boiling point around 122F (Novec 1230 boils at 120.6F). Boiling the refrigerant creates a cloud of vapor, which rises and comes into contact with a cooled condenser on top of the tank cap. The liquid then falls back into the closed-loop server chassis, replenishing the systems with freshly cooled coolant. Heat is also transferred from the server tank to a dry cooler outside the enclosure and dissipated there as well. Immersion cooling works because direct contact with a non-conductive fluid offers much better heat dissipation than a conventional air cooler.

“We are the first cloud provider to run two-phase immersion cooling in a production environment,” said Husam Alissa, lead hardware engineer on the Microsoft team for advanced data center development in Redmond, Washington.

Ioannis Manousakis, a senior software developer at Azure, is shown removing a Xeon server from his bathroom. Photo by Gene Twedt for Microsoft.

Microsoft’s blog post describes the growth of immersion cooling as a good thing, highlighting the fact that it can reduce server power consumption by 5-15 percent. The company notes that running immersion cooling allows you to direct bursty workloads to those specific servers because you can overclock them to serve requests more quickly.

Microsoft’s Project Natick experiment showed that pumping a data center module with nitrogen and dropping it into water can be quite helpful, as submerged servers suffer 1/8 the failure rate of mirror servers on land. The lack of moisture and oxygen is said to be responsible for the superior underwater reliability. This system should enjoy similar benefits. The company plans to implement data centers for low latency, high performance and minimal maintenance needs if this liquid cooling system is sustainable.

Microsoft’s blog post claims that the adoption of immersion cooling allows data centers to follow a “Moore’s Law” of their own because the change will lower power consumption and allow for higher server density, but this seems a bit tricky. . The reason companies are evaluating features like immersion cooling is because CPUs and GPUs are now struggling to deliver higher performance without consuming increasing amounts of power. CPUs can now go up to 300W in the socket, while GPUs scale up to 700W in the data center. CPUs continue to get more efficient, but increasing the number of cores and additional capacities on the matrix increase their absolute power consumption even with these gains.

An interesting question is whether we will ever see immersion cooling hit the consumer market. Technologies debuting in data centers often scale their way into personal computing over time, but building an affordable aftermarket kit for a home user to tackle this type of cooling is a difficult task. This type of cooling solution would never be cheap, but there could be a market for it in boutique gaming PCs and high-end workstations.

Featured image by Gene Twedt for Microsoft.

Now read:

Source link