Cooling the future: New system to save energy in data centers

The cooling system dissipates heat from server chips through phase change, such as boiling a liquid into vapor in a thin, porous layer. Credit: University of Missouri

Artificial intelligence (AI) is becoming a major part of our daily lives.

But as AI grows, so does the need for data centers to power it. Keeping these centers cool takes a huge amount of energy.

This problem will only get bigger as high-powered AI computers become more common.

That’s why Chanwoo Park, a researcher at the University of Missouri, is working on a new cooling system to reduce energy use in data centers.

Park’s work is published in the journal Applied Thermal Engineering.

“Cooling and chip manufacturing go hand-in-hand,” said Park, a professor of mechanical and aerospace engineering at the Mizzou College of Engineering.

“Without proper cooling, components overheat and fail. Energy-efficient data centers will be key to the future of AI computing.”

Data centers are large facilities full of servers that store and process data. They are like giant computer hubs running websites, mobile apps, and cloud data.

However, they consume a lot of power. In 2022, data centers used more than 4% of all electricity in the U.S., with 40% of that energy spent on cooling equipment. As the demand for data centers increases, even more energy will be needed.

Currently, data centers are cooled with either air-moving fans or liquid that moves heat away from computer racks.

Park and his team are developing a new two-phase cooling system that efficiently dissipates heat from server chips through phase change, like boiling a liquid into vapor in a thin, porous layer.

This system can operate passively without consuming energy when less cooling is needed. Even in active mode, where a pump is used, it consumes only a tiny amount of energy.

“The liquid goes in different directions and evaporates on a thin metal surface,” Park said. “Using this boiling surface, we’re able to achieve very efficient heat transfer with low thermal resistance.”

The system also includes a mechanical pump that activates to absorb more heat only when needed. Early tests show that two-phase cooling techniques drastically reduce the amount of energy needed to keep equipment cool.

Park’s team is now building the cooling system, designed to easily connect and disconnect within server racks. Park hopes these systems will be in use within the next decade, just as AI-powered computers become mainstream.

“Eventually, there will be limitations with current cooling systems, and that’s a problem,” Park said. “We’re trying to get ahead of the curve and have something ready and available for the future of AI computing. This is a futuristic cooling system.”

Park’s work aligns with the goals of the Center for Energy Innovation, a building being constructed on campus to help researchers solve challenges related to rising energy concerns and rapid AI growth. The center aims to use advanced technology to improve energy production, storage, and efficiency.

“The center will allow us to explore additional ideas and innovations around energy-efficient processes,” Park said. “These are complex problems that require different areas of expertise. I look forward to future collaborations.”

In summary, this new cooling system could save a lot of energy in data centers, helping to make our growing AI technology more sustainable.