Chilling out your chips: How cold could be the future of computing

Credit: Unsplash+.

Today’s computer chips use a lot of energy and generate a great deal of heat.

Cooling them down with fans or liquid systems helps, but what if we embraced the cold instead of fighting it?

A new international study led by Dr. Qing-Tai Zhao from Forschungszentrum Jülich suggests that future electronics might run better—and more efficiently—at extremely low temperatures.

In fact, this “cryogenic computing” approach could cut energy use by up to 80%.

The idea sounds futuristic, but it’s built on solid science. Modern transistors, the tiny switches inside chips, need a certain voltage to turn on and off.

At room temperature, this is about 60 millivolts. But at lower temperatures, electrons behave more predictably and need less energy to switch.

In theory, cooling chips to near absolute zero could reduce this voltage to just 1 millivolt—saving enormous amounts of power and producing much less heat.

Even at 77 Kelvin (around -196°C), which is achievable using liquid nitrogen, energy savings of up to 70% are possible. If chips are cooled to 4 Kelvin using liquid helium, savings could reach 80%. That’s true even after accounting for the energy used to keep them cold.

However, things aren’t so simple in practice. At these frigid temperatures, certain hidden problems appear.

Tiny imperfections in chip materials can cause energy to “leak” through transistors that should be off. Quantum effects, like electrons tunneling through barriers, also interfere with the perfect switch-off behavior scientists hoped for. These issues prevent chips from reaching their full cold-efficiency potential.

But the researchers see a way forward. By using new or specially adapted materials that behave better in the cold, engineers could design transistors specifically for low-temperature use. These “cold super-transistors” would rely on technologies like nanowires, advanced insulation, and materials with very narrow energy gaps—features that let them switch with even less energy.

This kind of chip design could transform high-powered data centers, which use thousands of chips and massive amounts of electricity. Cryogenic computing is also a perfect fit for quantum computers and space exploration, where extremely cold temperatures are already standard.

Big players like TSMC—the world’s largest chip manufacturer—are already involved. And researchers believe that combining traditional, quantum, and brain-inspired (neuromorphic) chips in a single ultra-cold system could lead to powerful, energy-efficient computers for the future.

As Zhao puts it, the key to cooler, faster, greener computing might lie not in fighting the cold—but in fully embracing it.