
As artificial intelligence continues to evolve, large language models like GPT-4 and Gemini are becoming more powerful.
But with that power comes a massive appetite for electricity, particularly in the huge data centers that run these systems.
Now, researchers at Oregon State University’s College of Engineering have found a solution: a new type of chip that reduces energy consumption by 50%.
The project, led by doctoral student Ramin Javadi and Associate Professor Tejasvi Anand, was recently presented at the IEEE Custom Integrated Circuits Conference in Boston.
According to Anand, the root of the problem is that as the demand for data processing increases, the energy needed to send even a single bit of information is not decreasing at the same pace.
This imbalance is what drives the enormous power usage in data centers.
To tackle this, the team designed a new chip that dramatically cuts the energy required for processing signals.
These signals are vital for transmitting data in large language models, which rely on constant back-and-forth communication over copper-based wirelines in data centers.
Traditionally, sending data at high speeds can lead to corruption by the time it reaches its destination. To fix these errors, most systems use a component called an equalizer, which unfortunately consumes a lot of power.
The new chip changes this process entirely by using artificial intelligence principles right on the chip itself. Javadi explained that the chip features an on-chip classifier, which is trained to recognize and correct errors more efficiently than conventional methods.
This means it can clean up corrupted data with much less energy, saving power without sacrificing performance.
This AI-driven approach represents a major shift in how data is handled in large-scale computing environments.
By integrating smarter error correction directly into the chip, the researchers were able to slash the power consumption required for high-speed data transmission. This is not just a small improvement—it cuts energy use in half, which could have massive implications for the energy efficiency of data centers around the world.
Javadi and Anand are already working on the next version of the chip, which they expect will push energy savings even further.
If successful, their technology could help make AI models like GPT-4 and Gemini more sustainable and less costly to operate, opening the door for even larger models and more advanced AI applications in the future.
Source: Oregon State University.