Scientists just broke the 100-Gigabit speed barrier for future Internet and AI

Credit: Unsplash+.

A new electronic system developed by researchers at Hanyang University in South Korea has achieved a major milestone in data transmission, reaching speeds of 108 gigabits per second.

This breakthrough could dramatically improve the performance of data centers, artificial intelligence systems, and future internet technologies, helping them move massive amounts of information faster and more efficiently than ever before.

The project was led by graduate researcher Sangwan Lee and Associate Professor Jaeduk Han from the university’s Department of Electronic Engineering.

Their work focuses on a highly advanced way of sending data using a technique called PAM-8 signalling. This method packs more information into each signal, which is essential for reaching speeds above 100 gigabits per second.

However, using such complex signals requires an extremely precise and stable receiver to avoid data errors.

To meet this challenge, the research team created a new type of receiver “frontend” system that improves how signals are handled when they enter an electronic device.

The system was built using 28-nanometre semiconductor technology and combines high performance with good energy efficiency.

It can handle large signal swings while keeping power use relatively low, using about 210 milliwatts of total power and achieving an energy efficiency of 1.95 picojoules per bit.

One of the key innovations is a “multi-path architecture”. Instead of sending the entire signal through a single route, the system divides it into several smaller paths.

Each path processes a part of the signal’s range, which reduces the burden on individual components.

This approach allowed the researchers to greatly improve performance, effectively doubling signal accuracy while increasing power consumption by only around 20 percent.

The team also solved a major problem related to signal loss. When very fast signals travel through cables or circuit boards, they tend to weaken and become distorted. To fix this, receivers often use what is called a feed-forward equalizer, or FFE, to correct the signal.

In most designs, the equalizer must handle strong signals, which can cause compression and reduce accuracy.

The Hanyang University team developed a new structure that separates the equalizing function from the main signal path.

This lets it calculate corrections using only a small, weakened version of the signal, avoiding distortion and maintaining high precision even with large inputs.

This technology is expected to play a critical role in next-generation communication systems, including data centers, AI server clusters, and advanced networking equipment such as future 800G and 1.6T Ethernet systems.

By allowing servers and processors to exchange data more quickly, it could significantly speed up the training of large AI models and the handling of huge data sets.

Looking ahead, the researchers believe their work could help support the development of faster supercomputers, more immersive virtual and augmented reality, better real-time translation, smarter medical systems, and safer autonomous vehicles.

At the same time, their energy-efficient design could reduce the massive power demands of data centers, helping to make the digital world more sustainable.

Source: KSR.