Brain-inspired AI could slash energy use and boost speed, scientists say

Credit: Unsplash+.

Artificial intelligence may soon get a major upgrade—by learning from the human brain.

Researchers at the University of Surrey have developed a new brain-inspired approach that could make AI systems faster, more energy-efficient, and even smarter, all without losing accuracy.

The breakthrough, published in Neurocomputing, comes from Surrey’s Nature-Inspired Computation and Engineering (NICE) group.

The team discovered that mimicking the brain’s efficient structure—where neurons are connected in a sparse, organized way—can dramatically improve how AI models work.

Modern AI systems, like ChatGPT or image recognition networks, rely on artificial neural networks (ANNs).

These are made up of layers of digital “neurons” connected to each other.

But today’s systems tend to connect every neuron in one layer to all the neurons in the next.

That setup allows for complex learning but wastes enormous amounts of energy and computing power.

The Surrey researchers tackled this inefficiency with a new technique called Topographical Sparse Mapping (TSM).

Instead of connecting every neuron to every other one, TSM links each neuron only to nearby or related neurons—similar to how the human brain’s visual system works. This selective wiring cuts down unnecessary connections, saving both energy and time while maintaining accuracy.

An improved version of the model, called Enhanced Topographical Sparse Mapping (ETSM), goes even further.

It introduces a “pruning” process that mimics how the brain refines its neural pathways as it learns. During training, ETSM removes weak or redundant connections, leaving behind only the most useful ones. This makes the system leaner, faster, and more efficient.

Dr. Roman Bauer, a senior lecturer at Surrey and the project’s supervisor, said that training large AI models today requires massive energy resources—sometimes more than a million kilowatt-hours of electricity, equivalent to what over a hundred U.S. homes use in a year.

“That simply isn’t sustainable at the rate AI continues to grow,” Bauer said. “Our work shows that intelligent systems can be built far more efficiently, cutting energy demands without sacrificing performance.”

The team’s enhanced model achieved up to 99% sparsity—meaning it eliminated nearly all unnecessary neural connections—while matching or even exceeding the accuracy of traditional models on benchmark tests. Despite being far smaller, it trained faster, required less memory, and consumed less than 1% of the energy used by conventional AI systems.

Lead author and Ph.D. student Mohsen Kamelian Rad explained that the key lies in copying how the brain organizes its neurons spatially. “When we mirror this topographical design, we can train AI systems that learn faster, use less energy, and perform just as accurately,” he said.

For now, the method focuses on improving the first layer of AI networks, but the researchers plan to extend it deeper into future models. They are also exploring how the approach could transform neuromorphic computing—a field that designs hardware to function like the brain itself.

If successful, it could lead to a new generation of AI that’s not only powerful but also planet-friendly.