Exoskeletons with AI could revolutionize mobility support

Illustration of the exoskeleton. Credit: RIKEN.

Exoskeletons—robotic devices that people can wear to help with movement—have long been seen as a promising technology for aging societies and people with mobility challenges.

They work by providing physical support, often amplifying strength or assisting with difficult motions.

But until now, most exoskeletons have been limited because they rely on preprogrammed movements.

The user has to “call up” these motions, which makes them difficult to use in the unpredictable situations of daily life.

A research team from Japan’s RIKEN Guardian Robot Project has found a way to overcome this limitation by bringing artificial intelligence into the picture.

Their work, recently published in npj Robotics, shows how AI can make exoskeletons more flexible, efficient, and user-friendly.

Traditionally, exoskeletons detect a person’s intent to move by using sensors placed on muscles.

This method, called electromyography (EMG), measures the electrical signals that occur when muscles prepare for movement.

While effective, EMG requires careful placement and calibration of sensors, which takes time and makes the system less practical outside the lab.

The RIKEN team wanted a simpler, smarter solution. They developed a system that combines a visual sensor with AI. A small camera, placed near the user’s eyes, captures the environment from the user’s perspective.

At the same time, sensors on the knees and torso provide data about the body’s motion. A transformer-based AI model processes this information to decide how best to assist.

The researchers tested the system with everyday tasks, such as picking up an object and climbing a step. These activities involve different kinds of physical support, making them a good test of the exoskeleton’s adaptability.

The results were encouraging: users showed reduced muscle activity when wearing the AI-powered exoskeleton, meaning the device was effectively sharing the workload with their bodies.

One of the most exciting findings was that the assistive strategy learned from one user’s data could also be applied to another person.

This cross-user adaptability is something many exoskeleton systems struggle with, often requiring retraining or recalibration for each individual. The RIKEN team’s approach suggests a path toward exoskeletons that can quickly adjust to different users without extensive setup.

“This study represents an important step toward intelligent exoskeletons that can support a wide range of human activities in diverse environments,” said Jun Morimoto, one of the study’s authors.

His colleague, Jun-ichiro Furukawa, added that such technology could be transformative in health care, rehabilitation, and elderly care—helping people with injuries or mobility impairments regain independence and improve their quality of life.

With further refinement, AI-powered exoskeletons could move beyond the lab and into the real world, offering personalized, adaptive assistance in ways that earlier designs could not. The future of wearable robots may soon be one where the machine learns not just from the user, but with them.