New wearable tech could let you control robots with simple gestures—even while running

Wearable technology uses everyday gestures to reliably control robotic devices even under excessive motion noise, such as when the user is running, riding in a vehicle or in environments with turbulence. Credit: David Baillot/UC San Diego Jacobs School of Engineering.

Imagine being able to control a robot just by moving your arm, even if you’re running, riding in a car or being bounced around by ocean waves.

Engineers at the University of California San Diego have created a new wearable system that makes this possible.

Their invention solves a major problem that has held back gesture-controlled devices for years: they usually stop working well the moment the user starts moving too much.

The new system, described in the journal Nature Sensors, combines soft, stretchable electronics with artificial intelligence.

According to study co-author Xiangjun Chen, most wearable gesture sensors work fine when someone is sitting still, but once they start moving, the signals become messy and unreliable.

Everyday movements—like jogging, shaking, or even the natural sway of the body—create so much “noise” that devices can no longer understand what gesture the person is actually trying to make.

This has made such technologies hard to use in real life.

The UC San Diego team solved this by training an AI system to clean up messy signals in real time.

Their soft, flexible patch sticks to a cloth armband and includes muscle sensors, motion sensors, a Bluetooth controller and a stretchy battery. When the wearer makes a gesture, the sensors collect raw signals from the arm.

The AI system then removes interference from motion, figures out the intended gesture, and instantly sends a command to control a machine like a robotic arm.

What makes this approach special is that it was trained with data collected during real gestures combined with real disturbances—running, shaking, and even the chaotic movement of ocean waves.

Because the AI has “seen” this noise before, it knows how to filter it out.

During tests, users wearing the device successfully controlled a robotic arm while running or while being exposed to strong vibrations.

The team even tested the system in UC San Diego’s unique ocean simulator, which can re-create real sea conditions. The device continued to work accurately and with very low delay, even when motions were extremely unpredictable.

This technology could have many uses. People in rehabilitation or with limited mobility could control assistive robots using simple gestures rather than fine hand movements.

Workers in hazardous environments could operate tools or robots without needing to stop and press buttons. Divers or underwater operators could control machines in rough seas. Everyday consumers might also benefit from more reliable gesture controls in phones, smartwatches and home devices.

The project originally began as a way to help military divers control underwater robots. But the researchers soon realized that motion interference is a universal problem in wearable tech. Their new method shows a promising path toward wearable systems that are flexible, wireless and smart enough to adapt to constantly changing real-world conditions.

By teaching the device to “learn” from complex environments and individual users, the researchers believe this invention brings us much closer to intuitive human-machine interaction that actually works in everyday life.