Scientists create a robot that “feels” its way through tough terrain

Credit: General Robotics Lab.

Imagine going for a hike through a dense forest. You feel the crunch of leaves under your feet, hear the snap of branches, and sense the unevenness of the ground.

Your senses help you adjust your steps, stay balanced, and avoid obstacles.

Now, imagine a robot doing the same thing—not just seeing the path with cameras but also feeling it with sensors that mimic touch, balance, and even sound.

This is exactly what researchers at Duke University have achieved with a new framework called WildFusion.

Led by Boyuan Chen, the Dickinson Family Assistant Professor at Duke, the team has developed WildFusion to help robots move confidently through complex outdoor environments like forests, disaster zones, and off-road trails.

Unlike traditional robots that rely only on vision-based systems like cameras or LiDAR, WildFusion gives robots a richer understanding of their surroundings.

It allows them to “sense” their way through rough terrain, much like humans do. This new method was recently published on the arXiv preprint server and has been accepted for presentation at the IEEE International Conference on Robotics and Automation (ICRA) 2025 in Atlanta.

The secret behind WildFusion is its combination of multiple sensory inputs. Traditional robots mostly rely on visual data to understand their path.

But in the real world, especially in places like dense forests or disaster zones, vision alone is not enough. Visibility might be blocked by tall grass, fallen trees, or uneven ground. WildFusion solves this problem by adding two more senses: touch and vibration.

Mounted on a four-legged robot, WildFusion is equipped with an RGB camera, LiDAR, inertial sensors, and, most interestingly, contact microphones and tactile sensors.

While the camera and LiDAR handle the usual tasks of mapping out the area and measuring distances, the microphones and touch sensors bring in a whole new level of awareness.

As the robot walks, the contact microphones pick up the subtle sounds created by each step.

They can distinguish between the crunch of dry leaves, the soft give of mud, or the hard scrape of gravel. This auditory information helps the robot understand what kind of surface it is walking on, even if its vision is blocked.

At the same time, the tactile sensors measure the force applied to each of its feet, helping the robot determine if it is on stable ground or slipping. If it starts to wobble or slide, the inertial sensors quickly detect the change, allowing the robot to adjust its movement and regain balance. This advanced sensory awareness allows the robot to adapt its steps in real time, choosing the safest and most stable paths.

What makes WildFusion even more unique is the way it processes all this information. At its core is a deep learning model inspired by human intuition. Instead of treating the environment as a collection of isolated points, the model understands surfaces and objects as continuous forms.

This means the robot can predict safe paths even if some of its sensor data is incomplete or noisy—much like how a person might step confidently over a patch of leaves, even if they can’t see every rock or root beneath them.

The researchers put WildFusion to the test at Eno River State Park in North Carolina. The robot navigated through thick forests, grassy fields, and gravel trails, demonstrating impressive stability and decision-making.

According to Yanbaihui Liu, a Ph.D. student in Chen’s lab and the lead author of the study, it was rewarding to see the robot move so smoothly across unpredictable terrain. For the first time, a robot was able to “feel” its way through the environment, making real-time decisions based on what it sensed beneath its feet and around its path.

Looking to the future, the team at Duke University plans to expand WildFusion’s abilities even further. They aim to add more sensors, like those that detect temperature or humidity, to help the robot adapt even better to harsh and changing environments. This could make robots more reliable in search-and-rescue missions, environmental monitoring, and even space exploration.

With WildFusion, Duke University has opened a new chapter in robotic navigation, moving us closer to a future where robots can confidently explore the world as naturally as we do.