Have you ever wondered how insects can travel far from their homes and still find their way back?
This question is not only fascinating in biology but also crucial for developing AI for tiny, autonomous robots.
Inspired by how ants navigate, researchers at TU Delft have created a new strategy for tiny, lightweight robots to autonomously find their way back home.
Ants use a combination of visual recognition of their environment and counting their steps to navigate. The TU Delft team used these insights to develop an autonomous navigation strategy for small robots.
This new method allows tiny robots to return home after long journeys using minimal computation and memory—only 0.65 kilobytes per 100 meters.
In the future, these small robots could be used for various tasks, like monitoring stock in warehouses or detecting gas leaks in industrial sites.
The researchers published their findings in Science Robotics.
Small robots, weighing between tens to a few hundred grams, have the potential for many real-world applications.
They are safe because of their light weight, can navigate narrow spaces, and can be deployed in large numbers if made cheaply. For example, they could quickly cover large areas in greenhouses to detect pests or diseases early.
However, making these tiny robots operate independently is challenging. Larger robots can rely on external aids like GPS for navigation, but GPS doesn’t work well indoors or in cluttered environments.
Installing and maintaining indoor navigation aids like wireless beacons is expensive or impractical, especially in scenarios like search-and-rescue operations.
Tiny drones have very limited computing power and memory, making it difficult for them to navigate on their own. Current navigation technologies, designed for larger robots like self-driving cars, require heavy and power-hungry sensors that small robots cannot carry.
Insects, which operate over relevant distances with limited resources, offer inspiration. They combine their movement tracking (odometry) with simple visual cues (view memory) to navigate.
The TU Delft team mimicked this with a “snapshot” model, where the robot takes occasional snapshots of its environment. When close to a snapshot location, the robot compares its current view to the snapshot and adjusts its path to minimize differences, much like following a trail of stones.
Dr. Tom van Dijk, the study’s first author, compares it to the fairy tale of Hansel and Gretel: “Hansel left stones on the ground to find his way back home. Similarly, our robots use visual snapshots.”
The team tested their insect-inspired strategy on a 56-gram drone called “CrazyFlie,” equipped with an omnidirectional camera. The drone successfully covered distances of up to 100 meters using only 0.65 kilobytes of memory.
All visual processing was done on a tiny computer called a “micro-controller,” commonly found in cheap electronic devices.
This new navigation strategy is an essential step towards using tiny autonomous robots in real-world applications. Although it doesn’t generate detailed maps, it allows robots to return to their starting point, which is sufficient for many tasks.
For instance, drones could monitor stock in warehouses or crops in greenhouses, returning to base after collecting data for further processing.
Professor Guido de Croon from TU Delft says, “While the functionality is limited compared to state-of-the-art methods, it is often enough for many applications. This insect-inspired approach is a promising development for the future of tiny autonomous robots.”
Source: Delft University of Technology.