Researchers at TU Delft have drawn inspiration from the natural world to develop an innovative autonomous navigation strategy for tiny, lightweight robots. Inspired by how ants visually recognize their environment and count their steps to find their way back home, this strategy could drastically improve the efficiency and application of small autonomous robots.
The TU Delft researchers designed an insect-inspired navigation system that allows tiny robots to return home after long journeys with minimal computation and memory requirements—just 0.65 kilobytes per 100 meters. This breakthrough has the potential to extend the use of small autonomous robots in various fields, such as warehouse inventory monitoring and detecting gas leaks at industrial sites.
Tiny robots, weighing between tens to a few hundred grams, offer numerous practical applications. Their lightweight design makes them safe, even if they accidentally collide with people. Their small size enables them to navigate narrow spaces, and if they can be produced inexpensively, they could be deployed in large numbers to cover extensive areas quickly, such as in greenhouses for early pest or disease detection.
However, the challenge lies in enabling these tiny robots to operate autonomously. Larger robots often rely on external infrastructure for navigation, such as GPS satellites outdoors or wireless communication beacons indoors. These methods can be unreliable in cluttered environments and costly to maintain, making them unsuitable for small robots.
The AI required for autonomous navigation has traditionally been developed for larger robots, like self-driving cars, which use heavy, power-hungry sensors such as LiDAR. These sensors are impractical for tiny robots. Vision-based approaches, although efficient, often require creating detailed 3D maps of the environment, demanding significant processing power and memory.
To overcome these limitations, researchers turned to nature. Insects, especially ants, navigate using minimal sensing and computing resources, combining motion tracking (odometry) with visually guided behaviors based on low-resolution, omnidirectional vision (view memory). The “snapshot” model suggests that insects occasionally take snapshots of their environment. When they are near the snapshot location, they compare their current view to the snapshot and move to minimize the differences, allowing them to navigate home.
Tom van Dijk, the study’s first author, likens this method to Hansel and Gretel’s fairy tale, where Hansel drops stones to find his way back home. For robots, snapshots serve as these metaphorical stones. The key is to use enough snapshots to ensure the robot can navigate home without getting lost, but not so many that it consumes excessive memory.
Professor Guido de Croon explains, “The main insight underlying our strategy is that you can space snapshots much further apart if the robot travels between snapshots based on odometry. This allows the robot to travel much further, as it flies faster when navigating between snapshots based on odometry.”
This navigation strategy enabled a 56-gram “CrazyFlie” drone equipped with an omnidirectional camera to cover distances up to 100 meters using only 0.65 kilobytes of memory. All visual processing occurred on a micro-controller, a small and inexpensive computer.
“The proposed insect-inspired navigation strategy is an important step toward applying tiny autonomous robots in the real world,” says Professor de Croon. Although the system’s functionality is more limited than state-of-the-art navigation methods—it doesn’t generate a map and only allows the robot to return to its starting point—it is sufficient for many applications. For instance, drones could be used for stock tracking in warehouses or crop monitoring in greenhouses, flying out to gather data and returning to a base station for post-processing.
This insect-inspired navigation strategy represents a significant advancement in the field of autonomous robotics, making tiny robots more practical and versatile for real-world applications.
By Impact Lab

