Imagine a serene scene: a setting sun bathes a cornfield in golden hues, where towering corn stalks sway gently in the breeze. Amidst the idyllic setting, you’d find children exploring corn mazes, farmers tending to their crops, and, somewhat unexpectedly, robots efficiently plucking ripe ears of corn for the fall harvest.

Yes, robots.

Though it may sound unconventional, the integration of robots into our agricultural landscape is steadily progressing, thanks to increasingly sophisticated software known as computer vision. These robots are now undertaking everyday tasks, from harvesting fruits to eliminating crop-damaging weeds.

As the agricultural industry faces a persistent shortage of human labor, there’s hope that machines can help bolster crop yields, ensure a steady supply of fresh produce to our tables, and minimize food waste.

However, for these robot farmhands to be effective, they must navigate complex and often confusing farmlands. Unfortunately, robots aren’t natural navigators; they frequently get lost, especially in intricate terrains. This issue has been dubbed the “kidnapped robot problem,” akin to kids struggling to find their way out of a corn maze.

In a recent study published in Science Robotics, researchers led by Dr. Barbara Webb at the University of Edinburgh sought to enhance robot navigation by endowing them with memory, drawing inspiration from an unexpected source: ants. These tiny creatures exhibit remarkable navigational skills, quickly adapting to new environments and effortlessly recalling familiar locations, even when traversing through dense vegetation.

Using images gathered from a roaming robot, the research team developed an algorithm based on the brain processes of ants during navigation. When implemented on hardware designed to mimic the brain’s computations, this novel approach outperformed state-of-the-art computer vision systems in navigation tasks.

“Insect brains, in particular, provide a powerful combination of efficiency and effectiveness,” noted the researchers.

Solving the navigational challenge not only provides robot farmworkers with an internal compass but also opens the door to enhancing how robots, including self-driving cars, interact with our world.

The Ant’s Secret

Navigating a crop field is challenging due to the lack of distinguishing landmarks; it’s challenging to determine one’s location and direction amidst the uniform environment. Robots encounter similar difficulties in the wild, struggling to recognize the same scene under changing lighting or weather conditions. Their algorithms are slow to adapt, impeding autonomous navigation in complex settings.

Ants, despite having relatively small brains compared to humans, excel at learning and navigating intricate new environments. They effortlessly remember previous routes regardless of adverse conditions, making their navigational abilities a source of fascination.

Ants don’t require precise knowledge of their location during navigation; instead, they rely on recognizing familiar places. This is akin to exploring a new town from a hotel—you don’t need to know your exact position on the map; you just need to remember the path to the café for breakfast.

Taking cues from ant brains, the research team constructed a neuromorphic robot in three key steps. First, they developed software inspired by the neural circuits in ant brains, particularly the “mushroom bodies,” which are vital for learning visual information from the surroundings. Next, they employed event cameras to capture images in a manner similar to how an animal’s eye processes light during a photograph. Finally, they utilized SpiNNaker, a computer chip designed to mimic brain functions, to encode memory.

Integrating these components, the team created their ant-like system. As a proof of concept, they equipped a mobile robot—about the size of an extra-large hamburger, aptly named the Turtlebot3 burger—with this system. The robot captured images using event cameras as it navigated challenging terrain, reporting “events” as it detected changes in its surroundings.

The robot traversed approximately 20 feet through various vegetation heights and learned from its journeys, a typical range for an ant navigating its route. In multiple tests, the AI model efficiently processed data from the trips and adapted when the route was altered. In contrast, a conventional algorithm struggled to recognize the same route and could only follow it with the exact same video recording, lacking the ability to generalize.

Efficient Robo-Brains

Traditional AI models are notorious for their energy consumption, but neuromorphic systems offer a more efficient alternative. SpiNNaker, the hardware behind this system, significantly reduces energy consumption by supporting massively parallel computing, enabling multiple computations to occur simultaneously. This not only reduces data processing lag but also enhances efficiency.

As a next step, the research team plans to delve deeper into ant brain circuits, exploring neural connections between different brain regions and groups. This could further enhance a robot’s efficiency, with the ultimate goal of building robots that interact with the world as adeptly as ants do.

By Impact Lab