Self-driving cars sometimes struggle with crashes because their visual systems can’t always process static or slow-moving objects in 3D space. This issue is reminiscent of the monocular vision found in many insects, which excels at motion-tracking but lacks depth perception. However, the praying mantis stands out with its exceptional vision, thanks to its binocular depth perception.
Inspired by the praying mantis, researchers at the University of Virginia School of Engineering and Applied Science have developed artificial compound eyes that address significant limitations in current visual data collection and processing systems. These limitations include accuracy issues, data processing lag times, and the need for substantial computational power.
“After studying how praying mantis eyes work, we realized that replicating their biological capabilities required developing new technologies,” said Byungjoon Bae, a Ph.D. candidate in the Charles L. Brown Department of Electrical and Computer Engineering.
Biomimetic Compound Eyes
The team’s innovative “eyes” integrate microlenses and multiple photodiodes, which produce an electrical current when exposed to light. They used flexible semiconductor materials to mimic the convex shapes and faceted positions within mantis eyes. This biomimetic approach provides a wide field of view and superior depth perception.
“Creating a sensor in hemispherical geometry while maintaining functionality is a state-of-the-art achievement,” Bae said. “The system delivers precise spatial awareness in real-time, essential for applications interacting with dynamic surroundings.”
Potential applications include low-power vehicles and drones, self-driving cars, robotic assembly, surveillance and security systems, and smart home devices. Bae, whose adviser is Kyusang Lee, an associate professor in the department, is the first author of the team’s recent paper in Science Robotics.
Power Efficiency and Edge Computing
One of the key findings was a potential reduction in power consumption by more than 400 times compared to traditional visual systems. Unlike cloud computing, Lee’s system processes visual information in real time, reducing the time and resource costs of data transfer and external computation while minimizing energy usage.
“The technological breakthrough lies in integrating flexible semiconductor materials, conformal devices that preserve the exact angles within the device, an in-sensor memory component, and unique post-processing algorithms,” Bae said.
The sensor array continuously monitors changes in the scene, identifying which pixels have changed and encoding this information into smaller data sets for processing. This approach mirrors how insects perceive the world through visual cues, using phenomena like motion parallax and stereopsis to understand their surroundings.
“The seamless fusion of advanced materials and algorithms enables real-time, efficient, and accurate 3D spatiotemporal perception,” said Lee, a prolific early-career researcher in thin-film semiconductors and smart sensors.
“Our work represents a significant scientific insight that could inspire other engineers and scientists by demonstrating a clever, biomimetic solution to complex visual processing challenges,” he added.
By harnessing the unique vision of the praying mantis, the researchers have paved the way for more efficient and effective visual systems in autonomous technologies, potentially revolutionizing how machines interact with the world.
By Impact Lab