Soft robotics is a rapidly growing field that has a huge amount of potential in applications where traditional rigid robots would be unsafe or unwieldy. But, building a soft robot comes with a number of unique challenges, particularly when it comes to actuation and position sensing. Fortunately, a newly-developed soft robotic finger with its own sense of self-perception may dramatically improve the situation.
This work comes from a team of researchers at the Bioinspired Robotics and Design Lab at the University of California San Diego and others around the globe. It’s intended to give soft robots the kind of positional sensing that is innately practical in rigid robots. Because a traditional robot’s frame is inflexible, it’s relatively simple to determine it’s exact position — you only need to measure the angle at each joint. But, due to their inherent flexibility, that’s not so easy with soft robots.
The solution that the researchers came up with was to use a neural network and machine learning to identify correlations between the readings from a motion capture system and flex sensors within the soft robotic finger. The flex sensors were placed somewhat arbitrarily, which would normally be extremely difficult to process through explicit programming. But, by using the neural network, the system is able to match those sensor readings to what it sees in the motion capture system.