3D printed skin reacts to texture and shape like our skin
Robots can be programmed to lift a car and even help perform some surgeries, but when it comes to picking up an object they have not touched before, such as an egg, they often fail miserably. Now, engineers have come up with an artificial fingertip that overcomes that limitation. The advance enables machines to sense the textures of these surfaces a lot like a human fingertip does.
The researchers are “bringing the fields of natural and artificial touch closer together … a necessary step to improve robotic touch,” says Mandayam Srinivasan, a touch researcher at the University College London who was not involved with the work.
Engineers have long sought to make robots as dexterous as people. One approach involves equipping them with artificial nerves. But, “The current state of robotic touch is generally far inferior to human tactile abilities,” Srinivasan says.
So, when researchers at the University of Bristol began designing an artificial fingertip in 2009, they used human skin as a guide. Their first fingertip—assembled by hand—was about the size of a soda can. By 2018, they had switched to 3D printing. That made it possible to make the tip and all its components about the size of an adult’s big toe and more easily create a series of layers approximating the multilayered structure of human skin. More recently, the scientists have incorporated neural networks into the fingertip, which they call TacTip. The neural networks help a robot quickly process what it’s sensing and react accordingly—seemingly just like a real finger.
In our fingertips, a layer of nerve endings deforms when skin contacts an object and tells the brain what’s happening. These nerves send “fast” signals to help us avoid dropping something and “slow” signals to convey an object’s shape.
TacTip’s equivalent signals come from an array of pinlike projections underneath a rubbery surface layer that move when the surface is touched. The array’s pins are like a hairbrush’s bristles: stiff but bendable. Beneath that array is, among other things, a camera that detects when and how the pins move. The amount of bending of the pins provides the slow signal and the speed of bending provides the fast signal. The neural network translates those signals into the fingertip’s actions, making it grip more tightly for example, or adjust the angle of the fingertip.
“A lot of our sense of touch is shaped by the mechanics [of the skin],” says Sliman Bensmaia, a neuroscientist at the University of Chicago who studies the neuronal basis of touch. “What this approach does is really tackle that head on.”
In the new work, University of Bristol engineer Nathan Lepora and colleagues put the artificial tip through its paces, testing it the same way researchers assess a person’s sense of touch. They measured the output from the camera as the fingertip touched corduroylike materials that had gaps and ridges of different heights and densities. Not only could the artificial fingertip detect the gaps and ridges, its output closely matched the neuronal signaling patterns of human fingertips undergoing the same tests, the team reports today in the Journal of the Royal Society Interface.
The artificial fingertip was not quite as sensitive as the real McCoy, however. A human can detect a gap as narrow as lead from a pencil, whereas TacTip needed it to be twice as wide to notice it, Lepora notes. But he thinks that resolution will improve once he and his colleagues develop a thinner outer surface.
In a second project, Lepora’s team added more pins and a microphone to TacTip. The microphone mimics another set of nerve endings deep within our skin that sense vibrations felt as we run our fingers across a surface. These nerve endings enhance our ability to feel how rough a surface is.
The microphone did likewise when the researchers tested the enhanced fingertip’s ability to differentiate among 13 fabrics. Again, the signals from the microphone and the camera mimicked those recorded from human fingers doing this test, Lepora notes.
The studies impress Levent Beker, a mechanical engineer at Koç University who works on wearable sensors. “A robotic hand can [now] sense pressure and texture information similar to a human finger,” he says.
“It’s a very interesting approach that I don’t think anyone else has taken,” Bensmaia adds. “It’s very cool.” However, the signals from the artificial and natural fingertips are not quite the same, as the signaling in real skin is more intense. “It’s only moderately skinlike.”
Still, Bensmaia thinks this fingertip can help robots detect, pick up, and manipulate objects. And the deformable, rubbery fingertip should give a bionic hand a leg—or hand—up on current devices with stiff metal fingers and toes, he says.
Today’s robots must be precisely programmed to attach a particular car part, and they, as well as hand protheses, have trouble holding on to hard objects, such as a pen or a toothbrush. Fingertips like TacTip could enable robots and prostheses to handle objects of all shapes and sizes without such programming, Lepora says. But Bensmaia points out that “it’s not clear to what extent it can be miniaturized.”
Lepora is optimistic TacTip will shrink. Cameras and microphones are getting smaller all the time, and improved 3D printing techniques are enabling thinner layers. Both he and Bensmaia think such smaller devices might approximate human “feel” even more because they would be able to detect finer textures and thus be more dexterous.
And on a basic level, this research is helping clarify how touch works in humans, says Robert Shepherd, a materials scientist at Cornell University. Lepora and his colleagues have basically figured out how the skin’s nerve endings translate what they sense to get the fingers to catch a ball slipping through our fingers or pick up an origami crane without crushing it (as in the video above), he says. “People like me and others need to be more knowledgeable about this stuff.”