Researchers in Germany have developed a groundbreaking method that equips robots with an innate sense of touch by integrating their existing internal force-torque sensors with machine learning algorithms. This innovative approach, developed by a team at the Deutschen Zentrums für Luft- und Raumfahrt (DLR), allows robots to sense and interpret human touch without the need for expensive synthetic skins or additional external sensors.

“The intrinsic sense of touch we proposed in this work can serve as the basis for an advanced category of physical human-robot interaction that has not been possible yet, enabling a shift from conventional modalities towards adaptability, flexibility, and intuitive handling,” the researchers stated.

As robotic systems continue to advance, their potential as collaborators in fields such as manufacturing, space exploration, healthcare, and daily life assistance grows. Human-robot interaction (HRI) is a critical area of research, focusing on the integration of human problem-solving and reasoning with robotic precision.

Various modalities of HRI, including vision-based, voice-recognition, and physical-contact approaches, have been explored, but achieving truly intuitive physical interaction has remained challenging. For robots to interact safely and efficiently with humans, a sense of touch is essential, allowing them to identify and respond to physical contacts with precision.

Traditionally, force-torque sensors in robots are used for control, but explicit tactile sensing is needed for detailed contact information. While advances have been made with tactile skins and sensors, these solutions often face challenges related to coverage, wiring, robustness, and real-time capability. Equipping robots with the necessary sensors for physical interaction can become costly and complex, especially when dealing with large or curved surfaces.

The DLR team overcame these challenges by leveraging the existing equipment in the Safe Autonomous Robotic Assistant (SARA) system—a robotic arm with force-torque sensors in its joints and base that detect position and movement. This setup allows the robot to detect and respond to physical interactions without the need for external touch sensors.

By utilizing these sensors, the robot can determine where and in what order different forces were applied to its surface. The researchers then combined this capability with deep learning algorithms to interpret the applied touch. In their demonstrations, the robot successfully recognized numbers or letters traced on its surface, using neural networks to predict each character.

The team further extended this mechanism by creating virtual “buttons” or sliders on the robot’s surfaces, which could trigger specific commands or movements. This approach not only endows the system with an intuitive and accurate sense of touch but also significantly expands the range of possible physical human-robot interactions.

“The intrinsic sense of touch we proposed in this work can serve as the basis for an advanced category of physical human-robot interaction that has not been possible yet, enabling a shift from conventional modalities toward adaptability, flexibility, and intuitive handling,” the team reiterated in their study.

By Impact Lab