A team of researchers at Bristol University has developed a highly dexterous four-fingered robotic hand with artificial tactile fingertips, named AnyRotate. This innovative robotic hand can sense and rotate objects in any direction and orientation, even when the hand is upside down—a capability never achieved before.

Achieving this level of dexterity required the extraction and utilization of rich touch information for precise motor control. The researchers believe that enhancing the dexterity of robotic hands could significantly advance automated tasks such as handling supermarket goods or sorting recycling waste. The details of their research are available on GitHub.

Improving robot manipulation includes challenges in in-hand manipulation with multi-fingered hands due to actuation complexity, precise control, and environmental uncertainties. Recent strides, notably by OpenAI, rely on vision systems prone to self-occlusion, requiring multiple cameras. Proprioception and touch sensing have emerged as pivotal for spatial manipulation, even during motion, demanding an understanding of complex physics and gravity-invariant grasping.

According to researchers, tactile sensing, crucial for detailed robot-object interaction, faces gaps between simulation and reality, limiting high-resolution data use. Enhanced tactile representation promises heightened dexterity and expanded capabilities in in-hand tasks, underscoring the potential for future advancements. Using advanced tactile sensing, the team developed a robot system for rotating objects in hand across multiple axes, unaffected by gravity. They combined goal-oriented reinforcement learning (RL) with dense tactile feedback.

“In Bristol, our artificial tactile fingertip uses a 3D-printed mesh of pin-like papillae on the underside of the skin, based on copying the internal structure of human skin,” said Professor Nathan Lepora in a statement.

Initially, researchers established a simulation-to-real-world framework and developed a detailed tactile representation to train a precise policy for multi-axis object manipulation. They trained an observation model to predict contact pose and force from tactile images, crucial for stable manipulation in noisy environments. In real-world applications, they equipped a four-fingered, fully actuated robot hand with tactile sensors on its fingertips to ensure stable and accurate in-hand object rotation. In real-world tests, the team’s dense tactile policy effectively handled different objects, rotating them in various directions better than simpler tactile methods.

Surprisingly, even without specific slip detection, advanced tactile sensors can sense when objects start to move and adjust accordingly. According to researchers, this demonstrates how detailed tactile sensing enhances our ability to manipulate objects securely in hand, emphasizing its crucial role in robotics.

This study showcased a general policy using advanced tactile sensing to rotate objects in hand across any axis and direction, marking a milestone in robotic dexterity. While dense touch performed well, it struggled with box-shaped or elongated objects due to similar tactile feedback from different grasping points. Enhancing tactile representations with tactile images, contact force fields, or integrating vision could improve robustness.

The Allegro Hand’s actuation limitations under certain orientations highlight the need for more capable and affordable hardware. Achieving effortless object manipulation in space through tactile feedback echoes human dexterity goals, emphasizing the importance of tactile sensing for future advancements in robotics.

By Impact Lab