Thousands of individuals born deafblind may soon be able to understand live, real-world conversations for the first time, thanks to pioneering research from Nottingham Trent University (NTU). The university’s Advanced Textiles Research Group (ATRG) is developing a pair of AI-driven smart gloves that translate spoken language into tactile signals, allowing wearers to interpret communication through their fingertips.

The technology uses artificial intelligence to listen to conversations in real time. Rather than requiring visual or auditory input, the system interprets speech and sends a summarized version through haptic actuators embedded in the gloves. These actuators—small vibration units located on the tops of the fingers—deliver coded messages using variations in vibration strength, frequency, and duration. The system effectively mimics the braille alphabet, enabling the wearer to feel words, grammar, and numbers.

The project is designed to support people who cannot hear or lip-read, offering a wearable, intuitive way to participate in everyday social interaction. The gloves are also being designed to convey other types of information, such as navigation assistancephone notificationsemergency alerts, and even interpretations of music or visual art. Because the actuators are embedded in the fabric, the gloves remain discreet and easy to integrate into daily life.

The technology specifically addresses the needs of the estimated 450,000 people in the UK who live with severe deafblindness, as well as the 15 million people globally affected by the condition. Among them, around 24,000 children and young people in the UK were born with deafblindness or developed it early in life.

Smart textile researcher and Ph.D. candidate Malindu Ehelagasthenna, who initiated the concept, proposed using textile-integrated haptics to enhance communication access for individuals with combined sensory loss. The gloves may also offer significant benefits to people with partial vision and hearing, helping them to navigate environments and access information more easily.

The technology has already progressed to the prototype stage. A working model was presented at the 2024 International Conference on the Challenges, Opportunities, Innovations and Applications in Electronic Textiles, held in March. The prototype successfully demonstrated the ability to continuously receive, process, and transmit spoken content through haptic feedback.

Beyond aiding communication, the system supports decentralized, real-time access to information, offering deafblind individuals greater independence. It also represents a major advancement in inclusive technology, showing how wearable AI and e-textiles can be combined to bridge accessibility gaps in a way that is both practical and transformative.

By Impact Lab