A team of engineers at Northwestern University has developed an innovative wearable device that stimulates the skin to produce a range of complex sensations, offering more immersive and realistic sensory experiences. This breakthrough in bioelectronics has significant implications for applications in gaming, virtual reality (VR), and even healthcare. In particular, the device could help individuals with visual impairments “feel” their surroundings or offer enhanced feedback for those with prosthetic limbs.

The study, recently published in Nature, builds on work first introduced in 2019 by Northwestern bioelectronics pioneer John A. Rogers. His previous research led to the development of “epidermal VR,” a skin-interfaced system that communicates touch via miniature vibrating actuators. This new device takes that concept to the next level by allowing multi-directional sensations, such as pressure, vibration, and even twisting motions.

Rogers’ team has developed miniaturized actuators that are far more advanced than the simple buzzers used in their 2019 study. These tiny actuators can deliver controlled forces across a wide range of frequencies, providing constant force without continuous power application. An upgraded version of the device allows actuators to produce gentle twisting motions in addition to vertical forces, increasing the realism of the sensations experienced.

“Our new miniaturized actuators for the skin are far more capable than the simple ‘buzzers’ we used previously,” said Rogers. “They can deliver a range of forces, including twisting motion, which makes the sensory feedback feel much more natural and immersive.”

Rogers co-led the research with his Northwestern colleagues Yonggang Huang, Professor of Mechanical Engineering, and Hanqing Jiang of Westlake University, along with Zhaoqian Xie of Dalian University of Technology in China. Jiang’s team was responsible for creating the small structures that enable the twisting motion.

The device features a hexagonal array of 19 small magnetic actuators encapsulated in a flexible silicone mesh. These actuators can create a variety of sensations, such as pressure, vibration, and twisting, based on the data received from a smartphone via Bluetooth. By interpreting environmental data, the device translates it into tactile feedback, effectively substituting one sense, such as vision, with another, like touch.

To maximize energy efficiency, the device uses a “bistable” design. This clever system allows the actuators to remain in two stable positions without continuous power input. The actuators press against the skin to store energy, and when they move back up, the stored energy is released. This system reduces the need for constant energy consumption, making it possible for the device to run longer on a single battery charge.

“Instead of fighting against the skin, we use the energy stored in the skin as elastic energy, which we can recover and reapply during operation,” explained Matthew Flavin, the paper’s first author. “It’s like stretching a rubber band—compressing the skin stores energy that we can later use for delivering sensory feedback.”

Flavin, who was a postdoctoral researcher in Rogers’ lab at the time, is now an assistant professor at the Georgia Institute of Technology.

To test the device’s effectiveness, the researchers blindfolded healthy subjects and had them navigate an obstacle course. As the participants moved through the course, the device provided feedback in the form of intensifying tactile sensations, guiding them to avoid objects or improve balance.

In one experiment, the device communicated the proximity of objects by increasing the intensity of feedback as the subject neared an obstruction. In just a short period of training, participants were able to adjust their behavior in real-time, with the device substituting visual input for mechanical feedback.

“This system operates similarly to a white cane, but it integrates more information than a traditional aid,” Flavin said. “By substituting tactile feedback for vision, it helps users navigate their surroundings more effectively.”

Rogers emphasized the potential of this device as a form of “sensory substitution.” For people with vision impairments, the device could provide a primitive, but meaningful, sense of their environment. By using data from smartphone 3D imaging (LiDAR), the system could simulate the experience of vision through haptic patterns delivered directly to the skin.

The wearable device developed by Rogers and his team is a significant step forward in the evolution of bioelectronics. While its primary applications may be in VR, gaming, and healthcare, the possibilities for using this technology to enhance sensory experiences and improve accessibility for people with disabilities are immense. The ability to translate environmental data into tactile feedback opens up new frontiers in wearable devices that could change how we interact with the world around us.

By Impact Lab