Most haptic technologies today are limited to simple vibrations—barely scratching the surface of what human skin can perceive. Our skin is equipped with a sophisticated network of sensors that can detect pressure, stretching, vibration, and more. Now, engineers at Northwestern University have taken a major leap forward, developing a breakthrough technology that recreates the complexity of human touch with unprecedented precision.

Published recently in Science, this new device is compact, lightweight, and fully wireless. It adheres directly to the skin and applies force in any direction to mimic a wide range of tactile sensations. From pressure and vibration to twisting and sliding, it delivers realistic touch feedback that’s customizable, dynamic, and nuanced—something existing haptics have never achieved.

Unlike conventional devices that simply “poke” at the skin, this innovative actuator allows for fully controlled, multidirectional motion. It’s powered by a small rechargeable battery and connects via Bluetooth to virtual reality headsets and smartphones. The flexible design makes it easy to place anywhere on the body, integrate into wearable electronics, or use in arrays for more immersive applications.

“Almost all haptic actuators really just poke at the skin,” said John A. Rogers, lead designer of the device and a professor at Northwestern. “But skin is receptive to much more sophisticated senses of touch. We built a tiny actuator that can push, twist, slide, and stretch the skin in any direction. This allows us to finely control the complex sensation of touch in a fully programmable way.”

Rogers, a pioneer in bioelectronics, collaborated with Yonggang Huang, a fellow Northwestern professor, to lead the project. Co-first authors Kyoung-Ho Ha, Jaeyoung Yoo, and Shupeng Li helped develop the device, building on years of research into tactile technologies.

While visual and auditory technologies have advanced rapidly—think ultra-HD displays and spatial audio—haptic technology has struggled to keep pace. The main reason? The complexity of human touch.

Touch involves various mechanoreceptors located at different depths within the skin. These receptors each respond to specific stimuli—pressure, vibration, or stretch—and send rich, detailed information to the brain. Simulating that complexity requires a device that can deliver precise stimuli in just the right way.

“Part of the reason haptic technology lags video and audio in its richness and realism is that the mechanics of skin deformation are complicated,” said J. Edward Colgate, a co-author and haptics expert at Northwestern. “Skin can be poked in or stretched sideways. Stretching can happen quickly or slowly, and across surfaces like the entire palm. Replicating that is tough.”

To meet this challenge, the Northwestern team developed the first actuator with full freedom of motion (FOM). This allows the device to move and apply forces in all directions, engaging all types of skin mechanoreceptors—either individually or in combination.

“It’s a big step toward managing the complexity of the sense of touch,” said Colgate. “This is the first compact device that can poke, pull, or stretch the skin in different ways, at different speeds, and in arrays to create intricate tactile effects.”

Just a few millimeters in size, the actuator contains a small magnet nested within a coil system. When electricity runs through the coils, it generates a magnetic field that interacts with the magnet to produce directional forces. These forces can replicate sensations like squeezing, tapping, pinching, or sliding.

“Achieving both a compact design and strong force output is crucial,” said Huang, who led the theoretical work. “We used detailed computational models to ensure each mode of motion generates maximum force with minimal interference.”

To further enhance realism, the device includes an onboard accelerometer that tracks orientation, motion, and acceleration. If worn on the hand, for example, it can tell whether the palm is facing up or down and adjust feedback accordingly.

This opens the door to incredibly lifelike interactions—like feeling the difference between silk and corduroy while online shopping, or using your sense of touch to explore virtual spaces.

“If you run your finger along a piece of silk, it will have less friction and slide faster than when touching burlap,” said Rogers. “You can imagine shopping for clothes online and wanting to feel that texture. This device makes it possible.”

Beyond simulating physical sensations, the system can also convey information through touch. The team has already demonstrated converting music into tactile signals—allowing users to “feel” different instruments based on vibration direction and rhythm.

“We broke down the characteristics of music and mapped them to haptic sensations without losing any subtlety,” Rogers explained. “It’s just one example of how touch can complement other senses, like hearing or sight.”

The implications are vast. The technology could enhance virtual reality, improve spatial navigation for visually impaired users, offer touch-based cues in remote healthcare, and even let people with hearing loss feel the rhythm and emotion in music.

By bringing a true sense of touch to the digital world, this breakthrough device represents a giant leap toward more immersive, accessible, and meaningful human-computer interaction.

By Impact Lab