Brain-computer interfaces (BCIs) have long held the promise of hands-free control over digital devices using nothing but thought. But until now, most of these systems have been bulky, fragile, and functionally tethered to lab settings—especially because they struggle to maintain stable contact with the scalp when the user is in motion.
A new breakthrough, however, may soon change that.
Researchers have developed a tiny, wearable neural interface, just 0.04 inches across, that can attach between a user’s hair follicles and keep functioning during movement. Thanks to a series of microneedles that painlessly anchor the device to the scalp, the interface maintains a steady readout of brain activity, even as users walk, run, or go about their day.
In a study published in Proceedings of the National Academy of Sciences, the team demonstrated the device by enabling users to control augmented reality (AR) video calls—no hands or voice commands required. The interface worked continuously for up to 12 hours, offering stable performance while the wearer moved freely.
Current BCIs are often used in medical or research contexts—helping people with paralysis control wheelchairs or study brain patterns in controlled settings. While invasive implants offer the most precise data by placing electrodes directly inside the brain, they come with serious risks and are unlikely to receive widespread approval for everyday use.
That’s why many researchers are focusing on non-invasive approaches like electroencephalography (EEG), which reads brain signals via electrodes on the scalp. But EEGs have a drawback: they require very stable electrode-to-skin contact, which is disrupted easily by movement.
The new interface solves this with an ingenious design. Each sensor is shaped like a tiny cross with five microscale spikes, which are coated in PEDOT, a conductive polymer. These microneedles gently pierce the outermost skin layer—composed of dead cells that usually block signals—to access the epidermis, allowing for clearer and more consistent electrical signal collection.
The sensor is attached to a flexible copper wire system that absorbs any jostling from movement without disturbing the scalp contact. From there, the brain signals are processed and sent wirelessly to external devices.
To showcase the potential of this brain-computer interface, researchers integrated it with Nreal AR glasses, enabling a user to control video calls simply by looking at flickering icons. The system uses a technique called steady-state visual evoked potentials (SSVEPs), in which the brain produces specific signals when focusing on an image that flickers at a set frequency.
By placing different flickering icons next to call-related buttons (like answer, decline, or end), the interface could detect what the user was looking at and execute the appropriate command—with a 96.4% accuracy rate in real time, even as the user moved.
Notably, while traditional EEG electrodes lost contact within hours, the microneedle-based interface remained secure and functional for 12 hours straight.
Even better, the sensor was built using manufacturing methods suited for mass production, raising the possibility of commercial applications in the near future. Beyond digital control, the device could also serve as a wearable health monitor, constantly tracking neurological signals for signs of stress, fatigue, or illness.
“This advance provides a pathway for the practical and continuous use of BCI in everyday life, enhancing the integration of digital and physical environments,” the research team noted.
With further development, this small but powerful interface could help usher in an era where our brains communicate seamlessly with our tech—even while we’re on the move.
By Impact Lab