Researchers at UC Berkeley have created a device that uses wearable sensors and artificial intelligence software to recognize what hand gesture a person intends to make. The sensors and AI are able to determine the hand gesture a person intends to make based on electrical signal patterns in the forearm. Researchers say the device paves the way for improved prosthetic control and interaction with electronic devices.
The device has implications that could usher in a new era of controlling computers without using a keyboard or playing games without a controller. The system also has the potential to replace steering wheels inside cars. A more likely use is enabling amputees to control prosthetic devices or interact with electronics.
UC Berkeley doctoral student Ali Moin says reading hand gestures is a way to improve human-computer interaction. Human-computer interaction can be improved using cameras and computer vision, but Moin says the system her team has developed also maintains an individual’s privacy. The team created a flexible armband able to read electric signals from 64 points on the forearm.
The signals are fed into an electrical chip programmed with an AI algorithm that can associate the signals gathered from the forearm with specific hand gestures. Researchers taught the algorithm to recognize 21 individual hand gestures, including a thumbs-up, fist, and flat hand.
Researchers first had to teach the algorithm how electrical signals in the arm to correspond with hand gestures. This required users to wear the cuff while making hand gestures one by one. The device uses a type of advanced AI called a hyperdimensional computing algorithm able to update itself with new information. That capability allows the AI to incorporate new information into its model allowing it correctly predict hand gestures even if the user’s arm gets sweaty or if their arm is above the head. The device isn’t ready to become a commercial product yet, but researchers believe it could be with a few tweaks.