A computer program which can read words before they are spoken by analysing nerve signals in our mouths and throats, has been developed by NASA.
Preliminary results show the button-sized sensors, which attach under the chin and on either side of the Adam’s apple and pick up nerve signals from the tongue, throat, and vocal cords, can indeed be used to read minds.
“Biological signals arise when reading or speaking to oneself with or without actual lip or facial movement,” says Chuck Jorgensen, a neuroengineer at NASA’s Ames Research Center in Moffett Field, California, in charge of the research.
The sensors have already been used to do simple web searches and may one day help space-walking astronauts and people who cannot talk communicate. The sensors could send commands to rovers on other planets, help injured astronauts control machines, or aid the handicapped.
More here.