students are so raw that they can create anything! who thought this could happen? you will be amazed i bet!
MIT scientists, led by an Indian-origin student(proud na!), have developed a computer system that can transcribe words that users say in their heads. The system consists of a wearable device and an associated computing system.
Electrodes in the device pick up neuromuscular signals in the jaw and face that are triggered by internal verbalisations — saying words ‘in your head’ — but are undetectable to the human eye.
The idea that internal verbalisations have physical correlates has been around since the 19th century, and it was seriously investigated in the 1950s.
the researchers- first step was to determine which locations on the face are the sources of the most reliable neuromuscular signals. They conducted experiments in which the people were asked to subvocalise a series of words four times, with an array of 16 electrodes at different facial locations each time.
The researchers wrote code to analyse the resulting data and found that signals from seven particular electrode locations were consistently able to distinguish subvocalised words. Researchers developed a prototype of a wearable silent-speech interface, which wraps around the back of the neck like a telephone headset and has tentacle-like curved appendages that touch the face at seven locations on either side of the mouth and along the jaws. They collected data on a few computational tasks with limited vocabularies – about 20 words each.
In that study, the system had an average transcription accuracy of about 92 %. However, the system’s performance should improve with more training data, which could be collected during its ordinary use. In ongoing work, the researchers are collecting a wealth of data on more elaborate conversations, in the hope of building applications with much more expansive vocabularies.
who else need that? isn’t this amazing and time saving! applause for them.