Experts have made further progress in the science of decoding what a person is saying just by looking at their brainwaves when they attempt speech.
Brainwaves are patterns of electrical activity occurring in the mind and they are fundamental to all aspects of brain functioning such as thinking which leads to speech, emotions, and behaviors.
Consistent research on artificial intelligence-backed technologies has long been underway to help individuals, who have suffered speech-limiting brain conditions, speak again.
In a new study published in Nature Neuroscience, scientists trained algorithms to transfer the brain patterns of their subjects into sentences in real-time with word error rates as low as 3%.
Four participants of the study were made to read sentences aloud while electrodes recorded their brain activity, which was thereafter fed into a computing system to create data representation.
Joseph Makin, an American machine learning specialist from the University of California, San Francisco (UCSF), and his colleagues, who carried out the research, had sought to improve accuracy.
These experts successfully decoded part of the representation word-by-word to form sentences unlike previous ones that got only “fragments,” with “limited success” in decoding neural activity.
While stating the study’s caveat, however, the research team admitted that the speech, which was to be decoded, was limited to 30-50 sentences.
But they also said the decoder is not just classifying sentences based on their structure as its performance was improved by adding sentences to the training set that weren’t used in the tests.
According to them, this proves that the machine interface is identifying single words, not just sentences, and could possibly decode sentences never encountered in any training set.
“Although we should like the decoder to learn and exploit the regularities of the language, it remains to show how many data would be required to expand from our tiny languages to a more general form of English,” said the researchers.
Copyright 2023 TheCable. All rights reserved. This material, and other digital content on this website, may not be reproduced, published, broadcast, rewritten or redistributed in whole or in part without prior express written permission from TheCable.
Follow us on twitter @Thecablestyle