⚡️🧠🤖 NEW - With the help of an AI model, sentences from a vocabulary of up to 125,000 words were recognized with up to 74% accuracy.
Researchers at Stanford have decoded for the first time the brain activity that occurs when people imagine themselves speaking (known as “inner speech”). These “silent thoughts” were decoded in real time with up to 74% accuracy using a brain-computer interface (BCI).
· Microelectrodes were implanted in the motor cortex of four participants with severe paralysis (caused by ALS or stroke in the brain stem, among other things).
· They were asked to either speak or imagine sentences. Similar activation patterns were observed in the brain, although inner speech was less pronounced.
· With the help of an AI model, sentences from a vocabulary of up to 125,000 words were recognized with up to 74% accuracy.
· Interestingly, even content that was not explicitly read aloud was recognized—such as numbers when participants were asked to count pink circles.
