⚡️🧠🤖 NEW - With the help of an AI model, sentences from a vocabulary of up to 125,000 words were recognized with up to 74% accuracy.

Researchers at Stanford have decoded for the first time the brain activity that occurs when people imagine themselves speaking (known as “inner speech”). These “silent thoughts” were decoded in real time with up to 74% accuracy using a brain-computer interface (BCI).

· Microelectrodes were implanted in the motor cortex of four participants with severe paralysis (caused by ALS or stroke in the brain stem, among other things).

· They were asked to either speak or imagine sentences. Similar activation patterns were observed in the brain, although inner speech was less pronounced.

· With the help of an AI model, sentences from a vocabulary of up to 125,000 words were recognized with up to 74% accuracy.

· Interestingly, even content that was not explicitly read aloud was recognized—such as numbers when participants were asked to count pink circles.

Reply to this note

Please Login to reply.

Discussion

Before covid, I would have said this was a great thing to help the disabled and paralysed, now I realise how badly it will get misused by the authorities and how people will suffer as a result...

I will just intentionally think of gibberish and nonsense.

I can just put song on repeat in my mind.

Fuck them. I will just learn to stop thinking in words if that's what it takes