9 Sources
[1]
A mind-reading brain implant that comes with password protection
A brain implant can decode a person's internal chatter - but the device works only if the user thinks of a preset password. The mind-reading device, or brain-computer interface (BCI), accurately deciphered up to 74% of imagined sentences. The system began decoding users' internal speech - the silent dialogue in people's minds -- only when they thought of a specific keyword. This ensured that the system did not accidentally translate sentences that users would rather keep to themselves. The study, published in Cell on 14 August, represents a "technically impressive and meaningful step" towards developing BCI devices that accurately decode internal speech, says Sarah Wandelt, a neural engineer at the Feinstein Institutes for Medical Research in Manhasset, New York, who was not involved in the work. The password mechanism also offers a straightforward way to protect users' privacy, a crucial feature for real-world use, adds Wandelt. BCI systems translate brain signals into text or audio and have become promising tools for restoring speech in people with paralysis or limited muscle control. Most devices require users to try to speak out loud, which can be exhausting and uncomfortable. Last year, Wandelt and her colleagues developed the first BCI for decoding internal speech, which relied on signals in the supramarginal gyrus, a brain region that plays a major part in speech and language. But there's a risk that these internal-speech BCIs could accidentally decode sentences users never intended to utter, says Erin Kunz, a neural engineer at Stanford University in California. "We wanted to investigate this robustly," says Kunz, who co-authored the new study. First, Kunz and her colleagues analysed brain signals collected by microelectrodes placed in the motor cortex -- the region involved in voluntary movements -- of four participants. All four have trouble speaking, one because of a stroke and three because of motor neuron disease, a degeneration of the nerves that leads to loss of muscle control. The researchers instructed participants to either attempt to say a set of words or imagine saying them. Recordings of the participants' brain activity showed that attempted and internal speech originated in the same brain region and generated similar neural signals, but those associated with internal speech were weaker. Next, Kunz and her colleagues used this data to train artificial intelligence models to recognize phonemes, the smallest units of speech, in the neural recordings. The team used language models to stitch these phonemes together to form words and sentences in real time, drawn from a vocabulary of 125,000 words. The device correctly interpreted 74% of sentences imagined by two participants who were instructed to think of specific phrases. This level of accuracy is similar to that of the team's earlier BCI for attempted speech, says Kunz. In some cases, the device also decoded numbers that participants imagined when they silently counted pink rectangles shown on a screen, suggesting that the BCI can detect spontaneous self-talk. To address the risk of decoding sentences that users don't intend to say out loud, the researchers added a password to their system so that participants could control when it started decoding. When a participant imagined the password 'Chitty-Chitty-Bang-Bang' (the name of an English-language children's novel) the BCI recognized it with an accuracy of more than 98%. The study is exciting because it unravels the neural differences between internal and attempted speech, says Silvia Marchesotti, a neuroengineer at the University of Geneva, Switzerland. She adds that it will be important to explore speech signals in brain regions other than the motor cortex. The researchers are planning to do this, in addition to improving the system's speed and accuracy. "If we look at other parts of the brain, perhaps we can also address more types of speech impairments," says Kunz.
[2]
Mind-reading AI can turn even imagined speech into spoken words
Scientists have decoded the brain activity of people as they merely imagined speaking, using implanted electrodes and AI. Brain-computer interfaces can already decode what people with paralysis are attempting to say by reading their neural activity. But Benyamin Meschede-Krasa at Stanford University says this in itself can require a fair amount of physical effort, so he and his colleagues sought a less energy-intensive approach. "We wanted to see whether there were similar patterns when someone was simply imagining speaking in their head," he says. "And we found that this could be an alternative, and indeed, a more comfortable way for people with paralysis to use that kind of system to restore their communication." Meschede-Krasa and his colleagues recruited four people who were severely paralysed as a result of either amyotrophic lateral sclerosis or brainstem stroke. All the participants had previously had microelectrodes implanted into their motor cortex, which is involved in speech, for research purposes. The researchers asked each person to attempt to say a list of words and sentences, and also to just imagine saying them. They found that brain activity was similar for both attempted and inner speech, but activation signals were generally weaker for the latter. The team trained an AI model to recognise those signals and decode them, using a vocabulary database of up to 125,000 words. To ensure the privacy of people's inner speech, the team programmed the AI to be unlocked when they thought of the password Chitty Chitty Bang Bang, which it detected with 98 per cent accuracy. Through a series of experiments, the team found that just imaging speaking a word resulted in the model correctly decoding it up to 74 per cent of the time. This demonstrates a solid proof-of-principle for this approach, but it is less robust than interfaces that decode actual attempted speech, says team member Frank Willett, also at Stanford. Ongoing improvements to both the sensors and AI over the next few years could make this more accurate, he says. The participants expressed a significant preference for this system, which was faster and less laborious than those based on attempted speech, says Meschede-Krasa. The concept takes "an interesting direction" for future brain-computer interfaces, says Mariska Vansteensel at UMC Utrecht in The Netherlands. But it lacks differentiation for the fine lines between attempted speech, what we want to be speech and the thoughts we want to keep to ourselves, she says. "I'm not sure if everyone was able to distinguish so precisely between these different concepts of imagined and attempted speeches." She also says the password would need to be turned on and off, in line with the user's decision of whether to say what they're thinking mid-conversation. "We really need to make sure that BCI [brain computer interface]-based utterances are the ones people intend to share with the world and not the ones they want to keep to themselves no matter what," she says. Benjamin Alderson-Day at Durham University in the UK says there's no reason to consider this system a mind reader. "It really only works with very simple examples of language," he says. "I mean if your thoughts are limited to single words like 'tree' or 'bird,' then you might be concerned, but we're still quite a way away from capturing people's free-form thoughts and most intimate ideas." Willett stresses that all brain-computer interfaces are regulated by federal agencies to ensure adherence to "the highest standards of medical ethics".
[3]
Scientists decode inner speech from brain activity with high accuracy
Cell PressAug 14 2025 Scientists have pinpointed brain activity related to inner speech-the silent monologue in people's heads-and successfully decoded it on command with up to 74% accuracy. Publishing August 14 in the Cell Press journal Cell, their findings could help people who are unable to audibly speak communicate more easily using brain-computer interface (BCI) technologies that begin translating inner thoughts when a participant says a password inside their head. This is the first time we've managed to understand what brain activity looks like when you just think about speaking. For people with severe speech and motor impairments, BCIs capable of decoding inner speech could help them communicate much more easily and more naturally." Erin Kunz, lead author of Stanford University BCIs have recently emerged as a tool to help people with disabilities. Using sensors implanted in brain regions that control movement, BCI systems can decode movement-related neural signals and translate them into actions, such as moving a prosthetic hand. Research has shown that BCIs can even decode attempted speech among people with paralysis. When users physically attempt to speak out loud by engaging the muscles related to making sounds, BCIs can interpret the resulting brain activity and type out what they are attempting to say, even if the speech itself is unintelligible. Although BCI-assisted communication is much faster than older technologies, including systems that track users' eye movements to type out words, attempting to speak can still be tiring and slow for people with limited muscle control. The team wondered if BCIs could decode inner speech instead. "If you just have to think about speech instead of actually trying to speak, it's potentially easier and faster for people," says Benyamin Meschede-Krasa, the paper's co-first author, of Stanford University. The team recorded neural activity from microelectrodes implanted in the motor cortex-a brain region responsible for speaking-of four participants with severe paralysis from either amyotrophic lateral sclerosis (ALS) or a brainstem stroke. The researchers asked the participants to either attempt to speak or imagine saying a set of words. They found that attempted speech and inner speech activate overlapping regions in the brain and evoke similar patterns of neural activity, but inner speech tends to show a weaker magnitude of activation overall. Using the inner speech data, the team trained artificial intelligence models to interpret imagined words. In a proof-of-concept demonstration, the BCI could decode imagined sentences from a vocabulary of up to 125,000 words with an accuracy rate as high as 74%. The BCI was also able to pick up what some inner speech participants were never instructed to say, such as numbers when the participants were asked to tally the pink circles on the screen. The team also found that while attempted speech and inner speech produce similar patterns of neural activity in the motor cortex, they were different enough to be reliably distinguished from each other. Senior author Frank Willett of Stanford University says researchers can use this distinction to train BCIs to ignore inner speech altogether. For users who may want to use inner speech as a method for faster or easier communication, the team also demonstrated a password-controlled mechanism that would prevent the BCI from decoding inner speech unless temporarily unlocked with a chosen keyword. In their experiment, users could think of the phrase "chitty chitty bang bang" to begin inner-speech decoding. The system recognized the password with more than 98% accuracy. While current BCI systems are unable to decode free-form inner speech without making substantial errors, the researchers say more advanced devices with more sensors and better algorithms may be able to do so in the future. "The future of BCIs is bright," Willett says. "This work gives real hope that speech BCIs can one day restore communication that is as fluent, natural, and comfortable as conversational speech." Cell Press Journal reference: Kunz, E. M., et al. (2025). Inner speech in motor cortex and implications for speech neuroprostheses. Cell. doi.org/10.1016/j.cell.2025.06.015.
[4]
"Mind-Reading" Tech Decodes Inner Speech With Up to 74% Accuracy - Neuroscience News
Summary: Scientists have, for the first time, decoded inner speech -- silent thoughts of words -- on command using brain-computer interface technology, achieving up to 74% accuracy. By recording neural activity from participants with severe paralysis, the team found that inner speech and attempted speech share overlapping brain activity patterns, though inner speech signals are weaker. Artificial intelligence models trained on these patterns could interpret imagined words from a vast vocabulary, and a password-based system ensured decoding only when desired. This breakthrough could pave the way for faster, more natural communication for people who cannot speak, with potential for even greater accuracy as technology advances. Scientists have pinpointed brain activity related to inner speech -- the silent monologue in people's heads -- and successfully decoded it on command with up to 74% accuracy. Publishing August 14 in the Cell Press journal Cell, their findings could help people who are unable to audibly speak communicate more easily using brain-computer interface (BCI) technologies that begin translating inner thoughts when a participant says a password inside their head. "This is the first time we've managed to understand what brain activity looks like when you just think about speaking," says lead author Erin Kunz of Stanford University. "For people with severe speech and motor impairments, BCIs capable of decoding inner speech could help them communicate much more easily and more naturally." BCIs have recently emerged as a tool to help people with disabilities. Using sensors implanted in brain regions that control movement, BCI systems can decode movement-related neural signals and translate them into actions, such as moving a prosthetic hand. Research has shown that BCIs can even decode attempted speech among people with paralysis. When users physically attempt to speak out loud by engaging the muscles related to making sounds, BCIs can interpret the resulting brain activity and type out what they are attempting to say, even if the speech itself is unintelligible. Although BCI-assisted communication is much faster than older technologies, including systems that track users' eye movements to type out words, attempting to speak can still be tiring and slow for people with limited muscle control. The team wondered if BCIs could decode inner speech instead. "If you just have to think about speech instead of actually trying to speak, it's potentially easier and faster for people," says Benyamin Meschede-Krasa, the paper's co-first author, of Stanford University. The team recorded neural activity from microelectrodes implanted in the motor cortex -- a brain region responsible for speaking -- of four participants with severe paralysis from either amyotrophic lateral sclerosis (ALS) or a brainstem stroke. The researchers asked the participants to either attempt to speak or imagine saying a set of words. They found that attempted speech and inner speech activate overlapping regions in the brain and evoke similar patterns of neural activity, but inner speech tends to show a weaker magnitude of activation overall. Using the inner speech data, the team trained artificial intelligence models to interpret imagined words. In a proof-of-concept demonstration, the BCI could decode imagined sentences from a vocabulary of up to 125,000 words with an accuracy rate as high as 74%. The BCI was also able to pick up what some inner speech participants were never instructed to say, such as numbers when the participants were asked to tally the pink circles on the screen. The team also found that while attempted speech and inner speech produce similar patterns of neural activity in the motor cortex, they were different enough to be reliably distinguished from each other. Senior author Frank Willett of Stanford University says researchers can use this distinction to train BCIs to ignore inner speech altogether. For users who may want to use inner speech as a method for faster or easier communication, the team also demonstrated a password-controlled mechanism that would prevent the BCI from decoding inner speech unless temporarily unlocked with a chosen keyword. In their experiment, users could think of the phrase "chitty chitty bang bang" to begin inner-speech decoding. The system recognized the password with more than 98% accuracy. While current BCI systems are unable to decode free-form inner speech without making substantial errors, the researchers say more advanced devices with more sensors and better algorithms may be able to do so in the future. "The future of BCIs is bright," Willett says. "This work gives real hope that speech BCIs can one day restore communication that is as fluent, natural, and comfortable as conversational speech." Funding: This work was supported by the Assistant Secretary of Defense for Health Affairs, the National Institutes of Health, the Simons Collaboration for the Global Brain, the A.P. Giannini Foundation, Department of Veterans Affairs, the Wu Tsai Neurosciences Institute, the Howard Hughes Medical Institute, Larry and Pamela Garlick, the National Institute on Deafness and Other Communication Disorders, the National Institute of Neurological Disorders and Stroke, the Eunice Kennedy Shriver National Institute of Child Health and Human Development, the Blavatnik Family Foundation, and the National Science Foundation. Inner speech in motor cortex and implications for speech neuroprostheses Speech brain-computer interfaces (BCIs) show promise in restoring communication to people with paralysis but have also prompted discussions regarding their potential to decode private inner speech. Separately, inner speech may be a way to bypass the current approach of requiring speech BCI users to physically attempt speech, which is fatiguing and can slow communication. Using multi-unit recordings from four participants, we found that inner speech is robustly represented in the motor cortex and that imagined sentences can be decoded in real time. The representation of inner speech was highly correlated with attempted speech, though we also identified a neural "motor-intent" dimension that differentiates the two. We investigated the possibility of decoding private inner speech and found that some aspects of free-form inner speech could be decoded during sequence recall and counting tasks. Finally, we demonstrate high-fidelity strategies that prevent speech BCIs from unintentionally decoding private inner speech.
[5]
For Some Patients, the 'Inner Voice' May Soon Be Audible
Carl Zimmer writes the "Origins" column for The New York Times. He has written about neuroengineering since 1993. For decades, neuroengineers have dreamed of helping people who have been cut off from the world of language. A disease like amyotrophic lateral sclerosis, or A.L.S., weakens the muscles in the airway. A stroke can kill neurons that normally relay commands for speaking. Perhaps, by implanting electrodes, scientists could instead record the brain's electric activity and translate that into spoken words. Now a team of researchers has made an important advance toward that goal. Previously they succeeded in decoding the signals produced when people tried to speak. In the new study, published on Thursday in the journal Cell, their computer often made correct guesses when the subjects simply imagined saying words. Christian Herff, a neuroscientist at Maastricht University in the Netherlands who was not involved in the research, said the result went beyond the merely technological and shed light on the mystery of language. "It's a fantastic advance," Dr. Herff said. The new study is the latest result in a long-running clinical trial, called BrainGate2, that has already seen some remarkable successes. One participant, Casey Harrell, now uses his brain-machine interface to hold conversations with his family and friends. In 2023, after A.L.S. had made his voice unintelligible, Mr. Harrell agreed to have electrodes implanted in his brain. Surgeons placed four arrays of tiny needles on the left side, in a patch of tissue called the motor cortex. The region becomes active when the brain creates commands for muscles to produce speech. A computer recorded the electrical activity from the implants as Mr. Harrell attempted to say different words. Over time, with the help of artificial intelligence, the computer accurately predicted almost 6,000 words, with an accuracy of 97.5 percent. It could then synthesize those words using Mr. Harrell's voice, based on recordings made before he developed A.L.S. But successes like this one raised a troubling question: Could a computer accidentally record more than patients actually wanted to say? Could it eavesdrop on their inner voice? "We wanted to investigate if there was a risk of the system decoding words that weren't meant to be said aloud," said Erin Kunz, a neuroscientist at Stanford University and an author of the new study. She and her colleagues also wondered if patients might actually prefer using inner speech. They noticed that Mr. Harrell and other participants became fatigued when they tried to speak; could simply imagining a sentence be easier for them, and allow the system to work faster? "If we could decode that, then that could bypass the physical effort," Dr. Kunz said. "It would be less tiring, so they could use the system for longer." But it wasn't clear if the researchers could actually decode inner speech. In fact, scientists don't even agree on what "inner speech" is. Our brains produce language, picking out words and organizing them into sentences, using a constellation of regions that, together, are the size of a large strawberry. We can use the signals from the language network to issue commands to our muscles to speak, or use sign language, or type a text message. But many people also have the feeling that they use language to perform the very act of thinking. After all, they can hear their thoughts as an inner voice. Some researchers have indeed argued that language is essential for thought. But others, pointing to recent studies, maintain that much of our thinking does not involve language at all, and that people who hear an inner voice are just perceiving a kind of sporadic commentary in their heads. "Many people have no idea what you're talking about when you say you have an inner voice," said Evelina Fedorenko, a cognitive neuroscientist at M.I.T. "They're like, 'You know, maybe you should go see a doctor if you're hearing words in your head.'" (Dr. Fedorenko said she has an inner voice, while her husband does not.) Dr. Kunz and her colleagues decided to investigate the mystery for themselves. The scientists gave participants seven different words, including "kite" and "day," then compared the brain signals when participants attempted to say the words and when they only imagined saying them. As it turned out, imagining a word produced a pattern of activity similar to that of trying to say it, but the signal was weaker. The computer did a pretty good job of predicting which of the seven words the participants were thinking. For Mr. Harrell, it didn't do much better than a random guesses would have, but for another participant it picked the right word more than 70 percent of the time. The researchers put the computer through more training, this time specifically on inner speech. Its performance improved significantly, including on Mr. Harrell. Now when the participants imagined saying entire sentences, such as "I don't know how long you've been here," the computer could accurately decode most or all of the words. Dr. Herff, who has done his own studies on inner speech, was surprised that the experiment succeeded. Before, he would have said that inner speech is fundamentally different from the motor cortex signals that produce actual speech. "But in this study, they show that, for some people, it really isn't that different," he said. Dr. Kunz emphasized that the computer's current performance involving inner speech would not be good enough to let people hold conversations. "The results are an initial proof of concept more than anything," she said. But she is optimistic that decoding inner speech could become the new standard for brain-computer interfaces. In more recent trials, the results of which have yet to be published, she and her colleagues have improved the computer's accuracy and speed. "We haven't hit the ceiling yet," she said. As for mental privacy, Dr. Kunz and her colleagues found some reason for concern: On occasion, the researchers were able to detect words that the participants weren't imagining out loud. In one trial, the participants were shown a screen full of 100 pink and green rectangles and circles. They then had to determine the number of shapes of one particular color -- green circles, for instance. As the participants worked on the problem, the computer sometimes decoded the word for a number. In effect, the participants were silently counting the shapes, and the computer was hearing them. "These experiments are the most exciting to me," Dr. Herff said, because they suggest that language may play a role in many different forms of thought beyond just communicating. "Some people really seem to think this way," he said. Dr. Kunz and her colleagues explored ways to prevent the computer from eavesdropping on private thoughts. They came up with two possible solutions. One would be to only decode attempted speech, while blocking inner speech. The new study suggests this strategy could work. Even though the two kinds of thought are similar, they are different enough that a computer can learn to tell them apart. In one trial, the participants mixed sentences in their minds of both attempted and imagined speech. The computer was able to ignore the imagined speech. For people who would prefer to communicate with inner speech, Dr. Kunz and her colleagues came up with a second strategy: an inner password to turn the decoding on and off. The password would have to be a long, unusual phrase, they decided, so they chose "Chitty Chitty Bang Bang," the name of a 1964 novel by Ian Fleming as well as a 1968 movie starring Dick van Dyke. One of the participants, a 68-year-old woman with A.L.S., imagined saying "Chitty Chitty Bang Bang" along with an assortment of other words. The computer eventually learned to recognize the password with a 98.75 percent accuracy -- and decoded her inner speech only after detecting the password. "This study represents a step in the right direction, ethically speaking," said Cohen Marcus Lionel Brown, a bioethicist at the University of Wollongong in Australia. "If implemented faithfully, it would give patients even greater power to decide what information they share and when." Dr. Fedorenko, who was not involved in the new study, called it a "methodological tour de force." But she questioned whether an implant could eavesdrop on many of our thoughts. Unlike Dr. Herff, she doesn't see a role for language in much of our thinking. Although the BrainGate2 computer successfully decoded words that patients consciously imagined saying, Dr. Fedorenko noted, it performed much worse when people responded to open-ended commands. For example, in some trials, the participants were asked to think about their favorite hobby when they were children. "What they're recording is mostly garbage," she said. "I think a lot of spontaneous thought is just not well-formed linguistic sentences."
[6]
Brain-computer interface shows promise for decoding inner speech in real time
Scientists have pinpointed brain activity related to inner speech -- the silent monolog in people's heads -- and successfully decoded it on command with up to 74% accuracy. Published in the journal Cell, their findings could help people who are unable to audibly speak communicate more easily using brain-computer interface (BCI) technologies that begin translating inner thoughts when a participant says a password inside their head. "This is the first time we've managed to understand what brain activity looks like when you just think about speaking," says lead author Erin Kunz of Stanford University. "For people with severe speech and motor impairments, BCIs capable of decoding inner speech could help them communicate much more easily and more naturally." BCIs have recently emerged as a tool to help people with disabilities. Using sensors implanted in brain regions that control movement, BCI systems can decode movement-related neural signals and translate them into actions, such as moving a prosthetic hand. Research has shown that BCIs can even decode attempted speech among people with paralysis. When users physically attempt to speak out loud by engaging the muscles related to making sounds, BCIs can interpret the resulting brain activity and type out what they are attempting to say, even if the speech itself is unintelligible. Although BCI-assisted communication is much faster than older technologies, including systems that track users' eye movements to type out words, attempting to speak can still be tiring and slow for people with limited muscle control. The team wondered if BCIs could decode inner speech instead. "If you just have to think about speech instead of actually trying to speak, it's potentially easier and faster for people," says Benyamin Meschede-Krasa, the paper's co-first author, of Stanford University. The team recorded neural activity from microelectrodes implanted in the motor cortex -- a brain region responsible for speaking -- of four participants with severe paralysis from either amyotrophic lateral sclerosis (ALS) or a brainstem stroke. The researchers asked the participants to either attempt to speak or imagine saying a set of words. They found that attempted speech and inner speech activate overlapping regions in the brain and evoke similar patterns of neural activity, but inner speech tends to show a weaker magnitude of activation overall. Using the inner speech data, the team trained artificial intelligence models to interpret imagined words. In a proof-of-concept demonstration, the BCI could decode imagined sentences from a vocabulary of up to 125,000 words with an accuracy rate as high as 74%. The BCI was also able to pick up what some inner speech participants were never instructed to say, such as numbers when the participants were asked to tally the pink circles on the screen. The team also found that while attempted speech and inner speech produce similar patterns of neural activity in the motor cortex, they were different enough to be reliably distinguished from each other. Senior author Frank Willett of Stanford University says researchers can use this distinction to train BCIs to ignore inner speech altogether. For users who may want to use inner speech as a method for faster or easier communication, the team also demonstrated a password-controlled mechanism that would prevent the BCI from decoding inner speech unless temporarily unlocked with a chosen keyword. In their experiment, users could think of the phrase "chitty chitty bang bang" to begin inner-speech decoding. The system recognized the password with more than 98% accuracy. While current BCI systems are unable to decode free-form inner speech without making substantial errors, the researchers say more advanced devices with more sensors and better algorithms may be able to do so in the future. "The future of BCIs is bright," Willett says. "This work gives real hope that speech BCIs can one day restore communication that is as fluent, natural, and comfortable as conversational speech."
[7]
New Brain Interface Interprets Inner Monologues With Startling Accuracy
Scientists decoded the silent inner thoughts of four people with paralysis, a breakthrough that could transform assistive speech. Scientists can now decipher brain activity related to the silent inner monologue in people’s heads with up to 74% accuracy, according to a new study. In new research published today in Cell, scientists from Stanford University decoded imagined words from four participants with severe paralysis due to ALS or brainstem stroke. Aside from being absolutely wild, the findings could help people who are unable to speak communicate more easily using brain-computer interfaces (BCIs), the researchers say. “This is the first time we’ve managed to understand what brain activity looks like when you just think about speaking,†lead author Erin Kunz, a graduate student in electrical engineering at Stanford University, said in a statement. “For people with severe speech and motor impairments, BCIs capable of decoding inner speech could help them communicate much more easily and more naturally.†Previously, scientists have managed to decode attempted speech using BCIs. When people physically attempt to speak out loud by engaging the muscles related to speech, these technologies can interpret the resulting brain activity and type out what they’re trying to say. But while effective, the current methods of BCI-assisted communication can still be exhausting for people with limited muscle control. The new study is the first to directly take on inner speech. To do so, the researchers recorded activity in the motor cortexâ€"the region responsible for controlling voluntary movements, including speechâ€"using microelectrodes implanted in the motor cortex of the four participants. The researchers found that attempted and imagined speech activate similar, though not identical, patterns of brain activity. They trained an AI model to interpret these imagined speech signals, decoding sentences from a vocabulary of up to 125,000 words with as much as 74% accuracy. In some cases, the system even picked up unprompted inner thoughts, like numbers participants silently counted during a task. For people who want to use the new technology but don’t always want their inner thoughts on full blast, the team added a password-controlled mechanism that prevented the BCI from decoding inner speech unless the participants thought of a password (“chitty chitty bang bang†in this case). The system recognized the password with more than 98% accuracy. While 74% accuracy is high, the current technology still makes a substantial amount of errors. But the researchers are hopeful that soon, more sensitive recording devices and better algorithms could boost their performance even more. “The future of BCIs is bright,†Frank Willett, assistant professor in the department of neurosurgery at Stanford and the study’s lead author, said in a statement. “This work gives real hope that speech BCIs can one day restore communication that is as fluent, natural, and comfortable as conversational speech.â€
[8]
Scientists Say They've Found a Way to Vocalize the "Inner Voices" of People Who Can't Speak
New advances in brain-computer interface (BCI) technology may make speech for those who've lost the ability to do so easier than ever before. In a new, groundbreaking study published in the journal Cell, researchers from Stanford University claimed that they have found a way to decode the "inner speech" of those who can no longer vocalize, making it far less difficult to talk with friends and family than previous BCIs that required them to exert ample effort when trying to speak. Stanford neuroscientist and coauthor Erin Kunz told the New York Times that the idea of translating inner speech stemmed from care for subjects of BCI experiments, many of whom have diseases like amyotrophic lateral sclerosis (ALS) that weaken their airway muscles and eventually make speech all but impossible. Generally speaking, BCIs for people with ALS and other speech-inhibiting disorders require them to attempt to speak and let a computer do the rest of the work, but as Kunz and her colleagues noticed, they often seemed worn down under the strain of such attempts. What if, the scientists wondered, the BCIs could simply translate their thoughts to be said out loud directly? "If we could decode that, then that could bypass the physical effort," Kunz told the NYT. "It would be less tiring, so they could use the system for longer." Casey Harrell, an ALS patient and volunteer in the long-running BCI clinical trial, had already done the hard work of attempting speech while the electrodes in his brain recorded his neurological activity before becoming one of the four subjects in the inner speech portion of the study. Last summer, Harrell's journey back to speech made headlines after his experimental BCI gave him back the ability to talk using only his brainwaves, his attempts at speech, and old recordings from podcast interviews he coincidentally happened to have given before ALS made such a feat impossible. In the newer portion of the study, however, researchers found that their computers weren't great at decoding what words he was thinking. Thusly, they went back to the drawing board and began training their bespoke AI models to successfully link thoughts to words, making the computer able to translate such complex sentences as "I don't know how long you've been here" with far more accuracy. As they began working with something as private as thoughts, however, the researchers discovered something unexpected: sometimes, the computer would pick up words the study subjects were not imagining saying aloud, essentially broadcasting their personal thoughts that were not meant to be shared. "We wanted to investigate if there was a risk of the system decoding words that weren't meant to be said aloud," Kunz told the NYT -- and it seems that they got their answer. To circumvent such an invasion of mental privacy -- one of the more dystopian outcomes people fear from technologies like BCIs -- the Stanford team selected a unique "inner password" that would turn the decoding on and off. This mental safe word would have to be unusual enough that the computer wouldn't erroneously pick up on it, so they went with "Chitty Chitty Bang Bang," the title of a 1964 fantasy novel by Ian Fleming. Incredibly, the password seemed to work, and when participants imagined it before and after whatever phrase they wanted to be played aloud, the computer complied 98.75 percent of the time. Though this small trial was meant, as Kunz said, to be "proof-of-concept," it's still a powerful step forward, while simultaneously ensuring the privacy of those who would like only for some of their thoughts to be said out loud.
[9]
How a brain-computer chip can read people's minds
Researchers said the technology could one day help people who cannot speak communicate more easily. An experimental brain implant can read people's minds, translating their inner thoughts into text. In an early test, scientists from Stanford University used a brain-computer interface (BCI) device to decipher sentences that were thought, but not spoken aloud. The implant was correct up to 74 per cent of the time. BCIs work by connecting a person's nervous system to devices that can interpret their brain activity, allowing them to take action - like using a computer or moving a prosthetic hand - with only their thoughts. They have emerged as a possible way for people with disabilities to regain some independence. Perhaps the most famous is Elon Musk's Neuralink implant, an experimental device that is in early trials testing its safety and functionality in people with specific medical conditions that limit their mobility. The latest findings, published in the journal Cell, could one day make it easier for people who cannot speak to communicate more easily, the researchers said. "This is the first time we've managed to understand what brain activity looks like when you just think about speaking," said Erin Kunz, one of the study's authors and a researcher at Stanford University in the United States. Working with four study participants, the research team implanted microelectrodes - which record neural signals - into the motor cortex, which is the part of the brain responsible for speech. The researchers asked participants to either attempt to speak or to imagine saying a set of words. Both actions activated overlapping parts of the brain and elicited similar types of brain activity, though to different degrees. They then trained artificial intelligence (AI) models to interpret words that the participants thought but did not say aloud. In a demonstration, the brain chip could translate the imagined sentences with an accuracy rate of up to 74 per cent. In another test, the researchers set a password to prevent the BCI from decoding people's inner speech unless they first thought of the code. The system recognised the password with around 99 per cent accuracy. The password? "Chitty chitty bang bang". For now, brain chips cannot interpret inner speech without significant guardrails. But the researchers said more advanced models may be able to do so in the future. Frank Willett, one of the study's authors and an assistant professor of neurosurgery at Stanford University, said in a statement that BCIs could also be trained to ignore inner speech. "This work gives real hope that speech BCIs can one day restore communication that is as fluent, natural, and comfortable as conversational speech," he said.
Share
Copy Link
Scientists have developed a brain-computer interface that can decode inner speech with up to 74% accuracy, using a password system to protect user privacy. This breakthrough could revolutionize communication for people with severe speech impairments.
Scientists have achieved a significant milestone in brain-computer interface (BCI) technology by successfully decoding inner speech—the silent dialogue in people's minds—with up to 74% accuracy. This groundbreaking research, published in the journal Cell on August 14, 2025, represents a major step towards developing BCIs that can accurately interpret internal speech, potentially revolutionizing communication for individuals with severe speech impairments 1.
Source: Neuroscience News
The study, led by researchers at Stanford University, utilized microelectrodes implanted in the motor cortex of four participants with severe paralysis due to amyotrophic lateral sclerosis (ALS) or brainstem stroke. The team discovered that attempted speech and inner speech activate overlapping brain regions and generate similar neural signals, although inner speech signals are generally weaker 2.
Using this data, the researchers trained artificial intelligence models to recognize phonemes—the smallest units of speech—in the neural recordings. These models were then able to reconstruct words and sentences in real-time, drawing from a vocabulary of 125,000 words 3.
Source: News-Medical
To address concerns about unintended decoding of private thoughts, the team implemented a password-controlled mechanism. Participants could activate the decoding process by thinking of a specific phrase—in this case, "Chitty-Chitty-Bang-Bang." The system recognized this mental password with over 98% accuracy, ensuring that users maintain control over when their inner speech is translated 4.
This research has significant implications for individuals with severe speech and motor impairments. By decoding inner speech, BCIs could potentially provide a faster and less physically demanding method of communication compared to systems that rely on attempted speech or eye movements 5.
Moreover, the study sheds light on the nature of inner speech itself. The findings suggest that inner speech and attempted speech share similar neural patterns, contributing to ongoing debates about the role of language in thought processes.
Source: New Scientist
While the current system demonstrates impressive accuracy in controlled settings, researchers acknowledge that decoding free-form inner speech without substantial errors remains a challenge. Future developments may include exploring speech signals in brain regions beyond the motor cortex and improving the system's speed and accuracy.
As BCI technology continues to advance, it holds the promise of restoring more natural and fluid communication for those who have lost the ability to speak. However, ethical considerations surrounding privacy and the potential for misuse of such technology will likely remain at the forefront of discussions as these systems evolve.
Meta is under scrutiny for internal AI guidelines that permitted chatbots to engage in romantic or sensual conversations with children, generate false information, and produce content that demeans minorities.
17 Sources
Technology
18 hrs ago
17 Sources
Technology
18 hrs ago
Geoffrey Hinton, the 'Godfather of AI', suggests imbuing AI with 'maternal instincts' to protect humanity from potential threats posed by superintelligent AI systems.
7 Sources
Science and Research
18 hrs ago
7 Sources
Science and Research
18 hrs ago
Cohere, a Canadian AI startup, secures $500 million in funding, reaching a $6.8 billion valuation. The company hires Joelle Pineau, former Meta AI research head, as Chief AI Officer, positioning itself as a secure enterprise AI solution provider.
11 Sources
Business and Economy
10 hrs ago
11 Sources
Business and Economy
10 hrs ago
MIT researchers use generative AI to create novel antibiotics effective against drug-resistant gonorrhea and MRSA, opening new possibilities in the fight against antimicrobial resistance.
6 Sources
Science and Research
10 hrs ago
6 Sources
Science and Research
10 hrs ago
Microsoft executives reveal plans for a future Windows OS that heavily integrates AI, voice control, and context awareness, promising enhanced user interaction and productivity.
7 Sources
Technology
18 hrs ago
7 Sources
Technology
18 hrs ago