AT2k Design BBS Message Area
Casually read the BBS message area using an easy to use interface. Messages are categorized exactly like they are on the BBS. You may post new messages or reply to existing messages!

You are not logged in. Login here for full access privileges.

Previous Message | Next Message | Back to Slashdot  <--  <--- Return to Home Page
   Local Database  Slashdot   [166 / 176] RSS
 From   To   Subject   Date/Time 
Message   VRSS    All   New Brain Device Is First To Read Out Inner Speech   August 15, 2025
 10:40 PM  

Feed: Slashdot
Feed Link: https://slashdot.org/
---

Title: New Brain Device Is First To Read Out Inner Speech

Link: https://science.slashdot.org/story/25/08/15/1...

An anonymous reader quotes a report from ScientificAmerican: After a brain
stem stroke left him almost entirely paralyzed in the 1990s, French
journalist Jean-Dominique Bauby wrote a book about his experiences -- letter
by letter, blinking his left eye in response to a helper who repeatedly
recited the alphabet. Today people with similar conditions often have far
more communication options. Some devices, for example, track eye movements or
other small muscle twitches to let users select words from a screen. And on
the cutting edge of this field, neuroscientists have more recently developed
brain implants that can turn neural signals directly into whole words. These
brain-computer interfaces (BCIs) largely require users to physically attempt
to speak, however -- and that can be a slow and tiring process. But now a new
development in neural prosthetics changes that, allowing users to communicate
by simply thinking what they want to say. The new system relies on much of
the same technology as the more common "attempted speech" devices. Both use
sensors implanted in a part of the brain called the motor cortex, which sends
motion commands to the vocal tract. The brain activation detected by these
sensors is then fed into a machine-learning model to interpret which brain
signals correspond to which sounds for an individual user. It then uses those
data to predict which word the user is attempting to say. But the motor
cortex doesn't only light up when we attempt to speak; it's also involved, to
a lesser extent, in imagined speech. The researchers took advantage of this
to develop their "inner speech" decoding device and published the results on
Thursday in Cell. The team studied three people with amyotrophic lateral
sclerosis (ALS) and one with a brain stem stroke, all of whom had previously
had the sensors implanted. Using this new "inner speech" system, the
participants needed only to think a sentence they wanted to say and it would
appear on a screen in real time. While previous inner speech decoders were
limited to only a handful of words, the new device allowed participants to
draw from a dictionary of 125,000 words. To help keep private thoughts
private, the researchers implemented a code phrase "chitty chitty bang bang"
that participants could use to prompt the BCI to start or stop transcribing.

Read more of this story at Slashdot.

---
VRSS v2.1.180528
  Show ANSI Codes | Hide BBCodes | Show Color Codes | Hide Encoding | Hide HTML Tags | Show Routing
Previous Message | Next Message | Back to Slashdot  <--  <--- Return to Home Page

VADV-PHP
Execution Time: 0.0144 seconds

If you experience any problems with this website or need help, contact the webmaster.
VADV-PHP Copyright © 2002-2025 Steve Winn, Aspect Technologies. All Rights Reserved.
Virtual Advanced Copyright © 1995-1997 Roland De Graaf.
v2.1.250224