ChangLabUcsf Profile Banner
ChangLabUCSF Profile
ChangLabUCSF

@ChangLabUcsf

Followers
3K
Following
13
Media
48
Statuses
154

Chang Lab at UCSF, human brain, speech, brain-computer-interfaces, neurosurgery

San Francisco, CA
Joined August 2020
Don't wanna be here? Send us removal request.
@NeurosurgUCSF
UCSF Neurosurgery
4 months
For more than a century, scientists thought that Broca's area coordinates the muscle movements required to speak. But new @UCSF research from @ChangLabUCSF identifies the key role that the middle precentral gyrus (mPrCG) plays in this process. @jessierliu https://t.co/5NCmqBNT8m
Tweet card summary image
ucsf.edu
Researchers discovered that a different part of the brain handles stringing sounds and words together into coherent sentences. The information could help people who have had strokes and lost the...
0
12
41
@ChangLabUcsf
ChangLabUCSF
4 months
This work wouldn’t have been possible without our two amazing participants, Bravo-1 and Bravo-3, their family and caregivers, and the support of our team!
0
0
1
@ChangLabUcsf
ChangLabUCSF
4 months
Rather than coding specific vocal-tract movements during attempted speech, activity at these shared electrodes could be decoded into words prior to attempted speech, supporting the emerging role of the middle precentral gyrus in speech-motor planning.
2
0
0
@ChangLabUcsf
ChangLabUCSF
4 months
We found that electrodes with activity during all three functions (attempted speech, listening, reading) were typically in a specific part of the SMC called the middle precentral gyrus.
1
0
0
@ChangLabUcsf
ChangLabUCSF
4 months
Specificity to attempted speech was achieved by incorporating listening and reading data into training of the speech detection model and adding a dedicated speech verification model.
1
0
0
@ChangLabUcsf
ChangLabUCSF
4 months
In two people with severe paralysis and anarthria, we designed a speech decoding system that maintained accuracy and complete specificity (zero false positives) to volitional speech attempts over 63.2 minutes of reading, listening, and internal thought.
1
0
0
@ChangLabUcsf
ChangLabUCSF
4 months
Developing decoders that are highly specific to volitional speech attempts, without false activations, is highly desired by potential users. Our participants indicated they largely preferred a highly specific system, even at the expense of speed.
1
0
1
@ChangLabUcsf
ChangLabUCSF
4 months
Speech decoders are trained to decode intended speech from neural activity, oftentimes from the sensorimotor cortex (SMC). However, the SMC (and other areas) can also be involved in other common tasks like listening and reading, and could unintentionally activate a decoder.
1
0
0
@ChangLabUcsf
ChangLabUCSF
4 months
Check out our latest work developing speech decoding systems that maintain their specificity to volitional speech attempts, co-led by @asilvaalex1 and @jessierliu
1
8
23
@ChangLabUcsf
ChangLabUCSF
4 months
Our latest research on the neural basis of speech-motor sequencing is now published in @NatureHumBehav! Check out this high-level explainer video and read the full paper here:
Tweet card summary image
nature.com
Nature Human Behaviour - Liu et al. examine the role of sustained neural activity in the planning and production of speech sequences, revealing a key role for the middle precentral gyrus.
@NeurosurgUCSF
UCSF Neurosurgery
4 months
New in @NatureHumBehav, @jessierliu, Lingyun Zhao, PhD & @ChangLabUCSF show that the middle precentral gyrus coordinates the muscle movements required to speak, challenging the longstanding view that Broca's area controls this process: https://t.co/NS0WpDVuuV
0
9
49
@ChangLabUcsf
ChangLabUCSF
8 months
Our latest work on the neural mechanism for stopping speech production is published! See a brief summary below and the original paper linked to the post at the bottom.
@NatureHumBehav
Nature Human Behaviour
9 months
In natural conversations, people can stop speaking at any time. How? Using high-density electrocorticography, Zhao et al. find a distinct neural signal in the human premotor cortex that inhibits speech output to achieve abrupt stopping. @ChangLabUcsf https://t.co/LJnry8VzYp
1
13
53
@ChangLabUcsf
ChangLabUCSF
1 year
Our models also showed stable performance without retraining for ~2 months and these results were achieved ~4 years after ECoG implantation. We hope these findings can be scaled to more patients in the near future!
0
2
3
@ChangLabUcsf
ChangLabUCSF
1 year
We leveraged this finding to demonstrate transfer learning across languages. Data collected in a first language could significantly expedite training a decoder in the second language, saving time and effort for the user.
1
2
4
@ChangLabUcsf
ChangLabUCSF
1 year
Cortical activity instead represented the intended vocal-tract movements of the participant, irrespective of the language. This allowed us to train a model that generalized across a shared set of English and Spanish syllables.
1
2
2
@ChangLabUcsf
ChangLabUCSF
1 year
Despite learning English later in life, we found that cortical activity was largely shared across languages, with no clear differences in magnitude or language-selective neural populations.
1
2
4
@ChangLabUcsf
ChangLabUCSF
1 year
Our system decoded cortical activity, measured with ECoG over the IFG and SMC, into English and Spanish sentences. The intended language was not set by the user, instead it was freely decoded from cortical activity and language models.
1
4
10
@ChangLabUcsf
ChangLabUCSF
1 year
Speech decoding has primarily been shown for monolinguals but half the world is bilingual with each language contributing to a person’s personality and worldview. There is a need to develop decoders that let bilinguals communicate with both languages.
2
3
3
@ChangLabUcsf
ChangLabUCSF
1 year
Work led by @asilvaalex1 with mentors and co-authors @jessierliu , @SeanMetzger5 , @bhaya_ilina , @KayloLittlejohn , @AtDavidMoses , Max Dougherty, Margaret Seaton, and Edward Chang
1
2
3
@ChangLabUcsf
ChangLabUCSF
1 year
Excited to share our work on developing a bilingual speech neuroprosthesis that decodes cortical activity into English and Spanish sentences in a person with paralysis. Out today in @natBME!
3
24
133
@ChangLabUcsf
ChangLabUCSF
2 years
Reminder to those at #Cosyne2024 that @asilvaalex1 will be presenting his work on ***a bilingual speech neuroprosthesis*** ! Poster Session 2 #131 at 12:30!
0
0
6