Nuttida Rungratsameetaweemana
@nuttidarungrat
Followers
426
Following
255
Media
12
Statuses
84
Assistant Professor @columbiaBME | Postdoc in Sejnowski Lab @salkinstitute | PhD @UCSDNeuro
Middlebury/UCSD/Salk/Columbia
Joined May 2020
A fantastic course with a great tradition! Honored to direct it together with @roozbeh_kiani. Thanks to Xiao-Jing Wang, Steve Baccus and all the previous directors, and of course, thanks to the @MBLScience team. Apply here:
mbl.edu
MCN introduces students to the computational and mathematical techniques that are used to address how the brain solves problems at levels of neural organization ranging from single membrane channels...
Methods in Computational Neuroscience (MCN) 🧠💻 July 24 - Aug. 20, 2024 Applications due: Feb. 28, 2024
3
41
145
Absolutely honored to be featured in this interview series, especially in conversation with the amazing @meg_kirch 🌟 Also, special thanks to @cprofaci and the team for this opportunity!
Check out our latest profile! Dr. Nuttida Rungratsameetaweemana (@nuttidarungrat) uses experimentation & modeling to uncover the neural computations underlying cognition. Follow the link for the full interview! 🧠 https://t.co/0apzb1cwme
#WomenInSTEM #WomenInNeuroscience
1
2
16
📢Come work with us at @SickKidsNews in Toronto! My lab is hiring a Research Tech. You will get to participate in cutting-edge research on learning and memory. And you 'll be using the coolest, most advanced imaging tools in Systems Neuroscience! 👇 https://t.co/Ak6kZ8k5mP
linkedin.com
Today’s top 3,000+ Research Study Coordinator jobs in United States. Leverage your professional network, and get hired. New Research Study Coordinator jobs added daily.
1
27
55
Happy to share our new work on decision boundary switching 🧠 with @maggiehende @serences! Link: https://t.co/DEljJ2CQcc We would love to hear your feedback on the preprint!
biorxiv.org
Everyday visual search tasks require objects to be categorized according to behavioral goals. For example, when searching for an apple at the supermarket, one might first find the Granny Smith apples...
Excited to share our new preprint on the neural mechanisms of flexible shape categorization! With @serences & @nuttidarungrat Link: https://t.co/XpR1j4BFUS See 🧵👇
0
0
10
SPECIAL SEMINAR! #NashNeuroscience & @PaulKennyPhD host Provost's Fellow @columbiaBME Dr. Nuttida Rungratsameetaweemana (@nuttidarungrat) as she presents "Cortical computations for adaptive learning in biological and artificial systems". Monday, September 18th, 1pm ➡️NOT TO MISS!
0
4
10
I'm recruiting a Ph.D. student for the upcoming school year (Fall 2024). If you are interested in studying the neural bases of attention and working memory, I would love to hear from you! You can submit an application from now through Dec 1, 2023.
6
87
192
We hope this work provides an interesting perspective into WM computation & information processing in biological networks in general. Please feel free to reach out with any questions. We would also love to hear your feedback on the preprint. Code and data release to come soon!
0
0
1
The results showed that noise prolonged training time and did not affect synaptic dynamics, suggesting that the slow decay dynamics induced by noise are specific to WM computation.
1
0
2
Finally, we asked if the modulatory effects of noise during training were specific to WM dynamics. To answer this, we trained our RNNs on two non-WM tasks: two-alternative force choice and context-dependent sensory integration tasks.
1
0
0
We found that these dynamics allowed the networks to maintain higher activity for the preferred stimulus and greater suppression for the non-preferred stimulus, facilitating the maintenance of stimulus-specific information for longer delay intervals.
1
0
0
We next identified dominant neurons that are responsible for WM maintenance in RNNs trained without and with noise and showed that for the noise-trained networks, the dominant units exhibited slower synaptic dynamics and stronger inhibitory signaling.
1
0
0
We next focused on understanding the role of slow inhibitory signaling on memory maintenance in the networks trained with noise. Using linear stability analysis, we found that RNNs trained with noise had more slowly relaxing modes making them more robust.
1
0
0
By analyzing the distribution of the optimized synaptic decay time constants (τ), we found that the incorporation of noise during WM training led to an increase in synaptic decay time constants of the inhibitory units in RNNs.
1
0
0
Next, we investigated how the noise facilitated stable maintenance of stimulus information by examining the optimized model parameters. We hypothesized that the internal noise enhances working memory dynamics by selectively modulating inhibitory signaling.
1
0
1
To better understand the underlying neural dynamics, we used PHATE and found that noise-trained RNNs exhibited distinct and informative bifurcations in neural trajectories, highlighting the role of internal noise in facilitating sensory encoding and WM computation.
1
0
0
We found that introducing noise to the RNNs during training enhances training efficiency and robustness. By varying the amount of injected noise, we showed that the 'optimal' level of noise improves contextualized sensory encoding and representation of stimuli.
1
0
1
We first built biologically inspired RNNs and trained them to perform a working memory (WM) task (i.e., maintaining information over a brief period). We next injected 'internal' noise signals into the networks and studied if/how they affect WM computation.
1
0
0
Growing evidence suggests that 1) 'cortical noise' holds information that can impact behavior and that 2) noise enhances robustness and stability in deep learning models. Here, we asked if noise can improve cognitive computation in biologically plausible networks.
1
0
1
Excited to share our new results! "Random Noise promotes slow heterogeneous synaptic dynamics important working memory computation" with Robert Kim, @ThiparatC , & Terry Sejnowski. Link: https://t.co/a1QBMeuQ5e
@ColumbiaBME @salkinstitute
biorxiv.org
Recurrent neural networks (RNNs) based on model neurons that communicate via continuous signals have been widely used to study how cortical neurons perform cognitive tasks. Training such networks to...
1
8
29
Please RT — We have open PostDoc, PhD and RA positions in Computational Neuroscience at @UCLA! Interested in developing theories and models on neural computations, coding, and dynamics in a highly collaborative environment? Apply to join us! For details: https://t.co/ib3XvbAmwi
1
83
132