Valentin Schmutz Profile
Valentin Schmutz

@schmutz_val

Followers
218
Following
95
Media
4
Statuses
54

Theoretical Neuroscientist | 🔎 Emergent neural population dynamics | Postdoc in Carandini-Harris lab @UCL | PhD from Gerstner lab @EPFL

London, England
Joined November 2023
Don't wanna be here? Send us removal request.
@schmutz_val
Valentin Schmutz
1 year
Concentration of measure, a notion from probability that is oddly little-known in neurotheory, can explain how a heterogeneous population of spiking neurons can approximate rate-based dynamics. This is shown in our new preprint from @compneuro_epfl.
Tweet media one
5
32
125
@schmutz_val
Valentin Schmutz
6 months
RT @DynamicsSIAM: Review article: "Nonlinear partial differential equations in neuroscience: from modelling to mathematical theory" (by Jos….
Tweet card summary image
arxiv.org
Many systems of partial differential equations have been proposed as simplified representations of complex collective behaviours in large networks of neurons. In this survey, we briefly discuss...
0
36
0
@schmutz_val
Valentin Schmutz
8 months
RT @bsimsek13: 📢 I'm on the faculty job market this year! . My research explores the foundations of deep learning and analyzes learning and….
0
22
0
@schmutz_val
Valentin Schmutz
8 months
RT @janeliaconf: A big thanks to the early-career researchers who joined us this week @HHMIJanelia for the Junior Scientist Workshop on The….
0
8
0
@schmutz_val
Valentin Schmutz
9 months
RT @soledad_gcogno: Had a great time chatting with @BjksPodcast about so many things, including my journey from 🇦🇷 to 🇳🇴, my lab and our re….
0
2
0
@schmutz_val
Valentin Schmutz
10 months
Congrats Jacob @jzavatoneveth! Looking forward to reading your future work.
@NIH_CommonFund
NIH Common Fund
10 months
Early Independence Awardee Jacob Zavatone-Veth of @Harvard's Society of Fellows is researching how neural networks model large-scale #NeuralData to advance our understanding of #DeepLearning. Read more:
0
0
3
@schmutz_val
Valentin Schmutz
10 months
RT @zdeborova: This mini-review is based on Hugo Cui's PhD thesis: . My advice to him was: "Write something you wo….
0
66
0
@schmutz_val
Valentin Schmutz
10 months
RT @EnnyvBeest: [1/5] Our paper “Tracking neurons across days with high-density probes” is now out in @naturemethods! .
0
69
0
@schmutz_val
Valentin Schmutz
10 months
RT @roxana_zeraati: Headed to @BernsteinNeuro Conference this weekend and interested in how biological computation is performed across diff….
0
14
0
@schmutz_val
Valentin Schmutz
10 months
RT @EnnyvBeest: Excited for Nano42 on 'circuit dynamics across brain regions during navigation (across species)' at #SfN2024. Come check ou….
0
1
0
@schmutz_val
Valentin Schmutz
11 months
6. The proof combines deterministic and probabilistic mean-field methods from interacting particle systems together with results from the theory of dense graph limits (graphons) and Lp spaces. This work was done at @PSUScience with the support of @compneuro_epfl.
0
0
2
@schmutz_val
Valentin Schmutz
11 months
5. This mean-field PDE is a spatially extended version of the mean-field equation rigorously derived by De Masi, Galves, Löcherbach, and Presutti (2015) and conjectured by Gerstner in the '90s.
1
0
3
@schmutz_val
Valentin Schmutz
11 months
4. We show that the stochastic dynamics of any sequence of networks of leaky integrate-and-fire neurons with "escape noise" converges (up to the extraction of a subsequence) to the deterministic solution to a spatially-extended mean-field PDE:
Tweet media one
1
0
3
@schmutz_val
Valentin Schmutz
11 months
3. To our knowledge, this surprising mathematical fact hasn't been conjectured, or even hinted at, in theoretical neuroscience. This is probably because this fact stems from relatively recent results in the graph theory (see Large Networks and Graph Limits by L. Lovász, 2012).
2
0
4
@schmutz_val
Valentin Schmutz
11 months
2. This means that neural field-like equations are universal limits of large networks with O(1/N) synaptic weight scaling. Networks do not need to have an explicit spatial structure to converge to a neural field-like equation; O(1/N) weight scaling suffices for that.
1
0
2
@schmutz_val
Valentin Schmutz
11 months
1. Synaptic weight scaling in O(1/N) self-induces a form of (implicit) spatial structure in networks of spiking neurons, as the number of neurons N tends to infinity. This is what D.T. Zhou, P.-E. Jabin and I prove in.
Tweet card summary image
arxiv.org
The dynamics of spatially-structured networks of $N$ interacting stochastic neurons can be described by deterministic population equations in the mean-field limit. While this is known, a general...
3
7
37
@schmutz_val
Valentin Schmutz
11 months
RT @d_g_clark: 1/ Excited to share new work with @MarschallOwen, @AlexVanMeegen, and Ashok Litwin-Kumar! "Connectivity Structure and Dynami….
0
29
0
@schmutz_val
Valentin Schmutz
11 months
RT @ClaudiaMerger: We computed a fluctuation correction for the spread of disease. The corrections suppress a spurious self-feedback effect….
journals.aps.org
The susceptible-infected-recovered (SIR) model and its variants form the foundation of our understanding of the spread of diseases. Here, each agent can be in one of three states (susceptible,...
0
2
0
@schmutz_val
Valentin Schmutz
1 year
RT @FlaviohMar: 📕Recovering network weights from a set of input-output neural activations 👀.Ever wondered if this is even possible? 🤔. Chec….
0
12
0
@schmutz_val
Valentin Schmutz
1 year
RT @maxime_beau: It is an honour to have been awarded @ucl’s Jon Driver prize for my PhD work in the @NeuralCompLab. I am grateful to @dimv….
0
6
0
@schmutz_val
Valentin Schmutz
1 year
RT @KiaNobre: Yale Psychology Department is hiring!.Open-rank position for a colleague interested in developing and employing novel analyti….
0
62
0