matthijs_pals Profile Banner
Matthijs Pals Profile
Matthijs Pals

@matthijs_pals

Followers
321
Following
355
Media
8
Statuses
108

Using deep learning to elucidate neural representations and dynamics @MackeLab

Tübingen, Germany
Joined November 2021
Don't wanna be here? Send us removal request.
@matthijs_pals
Matthijs Pals
2 years
Now out in @PLOSCompBiol 🎉🥳 https://t.co/eicLeuTMp3 Summary in gifs 👇
Tweet card summary image
journals.plos.org
Author summary Many of our actions are rhythmic—walking, breathing, digesting and more. It is not surprising that neural activity can have a strong oscillatory component. Indeed, such brain waves are...
@mackelab
Machine Learning in Science
3 years
Neural oscillations are abundant! But what is their role? Oscillations could serve as a clock signal for phase-coding information. In: https://t.co/qxWgSPfPtJ, we studied the dynamical systems underlying this kind of coding. With @matthijs_pals, @jakhmack and Omri Barak [1/5]
2
13
46
@adriamilcar
Adrián F. Amil
11 months
🎉Finally published @PLOSCompBiol ! Why do neurons use low-frequency oscillations for encoding? Why not use higher frequencies for better sampling resolution? We identify a speed-precision trade-off driven by noise, showing that theta (3–8 Hz) maximizes bits/s! Check it out 👇
8
104
566
@matthijs_pals
Matthijs Pals
11 months
We reduce the cost of finding all fixed-points in piece-wise linear low-rank RNNs from 2^N to O(N^R)! This is part of our recent NeurIPS paper, presented tomorrow: https://t.co/WyEjJDOYdl Special thanks to @gloecklermanuel for helping write out a proof! 7/7
@mackelab
Machine Learning in Science
11 months
We show how to generate long sequences of realistic neural data with stochastic low-rank RNNs - and how to find their fixed points. Poster #3908 (East; Wed 11 Dec 11:00 PT). By @matthijs_pals @aesagtekin, @fel_p8, @gloecklermanuel and @jakhmack ➡️ https://t.co/0ABsHAy9R5   2/4
0
0
6
@matthijs_pals
Matthijs Pals
11 months
But not all of the 2^N regions do! Each neuron partitions the R-dim subspace of dynamics with a hyperplane (here a line). N hyperplanes can partition R-dim space into at most O(N^R) regions. We can thus reduce our search space by only solving for fixed points in these! 6/7
1
0
4
@matthijs_pals
Matthijs Pals
11 months
Some of the the 2^N regions with linear dynamics intersect the subspace in which the dynamics unfold (span U), as the one here: 5/7
1
0
4
@matthijs_pals
Matthijs Pals
11 months
In low-rank RNNs dynamics are constrained to the linear subspace spanned by the left singular vectors U of the recurrent weight matrix (e.g., work from Ostojic' lab) 4/7
1
0
6
@matthijs_pals
Matthijs Pals
11 months
This requires solving 2^N systems of equations. Can we be more efficient? Turns out we can - if the RNNs are also low-rank! 3/7
1
0
4
@matthijs_pals
Matthijs Pals
11 months
We can go through all those regions, one by one, and solve the corresponding linear system equations (see e.g., work from Curto and Durstwitz' lab) 2/7
1
0
6
@matthijs_pals
Matthijs Pals
11 months
How to find all fixed points in piece-wise linear recurrent neural networks (RNNs)? A short thread 🧵 In RNNs with N units with ReLU(x-b) activations the phase space is partioned in 2^N regions by hyperplanes at x=b 1/7
1
31
169
@sbi_devs
sbi developers
1 year
The sbi package is growing into a community project 🌎 To reflect this and the algorithms, neural nets, and diagnostics that have been added since its initial release, we have written a new software paper. Reach out if you want to get involved:
Tweet card summary image
arxiv.org
Scientists and engineers use simulators to model empirically observed phenomena. However, tuning the parameters of a simulator to ensure its outputs match observed data presents a significant...
1
11
33
@mackelab
Machine Learning in Science
1 year
Want to stay up-to-date with exciting #ml4science #ai4science research happening in our lab? Come have a look at @mackelab.bsky.social :).
0
1
5
@_rdgao
Richard Gao
1 year
Want a tool that uses ML to generate REALLY good fake brain recordings? You're getting one. Julius' paper on diffusion models for brain data is published! Works with all kinds of densely sampled, multichannel continuous signals (LFP, EEG, etc.) https://t.co/CG7aD3uLAu
8
76
325
@mackelab
Machine Learning in Science
1 year
A propos: Inspired to do a PhD or Postdoc in #ML4Science/#AI4Science? We have multiple openings to work ML and AI tools for scientific discovery, in neuroscience and beyond! Full details: https://t.co/oNwIeQ3g8v Students: Apply by Nov 15, directly to IMPRS-IS or @ELLISforEurope
@MPI_IS
Intelligent Systems
1 year
Researching #AI #MachineLearning #Robotics #HCI? Join our elite doctoral program - a partnership with MPI-IS, @uni_stuttgart & @uni_tue! Applications accepted until Nov 15, 2024 at https://t.co/veVpmjl215
1
9
37
@mackelab
Machine Learning in Science
1 year
At Poster III-69 @matthijs_pals will explain how to fit RNNs to neural data - and use them as generative models. Want to understand the fit models? We show how to obtain all fixed points in low-rank piecewise-linear RNNs.
1
1
3
@mackelab
Machine Learning in Science
1 year
We’re at Bernstein Conference next week with lots of new work to share: 10 posters, 1 workshop talk, and don’t miss @jakhmack’s invited talk on Wednesday! If you’re excited about machine learning for (neuro)science, come chat with us—we’re hiring PhD students & postdocs!
1
8
21
@mackelab
Machine Learning in Science
1 year
We’re stoked to share: “A Practical Guide to Sample-based Statistical Distances for Evaluating Generative Models in Science”. Now out in TMLR: https://t.co/FNmK52nGkG This was an incredibly special project for us, as it involved the **entire** lab getting together!
3
54
218
@_rdgao
Richard Gao
1 year
Back in 2022, @roxana_zeraati & I organized a Cosyne workshop on neural timescales, and after working on it for the last 2 years together, it's now a review paper! https://t.co/LgVdULAQ2U w/ @SelfOrgAnna & @jakhmack (2nd blogpost to turn into a real review paper this year lol)
3
69
280
@Tomdonoghue
Tom Donoghue
1 year
📜🎉 We have a new preprint: an overview & comparison of measures of 'aperiodic' neural activity! This project explores different ideas & many methods used to study non-oscillatory features of intra- & extra-cranial electrophysiological recordings! https://t.co/4FCkv1TJlk
Tweet card summary image
biorxiv.org
Neuro-electrophysiological recordings contain prominent aperiodic activity – meaning irregular activity, with no characteristic frequency – which has variously been referred to as 1/f (or 1/f-like...
4
48
135
@matthijs_pals
Matthijs Pals
1 year
Want to train neuroscience models consisting of single cells, recurrent neural networks (RNNs), or huge feedforward networks - all with detailed biophysics? @deismic_'s Jaxley has your back! 👇
@mackelab
Machine Learning in Science
1 year
How can we train biophysical neuron models on data or tasks? We built Jaxley, a differentiable, GPU-based biophysics simulator, which makes this possible even when models have thousands of parameters! Led by @deismic_, collab with @CellTypist @ppjgoncalves
0
0
8
@lappalainenjk
Janne Lappalainen
1 year
Biggest joy and honour leading this project at the intersection of visual neuroscience and ML to a successful finish! Paper:
Tweet card summary image
nature.com
Nature - A study demonstrates how experimental measurements of only the connectivity of a biological neural network can be used to predict neural responses across the fly visual system at...
8
44
166
@_rdgao
Richard Gao
1 year
My #AI4Neuro magnum opus: Discovery of spiking network model parameters constrained by neural recordings, using simulation-based inference & generative “AI”. (aka the answer to “how the f did you end up in Tübingen?”) Here's what we have in store: https://t.co/UpnjVaLp8G
10
54
212