DurstewitzLab
@DurstewitzLab
Followers
2K
Following
3K
Media
68
Statuses
1K
Scientific machine learning, AI & data analysis, dynamical systems theory, applications in (computat.) neuroscience & psychiatry. @durstewitzlab.bsky.social
Mannheim+Heidelberg
Joined May 2019
Despite being extremely lightweight (only 0.1% of params, 0.6% training corpus size, of closest competitor), it also outperforms major TS foundation models like Chronos variants on real-world TS short-term forecasting with minimal inference times (0.2%) ...
0
0
3
Our #DynamicalSystems #FoundationModel was accepted to #NeurIPS2025 with outstanding reviews (6555) – first model which can *0-shot*, w/o any fine-tuning, forecast the *long-term statistics* of time series provided a context. Test it on #HuggingFace: https://t.co/FrbK5Kx9t2 ...
huggingface.co
Can time series #FoundationModels like Chronos zero-shot generalize to unseen #DynamicalSystems (DS)? No, they cannot. But *DynaMix* can, the first FM based on principles of DS reconstruction, capturing the long-term evolution of out-of-domain DS: https://t.co/fL1CLATTpB (1/6)
1
2
11
Relevant publications: https://t.co/mje21ShyWC
https://t.co/R5sSZtd9Yq
https://t.co/KEmmAbr4Tl
https://t.co/iu59ez66vu
nature.com
Nature Communications - Whether neurocomputational mechanisms that speed up human learning in changing environments also exist in other species remains unclear. Here, the authors show that both...
0
0
0
We have openings for several fully-funded positions (PhD & PostDoc) at the intersection of AI/ML, dynamical systems, and neuroscience within a BMFTR-funded Neuro-AI consortium, at Heidelberg University & Central Institute of Mental Health (see below): https://t.co/CZNXECQETe
1
3
15
Our new preprint compares naïve baselines, network models (incl. PLRNN-based SSMs), and Transformers on 3x40‑day EMA+EMI datasets. PLRNNs gave the most accurate forecasts, yielded interpretable networks, and flagged “sad” & “down” as top leverage points. https://t.co/9trDupOR4A
1
13
33
Got prov. approval for 2 major grants in Neuro-AI & Dynamical Systems Recons., on learning & inference in non-stationary environments, OOD generalization, and DS foundation models. To all AI/math enthusiasts: Expect job announcements (PhD/PostDoc) soon! Feel free to get in touch.
1
7
31
We wrote a little #NeuroAI piece about in-context learning & neural dynamics vs. continual learning & plasticity, both mechanisms to flexibly adapt to changing environments: https://t.co/UR20TGtJ8L We relate this to non-stationary rule learning w rapid jumps. Feedback welcome!
arxiv.org
Modern AI models, such as large language models, are usually trained once on a huge corpus of data, potentially fine-tuned for a specific task, and then deployed with fixed parameters. Their...
0
18
81
Fantastic work by Florian Bähner, Hazem Toutounji, Tzvetan Popovand & many others - I'm just the person advertising!
0
0
3
How do animals learn new rules? By systematically testing diff. behavioral strategies, guided by selective attn. to rule-relevant cues: https://t.co/Bxr8xalkmr Akin to in-context learning in AI, strategy selection depends on the animals' "training set" (prior experience).
2
8
42
Into population dynamics? Coming to #CNS2025 but not quite ready to head home? Come join us! at the Symposium on "Neural Population Dynamics and Latent Representations"!🧠 🗓️July 10th 📍@ScuolaSantAnna, Pisa (and online) Free registration: 👉 https://t.co/NMfX7U3LH4
1
8
49
We dive a bit into the reasons why current time series FMs not trained for DS reconstruction fail, and conclude that a DS perspective on time series forecasting & models may help to advance the #TimeSeriesAnalysis field. (6/6)
1
0
3
Remarkably, DynaMix not only generalizes zero-shot to novel DS, but it can even generalize to new initial conditions and regions of state space not covered by the in-context information. (5/6)
1
0
4
And no, it’s neither based on Transformers nor Mamba – it’s a new type of mixture-of-experts architecture based on the recently introduced AL-RNN ( https://t.co/vrFY4WejRW), specifically trained for DS reconstruction. #AI (4/6)
1
0
3
It often even outperforms TS FMs on forecasting diverse empirical time series, like weather or traffic, typically used to train TS FMs. This is surprising, cos DynaMix’ training corpus consists *solely* of simulated limit cycles & chaotic systems, no empirical data at all! (3/6)
1
0
2
Unlike TS FMs, DynaMix exhibits #ZeroShotLearning of long-term stats of unseen DS, incl. attrac. geom. & PS, w/o *any* re-training, just from a context signal. It does so with only 0.1% of the parameters of Chronos & 10x faster inference times than the closest competitor. (2/6)
1
0
2
Can time series #FoundationModels like Chronos zero-shot generalize to unseen #DynamicalSystems (DS)? No, they cannot. But *DynaMix* can, the first FM based on principles of DS reconstruction, capturing the long-term evolution of out-of-domain DS: https://t.co/fL1CLATTpB (1/6)
3
19
91
Our revised #ICLR2025 paper & code for a foundation model architecture for dynamical systems is now online: https://t.co/R5sSZtd9Yq ... incl. add. examples of how this may be used for identifying drivers (control par.) of non-stationary processes. And please switch platform!
Interested in interpretable #AI foundation models for #DynamicalSystems reconstruction? In a new paper we move into this direction, training common latent DSR models with system-specific features on data from multiple different dynamic regimes and DS: https://t.co/uaf1RcyTBi 1/4
0
9
33
Transfer & few-shot learning for dynamical systems ... our paper just got accepted for #ICLR2025 @iclr_conf ! Thread below; strongly updated version will be available soon ... ... and don't forget to move to bsky! https://t.co/JXywJ9Nqe9
Interested in interpretable #AI foundation models for #DynamicalSystems reconstruction? In a new paper we move into this direction, training common latent DSR models with system-specific features on data from multiple different dynamic regimes and DS: https://t.co/uaf1RcyTBi 1/4
0
0
4
Periodic reminder that MSE or explained var. are not good stats for assessing quality of dynamical systems reconstructions, cos of exponential trajectory divergence in chaotic systems. And please follow us at https://t.co/JXywJ9Nqe9 !
0
0
5