DurstewitzLab Profile Banner
DurstewitzLab Profile
DurstewitzLab

@DurstewitzLab

Followers
2K
Following
3K
Media
66
Statuses
1K

Scientific machine learning, AI & data analysis, dynamical systems theory, applications in (computat.) neuroscience & psychiatry. @durstewitzlab.bsky.social

Mannheim+Heidelberg
Joined May 2019
Don't wanna be here? Send us removal request.
@DurstewitzLab
DurstewitzLab
6 days
Mathematics & kindness.
@gunsnrosesgirl3
Science girl
6 days
Which important skill is slowly fading.
0
0
4
@DurstewitzLab
DurstewitzLab
6 days
We wrote a little #NeuroAI piece about in-context learning & neural dynamics vs. continual learning & plasticity, both mechanisms to flexibly adapt to changing environments:.We relate this to non-stationary rule learning w rapid jumps. Feedback welcome!.
0
18
82
@DurstewitzLab
DurstewitzLab
16 days
Fantastic work by Florian Bähner, Hazem Toutounji, Tzvetan Popovand & many others - I'm just the person advertising!.
0
0
3
@DurstewitzLab
DurstewitzLab
16 days
How do animals learn new rules? By systematically testing diff. behavioral strategies, guided by selective attn. to rule-relevant cues: Akin to in-context learning in AI, strategy selection depends on the animals' "training set" (prior experience).
2
8
42
@DurstewitzLab
DurstewitzLab
21 days
RT @russo_eleon: Into population dynamics? Coming to #CNS2025 but not quite ready to head home?. Come join us! at the Symposium on "Neural….
0
7
0
@DurstewitzLab
DurstewitzLab
2 months
We dive a bit into the reasons why current time series FMs not trained for DS reconstruction fail, and conclude that a DS perspective on time series forecasting & models may help to advance the #TimeSeriesAnalysis field. (6/6)
Tweet media one
1
0
3
@DurstewitzLab
DurstewitzLab
2 months
Remarkably, DynaMix not only generalizes zero-shot to novel DS, but it can even generalize to new initial conditions and regions of state space not covered by the in-context information. (5/6)
Tweet media one
1
0
4
@DurstewitzLab
DurstewitzLab
2 months
And no, it’s neither based on Transformers nor Mamba – it’s a new type of mixture-of-experts architecture based on the recently introduced AL-RNN (, specifically trained for DS reconstruction. #AI.(4/6)
Tweet media one
1
0
3
@DurstewitzLab
DurstewitzLab
2 months
It often even outperforms TS FMs on forecasting diverse empirical time series, like weather or traffic, typically used to train TS FMs. This is surprising, cos DynaMix’ training corpus consists *solely* of simulated limit cycles & chaotic systems, no empirical data at all!.(3/6)
Tweet media one
1
0
2
@DurstewitzLab
DurstewitzLab
2 months
Unlike TS FMs, DynaMix exhibits #ZeroShotLearning of long-term stats of unseen DS, incl. attrac. geom. & PS, w/o *any* re-training, just from a context signal. It does so with only 0.1% of the parameters of Chronos & 10x faster inference times than the closest competitor. (2/6)
Tweet media one
1
0
2
@DurstewitzLab
DurstewitzLab
2 months
Can time series #FoundationModels like Chronos zero-shot generalize to unseen #DynamicalSystems (DS)?. No, they cannot. But *DynaMix* can, the first FM based on principles of DS reconstruction, capturing the long-term evolution of out-of-domain DS: (1/6)
Tweet media one
3
19
91
@DurstewitzLab
DurstewitzLab
5 months
Our revised #ICLR2025 paper & code for a foundation model architecture for dynamical systems is now online: incl. add. examples of how this may be used for identifying drivers (control par.) of non-stationary processes. And please switch platform!.
@DurstewitzLab
DurstewitzLab
9 months
Interested in interpretable #AI foundation models for #DynamicalSystems reconstruction?.In a new paper we move into this direction, training common latent DSR models with system-specific features on data from multiple different dynamic regimes and DS:.1/4
Tweet media one
0
9
33
@DurstewitzLab
DurstewitzLab
6 months
Transfer & few-shot learning for dynamical systems . our paper just got accepted for #ICLR2025 @iclr_conf !.Thread below; strongly updated version will be available soon . and don't forget to move to bsky!.
@DurstewitzLab
DurstewitzLab
9 months
Interested in interpretable #AI foundation models for #DynamicalSystems reconstruction?.In a new paper we move into this direction, training common latent DSR models with system-specific features on data from multiple different dynamic regimes and DS:.1/4
Tweet media one
0
0
4
@DurstewitzLab
DurstewitzLab
7 months
Periodic reminder that MSE or explained var. are not good stats for assessing quality of dynamical systems reconstructions, cos of exponential trajectory divergence in chaotic systems. And please follow us at !
Tweet media one
0
0
5
@DurstewitzLab
DurstewitzLab
7 months
Can really recommend this excellent talk by Christoph Bergmeir at yesterday … on the inability of Transformer- & LLM-based recent time series models to beat even simple baselines *if you do the stats and testing right*! A lesson in careful stat. eval.
0
2
18
@DurstewitzLab
DurstewitzLab
7 months
2) A scalable generative model for dynamical system reconstruction from neuroimaging data. @GeorgiaKoppe.
0
0
3
@DurstewitzLab
DurstewitzLab
7 months
Don't miss out on our 2 #NeurIPS2024 papers on dynamical systems reconstruction today & tomorrow:. 1) Almost-Linear RNNs Yield Highly Interpretable Symbolic Codes in Dynamical Systems Reconstruction .
1
9
42
@DurstewitzLab
DurstewitzLab
7 months
RT @russo_eleon: 📢PhD position @BristolUni (with Ross Purple and @seanfw, UK) and joint supervision @SantAnnaPisa (with @russo_eleon, IT) o….
0
6
0
@DurstewitzLab
DurstewitzLab
8 months
Happy our team is among this years’ recipients of the Samsung Global Research Outreach Awards!. We will take #DynamicalSystems reconstruction to the next level, large-scale – looking forward to the collaboration with the Samsung team!.
2
6
45
@DurstewitzLab
DurstewitzLab
8 months
Hey everyone,.Please follow us on bluesky:. it's a wonderful place!.
0
0
1