Jonas Beck Profile
Jonas Beck

@__jnsbck__

Followers
102
Following
190
Media
12
Statuses
52

Probabilistic Inference in Mechanistic Models. PhD student with @CellTypist, @uni_tue.

Tübingen, Deutschland
Joined January 2022
Don't wanna be here? Send us removal request.
@__jnsbck__
Jonas Beck
7 months
RT @sbi_devs: 🚀 Join the 4th SBI Hackathon! 🚀. The last hackathon was a fantastic milestone in forming a collaborative open-source communit….
0
2
0
@__jnsbck__
Jonas Beck
11 months
RT @KadhimKyra: Looking forward to the #Bernsteinconference tomorrow where I'll be presenting work developing a preliminary model of the mo….
0
2
0
@grok
Grok
2 days
Join millions who have switched to Grok.
101
184
1K
@__jnsbck__
Jonas Beck
11 months
Thrilled to be at #BernsteinConference next week to present my most recent work⬇️ on how probabilistic numerical methods can help with parameter inference in Hodgkin-Huxley models. If you’re curious or want to chat about fitting biophysical models, come by Poster 10 on Tue 2:15.
@__jnsbck__
Jonas Beck
1 year
Interested in reliable parameter inference for ODEs with probabilistic solvers?. I am excited to present my latest work w. @nathanaelbosch, @deismic_, @KadhimKyra, @jakhmack, @PhilippHennig5, @CellTypist, @ml4science at #ICML2024 this year. 1/n
Tweet media one
0
1
10
@__jnsbck__
Jonas Beck
1 year
We built a simulator for biophysical neuron models that is.- all python.- accelerated.- fully differentiable.- capable of morphological detail.- scalable to >1M params. Very proud to be part of this! Much respect to @deismic_, who put most of it together. See what it can do ⬇️.
@mackelab
Machine Learning in Science
1 year
How can we train biophysical neuron models on data or tasks? We built Jaxley, a differentiable, GPU-based biophysics simulator, which makes this possible even when models have thousands of parameters! Led by @deismic_, collab with @CellTypist @ppjgoncalves
0
3
21
@__jnsbck__
Jonas Beck
1 year
RT @lappalainenjk: Biggest joy and honour leading this project at the intersection of visual neuroscience and ML to a successful finish! Pa….
Tweet card summary image
nature.com
Nature - A study demonstrates how experimental measurements of only the connectivity of a biological neural network can be used to predict neural responses across the fly visual system at...
0
44
0
@__jnsbck__
Jonas Beck
1 year
RT @sbi_devs: We just released a new version of `sbi`, and this one has _a ton_ of new features! Many of these features are thanks to more….
0
19
0
@__jnsbck__
Jonas Beck
1 year
RT @mackelab: @gloecklermanuel and @deismic_ are at ICML to present our work on `All-in-one simulation-based inference`. Join us at ORAL 6F….
0
5
0
@__jnsbck__
Jonas Beck
1 year
RT @alfcnz: It’s 2020. @samgreydanus pushes on @arXiv a remarkable unpublished paper. 4 years later, @hippopedoid helps getting it publishe….
0
12
0
@__jnsbck__
Jonas Beck
1 year
RT @mackelab: After long and exhausting travels, the MackeLab has arrived in Vienna for ICML.
Tweet media one
0
2
0
@__jnsbck__
Jonas Beck
1 year
If you want to find out more, check out our code and paper or come say hi at #ICML2024 (Hall C 4-9 #1313 Tuesday 11:30) if you want to talk PN or ML + Neuro. @deismic_ and I will be there. 12/12.
Tweet card summary image
github.com
Contribute to berenslab/DiffusionTempering development by creating an account on GitHub.
5
0
0
@__jnsbck__
Jonas Beck
1 year
Hence, you can do inference on ODEs now where the original method (top) previously struggled, like for a pretty complex HH model with 6 free parameters. 11/n
Tweet media one
1
0
0
@__jnsbck__
Jonas Beck
1 year
This improves the reliability of parameter estimates on a variety of ODEs, like the Lottka-Volterra system or Hodgkin-Huxley models of various complexities and only comes at a small cost-overhead compared to the original probabilistic ODE solver with a few tweaks (OURS+). 10/n
Tweet media one
1
0
0
@__jnsbck__
Jonas Beck
1 year
We show that for a 1D pendulum this controlled 'smoothing' effectively avoids local minima during optimization, i.e. a pendulum w. infinite period (green) and reliably seeks out the global optimum (dashed) thanks to successively better initializations (C). 9/n
Tweet media one
1
0
0
@__jnsbck__
Jonas Beck
1 year
By iteratively reducing the diffusion parameter along a predetermined schedule and initializing with the previous parameter estimate at every step, we force the optimizer to find good parameter estimates at a given level of decreasing uncertainty. 8/n.
1
0
0
@__jnsbck__
Jonas Beck
1 year
We therefore propose diffusion tempering, a novel regularization technique for probabilistic numerical methods which dramatically increases the robustness of the optimization. 7/n.
1
0
0
@__jnsbck__
Jonas Beck
1 year
In practice this increasing of (k^2) amounts to 'smoothing out' the loss surface, making the joint optimization very vulnerable to local minima. 6/n.
1
0
0
@__jnsbck__
Jonas Beck
1 year
Namely, the optimizer can increase the likelihood that a bad ODE solution (grey) could have generated the data (black) by increasing the calibration/diffusion parameter (k^2), rather than by finding better parameters for the ODE. 5/n
Tweet media one
1
0
0
@__jnsbck__
Jonas Beck
1 year
We find that this is because the uncertainty of the probabilistic integrator is calibrated at the same time that the ODE parameters are being optimized, and this leads to undesirable behaviour. 4/n.
1
0
0
@__jnsbck__
Jonas Beck
1 year
This approach has been shown to outperform classical RK least squares regression. However, in practice these methods still struggle in more complex nonlinear dynamical systems, like the Hodgkin-Huxley model. 3/n
Tweet media one
1
0
0
@__jnsbck__
Jonas Beck
1 year
Inferring parameters of an IVP from noisy measurements is often challenging due to uncertainty about both the ODE solution and the data. Probabilistic ODE solvers therefore re-cast the solution of IVPs as Bayesian inference. 2/n.
1
0
0