Leon Klein Profile
Leon Klein

@leonklein26

Followers
440
Following
2K
Media
13
Statuses
133

PhD student @FU_Berlin @bifoldberlin, working on enhancing MD with ML. Former Visiting Researcher @MSFTResearch and ML Intern @DEShawResearch.

Joined January 2021
Don't wanna be here? Send us removal request.
@leonklein26
Leon Klein
1 year
Excited to share our latest preprint: "Transferable Boltzmann Generators"! We propose a framework based on flow matching and demonstrate transferability on dipeptides. Work done with amazing @FrankNoeBerlin. Check it out here:
10
17
127
@ian_dunn_
Ian Dunn
2 months
I'm excited to share FlowMol3! the 3rd (and final) version of our flow matching model for 3D de novo, small-molecule generation. FlowMol3 achieves state of the art performance over a broad range of evaluations while having โ‰ˆ10x fewer parameters than comparable models.
1
3
16
@FrankNoeBerlin
Frank Noe
3 months
After a 4-year journey, we are super happy to see this paper out in @NatureComms - @ElezKatarina et al: High-throughput molecular dynamics + active @machinelearning enable efficient identification of an experimentally-validated broad coronavirus inhibitor. https://t.co/Q3ioRNKnJP
Tweet card summary image
nature.com
Nature Communications - Approaches making virtual and experimental screening more resource-efficient are vital for identifying effective inhibitors from a vast pool of potential drugs but remain...
2K
41
257
@BigAmeya
Ameya Daigavane
4 months
Really excited to (finally) share the updated JAMUN preprint and codebase! We perform Langevin molecular dynamics in a smoothed space which allows us to take larger integrator steps. This requires learning a score function only at a single noise level, unlike diffusion models.
3
5
45
@CecClementi
Cecilia Clementi
4 months
Our development of machine-learned transferable coarse-grained models in now on Nat Chem! https://t.co/HGngd8Vpop I am so proud of my group for this work! Particularly first authors Nick Charron, Klara Bonneau, @sayeg84, Andrea Guljas.
Tweet card summary image
nature.com
Nature Chemistry - The development of a universal protein coarse-grained model has been a long-standing challenge. A coarse-grained model with chemical transferability has now been developed by...
3
39
122
@AlexanderTong7
Alex Tong
4 months
Come check out SBG happening now! W-115 11-1:30 with @charliebtan @bose_joey Chen Lin @leonklein26 @mmbronstein
0
15
91
@bose_joey
Joey Bose
4 months
Main Conference: ๐Ÿ“œ Title: Scalable Equilibrium Sampling with Sequential Boltzmann Generators ๐Ÿ• When: Wed 16 Jul 11 a.m. PDT โ€” 1:30 p.m. PDT ๐Ÿ—บ๏ธ Where: West Exhibition Hall B2-B3 W-115 ๐Ÿ”— arXiv: https://t.co/VKQluYOffP w/ @charliebtan @WillLin1028 @leonklein26 @mmbronstein
Tweet card summary image
arxiv.org
Scalable sampling of molecular states in thermodynamic equilibrium is a long-standing challenge in statistical physics. Boltzmann generators tackle this problem by pairing normalizing flows with...
1
2
7
@bose_joey
Joey Bose
4 months
๐Ÿ‘‹ I'm at #ICML2025 this week, presenting several papers throughout the week with my awesome collaborators! Please do reach out if you'd like to grab a coffee โ˜•๏ธ or catch up again! Papers in ๐Ÿงตbelow ๐Ÿ‘‡:
1
7
62
@bose_joey
Joey Bose
4 months
๐ŸŽ‰Personal update: I'm thrilled to announce that I'm joining Imperial College London @imperialcollege as an Assistant Professor of Computing @ICComputing starting January 2026. My future lab and I will continue to work on building better Generative Models ๐Ÿค–, the hardest
98
33
626
@smnlssn
Simon Olsson
4 months
New pre-print from PhD student Hang Zou on warm-starting the variational quantum eigensolver using flows: Flow-VQE! Flow-VQE is parameter transfer on steroids: it learns how to solve a family of related problems, dramatically reducing the aggregate compute cost!
1
5
48
@AdamEFoster
Adam Foster
5 months
I am very happy to share Orbformer, a foundation model for wavefunctions using deep QMC that offers a route to tackle strongly correlated quantum states!
Tweet card summary image
arxiv.org
Reliable description of bond breaking remains a major challenge for quantum chemistry due to the multireferential character of the electronic structure in dissociating species. Multireferential...
7
30
92
@jenseisert
Jens Eisert
6 months
The abelian state hidden subgroup problem: Learning stabilizer groups and beyond https://t.co/KwK9pGu85o Identifying the #symmetry properties of quantum states is a central theme in quantum information theory and quantum many-body physics. In this work, we investigate quantum
5
10
59
@leonklein26
Leon Klein
6 months
Excited to have contributed to this amazing work by @LVaitl! https://t.co/Ti8pxrH0mu
@LVaitl
Lorenz Vaitl
6 months
Ever felt like Boltzmann Generators trained with Flow Matching were doing fine, just not good enough? We slapped Path Gradients on top, and things got better. No extra samples, no extra compute, no changes to the model. Just gradients you already have access to.
1
1
18
@GabriCorso
Gabriele Corso
7 months
๐Ÿš€ Excited to release a major update to the Boltz-1 model: Boltz-1x! Boltz-1x introduces inference-time steering for much higher physical quality, CUDA kernels for faster, more memory-efficient inference and training, and more! ๐Ÿ”ฅ๐Ÿงต
6
109
439
@HLawrenceCS
Hannah Lawrence
7 months
Equivariant functions (e.g. GNNs) can't break symmetries, which can be problematic for generative models and beyond. Come to poster #207 Saturday at 10AM to hear about our solution: SymPE, or symmetry-breaking positional encodings! w/Vasco Portilheiro, Yan Zhang, @sekoumarkaba
2
15
99
@GabriCorso
Gabriele Corso
7 months
Happy to finally release our work on "Composing Unbalanced Flows for Flexible Docking and Relaxation" (FlexDock) that we will be presenting as an oral at #ICLR2025 ! ๐Ÿค—โœˆ๏ธ๐Ÿ‡ธ๐Ÿ‡ฌ A thread! ๐Ÿงต
2
33
184
@smnlssn
Simon Olsson
7 months
Check out cool new work from our group in collaboration with Pfizer and AstraZeneca, lead by Julian Cremer and Ross Irwin on FLOWR, a flow-based ligand generation approach, and highly sanitized benchmark dataset, SPINDR, for the SBDD community!
@BiologyAIDaily
Biology+AI Daily
7 months
FLOWR โ€“ Flow Matching for Structure-Aware de novo and Conditional Ligand Generation 1. FLOWR introduces a new generative framework for structure-based ligand design using flow matching instead of diffusion, achieving up to 70x faster inference while improving ligand validity,
2
13
55
@FrankNoeBerlin
Frank Noe
8 months
Simulating full quantum mechanical ground- and excited state surfaces with deep quantum Monte Carlo by Zeno Schรคtzle, Bernat Szabo and Alice Cuzzocrea. https://t.co/UEy1q3LmLC ๐Ÿงตโฌ‡๏ธ
3
24
96
@charliebtan
charliebtan
9 months
New preprint! ๐Ÿšจ We scale equilibrium sampling to hexapeptide (in cartesian coordinates!) with Sequential Boltzmann generators!ย  ๐Ÿ“ˆ ๐Ÿคฏ Work with @bose_joey, @WillLin1028, @leonklein26, @mmbronstein and @AlexanderTong7 Thread ๐Ÿงต 1/11
3
22
76
@FrankNoeBerlin
Frank Noe
9 months
The BioEmu-1 model and inference code are now public under MIT license!!! Please go ahead, play with it and let us know if there are issues. https://t.co/K7wwHmCt2o
Tweet card summary image
github.com
Inference code for scalable emulation of protein equilibrium ensembles with generative deep learning - microsoft/bioemu
@FrankNoeBerlin
Frank Noe
11 months
Super excited to preprint our work on developing a Biomolecular Emulator (BioEmu): Scalable emulation of protein equilibrium ensembles with generative deep learning from @MSFTResearch AI for Science. #ML #AI #NeuralNetworks #Biology #AI4Science https://t.co/yzOy6tAoPv
5
95
356
@AlexanderTong7
Alex Tong
9 months
Check out our new preprint on improving sampling in masked diffusion models! A drop in replacement of the standard random sampling order without any additional training improves performance across the board. ๐Ÿš€
@pengzhangzhi1
Fred Zhangzhi Peng
9 months
New Paper Alert! ๐Ÿš€ We introduce Path Planning (P2), a sampling approach to optimizing token unmasking order in Masked Diffusion Models (MDMs). SOTA results across language, math, code, and biological sequence (Protein and RNA)๏ฟฝ๏ฟฝall without training. https://t.co/PIsgCS1Hqg ๐Ÿงต๐Ÿ‘‡
1
11
82