
Stephan Mandt @ AISTATS’25
@StephanMandt
Followers
3K
Following
1K
Media
39
Statuses
494
AI Professor @UCIrvine | Formerly @blei_lab, @Princeton | #GenAI, #Compression, #AI4Science | General Chair @aistats_conf 2025 | AI Resident @ChanZuckerberg
Irvine, California
Joined March 2015
RT @YiboYang: I had the pleasure of giving a talk and sharing some recent work on diffusion + compression (together with @justuswill and @S….
0
3
0
RT @unnatjain2010: ✨New edition of our community-building workshop series!✨ . Tomorrow at @CVPR, we invite speakers to share their stories,….
0
15
0
Thanks for the great collaboration!.
ICML 25 paper on variational guidance for diffusion models accepted. Happy to share that our diffusion model guidance paper with @farrinsofian, @kpandey008, @felixDrRelax, and @StephanMandt on casting control for guidance as variational inference with auxiliary variables was.
0
0
35
TL;DR: Guidance = variational optimal control. I'm excited to share the outcomes of this collaboration with @Tkaraletsos at the @ChanZuckerberg_ Initiative. All credit to my amazing students @farrinsofian and @kpandey008!.
🚀 News! Our recent #ICML2025 paper “Variational Control for Guidance in Diffusion Models” introduces a simple yet powerful method for guidance in diffusion models — and it doesn’t need model retraining or extra networks. 📄 Paper: 💻 Code:
0
1
51
RT @liyzhen2: #AISTATS2025 day 3 keynote by Akshay Krishnamurthy about how to do theory research on inference time compute 👍.@aistats_conf….
0
7
0
Back from @aistats_conf in Thailand to my Zurich sabbatical—sipping coffee in the same spots Einstein once did. What a journey! Huge thanks to my entire AISTATS team: reviewers, ACs, senior ACs, and Chairs. It’s been amazing to work with you!
5
1
69
RT @aistats_conf: And last but not least. the Best Student Paper Award at #AISTATS 2025 goes to Daniel Marks and Dario Paccagnan for "Pic….
0
3
0
RT @aistats_conf: The #AISTATS 2025 Test of Time Award goes to . 🥁 . Chen-Yu Lee, Saining Xie, Patrick Gallagher, Zhengyou Zhang, Zhuow….
0
5
0
Congrats!!.
Big congrats to Charles Margossian and Lawrence Saul for winning the #AISTATS 2025 Best Paper Award! ."Variational Inference in Location-Scale Families: Exact Recovery of the Mean and Correlation Matrix"
0
0
33
RT @aistats_conf: #AISTATS2025 is off to a strong start!.First keynote: Chris Holmes rethinks Bayesian inference through the lens of predic….
0
6
0
RT @Tkaraletsos: Proud to share TranscriptFormer - a generative model for single-cell transcriptomics, trained on 112 million cells across….
0
12
0
Congratulations to @MetodJazbec and the team on winning the ICLR QUESTION workshop's Best Paper Award! 🎓.
Great too see that our Generative Uncertainty won the best paper award at the ICLR QUESTION workshop (. If you're interested in what Bayesian/ensembling methods can bring to the world of diffusion models, check out the paper 👇.
0
1
27
RT @akristiadi7: AABI 2025 has just been concluded. We had a great time talking to each other and listening to great talks about probabilis….
0
2
0
Just gave a talk on Scientific Inference with Diffusion Models at @ETH_AI_Center, sharing our recent work—from test-time control and distributional matching to uncertainty calibration. Great crowd, thoughtful questions, nice view. Thanks, Julia Vogt, for hosting!
0
2
66
RT @canaesseth: Make sure to get your tickets to AABI if you are in Singapore on April 29 (just after #ICLR2025) and interested in probabil….
0
3
0
RT @f_immorlano: I am so excited to share our latest research paper "Transferring climate change physical knowledge" published in @PNASNews….
0
1
0
Thrilled to share that my student Justus Will and former student @YiboYang had their work selected as an ICLR 2025 Oral (top 2%)!. Presenting the first runtime-efficient progressive coding method using diffusion models. 👇.
Excited to present some recent work on developing "Progressive Compression with Universally Quantized Diffusion Models", accepted as an Oral at ICLR'25. 🧵1/4
0
4
34
Introducing the first #NeuralCompression benchmark for astrophysics data! #ICLR2025 🚀🌌.
Dear ICLR x Astro(physics) / AI+Science / Data Compression community:. Our conference paper AstroCompress will be presented at ICLR 2025. 320 GB of ML-ready data. ML codecs could unlock at least 5% more data from multi billion-dollar telescopes like JWST!.
0
3
33