
Stephan Mandt @ AISTATS’25
@StephanMandt
Followers
3K
Following
1K
Media
39
Statuses
496
AI Professor @UCIrvine | Formerly @blei_lab, @Princeton | #GenAI, #Compression, #AI4Science | General Chair @aistats_conf 2025 | AI Resident @ChanZuckerberg
Irvine, California
Joined March 2015
Huge thanks to Laura Manduchi, Clara Meister & Kushagra Pandey, who led the 2-year effort of writing “On the Challenges and Opportunities in Generative AI” involving 27 authors. Coming out of a 2023 Dagstuhl Seminar I co-organized with @vincefort, @liyzhen2 & @sirbayes.
Exciting news! Our paper "On the Challenges and Opportunities in Generative AI" has been accepted to TMLR 2025. 📄
0
2
14
I had the pleasure of giving a talk and sharing some recent work on diffusion + compression (together with @justuswill and @StephanMandt) at the Learn to Compress workshop at #isit2025. Here are my slides: https://t.co/yycVA3gkkG Thanks again for the invitation!
0
3
20
✨New edition of our community-building workshop series!✨ Tomorrow at @CVPR, we invite speakers to share their stories, values, and approaches for navigating a crowded and evolving field, especially for early-career researchers. Cheeky title🤭: How to Stand Out in the
In this #CVPR2025 edition of our community-building workshop series, we focus on supporting the growth of early-career researchers. Join us tomorrow (Jun 11) at 12:45 PM in Room 209 Schedule: https://t.co/1fKzplQrU5 We have an exciting lineup of invited talks and candid
3
16
66
Thanks for the great collaboration!
ICML 25 paper on variational guidance for diffusion models accepted Happy to share that our diffusion model guidance paper with @farrinsofian, @kpandey008, @felixDrRelax, and @StephanMandt on casting control for guidance as variational inference with auxiliary variables was
0
0
35
TL;DR: Guidance = variational optimal control. I'm excited to share the outcomes of this collaboration with @Tkaraletsos at the @ChanZuckerberg_ Initiative. All credit to my amazing students @farrinsofian and @kpandey008!
🚀 News! Our recent #ICML2025 paper “Variational Control for Guidance in Diffusion Models” introduces a simple yet powerful method for guidance in diffusion models — and it doesn’t need model retraining or extra networks. 📄 Paper: https://t.co/nixanKxs9W 💻 Code:
0
1
51
#AISTATS2025 day 3 keynote by Akshay Krishnamurthy about how to do theory research on inference time compute 👍 @aistats_conf
1
10
139
Back from @aistats_conf in Thailand to my Zurich sabbatical—sipping coffee in the same spots Einstein once did. What a journey! Huge thanks to my entire AISTATS team: reviewers, ACs, senior ACs, and Chairs. It’s been amazing to work with you!
5
1
69
And last but not least... the Best Student Paper Award at #AISTATS 2025 goes to Daniel Marks and Dario Paccagnan for "Pick-to-Learn and Self-Certified Gaussian Process Approximations". Congratulations!
0
3
69
The #AISTATS 2025 Test of Time Award goes to ... 🥁 ... Chen-Yu Lee, Saining Xie, Patrick Gallagher, Zhengyou Zhang, Zhuowen Tu, for "Deeply Supervised Nets"! Congratulations!
2
6
71
Congrats!!
Big congrats to Charles Margossian and Lawrence Saul for winning the #AISTATS 2025 Best Paper Award! "Variational Inference in Location-Scale Families: Exact Recovery of the Mean and Correlation Matrix"
0
0
32
#AISTATS2025 is off to a strong start! First keynote: Chris Holmes rethinks Bayesian inference through the lens of predictive distributions—introducing tools like martingale posteriors. 🌴🌴🤖🎓
0
7
54
Proud to share TranscriptFormer - a generative model for single-cell transcriptomics, trained on 112 million cells across 12 species spanning 1.5 billion years of evolution. TranscriptFormer is distinct in being both generative and multi-species. Built entirely on raw gene
2
12
104
Congratulations to @MetodJazbec and the team on winning the ICLR QUESTION workshop's Best Paper Award! 🎓
Great too see that our Generative Uncertainty won the best paper award at the ICLR QUESTION workshop ( https://t.co/5O1iUK2Kmv). If you're interested in what Bayesian/ensembling methods can bring to the world of diffusion models, check out the paper 👇 https://t.co/41SII4NAUq
0
1
26
AABI 2025 has just been concluded. We had a great time talking to each other and listening to great talks about probabilistic ML and its applications in Singapore 🇸🇬 Looking forward to AABI 2026! Stay tuned!
0
2
27
Just gave a talk on Scientific Inference with Diffusion Models at @ETH_AI_Center, sharing our recent work—from test-time control and distributional matching to uncertainty calibration. Great crowd, thoughtful questions, nice view. Thanks, Julia Vogt, for hosting!
0
2
65
Make sure to get your tickets to AABI if you are in Singapore on April 29 (just after #ICLR2025) and interested in probabilistic modeling, inference, and decision-making! Tickets (free but limited!): https://t.co/NR8N2JalB4 More info: https://t.co/uC2LkSi1KB
#ICLR2025 #ML
0
3
20
I am so excited to share our latest research paper "Transferring climate change physical knowledge" published in @PNASNews! 🧵 1/4 https://t.co/pErMsf4J8Q
pnas.org
Precise and reliable climate projections are required for climate adaptation and mitigation, but Earth system models still exhibit great uncertaint...
1
1
10
Thrilled to share that my student Justus Will and former student @YiboYang had their work selected as an ICLR 2025 Oral (top 2%)! Presenting the first runtime-efficient progressive coding method using diffusion models. 👇
Excited to present some recent work on developing "Progressive Compression with Universally Quantized Diffusion Models", accepted as an Oral at ICLR'25. 🧵1/4
0
4
33
Introducing the first #NeuralCompression benchmark for astrophysics data! #ICLR2025 🚀🌌
Dear ICLR x Astro(physics) / AI+Science / Data Compression community: Our conference paper AstroCompress will be presented at ICLR 2025. 320 GB of ML-ready data. ML codecs could unlock at least 5% more data from multi billion-dollar telescopes like JWST! https://t.co/yJtTBufnhE
0
3
33