Wei Guo @ NeurIPS
@WeiGuo01
Followers
259
Following
2K
Media
4
Statuses
61
ML PhD @ Georgia Tech (previously Math Undergrad @ PKU): Sampling (MCMC) / Diffusion Models / Generative AI / Machine Learning Theory
Atlanta, GA
Joined July 2022
๐ข We proudly present Non-equilibrium Annealed Adjoint Sampler (NAAS), a diffusion sampler for Boltzmann distributions that leverages informative reference dynamics, yielding strong initial samples and requiring only small corrective control to reach the target. #NeurIPS
0
7
28
I will present these two papers tomorrow with my collaborator Yuchen, please come around if interested! Iโm happy to chat about diffusion/flow-based models, diffusion LLM, RL, sampling theory and related topics. Iโm also eagerly looking for an internship for the next summer!!
Heading to SD today! I'll be attending #NeurIPS2025 from Dec 2 - Dec 7. Excited to meet old and new friends and love to chat about diffusion models, diffusion LLMs, RL and any other interesting topics related. Welcome to DM if you want to โ๏ธ chat. I will present two posters
0
2
12
You are cordially invited to drop by if interested in discrete diffusion models!
Thrilled to share that I will be at @NeurIPSConf from this week to present our work on fast solvers for discrete diffusion models ( https://t.co/zoxaLM5yCS) with my awesome peer collaborators @Yinuo_Ren @YuchenZhu_ZYC @WeiGuo01!
0
1
12
Inference of discrete diffusion models benefits from higher order numerical schemes and hereโs how. I will present this paper at NeurIPS 2025, codes available at https://t.co/H919mef68w Happy to chat about generative modeling and sampling!
github.com
[NeurIPS 2025] Fast Solvers for Discrete Diffusion Models: Theory and Applications of High-Order Algorithms - yuchen-zhu-zyc/DiscreteFastSolver
Many fast solvers for continuous diffusion models have been proposed - how about discrete ones? Pls check out our paper: https://t.co/vl5XOR16bE, which accelerates the inference of discrete diffusion models on OpenWebText and ImageNet via high-order numerical schemes!
0
3
11
I'll be at #NeurIPS2025 to present 6(main)+1(workshop) papers. Love to chat about ๐diffusion model / sampling ๐optimization for deep learning / theory ๐AI4Science ๐PhD/postdoc recruitment ๐dynamics/ probability/ numerics/ anything Find me at our posters, email or DM. Thanks!
3
10
87
Happy to share that #ASBS got #Oral presentation in #NeurIPS2025 (accept rate 0.3%). I'll present it next week in San Diego. We also release codes for synthetic energies (DW, LJ)๐ https://t.co/sj9xP7sgZ8 ... and we have more papers on adjoint-based diffusion samplers coming ๐
github.com
code for adjoint-based diffusion samplers. Contribute to facebookresearch/adjoint_samplers development by creating an account on GitHub.
Adjoint-based diffusion samplers have simple & scalable objectives w/o impt weight complication. Like many, though, they solve degenerate Schrรถdinger bridges, despite all being SB-inspired. ๐ข Proudly introduce #Adjoint #Schrรถdinger #Bridge #Sampler, a full SB-based sampler that
1
11
60
My awesome student Lingkai Kong @konglk1203 is graduating & looking for a frontier lab He developed SOTA for optimization, sampling & generative modeling on manifolds. E.g., manifold optimization underpins preconditioned optimizers for training large models Grab him if you can!
0
1
19
Nonconvex optimization can be hard. Sampling, as a stochastic generalization, is not always easier. What about a case further complicated by nonconvex ineq & equality constraints? https://t.co/tbUi9HJbUf (#NeurIPS2025) introduces a new tool, and samples exponentially fast!
arxiv.org
Sampling from constrained statistical distributions is a fundamental task in various fields including Bayesian statistics, computational chemistry, and statistical physics. This article considers...
1
7
36
I'm hiring 2 PhD students & 1 postdoc @GeorgiaTech for Fall'26 Motivated students plz consider us, especially those in * ML+Quantum * DeepLearning+Optimization -PhD: see https://t.co/h4anjm6b8j -Postdoc: see https://t.co/548XVaahx3 & https://t.co/4ahNE7OOwV Retweet appreciated
9
120
477
Diffusion models are powerful ๐ฅ, but adapting them at inference time without retraining is challenging โ๏ธ. We introduce DriftLite, a lightweight ๐ชถ, training-free ๐, inference-time scaling method that actively steers inference and absorbs instability, preventing weight
7
70
440
Proud of my junior collaborators Kijung Jeon Yuchen @YuchenZhu_ZYC Wei @WeiGuo01 Jaemoo @jaemoo51133 Avrajit @GhoshAvrajit Lianghe Shi Yinuo @Yinuo_Ren Haoxuan @haoxuan_steve_c - 6 joint #NeurIPS2025 main track paper! Lucky to have you Wanna join us? Will post recruit info soon.
1
7
77
Our work on theoretically-grounded fast discrete diffusion model sampler is now accepted at #NeurIPS2025 ๐ฅ๐ฅ
Many fast solvers for continuous diffusion models have been proposed - how about discrete ones? Pls check out our paper: https://t.co/vl5XOR16bE, which accelerates the inference of discrete diffusion models on OpenWebText and ImageNet via high-order numerical schemes!
0
5
33
Excited to announce our work is accepted at #NeurIPS2025! New LLaDA experiments and open-source codebase coming soon! Thanks and congratulations to my great collaborators: @haoxuan_steve_c, @YuchenZhu_ZYC, @WeiGuo01, @YongxinChen1, Grant Rotskoff, @MoleiTaoMath , and
Many fast solvers for continuous diffusion models have been proposed - how about discrete ones? Pls check out our paper: https://t.co/vl5XOR16bE, which accelerates the inference of discrete diffusion models on OpenWebText and ImageNet via high-order numerical schemes!
0
6
34
Thrilled to share that our work on the theory and application of fast solvers for discrete diffusion models has been accepted by #NeurIPS2025! Many thanks to my terrific collaborators @Yinuo_Ren @YuchenZhu_ZYC @WeiGuo01 @YongxinChen1 Grant Rotskoff @MoleiTaoMath @lexing_ying!
This one-liner modification yields the first high-order accurate algorithm for discrete diffusion model inference, both theoretically proven & empirically validated on large-scale datasets! Check out our paper: https://t.co/nraj4dR5MA
0
6
83
Thrilled to announce that MDNS is accepted at #NeurIPS2025 ๐ฅณ
๐New Paper Alert ๐ ๐คDiffusion samplers are effective approaches for distributions on R^d, but can we extend this success to those on discrete state spaces๐งฉ? ๐ข Proudly present ๐๐๐ฌ๐ค๐๐ ๐๐ข๐ฌ๐๐ซ๐๐ญ๐ ๐๐๐ฎ๐ซ๐๐ฅ ๐๐๐ฆ๐ฉ๐ฅ๐๐ซ (๐๐๐๐), an MDM-based neural sampler
2
9
84
Finally I'd like to thank the wonderful collaborators Yuchen Zhu @YuchenZhu_ZYC, Jaemoo Choi @jaemoo51133, Guan-Horng Liu @guanhorng_liu, Yongxin Chen @YongxinChen1, and Molei Tao @MoleiTaoMath for insightful discussions!
0
0
3
MDNS has many designs choices to be considered, including the warm-up trick for training low-temp. distributions, and preconditioning neural network for special target distributions. This framework also extends to uniform discrete diffusions, but mask generally performs better.
1
0
2
This doesn't require relaxing discrete states into continuous ones, avoids backpropagation over the whole trajectory, and makes replay buffer possible. Moreover, as we can mask the samples in different ways, it amortizes the O(D) cost for computing the weights (RD derivatives).
1
0
2
Our key approach is the Weighted Denoising Cross-entropy loss -- given the current model, draw samples and compute the weights w.r.t. the target distribution. Then optimize the CE loss by treating i.i.d. samples from the model as weighted samples from the target distribution.
1
0
3
Masked discrete diffusion model is powerful in learning categorical data distributions, but what if we only specify the unnormalized target density without any samples? We propose MDNS, a scalable approach to train discrete neural samplers based on optimal control!
๐New Paper Alert ๐ ๐คDiffusion samplers are effective approaches for distributions on R^d, but can we extend this success to those on discrete state spaces๐งฉ? ๐ข Proudly present ๐๐๐ฌ๐ค๐๐ ๐๐ข๐ฌ๐๐ซ๐๐ญ๐ ๐๐๐ฎ๐ซ๐๐ฅ ๐๐๐ฆ๐ฉ๐ฅ๐๐ซ (๐๐๐๐), an MDM-based neural sampler
1
4
32