WeiGuo01 Profile Banner
Wei Guo @ NeurIPS Profile
Wei Guo @ NeurIPS

@WeiGuo01

Followers
259
Following
2K
Media
4
Statuses
61

ML PhD @ Georgia Tech (previously Math Undergrad @ PKU): Sampling (MCMC) / Diffusion Models / Generative AI / Machine Learning Theory

Atlanta, GA
Joined July 2022
Don't wanna be here? Send us removal request.
@JaemooChoi
Jaemoo Choi
8 days
๐Ÿ“ข We proudly present Non-equilibrium Annealed Adjoint Sampler (NAAS), a diffusion sampler for Boltzmann distributions that leverages informative reference dynamics, yielding strong initial samples and requiring only small corrective control to reach the target. #NeurIPS
0
7
28
@WeiGuo01
Wei Guo @ NeurIPS
7 days
I will present these two papers tomorrow with my collaborator Yuchen, please come around if interested! Iโ€™m happy to chat about diffusion/flow-based models, diffusion LLM, RL, sampling theory and related topics. Iโ€™m also eagerly looking for an internship for the next summer!!
@YuchenZhu_ZYC
Yuchen Zhu โœˆ๏ธ NeurIPS 25'
8 days
Heading to SD today! I'll be attending #NeurIPS2025 from Dec 2 - Dec 7. Excited to meet old and new friends and love to chat about diffusion models, diffusion LLMs, RL and any other interesting topics related. Welcome to DM if you want to โ˜•๏ธ chat. I will present two posters
0
2
12
@WeiGuo01
Wei Guo @ NeurIPS
8 days
You are cordially invited to drop by if interested in discrete diffusion models!
@haoxuan_steve_c
Haoxuan (Steve) Chen
8 days
Thrilled to share that I will be at @NeurIPSConf from this week to present our work on fast solvers for discrete diffusion models ( https://t.co/zoxaLM5yCS) with my awesome peer collaborators @Yinuo_Ren @YuchenZhu_ZYC @WeiGuo01!
0
1
12
@WeiGuo01
Wei Guo @ NeurIPS
13 days
Inference of discrete diffusion models benefits from higher order numerical schemes and hereโ€™s how. I will present this paper at NeurIPS 2025, codes available at https://t.co/H919mef68w Happy to chat about generative modeling and sampling!
Tweet card summary image
github.com
[NeurIPS 2025] Fast Solvers for Discrete Diffusion Models: Theory and Applications of High-Order Algorithms - yuchen-zhu-zyc/DiscreteFastSolver
@haoxuan_steve_c
Haoxuan (Steve) Chen
10 months
Many fast solvers for continuous diffusion models have been proposed - how about discrete ones? Pls check out our paper: https://t.co/vl5XOR16bE, which accelerates the inference of discrete diffusion models on OpenWebText and ImageNet via high-order numerical schemes!
0
3
11
@MoleiTaoMath
Molei Tao
13 days
I'll be at #NeurIPS2025 to present 6(main)+1(workshop) papers. Love to chat about ๐ŸŒŸdiffusion model / sampling ๐ŸŒŸoptimization for deep learning / theory ๐ŸŒŸAI4Science ๐ŸŽ“PhD/postdoc recruitment ๐ŸŒŸdynamics/ probability/ numerics/ anything Find me at our posters, email or DM. Thanks!
3
10
87
@guanhorng_liu
Guan-Horng Liu
14 days
Happy to share that #ASBS got #Oral presentation in #NeurIPS2025 (accept rate 0.3%). I'll present it next week in San Diego. We also release codes for synthetic energies (DW, LJ)๐Ÿ‘‡ https://t.co/sj9xP7sgZ8 ... and we have more papers on adjoint-based diffusion samplers coming ๐Ÿ™‚
Tweet card summary image
github.com
code for adjoint-based diffusion samplers. Contribute to facebookresearch/adjoint_samplers development by creating an account on GitHub.
@guanhorng_liu
Guan-Horng Liu
5 months
Adjoint-based diffusion samplers have simple & scalable objectives w/o impt weight complication. Like many, though, they solve degenerate Schrรถdinger bridges, despite all being SB-inspired. ๐Ÿ“ข Proudly introduce #Adjoint #Schrรถdinger #Bridge #Sampler, a full SB-based sampler that
1
11
60
@MoleiTaoMath
Molei Tao
20 days
My awesome student Lingkai Kong @konglk1203 is graduating & looking for a frontier lab He developed SOTA for optimization, sampling & generative modeling on manifolds. E.g., manifold optimization underpins preconditioned optimizers for training large models Grab him if you can!
0
1
19
@MoleiTaoMath
Molei Tao
28 days
Nonconvex optimization can be hard. Sampling, as a stochastic generalization, is not always easier. What about a case further complicated by nonconvex ineq & equality constraints? https://t.co/tbUi9HJbUf (#NeurIPS2025) introduces a new tool, and samples exponentially fast!
Tweet card summary image
arxiv.org
Sampling from constrained statistical distributions is a fundamental task in various fields including Bayesian statistics, computational chemistry, and statistical physics. This article considers...
1
7
36
@MoleiTaoMath
Molei Tao
2 months
I'm hiring 2 PhD students & 1 postdoc @GeorgiaTech for Fall'26 Motivated students plz consider us, especially those in * ML+Quantum * DeepLearning+Optimization -PhD: see https://t.co/h4anjm6b8j -Postdoc: see https://t.co/548XVaahx3 & https://t.co/4ahNE7OOwV Retweet appreciated
9
120
477
@Yinuo_Ren
Yinuo Ren
2 months
Diffusion models are powerful ๐Ÿ”ฅ, but adapting them at inference time without retraining is challenging โ‰๏ธ. We introduce DriftLite, a lightweight ๐Ÿชถ, training-free ๐Ÿ˜€, inference-time scaling method that actively steers inference and absorbs instability, preventing weight
7
70
440
@MoleiTaoMath
Molei Tao
2 months
Proud of my junior collaborators Kijung Jeon Yuchen @YuchenZhu_ZYC Wei @WeiGuo01 Jaemoo @jaemoo51133 Avrajit @GhoshAvrajit Lianghe Shi Yinuo @Yinuo_Ren Haoxuan @haoxuan_steve_c - 6 joint #NeurIPS2025 main track paper! Lucky to have you Wanna join us? Will post recruit info soon.
1
7
77
@YuchenZhu_ZYC
Yuchen Zhu โœˆ๏ธ NeurIPS 25'
2 months
Our work on theoretically-grounded fast discrete diffusion model sampler is now accepted at #NeurIPS2025 ๐Ÿ”ฅ๐Ÿ”ฅ
@haoxuan_steve_c
Haoxuan (Steve) Chen
10 months
Many fast solvers for continuous diffusion models have been proposed - how about discrete ones? Pls check out our paper: https://t.co/vl5XOR16bE, which accelerates the inference of discrete diffusion models on OpenWebText and ImageNet via high-order numerical schemes!
0
5
33
@Yinuo_Ren
Yinuo Ren
2 months
Excited to announce our work is accepted at #NeurIPS2025! New LLaDA experiments and open-source codebase coming soon! Thanks and congratulations to my great collaborators: @haoxuan_steve_c, @YuchenZhu_ZYC, @WeiGuo01, @YongxinChen1, Grant Rotskoff, @MoleiTaoMath , and
@haoxuan_steve_c
Haoxuan (Steve) Chen
10 months
Many fast solvers for continuous diffusion models have been proposed - how about discrete ones? Pls check out our paper: https://t.co/vl5XOR16bE, which accelerates the inference of discrete diffusion models on OpenWebText and ImageNet via high-order numerical schemes!
0
6
34
@haoxuan_steve_c
Haoxuan (Steve) Chen
2 months
Thrilled to share that our work on the theory and application of fast solvers for discrete diffusion models has been accepted by #NeurIPS2025! Many thanks to my terrific collaborators @Yinuo_Ren @YuchenZhu_ZYC @WeiGuo01 @YongxinChen1 Grant Rotskoff @MoleiTaoMath @lexing_ying!
@Yinuo_Ren
Yinuo Ren
10 months
This one-liner modification yields the first high-order accurate algorithm for discrete diffusion model inference, both theoretically proven & empirically validated on large-scale datasets! Check out our paper: https://t.co/nraj4dR5MA
0
6
83
@YuchenZhu_ZYC
Yuchen Zhu โœˆ๏ธ NeurIPS 25'
3 months
Thrilled to announce that MDNS is accepted at #NeurIPS2025 ๐Ÿฅณ
@YuchenZhu_ZYC
Yuchen Zhu โœˆ๏ธ NeurIPS 25'
3 months
๐Ÿš€New Paper Alert ๐Ÿš€ ๐Ÿค”Diffusion samplers are effective approaches for distributions on R^d, but can we extend this success to those on discrete state spaces๐Ÿงฉ? ๐Ÿ“ข Proudly present ๐Œ๐š๐ฌ๐ค๐ž๐ ๐ƒ๐ข๐ฌ๐œ๐ซ๐ž๐ญ๐ž ๐๐ž๐ฎ๐ซ๐š๐ฅ ๐’๐š๐ฆ๐ฉ๐ฅ๐ž๐ซ (๐Œ๐ƒ๐๐’), an MDM-based neural sampler
2
9
84
@WeiGuo01
Wei Guo @ NeurIPS
3 months
Finally I'd like to thank the wonderful collaborators Yuchen Zhu @YuchenZhu_ZYC, Jaemoo Choi @jaemoo51133, Guan-Horng Liu @guanhorng_liu, Yongxin Chen @YongxinChen1, and Molei Tao @MoleiTaoMath for insightful discussions!
0
0
3
@WeiGuo01
Wei Guo @ NeurIPS
3 months
MDNS has many designs choices to be considered, including the warm-up trick for training low-temp. distributions, and preconditioning neural network for special target distributions. This framework also extends to uniform discrete diffusions, but mask generally performs better.
1
0
2
@WeiGuo01
Wei Guo @ NeurIPS
3 months
This doesn't require relaxing discrete states into continuous ones, avoids backpropagation over the whole trajectory, and makes replay buffer possible. Moreover, as we can mask the samples in different ways, it amortizes the O(D) cost for computing the weights (RD derivatives).
1
0
2
@WeiGuo01
Wei Guo @ NeurIPS
3 months
Our key approach is the Weighted Denoising Cross-entropy loss -- given the current model, draw samples and compute the weights w.r.t. the target distribution. Then optimize the CE loss by treating i.i.d. samples from the model as weighted samples from the target distribution.
1
0
3
@WeiGuo01
Wei Guo @ NeurIPS
3 months
Masked discrete diffusion model is powerful in learning categorical data distributions, but what if we only specify the unnormalized target density without any samples? We propose MDNS, a scalable approach to train discrete neural samplers based on optimal control!
@YuchenZhu_ZYC
Yuchen Zhu โœˆ๏ธ NeurIPS 25'
3 months
๐Ÿš€New Paper Alert ๐Ÿš€ ๐Ÿค”Diffusion samplers are effective approaches for distributions on R^d, but can we extend this success to those on discrete state spaces๐Ÿงฉ? ๐Ÿ“ข Proudly present ๐Œ๐š๐ฌ๐ค๐ž๐ ๐ƒ๐ข๐ฌ๐œ๐ซ๐ž๐ญ๐ž ๐๐ž๐ฎ๐ซ๐š๐ฅ ๐’๐š๐ฆ๐ฉ๐ฅ๐ž๐ซ (๐Œ๐ƒ๐๐’), an MDM-based neural sampler
1
4
32