Alex Thiery
@alexxthiery
Followers
1K
Following
3K
Media
52
Statuses
353
Associate Prof. of Statistics & Machine Learning National University of Singapore (NUS)
Singapore
Joined April 2009
🔥 WANTED: Student Researcher to join me,@ValentinDeBort1,@thjashin,@liwenliang,@ArthurGretton in DeepMind London. You'll be working on Multimodal Diffusions for science. Apply here
6
46
315
I'm teaching Bayesian Optimization for the first time this semester, so finally had to implement it myself. Fun stuff, with a few gotchas along the way!
0
2
15
Art B. Owen. [ https://t.co/kqdORNEPTP]. Better bootstrap t confidence intervals for the mean.
0
2
11
I'm looking for talented and ambitious PhD students to join me at Nanyang Technological University Singapore to work on safe and robust AI systems! Full scholarships covering tuition and a stipend are available, and are open to local and international students alike.
5
18
79
Scalable Monte Carlo for Bayesian Learning by Professor Paul Fearnhead, Dr Christopher Nemeth, Professor Chris J. Oates and Dr Chris Sherlock A clear and intuitive introduction to advanced topics in Markov chain Monte Carlo, with a focus on scalability. 📚 https://t.co/Ytszyn0mp1
1
4
11
Why do we keep sampling from the same distribution the model was trained on? We rethink this old paradigm by introducing Feynman-Kac Correctors (FKCs) – a flexible framework for controlling the distribution of samples at inference time in diffusion models! Without re-training
arxiv.org
While score-based generative models are the model of choice across diverse domains, there are limited tools available for controlling inference-time behavior in a principled manner, e.g. for...
🧵(1/6) Delighted to share our @icmlconf 2025 spotlight paper: the Feynman-Kac Correctors (FKCs) in Diffusion Picture this: it’s inference time and we want to generate new samples from our diffusion model. But we don’t want to just copy the training data – we may want to sample
1
28
137
Lecture notes: "Data-driven approaches to inverse problems" (by Carola-Bibiane Schönlieb, Zakhar Shumaylov): https://t.co/C4XrUCMNZ9 [Comments:Notes from Machine Learning: From Data to Mathematical Understanding (CIME 2023)]
arxiv.org
Inverse problems are concerned with the reconstruction of unknown physical quantities using indirect measurements and are fundamental across diverse fields such as medical imaging, remote sensing,...
2
34
154
Interesting paper: "Sequential Monte Carlo approximations of Wasserstein--Fisher--Rao gradient flows" By: Francesca Crucinio & Sahani Pathiraja https://t.co/HCJ1DwPH5F
arxiv.org
We consider the problem of sampling from a probability distribution $Ï€$. It is well known that this can be written as an optimisation problem over the space of probability distribution in which...
0
0
7
Geodesic path v.s. KL(pi, target) gradient flow under the Fisher-Rao Metric: can you tell which is which?
6
4
44
One #postdoc position is still available at the National University of Singapore @NUSingapore to work on sampling, high-dimensional data-assimilation, and diffusion/flow models. Applications are open until the end of January. Details: https://t.co/yMILtiJTz6
0
6
38
Variational approximation with Gaussian mixtures is looking cute! So here it's just gradient descent on K(q||p) for optimising the mixtures means & covariances & weights...
Gaussian approximation of a target distribution: mean-field versus full-covariance! Below shows a simple gradient descent on KL(q||p)
3
21
197
"Doubly Stochastic Variational Bayes for non-Conjugate Inference" by Titsias, Michalis, and Miguel Lázaro-Gredilla https://t.co/APvDzE7VgD
proceedings.mlr.press
We propose a simple and effective variational inference algorithm based on stochastic optimisation that can be widely applied for Bayesian non-conjugate in...
0
1
7
Gaussian approximation of a target distribution: mean-field versus full-covariance! Below shows a simple gradient descent on KL(q||p)
1
6
58
📢 Hi friends! We’re launching a weekly online seminar on Monte Carlo methods, starting on October 1st with Prof. Persi Diaconis. Join us every Tuesday! For more details, visit our website: https://t.co/5AmhPmte7C or subscribe to our mailing list https://t.co/PdpquTV40Z. Welcome!
sites.google.com
Upcoming Seminar Presentations All seminars are on Tuesdays [ 8:30 am PT ] = [ 11:30 am ET ] = [ 4:30 pm London ] = [ 5:30 pm Paris ] = [ 0:30 am Beijing + 1d] Subscribe to our mailing list and...
0
28
103
We have two open post-doc positions. You dont' have to be a Bayesian but somebody who is interested to work with at the intersection of DL, Bayes, and optimization. https://t.co/DJ4xwxjhWc Interest in understanding deep learning and continual lifelong learning is a plus!
2
35
123
I’m hiring a postdoc to work with me on exciting projects in generative modelling (AI) and/or uncertainty quantification. You'll be part of a great team, embedded in @AmlabUva and the UvA-Bosch Delta Lab. Apply here: https://t.co/nJ2phFRqIr RT appreciated! #ML #GenAI
1
30
95
Doing the approach above (with 100k points) on that double-torus example gives a relative error of about 0.5%
0
0
0
I give you 100k points independently & uniformly distributed on a surface of R^3. How to estimate the total area without doing any triangulation etc.. Crude: compute the average distance to the 100th (say) nearest neighbour + do some algebra. Anything much better?
This morning fun: running 30k Random Walk Metropolis #mcmc chains in parallel to explore a double torus!
1
1
8