
malkin1729
@FelineAutomaton
Followers
112
Following
17
Media
6
Statuses
27
Mathematician/informatician thinking probabilistically, expecting the same from you. ‘Tis categories in the mind and guns in their hands which keep us enslaved.
Edinburgh, Scotland
Joined September 2024
RT @nvimorozov: (1/n) The usual assumption in GFlowNet environments is acyclicity. Have you ever wondered if it can be relaxed? Does the ex….
0
4
0
RT @QueerinAI: 1/ 💻 Queer in AI is hosting a social at #ICML2025 in Vancouver on 📅 July 16, and you’re invited! Let’s network, enjoy food a….
0
2
0
An oasis of inclusive science and solidarity amid the monotonically increasing NeurIPS madness that I'm proud to be supporting in a small role this year.
🏳️🌈 Queer in AI is thrilled to announce another season of our affinity workshop at #NeurIPS2025! We announce a Call for Contributions to the workshop, with visa-friendly submissions due by 📅 July 31, 2025, all other submissions due by 📅 August 14, 2025. #QueerInAI #CallForPapers
1
1
5
A great pleasure to crash two Bayesian statistics conferences with a dose of diffusion wisdom — last week in Singapore (, now in Cambridge ( — with the two authors of this very nice paper.
🚨 New paper: “Towards Adaptive Self-Normalized IS”. TLDR; To estimate µ = E_p[f(θ)] when p(θ) has intractable partition, instead of doing MCMC on p(θ) or learning a parametric q(θ), we try MCMC directly on p(θ)| f(θ)-µ | - variance-minimizing proposal.
0
3
10
Great paper by @siddarthv66, @mh_steps, et al. on amortised inference in latent spaces of generative models, generalising our past work (. Useful for alignment, planning in latent space, inference in probabilistic programs?.
Is there a universal strategy to turn any generative model—GANs, VAEs, diffusion models, or flows—into a conditional sampler, or finetuned to optimize a reward function?.Yes! Outsourced Diffusion Sampling (ODS) accepted to @icmlconf , does exactly that!
1
1
24
RT @josephdviviano: Ecstatic to show off some work my brilliant colleagues and I did at @iclr_conf this year! 🚀. We address the credit assi….
0
15
0
RT @sarthmit: 🚀 New Preprint! 🚀. In-Context Parametric Inference: Point or Distribution Estimators?. Thrilled to share our work on inferrin….
0
6
0
RT @TristanDeleu: My PhD thesis entitled "Generative Flow Networks: Theory and Applications to Structure Learning" is now available on Arxi….
0
77
0
RT @MarcinSendera: Happy to share one of my last works! If you are interested in diffusion samplers, please take a look🙃! Many thanks for a….
0
6
0
This delightful collaboration built upon my past work with @MarcinSendera @jarridrb ( and that of the brilliant @julberner and @lorenz_richter ( . Thanks to all! 9/9.
0
0
5
Happy to share our latest work on #diffusion models without data: building theoretical bridges between existing methods, analysing their continuous-time asymptotics, and showing some cool practical implications. #MachineLearning 1/9
1
26
100